I am doing a PHP project which is almost completed and uploaded to Production server for Client demo. Since The client is specifying some changes. I am doing that changes in my local server and later upload it to production server. Since some changes will took in more that 1 files its really difficult me to update the Production server via FTP.
Is there any way to Synchronize the changes made in local server with production server??
Is there any way to configure SVN in production server???
please help
Any help will be highly appreciated.
Thanks
well i think that you have your own answer... SVN... what you need is a SVN server.. (maybe in the same Production server or not), and if you have ssh access to your production server all you need to do is login in a console and update the new version.. for that you'd have to learn the basic svn commands, so you wont override configuration files or upload/download files you dont need (like uploaded images, etc)...
good luck!
Can't you do a 'checkout' of your project to the server? I think that would be the simplest thing to do.
When you change something in your system you just commit and then update in the production server. Either that or do a scp or rsync.
Related
I need to run PHP scripts on server without having to update files it stores - perform testing on real server before deployment. Server has access to database which is inaccessible from outside. For this reason I can't run my scripts locally, I need to run them within server's environment, but I don't want to update files stored on server. Is there any way to do so? Is there a tool for remote PHP debugging?
There are several ways to achieve this thing
You can export database from live server and import it to your local server for testing and debugging purpose.
You can Upload code into a separate folder or subdomain on server to connect with database and test with live server configurations. Once you are satisfied replace live files.
I have found a solution - XDebug for PHP. However as was mentioned in comments, testing against copy of DB and using virtualization is a more common approach, which I personally will probably stick to.
For those who are still determined to go the "hard" way, here is a link to HOWTO on XDebug installation for PHP on Ubuntu - http://ubuntuforums.org/showthread.php?t=525257.
I'm using Apache, PHP and MySQL. I maintain websites to connect to a real server remotely. But I want to separate real and development Server.
I will use Github and make a local development environment. Then I will maintain websites on local system and send source files to Github and a real server.
I'm curious. Source files will be placed on Github. Then how to manage database info files and Board Uploaded files?
Take advantage of the .gitignore file.
Say uploaded files.. they only mean something to production if they are uploaded in production. They only mean something to dev if they are uploaded in dev. If git is set to ignore this directory then it won't allow you to add/commit/push these uploaded files.
For database and config files, we generally name them like config.php.dist (.dist for distribution) and then rename them on the production side. Then you will always have a base config file with the necessary information which does not override what is in production and being used. Then on dev config you set your settings to reflect dev environment & dev database. In production you reflect proper.
Here is some documentation on using gitignore: https://help.github.com/articles/ignoring-files/
I am developing a php script locally using XAMPP, it is working on my local server, but after being uploaded to the linux live server, there are many things those won't work. Most of the problems are caused by file path writing.
For example:
require('inc/config.php') working in XAMPP but it does not work on live server or vice versa. So I need to change it to require('config.php') in order to make it able to be called on live server.
That seem like ruining the whole of my works and make the times I invested become useless.
My question:
What's the common solution can I use to prevent that kind of problem? Is using full path to call a file the best practice?
Is there any local development environment for Windows like XAMPP that able to simulate linux server structure, so there will be nothing need to be fixed after finishing development on local server then upload it to linux live server?
Please somebody help..
Best Regards
You don't need to change anything, just in the index.php give your script proper path to your scripts using set_include_path().
When i've worked on Drupal sites before, if there is internal access to the server, or if remote desktop access is available, i've always developed it on the machine it would be ran from when live, and just not made it public on the server.
However, what is the best thing to do if you don't have access to the server yet, for example if the client hasn't got anything in place?
I need to be able to build and test the solution on my local machine, or on my VPS which I have RDP access to, and be able to move it over with as much ease as possible to the clients server when ready.
Any tips or best practices? As far as i'm aware Drupal doesn't have any specific migration tools? I could be wrong though
I don't work with Drupal, but for Prestashop, Wordpress, Zencart, etc. I always use the same workflow:
I setup a vhost in my virtual sever, usually using a subdomain of my own domain (like customer.mydomain.com). Install the software with its DB etc. on the server. Setup FTP access.
I get a local copy of the files, which I maintain in a local git repository, pushing to github for backup purposes mainly.
I work with ZendStudio and configure a remote server and set it up to upload the files when I save them, so I can check them pretty much as if I were working locally. But the main advantage of this approach is that I can share the project with the customer as it progresses.
When I have to move to final server, at least with Wordpress, I have to search/replace the domain name, which wordpress saves on DB. But I do it locally. I download the entire DB as an SQL file through phpmyadmin, open it, searc-replace and upload it again via phpmyadmin to the permanent server.
With ZenCart and others the problem is the config file, which stores some paths. For long projects or long term customers I modify the config file to use some config details or anothers depending on the server name.
adding to the above comment...
Check the "backup and migrate" module and the "backup files" module. "Backup and migrate" is useful in any setup...
with this I was able to do a barebones drupal install and then migrate/replace the database with the one backed up from my local system... if the databases are named differently you will still need to edit the settings.php
"backup files" is useful for themes and content assets like images etc. but is essentially just a wrapper around gzip
I typically develop on my local machine and then upload to server once complete.
All you need to do is change the folder name in /sites/ and change the settings.php file to reflect the server settings/domain.
Something you should be aware of:
If you are uploading files on your local installation, the file paths will be wrong on the server and you will need to execute a one off mysql replace query.
Make sure you use relative paths in any hard coded links.
Here is how our current php development solution is set up:
Each developer work on their local machine.
Each developer commit their change to a common SVN server (intranet).
A commit hook upload the change to the staging server and perform validations tasks.
When the product is ready, manually deploy it to the production server via SFTP.
Note: Most - if not all - of the time I don't have SSH access to the server, only SFTP.
I could automate the deployment to the production server in the same way the staging server is updated but this solution works only one-way. How can I revert to a previous revision in case of problems?
How can I improve this solution?
Thanks and sorry for my English.
If you can set up the production server to access the SVN repo via a secure channel, such as https with webdav maybe try the following:
Create a script on the Production server that allows you to enter a tag directory and/or revision number/date and perform an svn export. This way, the prod server is pulling the changes from svn.
Now, if you have a way to have this script called securely from, say a commit script. Voila, you have automation.
Most importantly, you do not want an automatic update performed to the prod server that you were not planning for.
To solve this:
The commit script should only call the prod update script when something is committed to "/path/to/tags/release/dir"
Make sure only appropriate change control staff (or whoever currently controls the manual prod deplyment) have the ability to perform an svn copy to this directory in the repo.
For example, say your repo is set up as:
/yourWebsite
--> /branches
--> /trunk
--> /tags
----> /releases
The commit that would trigger the auto deployment to prod would be something like:
svn copy https://mySvnRepo/yourWebSite/trunk \
https://mySvnRepo/yourWebSite/tags/releases/x.y \
-m "Tagging for production deployment"
Rolling back can be achieved by making a commit to a previous releases directory. Note however, that this will not cause new files that were added to be rolled back.
Of course, your mileage may vary; this is only a suggestion for your investigation.
You should take time to consider the security implications and potential for disaster if set up incorrectly.
Hope this helps, even if only to get you thinking of other solutions.