bit of a theory questions leading to possibly a bit more practical. I currently have a web application developed using PHP within a codeigniter framework.
The application is now in use by 5 separate customers each with their own specific amendments that extend the base/core code I have created. I'm now at the time that whenever I make any updates to the development version I am having to painstakingly hand copy each of the separate files/directories to each customer.
I'd like a bit input on how best to automate this process. The scenario is that the applications are running on a windows server with php 5.4 connecting to a mysql database. All domains/applications including the dev version are on the same server and use the same IP Address.
Would I be correct in thinking a batch file would be the best way to do this, if so can someone point me in the right direction for creating a batch file. Other thoughts I've had are possibly a download button on each site that starts the process of updating itself but again i'm not entirely sure how to go about this.
Any input is greatly appreciated.
Thanks
It really depends on how extensive your modifications are for the individual clients. Is 90% shared code? 50%?
If 90% of the code is common code:
I would put my core codebase in one folder and then just use custom files for the modifications.
In the core code you need a way to tell the clients apart. Let's assume you have a 'clientid' that is unique to the client.
Example:
Let's say we are in the core code and going to include a file called user_admin.php but this file is different for some of your clients but not all
function custominclude($file,$clientid){
$customfile='/custom/'.$clientid.'/'.$file;
if(file_exists($customfile)){
include ($customfile);
}
else{
include ($file);
}
}
custominclude('user_admin.php',$clientid);
EDIT based on your comment below.
To sync two folders (or more) on a windows server. Use a utility like GoodSync http://www.goodsync.com/ . It lets you sync folders one way/both ways on the same machine or between different ones. It should handle your task easily. You can set it up to do it on a schedule or you can manually tell it you sync when you want.
Related
Me and my friend are in different countries have been developing a LAMP web app for several weeks. All these times we have been sharing source code over ftp. In this way php files become messy. I have heard about CVS, and have been reading about it. But I still cannot figure out how it works exactly.
How does the CVS could help me in this matter ?
I would be much appreciated for someone who point me in the right direction.
Ok here comes a very simple explanation of VCS. After using it for a while you'll laugh at the explanation but for now I guess this should help you.
What are the problems of your current ftp file sharing?
If 2 people upload the same file one of the files will get overwritten
After uploading it you'll only see who changed the file (the last time) but not where it got changed
You can't provide information about the changes (despite putting comments in the files itself)
You can't go back in time, once uploaded old files are lost
With version control you can solve these problems:
Files get either merged into one new file, or get overwritten but the old file will still be stored to roll back if needed
You can see who made which changes when
You can provide comments when you "upload" your files about what got changed (without storing these comments inside files)
You can always go back in time and restore old "uploads"/changes
You can also create small side projects by branching. This basically let's you split your project in smaller pieces and work on them separately.
So at the beginning of your work you usually get your local sources up-to-date by getting all the changes that got made. Then you do your work and afterwards you update the online version with your changes so that other developers can pull these changes and continue to work on them or integrate these changes into their current changes.
How to implement this sorcery?
You could google for "how to implement git" or "how to implement svn" but I would recommend you to use an online service as a beginner. Here is a list of services: https://git.wiki.kernel.org/index.php/GitHosting
My personal preference for closed source projects with a low number of developers is https://bitbucket.org/. You get a small wiki page and bug tracking tool provided with some of the services. If you want to use bitbucket, here is the very easy to understand documentation: https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+101
Important to know:
Soon you'll learn that you don't upload files as I've written multiple times but rather change lines of code. You also don't upload them you "commit" them.
While cvs could help, not many developers will recommend using it for new projects. It has largely been replaced with Subversion (svn), but even that is falling out of favour. Many projects these days use distributed version control with git or Mercurial (hg).
A good introduction to git can be found in the free online book Pro Git.
In any case, these things are all version control systems. They help to synchronize the code between developers, and also let you track
who changed code,
when it was changed,
why it was changed, and
how it was changed.
This is very important on projects with multiple developers, but there is value in using such a system even when working on your own.
I was wondering how to start coding a script using php, and that script will be used on many websites.
so should I start first by creating the database ? and then start creating php files that will process data from the database ?
and should I start thinking of an install wizard for this script at first, or later when I finish the project I'll create one ?
I'm really confused on how to start a project, can you please give me some advice ?
and thanks everyone :D
should I start first by creating the database?
If you are going to use a database in your PHP script, then yes, you should install a database first. MySQL is a good start.
and then start creating php files that will process data from the database?
I would start on one server first, and create one PHP file called index.php that will do a database query. Then work your way to multiple PHP files from there.
and should I start thinking of an install wizard for this script at first, or later when I finish the project I'll create one.
Installing PHP files is 90% of the times as simple as just copying them onto your new server. I wouldn't worry about an install wizard just yet.
Another general tip because you are a beginner: install WAMPServer, it is a webserver/PHP server/MySQL Server in one that runs on your local computer. This is great for developing because you can just put your PHP files in C:\WAMP, edit them and directly see the result in your browser through http://localhost/. Then when you are happy you can upload to the server, or multiple servers. (Just by copying).
Most php software does not have, or need for that matter, what you would call an install wizzard.
I would suggest you to develop whichever way feels most natural to you.
Some people find it easier to start with the database design, while others prefer to write some code first and then expand the db schema further. There really is no right way to do it.
Starting a PHP project can be as easy as creating a text file and pumping out lines of code, however if you plan on creating a sizeable project, I would suggest a fully featured IDE.
Decide what dependencies your script has.
Decide which minimum version of PHP the script will be compatible with.
Work out a script which queries the users setup to detect whether these conditions are met or not. (eg does it rely on the mysql extension to be installed).
Detail how to meet each of the dependencies in case they are missing.
Explain which is the minimum version number supported, if your script detects it is below that version number.
Test it on your target Operating Systems.
Run a script which creates a database, test whether that was created. Provide detailed instructions on how to do this manually, and how to provide the correct privileges.
If necessary give them a config file which permits them to enter key information such as doc_root etc.
Conform to common wisdom such as short_tags = off else override these settings. Imagine the user is on shared hosting and is running on safe_mode = on.
Try and follow your own instructions and re-install it on your localhost, then on a live server - ideally on a variety of OSs too.
I am developing (solo web developer) a rather large web based system which needs to run at various different locations. Unfortunately, due to some clients having dialup, we have had to do this and not have a central server for them all. Each client is part of our VPN, and those on dialup/ISDN get dialed on demand from our Cisco router. All clients are accessable within a matter of seconds.
I was wondering what the best way to release an update to all these clients at once would be. Automation would be great as their are 23+ locations to deploy the system to, each of which is used on a very regular basis. Because of this, when deploying, I need to display a 'updating' page so that the clients don't try access the system while the update is partially complete.
Any thoughts on what would be the best solution
EDIT: Found FileSyncTask which allows me to rsync with Phing. Going to use that.
There's also a case here for maintaining a "master" code repository (in SVN, CVS or maybe GIT). This isn't your standard "keep editions of your code in the repo and allow roll backs"... this repo holds your current production code (only). Once an update is ready you check the working updated code into the master repo. All of your servers check the repo on a scheduled bases to see if it's changed, downloading new code if a change is found. That check process could even include turning on the maintenance.php file (that symcbean suggested) before starting the repo download and removing the file once the download is complete.
At the company I work for, we work with huge web-based systems which are both Java and PHP. For all systems we have our development environments and production environments.
This company has over 200 developers, so I guess you can imagine the size of the products we develop.
What we have done is use ANT and RPM build archives for creating deployment packages. This is done quite easily. I haven't done this myself, but might be worth for you to look into.
Because we use Linux systems we can easily deploy RPM packages, the setup scripts within a RPM package can make sure everything gets to the correct place. Also you get a more proper version handling and release process.
Hope this helped you.
Br,
Paul
There's 2 parts to this, lets deal with the simple one first:
I need to display a 'updating' page
If you need to disable the entire site while maintaining transactional integrity, and publishing a message to the users from the server being updated, then the only practical way to do this is via an auto-prepend - this needs to be configured in advance (note - I believe this can be done using a .htaccess file without having to restart the webserver for a new PHP config):
<?php
if (file_exists($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php')) {
include_once($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php');
exit;
}
Then just drop maintenance.php into your webroot and that file will be displayed instead of the expected file. Note that it should probably include a session_start() and auto-refresh to ensure the session is not expired. You might want to extend the above to allow a grace period where POSTs will still be processed e.g. by adding a second php file.
In terms of deploying to remote sites, I'd recommend using rsync over ssh for copying content files - which should be invoked via a controlling script which:
Applies the lock file(s) as shown above
runs rsync to replicate files
runs any database deployment script
removes the lock file(s)
If each site has a different set up then I'd recommend either managing the site specific stuff via a hierarchy of include paths, or even maintaining a comlpete image of each site locally.
C.
I have a problem with maintenance of my php based website. My website is built on the Zend Framework. When I wish to upload a new copy or version online - during the time of upload especially when crucial files like models and controllers are being uploaded and rewritten - the site won't run understandably.
Is there a way to upload a website without having to go through this issue?
My updates are really quite regular. Let's say like once or twice a week in this case.
You can make use of the fact that renaming directories is quick and easy even through FTP. What I usually do is:
Have two directories, website_live and website_upload
website_live contains the live website (obviously)
Upload contents to website_upload
Rename website_live to website_old (or whatever)
Rename website_upload to website_live
done! Downtime less than two seconds if you rename quickly.
It gets a bit more complex if you have uploaded content in the old version (e.g. from a CMS) that you need to transfer to the new one. It's cumbersome to do manually every time, but basically, it's just simple rename operations too (renaming works effortlessly in FTP as well).
This is a task that can be automated quite nicely using a simple deployment script. If you're on Linux, setting up a shell script for this is easy. On Windows, a very nice tool I've worked with to do automated FTP synchronizing, renaming and error handling - even with non-technical people starting the process - is ScriptFTP. It comes with a good scripting language, and good documentation. It's not free, though.
If you're looking to get into hard-core automated PHP deployment, I've been doing some research in that field recently. Maybe the answers to my recent bounty question can give you inspiration.
I need to develop a project that would allow me to instance many copies of a website, but each copy needs to be a separate website. I could upload the same code to many different accounts, but I would prefer to have only one copy of the code. Each website would be an "instance", so to speak. This way I could upload the code once and update all the websites at the same time.
For technical reasons I need to use PHP (but I'm interested in the other options too, for my own knowledge), and I thought Jelix could be a good choice of framework. Are there better options out there?
You can have all code in one directory, and then create virtual subdirectories in all your web sites, which all point to this directory. This is how Microsoft solves the problem in SharePoint.
The easiest bet is to have all the websites link to one server (perhaps distributed).
Pass the calling URL through your webserver to generate configuration information. Use those passed URLs to define the differences between each site.
Beyond that, the framework is almost immaterial to the question, so I'll leave it to someone else to answer.
Just remember, if you make 20 copies of the same code, that's 20x the time it'll take to fix bugs.
If you're using UNIX or Linux for a web server, you could create one master copy of the PHP code, and then use symbolic links to the actual files that are in separate directories with virtual websites set up in Apache. You could also put site-specific config files under those directories, but the bulk of the PHP code would be resolved as symbolic links to the "master" code.
I'm not sure what kind of websites you're talking about, but why not use an already developed application like Wordpress or any other cms? The code is identical on every website, and you can easily update it. The website-specific data is only present in the single configuration file, and the MySQL database.