PHP method to transfer whole directory using FTP? - php

I've been trying to come up with a way to build an automatic update system for a cms that I'm building that will potentially be installed on numerous servers (likely with different configurations). What I've come up with is to keep a current version uploaded to my server in a predetermined directory. Then have the distributed systems check that directory (on a remote server) once every so often to see if a new version has been uploaded. If the version number of the uploaded version is greater than the one that particular system has, it will prompt the admin to update. Then the files will be copied via FTP into a tmp directory, then will be copied from tmp to replace the older versions of each file. then the tmp directory is deleted, and the systems version number is increased.
THe problem with this, is that I haven't found a way to transfer whole directories via PHP FTP. I know I can zip it and transfer it that way, but I haven't found a reliable way to unzip files in various server environments.
I know I can write my own method that will just go through each directory and transfer over each file one at a time, which is fine. I just wanted to know if someone else had already written something to this effect, or if someone knew of an alternate solution to this problem before I dive into it.
Thanks for your help!

Instead of sending a directory, why not zip the folder to save bandwidth and time? Then unzip on the destination servers. deleted since it was not an option for the OP
You could use PHAR archives for your app deployment (if you want to be bleeding-edge modern)
Or you could write an update script for Phing that gets the files from your server. You could even do checkouts from SVN then, instead of placing the build in a directory.

Try using exec() to run lftp from the command line:
/usr/bin/lftp -e 'o ftp://ftp.example.com/path/to/remote/directory && mirror --verbose && quit'

Related

Unable to run TYPO3 website to localhost

Currently I will upload a website to a server, but before I upload it I want to run this website on my localhost (I'm using XAMPP).
They gave me an archive containing the website files, based on the structure of the webfiles I realize TYPO3 is used.
Since I never tried using TYPO3, I tried installing it I followed this steps:
here
After the installation I run the sample, so far I don't have a problem running it on my web browser.
But when I tried to run the website that I need to upload
I'm getting an error.
here are the files inside the archive.
Here is after I extract it. As you can see I wasn't able to extract typo3_src which is the target location of index.php, t3lib and typo3 file that are all both in .symlink type.
And also the typo3 is not in folder type unlike the sample I run.
I'm not sure if the archive is broken/corrupted or I need something to do first before it will work. Can someone please help me regarding this issue?
You do not need to deal with those unix symlinks at all.
Just figure out or ask the person who provided you with that system, which version of TYPO3 you need to use.
Then go to https://sourceforge.net/projects/typo3/files/TYPO3%20Source%20and%20Dummy/ and download the appropriate zip package.
Decompress the package, delete all symlinks (typo3, t3lib, index.php) and just copy the corresponding resources from the package.
You then need to deal with the database and database connection settings in typo3conf/localconf.php or typo3conf/LocalConfiguration.php.
If it all works, you can then upload it like this to the server without using any symlinks.

Joomla 3.2 installation file/directory ownership issues

I am trying to install Joomla 3.2 using the host provided by my university. So, I have available one mysql database with the username and pass and the ftp username and pass to transfer data, only. That is, no Cpanel installed or ssh available.
If I extract the file Joomla_3.2.0-Stable-Full_Package.zip and ftp the contents on the website, the owner of these files/folders is user "ftp_username". I can continue to install Joomla only if I set-up the ftp layer.
This works, except I get from time to time messages like --> JFTP: :rename: Bad response Rename failed More specifically I get this message when using kunena.
Moreover, I read online that the FTP layer shouldn't be used normally. Also, the tmp/cache/logs create permission problems.
If I remove the ftp layer (by editing the configuration.php) then I cannot modify anything since Joomla cannot modify the files (owned by ftp_username). Of course, I could change all permissions to 777, but that would be suicide...
I found a post explaining a bit the situation I'm here! Especially the advice on using "chmod 4770" is feasible but I don't know about how secure it is (haven't tried it).
Anyone has an idea how I can make this work?
cross-posted here
Thanks in advance
I found a way to bypass this problem. Not worth it if you have an alternative provider!
I installed joomla with ftp layer
I installed Extplorer plugin
Used plugin to upload Joomla_3.2.0-Stable-Full_Package.zip on the server. Now the file is owned by the apache user.
Deleted all the (joomla installation) files on the server except the zip file
Uploaded a php script to unzip the file using php (http://php.net/manual/en/ziparchive.open.php). The installation folders/files are now owned by apache user
Reinstall Joomla without the ftp layer.
I guess I could have used some php uploader script instead of steps 1-4, but I already had Joomla installed with the ftp layer.
That's it. Up and working. If you have an alternative provider, don't bother.
It seems to me that the hosting option provided by the university may be too restrictive to work for pretty much any CMS. You need to be able to chmod and potentially chown all your files in bulk so as to avoid insanity.
There may be an FTP program out there that will chmod your files as you are uploading them. If your group is www-data or whatever group apache needs, then you can 775 your folders and 664 your files; you should be good to go.

Using Version Control but still access the files normal way

I am trying hard to use version control but i am so used to old method of editing files directly via FTP that i am feeling confused what to do. So i am thinking of one solution and please help me with this if its possible or not
I have the user folder in Linux VPS system(Single VPS only)
/home/user/public_html2
Now that will be linked with Http://demo.server.com.
That directory will be version controlled with repo in the folder /usr/bin/repo/ so that i can commit the changes.
Now i will have another directory called
/home/user/public_html which will have the contents updated as the commit is done in /home/user/public_html2.
That /home/user/public_html will be linked with http://main.server.com
So that i keep working and editing files in public_html2 via FTP normal and test code as it is but main site is version controlled
Is it possible??
When you are using a version control system, why do you need to of to a FTP location to edit the files... It defeats the whole purpose.
Rather, checkout the files from the repository into your local machine and after the changes are done, commit the code.
If you don't want to use the version while working with the files on your local, simply export the code instead of checkout.
The problem I see from your approach is that the files / directories may go out of sync. Also, if there are multiple users working then this approach would bomb.
I would suggest to have the working copy of the files in your local machine and use them instead of mirroring them to another folder.
Maybe it's possible. But once you start to use version control systems, you should keep accessing your version controlled files through that system.
Or better, if you want to access files in "normal way" as you said, then if you access that files without committing any change then it should be ok. But if you want to modify them, you should always pass through your version control system, otherwise you could have problem with file versions.
You need a deployment script. Create a script to upload the files to the appropriate locations. Run this script whenever you wish to see your changes reflected on the site. If you wish to maintain two separate locations on a server running different versions, simply add a command line argument to the script to choose which location to upload to.
Use rsync instead of FTP if you can, to avoid transferring unchanged files every time.

How to avoid crashing your web app during replacing a file?

Let's say you have a big web app with large visits, but you don't want your app to crash & you don't want people to see the php or mysql errors that happens during replacing files using FTP, How to avoid that? How to just execute the old version of file until the replacing is done?
Thanks
you can follow at least one of this 2 rules:
to use accelerators (like APC) with turned off mtime checking. so until you clear cache manually - old versions will be used from memory
to use virtualhost symlinked to directory with your project: let's examine you store yout project at /home/project/www. and /home/project/public_html is your real webroot and symlinked to www. so - create /home/project/www2, checkout files there, setup and do whatever you want. after this - just change symlink.
I use git to upload my changes to a staging website on the same server, after testing I then push it to the production website. None of the files are changed until they are all received. On the plus side, it only sends the changes compressed, so I don't even have to send an entire file.
The staging area isn't required. I work with a lot of servers and sometimes some of the specific configurations on that server (mostly just find that an extension isn't installed)/
I'm sure you can do the same with another version control system. You need to be careful though. The tutorial I linked specifically stores the git information OUTSIDE the document root. Otherwise someone can just clone all the source code for your website.
If you like SVN, the .svn being in every directory can be a little annoying. Make sure that people can't download what they shouldn't be able to.
Deploy your app into the temporary directory. Then after you done, just rename the original app directory to app.old and the directory where you deployed your files into app
Note this should work okay in Unix environments. Also this will only work if all of the above directories are on the same file systems. In rare case users might see 404 error if they happen to access the app after your renamed the original app into .old and before you renamed temp dir into the original app directory.

php, user-uploaded files, version control, and website deployment

I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment.
There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways.
PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need.
Is there a better way to do this?
I usually dont keep user generated data/file in svn. only the code, db schema/test data. What i usually do to deploy is an rsync from an up to date working copy which excludes the upload dir and .svn dirs. IMO content should be handled by more traditional filesystem/db backup mechanisms and not version control.
EDIT:
Just to be clear your symlinking strategy seems like a good practice. youre jsut missing the backup part it think. Id probably just tar | gzip the uploaded stuff in the cron job instead of interacting with SVN. And then probably have a seperate one to use mysqldump to dump the db and gzip that as well.
I would continue with the practice of exporting the site as upgrades are needed but have a symbolic link to a directory outside of the version controlled directory with the user uploaded content. This way when you do an export you only need to recreate the symlink if it gets blown away. You should then of course be backing up that user content as needed.
Rather than manually doing the export and managing symlinks, you could make your deployment directory a subversion checkout (from a production branch). That way, deploying is as simple as checking in your updates to the production branch.
This works as long as you have sufficient control of your subversion server and hosting setup, and your subversion repository is "ready to run." In this situation, your user directory could be an empty placeholder in subversion and would be left alone by the update process that runs on commit (business as usual for svn update). I'd still recommend (as mentioned by #Flash84x and #prodigitalson) a separate process to back up the user content.
There's an Ars Technica article with a description of how to set this up.
Update: If you follow this approach, make sure that your web server does not allow access to the .svn files in the deployment checkout.

Categories