Our website relies on images from one of our manufacturers. The image directories are massive and getting them via FTP is an all day job. Now that we've downloaded the entire directory, we'd like to be able to periodically download files and directories that are new, or have been changed since the last time we downloaded them. We're thinking about writing a script that checks the modification date of files and only downloads the latest versions.
Since this can't be the first time this problem has been encountered or solved, I thought I'd post this and see if anyone knows of existing solutions that can be applied here. An existing solution would need to be compatible with FreeBSD and/or LAMP.
Is there any reason you can't use rsync?
with wput
As user77413 noted in another comment, this should work...
wget --mirror username:password#siteurl.com/path
The default number of retries is 20, you can increase this with --tries 100
Related
I've got a weird issue lately and I am not sure what causes it. I've got a dedicated box with a few websites on it and it seems like memcache only works on half of them.
2 wordpress websites (both got W3 Total Cache with same settings) and one of them is stuck while the other one works fine. If I manually clear cache from W3 it will work until the next post.
At first I thought it was plugin's fault but then I've noticed similar issues on some of my other sites. For example, when I try to update a php file it will show me the file I uploaded unless the file size is different.
Another occasion was DIR which would point out to the old folder name (the one used when it was uploaded) unless I edit all the files in that folder that had DIR in them.
Any ideas/suggestions?
PS: All these things started happening after I installed php curl on my server.
I've been having problems with PHP session variables since starting work on a new website for a company I work for. I phoned the hosting provider (1and1) and I was told it was because I needed to copy an INI file into each directory I make.
I recall facing this problem before, and somehow finding a list of PHP versions that you don't need to copy the INI file to each sub-directory, to use sessions with. I just can't seem to find such information on Google (despite searching for a few hours). Does anyone know what PHP version I need to downgrade to, in order to use session variables again, without having to manually copy the INI file into each of the X thousand directories I have?
I called them back and somebody else answered, who claimed to have changed something in the accounts PHP setup, after I described my problem to him. It now works fine. I guess the person I spoke to previously just didn't have the expertise to help.
Happy days :)
Ok this might seems a bad idea or an obvious one. But let's imagine a CMS like PHPBB. And let's imagine you'd build one. I'd create just 1 file called PHPBB.install.php and running it it will create all folders and files needed with PHP. I mean, the user run it just once and every file and folder of the app is created via the PHP file.
Why to do this?
Well mostly because it's cleaner and you are pretty much sure it creates everything as you wish (obliviously checking everything about the server first). Also, having all the files backed-up inside a file you would be able to restore it very easily by deleting everything and reinstalling it running again PHPBB.install.php. Backing-up files like this will allow you to also prevent errors: How? When an error occurred in a file, this file is restored as it was and automatically re-run.
It would be too heavy!
The installation would happen only once and you'd be sure the user will not forget to place the files correctly. The error-preventing will worth the cause and it would also happen only once.
Now the questions:
Does this technique exists? If so, What's its name?
Why would you discourage it?
As others have said, an installer.
It requires the web server to have permission to write to the filesystem, and ends up having the files owned by the user the web server runs as. Even when one has the ability to change filesystem permissions, it's usually a longer process than just extracting an archive and having the initial setup verify permissions.
Does this technique exists? If so, What's its name?
I'd advise to read about __halt_compiler(). It allows you to mix PHP code with non-php data which is not parsed, so you may have PHP code ("installer") and binary data (e.g., compressed contents of all the files) in single PHP file.
1 - Yes, there is a single install file in PHPBB. You run through an online wizard defining your settings and then it installs automatically.
http://www.phpbb.com/support/documents.php?mode=install&version=3&sid=908f5766fc04868ccb985c1b1e6dee4b#quickinstall
2 - The only reason to discourage it would be if you want the user to understand exactly how the system works. Automatically installing it means the user has no need to understand the nitty gritty of it all - of course, many see this as a good thing.
For various dull reasons, I'd like to assay a script that looks at the files in a directory, copies the filename of the latest and inserts it into a mysql table. It shld also check if the insert has been done already.
I am a web tinkerer (i work in construction) so my question may seem a bit ingenue but what functions do I need to get the filenames of files in a particular directory ? I can see how to check if the insert's been done already plus the db insert bits. I just wanna learn how to get hold of the latest filename.
Afterthought: is there a way to run the script automatically or on completion of a successful ftp upload to the directory in question ?
Tom
I would recommend the SPL DirectoryIterator instead of glob().
To answer your second question:
Afterthought: is there a way to run the script automatically or on
completion of a successful ftp upload
to the directory in question ?
This depends on the type of server you're running on, since you're programming in PHP, I'll assume it's a Unix or Linux based machine, in which case you'll want to read up on CRONTAB, which is the normal way of running scripts at a specific time.
It would be difficult on the server end to know when a client has finished uploading via FTP, as you likely really don't know how many files they might upload.
I am running the site at www.euroworker.no, it's a linux server and the site has a backend editor. It's a smarty/php site, and when I try to update a few of the .tpl's (two or three) they don't update. I have tried uploading through FTP and that doesn't work either.
It runs on the livecart system.
any ideas?
Thanks!
Most likely, Smarty is fetching the template from the cache and not rebuilding it. If it's a one-time thing, just empty the cache directory or directories (templates_c). If it happens more often, you may have to adjust smarty's caching behaviour in the configuration (among others, $smarty->cachingand $smarty->cache_lifetime)
Are you saying that when you attempt to upload a new version it isn't updating the file? Or it's updating the file but the browser output does not conform to the new standards?
If it's the latter problem, delete all the files in your template_c directory. If it's the former problem, er, might want to check out ServerFault or SuperUser.