I have a site with thousands of posts. Each post got 0-4 pictures. The pictures are in a folder system. Now I need to move all the pictures into another folder system with different hierarchy logic.
I could easily write a PHP code for this, but I guess that I can't succesfully run a file with that amount of to do. So how can I run a php file like this long enough, or what are the other possibilities?
You can write it in PHP because PHP - on the command-line - has no execution time limit. It will run endlessly if the script takes that long.
Contact your hoster for shell access.
Related
So I have launched a classified ads website but I am realizing that the website is getting too slow when loading ... I have one single upload folder and one thumb_cache folder that contains 4680 images in each one of them ... so each folder has 4680 images .. I have a regular shared hosting ... Could this be why the site is slowing down ? Can over 4000 image files in one upload folder slow down a site ? I developed the site in PHP , would PHP have a hard time finding images in one folder with over 4000 image files ?
How should I organize the upload directory for better performance ? automatically create a folder within the upload folder for each ad with PHP ?
I get Warning:
imagepng() [function.imagepng]: Unable to open
'/home/content/72/9959172/html/thumb_cache/
185x200__width__uploaded_files^classified_img^tractor61354.PNG' for writing:
Stale NFS file handle in
/home/content/72/9959172/html/al/includes/funcs_lib.inc.php on line 1168
The answer is highly depends on Your server configuration. Basically -- yes, it POSSIBLE that this CAN be a problem when PHP is run under certain server installations.
In any way, it CAN AND WILL be a problem when You will start thinking about backups and maintenance.
Many servers and applications use two-letter prefixes on file's hashsums. For example, file with hash 75f5164e4bd2de372b99a3e2e2718aed will be placed under 75/f5/ folder. This can solve some problems.
No this should not be issue, I had stored around 10000 images inside one folder... only if you are loading all the images of this folder on a single page will then have an issue.. but that too can be solved by a a jQuery lazy load plugin.
PHP has nothing to do about how files are stored on your OS, unless if you are retrieving it in a loop and processing those.
I've just published a client website, it's primary purpose is distributing content from other sources, so it's regularly pulling in text, videos, images and audio from external feeds.
It also has an option for client to manually add content to be distributed.
Using PHP all this makes a fair bit of use of copy() to copy files from another server, move_uploaded_file() to copy manually uploaded files, and it also uses SimpleImage image manipulation class to make multiple copies, and crop etc..
Now to the problem: in amongst all of this, some temp files are not being deleted, it's locking up the server pretty quickly as when tmp is full it causes things like mysql errors and stops pages loading.
I've spent a lot of time googling which leads me to one thing: "temp files are deleted when the script is finished executing" - this is clearly not the case here.
Is there anything i can do to make sure any temporary files created by the scripts are deleted?
I've spoken to my server guy who suggested running a cron that will delete from it every 24 hours, i don't know whether this is a good solution but it's certainly not THE solution as i believe the files should be getting deleted? what could be a cause of stoping files from being deleted?
Regardless of anything else you come up with, the cron idea is still a good one, as you want to make sure that /tmp is getting cleaned up. You can have the cron job delete anything older than 24 hours, not delete everything every 24 hours, assuming this leaves enough space.
In terms of temp files deleting when the script is done. This only happens when tmpfile () is used to creat the temp file in the first place, as far as I know. So other files created in /tmp by other means (and there would be many other means) will not just go away because the script is done.
I am just currently wondering how I can backup a folder which contains 8000+ images without the script timing out, the folder in all contains around 1.5gb of data, which we need to backup ourselves every so often.
I have tried the zip functionality provided in PHP, however it simply times out the request due to the huge number of files needed to be backed up, it does however work with smaller amounts of work.
I am trying to run this script through a HTTP REQUEST, would putting it through a Cronjob ignore the timeout?
Does anyone have any recommendations?
I would not use php for that.
If you are on linux I would setup a cron job and to run a program like rsync periodically.
A nice introduction about rsync.
Edit: If you do want / need to go the php way, you can also consider just copying instead of using zip. zip normally doesn't do much with images and if you have a database already, you can check your current directory against the database and just do a differential backup (just copy the new files). That way only your initial backup would take a long time.
You can post the code so we can optimize it, other than that, you should change your php.ini (configuration file) and remove/increase the timeout (the longest time your script can run on your server)
I'm practicing some file upload with PHP and I was uploading a file numerous times, so I wanted to make sure I wasn't taking up a lot of space on the server. I had a while loop go over every file in the php tmp directory, and there were 103,988 entries.
Is this more than normal? I had assumed the tmp directory was for files that were automatically deleted after a certain amount of time. Am I supposed to be managing this folder some how?
Part of the reason I ask is because I'm writing an app that takes a users file, changes some things, and serves it back to them. I want the file to be deleted once they leave, but I'm not sure what the best way to do it is. Should I have a folder I put all the files in and use cron to delete files older than a certain time?
General rule is that you should clean up after yourself whenever possible.
If you aren't sure that you can remove temporary files every time, it is a good idea to have a cron job doing this for you once in a while.
I need to accept a large number of images from a 3rd party, and I already have an apache server up and running. As the 3rd party is not tech-savvy, I would like to give them a simple web form to upload files.
They don't need to be able to access the files they've uploaded, although I suppose it would be nice for them to verify what they've already sent, especially being that there is a large number of files.
There is also no requirement to be able to upload all files at once, and I think I can talk them through packaging the files into a 4-5 zip files, so single upload would be acceptable.
If I need to write a PHP script myself then so be it, but I was wondering if such a standalone script already exists in the wild, nice and polished etc :)
Thanks!
Nice ajax file manager:
http://www.ajaxplorer.info/wordpress/demo/
Others:
http://devsnippets.com/article/7-free-powerful-file-managers.html