I need to write a script that reads a big .csv and generates a file for each row.
Since on this server I can't modify timeout with set_time_limit, in my script that is manually executed in browser via http I get 10 row, execute my code for each, reach the end and auto reload for next 10 row, so on till I reach the end.
Now I need to convert this script into a web service, but I don't know how to avoid that the scripts timeout.
set_time_limit(900) // does not work
From reading here it seems it has a restriction.
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Do you have safe mode enabled? If so, it seems you may have to disable it. You can also see other values that do the same/similar actions here, though they also require safe mode to be disabled.
Related
I made a code to zip 400 files from website, but when I open it, it taking a lot of time (And this is ok), but if it too long the php file stop working.
How I suppose to zip 4000 files without my website crash? Maybe I need to create a progress bar?
Hmm.. help? :)
Long work (like zipping 4000 files, sending emails, ...) should not be done in PHP scripts that will keep the user's browser waiting.
Your user may cancel loading the page, and even if they don't, it's not great to have an apache thread locked during a long time.
Setting up a pool of workers outside of apache, to make this kind of work asynchronously is usually the way to go. Have a look at tools like RabbitMQ and Celery
In a PHP installation, you have the directive max_execution_time which is defined in your php.ini file. This directive sets the maximum time of execution of any of your PHP scripts, so you might want to increase it or set it to the infinite (no time limit). You have two ways of doing that : you can modify your php.ini but it's not always available. You can also use the set_time_limit function or the ini_set function. Note that depending on your host service, you may not be able to do any of this.
I think you should look around PHP set time limit or max_execution_time properties:
ini_set('max_execution_time', 300);
// or if safe_mode is off in your php.ini
set_time_limit(0);
Try to find the reasonable settings for zipping all your bundle, and set the values accordingly.
The PHP interpreter is limited in its own execution time, by default to a certain value. That's why it stops suddently. If you change that setting at the beginning of your php script, it will work better, try it!
You could invoke the php executable in cli mode too, to handle that process... with functions like shell_exec
I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.
I have made a PHP script which probably would take about 3 hours to complete. I run it from browser and after about 45minutes it stops doing anything. I know this since its polling certain web addresses and then saves some data to database. So it basically stops putting any data to database which lead me to conclusion that it has stopped. It still shows in browser like it would be loading the page though but its neverending.
There arent any errors so it probably is some kind of timeout... But where it occurs is mystery or how can I prevent it from happening. In my case I cant use the CLI, I must user browser client to initiate the script.
I have tried to put
set_time_limit(0);
But it had no apparent effect. Any suggestions what could cause the timeout and a fix for it?
Try this:
set_time_limit(0);
ignore_user_abort(true);
ini_set('max_execution_time', 0);
Most webhosts kill processes that run for a certain length of time. This is intended as a failsafe against infinite loops.
Ask your host about this and see if there's any way it can be disabled for this particular script of yours. In some cases, the killer doesn't apply to Cron tasks, or processes run by SSH. However, this varies from host to host.
Might be the browser that's timing out, not sure if browsers do that or not, but then I've also never had a page require so much time.
Suggestion: I'm assuming you are running a loop. Load a page then run each iteration of the loop in an ajax call to another page, not firing the next iteration until the previous one returns.
There's a setting in PHP to kill processes after sometime. This is especially important for shared servers (you do not want that one process slows up the whole server).
You need to ask your host if you can make modifications to php.ini (through .htaccess). In particular the max_execution_time setting.
If you are using session, then you would need to look at 'session.cookie_lifetime' and not set_time_limit. If you are using an array, the array size might also fill up.
Without more info on how your script handles the task, it would be difficult to identify.
I have a massive amount of data that needs to be read from mysql, analyzed and based on the results split up and stored in new records.
five record takes about 20 seconds, but the records vary in length so I can't really estimate how long the program will take, however have calculated that the process should not take longer much longer than 5 hours, so I'd like to run it over night and feel quite sure that when I come back to the office the next morning the program is done.
Assuming the code is fail safe (I know right ;) how should set up Apache / PHP /Mysql settings so that when I execute the script so that I can be sure that the program will not time out and/or not run out of ram?
(it is basically running in a loop fetching sets of 100 rows until it can't anymore a loop so, I am hoping the fact that the variables are being reset at the beginning of each iteration will keep the memory usage constant.)
The actual size of the database when dumped is 14mb, so the volume of the data is not so high
(on a side note, it might also be that I haven't assigned the maximum resources to the server settings, so maybe that's why it takes 20 seconds to run 5 records)
Make sure you have removed any max_execution_time limits by setting this to 0 (unlimited) in your PHP.ini or by calling set_time_limit(0). This will ensure that PHP doesn't stop the script mid-execution.
If it all possible, you should run the script from the CLI so that you don't have to worry about Apache timing your request out (it shouldn't, but it might).
Since you are working with only 15 MB of data I wouldn't worry about memory usage (128 MB is the default in PHP). If you are really worried you can remove memory limits in PHP by modifying the memory_limit to be either a higher number of -1 (infinite memory).
Keep in mind modifying the PHP.ini will affect all scripts that are interpreted by that installation. I prefer to use the appropriate ini setting functions at the top of my scripts to prevent dangerous global changes.
On a side note: This doesn't really sound like a job for PHP. I'm not trying to discourage your use of PHP here, but there are other languages that are better suited for command line usage.
Better make your script exit the execution, and then restart it. Store the point
where it left last time. This will ensure you do not have memory leaks and script does not run out of memory due to some error in garbage collection,and that
the execution continues if there is unexpected failure.
A simple shell command would be :
while [ 1 ]; do php myPhpScript.php a; done
you can make other checks to ensure proper running.
I'd like to point out, by default scripts run via a CLI in PHP, default to having no time limit, unlike scripts run through CGI, mod_php etc.
And as stated avoid running this via Apache.
However if you MUST do this, consider breaking it down. You can make a page that could process 5-10 results, appends the dump file, then prints out either a meta refresh, or some JavaScript to reload the page with with a parameter telling it where it's up too, and to continue until done.
Not recommended though.
adding to some of the other good options here you might want to look at http://www.electrictoolbox.com/article/php/process-forking/ and also sending some requests to dev/null if you dont need them to give back feedback.
Don't do this using a web interface. Run it from the command line; but look to see if your code can be optimised, or set break points and do it in "chunks"
First of all http://php.net/manual/en/function.set-time-limit.php put set_time_limit(0); at the beginning of the script.
As for the memory you should take care of that by unsetting any variables, array, pointers that you do not need on each iteration.
Better run the script from the shell (CLI) or as cronjob.
As far as I know MySQL connections do not time out, so you should be safe by setting:
php_value max_execution_time X
in a .htaccess file or placing set_time_limit(X) at the beginning of your script where X is a comfortable value in seconds.
I want to upload and download files to a server. As the server is in safe mode it is not allowing to increase the execution time of the script. As I cant increase the time limit the script is timing out. Any help is appreciated.
There is an ugly hack I remember for server environments where you cannot control the timeout. Here goes -
when a process like this starts, you set a processid in the db
you get the timeout for the page and reload the page yourself before the timeout happens using javascript, but on each such load (using javascript), pass the processid stored in the db
on the page load, you check if the processid exists in the db and accordingly restart the script from
where it was left (I am not sure how this will work in case of upload/download, but probably you can
break the file into pieces).
on completion of the job, delete the processid from the db.
As I mentioned this is an ugly hack, so please only use it if you are left with no other choice or if anyone else cannot suggest a better option. let know how it goes or if you need any details.
Got this from the PHP manual:
You can not change this setting with
ini_set() when running in safe mode.
The only workaround is to turn off
safe mode or by changing the time
limit in the php.ini.
So, I think that you have pretty slim options