I have a PHP page that I am executing from another PHP page like this - exec('c:\php\php.exe processing.php > NUL ');
The processing page has a SQL stored procedure that runs and takes awhile to complete after the stored procedure is complete the page then creates a .csv from the results and drops it into a directory.
This entire process works fine as long as it does not take too long to execute. If I am running it on a few thousand records it only takes like 3-5 minutes and completes fine. If I run it with like 20K records the process takes over 10 minutes to complete but is getting killed before it's done.
The speed is not the issue, we are fine with it taking awhile to run but somehow the script is getting killed. While it's running I can see the CLI process from the task manager in Windows but it seems like if it takes more than 10 minutes Windows is killing it or something.
I am just trying to figure out how I can run this without the process getting killed for taking too long.
Any help on this would be greatly appreciated.
Thanks!
Try using set_time_limit(0) (see documentation: http://php.net/manual/en/function.set-time-limit.php).
Setting time limit to 0 will remove any time limit.
Also, make sure you're not having any memory issue. If your script tries to allocate too much memory it will be killed anyway. You can check memory usage of your script with memory_get_usage() (see documentation: http://php.net/manual/en/function.memory-get-usage.php)
Related
I need help with a problem:
I have a PHP code and it runs multiple times with multiple files and that is using the processor a lot and consequently bringing the server down. I would like to limit the processor usage for this user to the less possible, making it stop crashing the server and runs untill it ends. Even if it runs very slowly, the important thing is that it finishes without the server going down.
Anyone have any idea?
Already limit it bt /etc/security/limits.conf
#user hard core 10000
But I had no result.
Any idea?
In Linux system you can use proc_nice to reduce the priority of the script
Also, you can add a sleep(1) function somewhere in your code (for example in a big/infinit loop)
sleeping for 1 second is like a year for a processor :)
I have tried an activity to check if a
> FOR/WHILE LOOP
fails or not if it loop through a few million records ....
My Activity
I have more than 1000000 records in a table and if i fetch them and print them
through a for loop and executed this script it with command line it worked successfully..
but when i tried it in a web browser it stucks and closed my browser...
So I want to know is it a failure of PHP(obviously not as it worked in command line)..
or It is a failure of WEB SERVER???
What to do if i must have such a situation in real time web application??
Should I switch to a more powerfull web server if yes then Which One...??
I'm guessing you're running into the max execution time limit which is unlimited on the command line but has a default of 30s on the web server. Try adding the following line at the beginning of your script to remove the limit:
set_time_limit(0);
From the official php docs
max_execution_time
This sets the maximum time in seconds a
script is allowed to run before it is terminated by the parser. This
helps prevent poorly written scripts from tying up the server. The
default setting is 30. When running PHP from the command line the
default setting is 0.
If you want to print 1 000 000 records to browser, you gonna have a bad time...
First - what FuzzyTree said - execution time. But if you say that it works in cli I guess that's not the problem.
Problem with this amount of data is probably browser, which simply cannot handle that much! Check out your CPU and RAM usage while trying to visit that page.
If you REALLY need to display user million of records, use pagination or load new records when user reach bottom of the page using ajax.
A PHP page of our web application has a rather complex query. On the production website this takes about 20 seconds to complete, and then the results are displayed. I have a test version of the site running on my local desktop. When I run the (Postgres) query directly in PGAdmin, it takes about 3 minutes to complete.
So my desktop is slow - not a real problem now. PHP does however quit loading the page and displays no results. Plain text that should be displayed right after the results is not shown either. Searching for solutions I found I could set the max_execution time for PHP on the page itself. I did so using the following code at the first line of the page:
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
This doesn't make a difference. The page loading stops after 6 seconds. What can I do to display the results?
Use set_time_limit()
You should read the documentation for this function where it says:Warning: This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
So make sure your PHP isn't running in safe-mode, else there wont be any way to set or modify the script execution time.
i have a script that load a csv by CURL, once i have the csv it add each of the records to the database and when finished, display the total amount of registries added.
on less than 500 registries, it execute just fine, the problem is that whennever the amount of registries is too big, the execution is interrupted at some point and the browser displays the download dialog with a file named like the last part of my url withouth extension containing nothing. no warning, error or any kind of message. the database shows that it added some of the registries, if i run the script several times it adds a small amount more.
i have tried to look for someone with a similar situation but haven't find it yet.
i would appreciate any insight in the matter, i'm not sure if this is a symfony2 problem, a server configuration problem or what.
thanks in advance
Probably your script is reaching the maximum php execution time which is by default 30 secs. You can change it in the controller doing the lengthy operation with the php set_time_limit() function. For example:
set_time_limit (300); //300 seconds = 5 minutes
That's more a limitation of your webserver/environment PHP is running in.
Increase max_execution_time to allow your webserver running the request longer - alternative would be writing a console command, the cli environment isn't restricted in many cases.
I have a PHP script that pulls down a bunch of RSS feeds. To prevent overloading the publishers' servers, I use the PHP Sleep function to slow things down.
The entire script could last for a couple of hours.
If I run this from a Cron job on GoDaddy, it will happily work for 5 - 10 minutes and then return a server error. I checked and the PHP maximum execution time is 30 seconds, so I'm not sure if this is the cause of the problem.
If I run the job on my Mac, my local PHP also has a default maximum execution time of 30 seconds, but this script does work if I run it from the terminal, but I don't understand why.
How do I loop a script that will exceed 30 seconds without running into unreliability problems?
Help appreciated.
Short answer is use set_time_limit(0) to allow for a long-running script. Your terminal (CLI) PHP probably has it set to 0. You could also be running out of memory, especially on PHP 5.2 or older. Log all errors to a file, and inspect it.
You could rewrite your program to be able to work on a subset of the data during one run. The benefit of that approach is you could use it to run 24/7 or to run every five minutes, depending on what the PHP environment supports. You could also run multiple instances at a time, each working on their own data.