About max execution timeout - php

I have some question on server response max execution timeout.
If, I called server API to running something huge and not able to finish within time limit set in server php.ini max_execution_time config, will the process in server still continue to process?
- if so, will it process endless?
- if not, is the process stop immediately or canceling loop one by one and finish all process.
In my experience, when I receive max execution timeout on local hosting, the data is already process.
So I not sure it is because it is stuck on response until timeout or server is continue running after throw max execution timeout exception.

It really depends on what your PHP code is like.
Usually the code execution will halt. You can alter this behaviour using ignore_user_abort().

PHP interpreter runs scripts against php.ini configuration and checks max_execution_time = 500 and max_input_time = 500.
PHP doesn't continue to run the script after the max_execution_time. It's simply "kills" the script.
What can also happen, script starts a database query, normally query will run on database server until finished no-matter what happens to the script. Also you may get a Gateway Timeout coming from the web server, for Apache check httpd.conf and look for the setting Timeout.
If you need to run a script that takes time to execute, a lot more then the rest of your website, you should call a web page, PHP on server, fork a new process as a background executed script (the PHP part that takes lot of time), inform user via async status updates or sending an email that processing ended. You should not extend max_execution_time for all script just for one exception.

It doesn't continue after the exception is thrown. It's simply cut when the time is up.
Anything before the time out is executed. If not designed especially to precent this.

The process won't continue and stop immediately after the time limit set in server php.ini max_executiontime config has been reached then php throw a max execution timeout exception.
Here (How to increase maximum execution time in php) if you want to increase maximum execution time in php file.

Related

How to trace a script running successfully or timeout in PHP

I have a PHP script which running some crawling job, and which probably require 5 minutes above to complete.
My question as below:
If I try to execute the script via browser request, probably will experience a request timeout after 30 seconds, but is it the script still running on server until completion?
If I execute the script via cron job, how do I trace the running status? How do I know if the script still running or already been kill by server?
Is it possible to increase the maximum execution time via PHP code without touching the php.ini file?
Appreciate for the reply.
If I try to execute the script via browser request, probably will
experience a request timeout after 30 seconds, but is it the script
still running on server until completion?
your script also stop processing on server.
If I execute the script via cron job, how do I trace the running
status? How do I know if the script still running or already been kill
by server?
You can track it by putting log in file at beginning of your script and at the end of your script.
Is it possible to increase the maximum execution time via PHP code
without touching the php.ini file?
You can increase the maximum execution time via PHP code by
ini_set('max_execution_time',300);
but it will only work if your HTTP_CONNECTION variable is set to keep-alive on server.

Very long script keeps failing

I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.

long running php script called via ajax (Don't want to wait for response)

I have a long running script that can run for awhile. (It sends an email every 5 seconds) to many users. This script is triggered via an ajax request
If the response is never received such as the browser is closed, will the script continue to run? (It appears it does, but are there any conditions on when it won't?
Does sleep count towards the max execution time. (It appears this is false also)
1.
The short answer is: it depends.
In fact, it can be configured in PHP and in web server you use. It depends on the mode you use PHP in (module or CGI or whatever).
You can configure it sometimes, though. There is an option in php.ini:
/* If enabled, the request will be allowed to complete even if the user aborts
; the request. Consider enabling it if executing long requests, which may end up
; being interrupted by the user or a browser timing out. PHP's default behavior
; is to disable this feature.
; http://php.net/ignore-user-abort
;ignore_user_abort = On*/
2. Almost always sleep does count. There are conditions when it does not, but in that case not the execution time is measured but execution cpu time. IIS does count CPU usage by app pools, not sure how it applies to PHP scripts.
It is true that PHP does not kill a script that is in sleep right now. That mean that the script will be killed once the sleep is over (easy test: add sleep(15); to your php and set max execution time to 10. You will get an error about time limit but in 15 seconds, not in 10).
In general, you can not rely on freely using sleeps in script. You should not rely on script that is run without a user (browser) within a web server, either. You are apparently solving a problem with wrong methods: you really should consider using cron jobs/separate tasks.
This depends on the server. Some servers could terminate the script when the socket is closed (which will probably happen when the browser is closed), others could let the script execute until the timeout is reached.
Again, would depend on the server. I can really see a implementation looking at the time the script puts load on a CPU, then again - just measuring how long ago the script was started is an equally good approach. It all depends on what the person(s) making the server software was after.
If you want definite answers I would suggest sifting through the documentation for the webserver and php-implementation your script is running on.

What happens when the server is in an infinite loop and the client stops?

I am trying to figure out how the "talking" between the server and the client is done.
So, when the server is generating an infinite loop, echoing"hello<br />", for example, what happens when the client stops, or hits 'back'?
How does the server know it's the end of the loop or does it take an endless process by its side?
Is there anywhere I can read about it just to get the big picture?
The client (browser) has a TCP/IP session established with your server, waiting for the HTTP response of your website. When the user hits back/cancel/close, this TCP connection is closed immediately by the client.
The webserver (i.e. apache) will inform the PHP interpreter of the TCP connection close.
Unless the php.ini directive ignore_user_abort is set to 1 (on server side, 0 is PHP default), the PHP interpreter will then abort script execution when the current atomic operation finishes (in your example: echo())
However, even when you set ignore_user_abort explicitly to 1 you will hit PHPs max_execution_time or the apache TimeOut (both are configurable on server side, too)
also see ignore_user_abort() and set_time_limit()
Even if your php script has an infinite loop, php.ini has a max_execution_time which will kill the process if the time exceeds.
I am not sure how it will work when the client closes a connection. Apache might kill the process but I don't think PHP will be notified of the client's connection closing.
If you do set_time_limit(0); in the script (so the PHP interpreter lets it run forever), then the script will probably run until the web server kills it after however long the TimeOut variable is set to (defaults to 300 seconds I think, and as far as I know is only an Apache setting).
See Apache's docs for the TimeOut directive.

Does running a PHP script from SSH bypass max execution time?

Basically I have an issue. I am posting to my users facebook statuses using a cron job, but when i run the cron from the browser I get an error after about 30 seconds. I have edited the .ini file to allow max execution time but it dont seem to work.
It updates the statuses of the first 700ish users but after that it stops.
Can I run it from the terminal or is there anything I can check/do to get around this?
When running PHP scripts from the command line the default max execution time is 0 - that is, unlimited. From an HTTP context there's other settings that can shutdown your script, including the Apache Timeout directive. This is definitely a job I'd run through the PHP CLI.
I would enable error logging which would describe what limits your script is running into. There's a lot of possibilities - you may be hitting the memory limit, the execution time may be too low, the Facebook API may be rate-limiting your requests, etc.
Make sure that you'll see errors by doing:
error_reporting(E_ALL);
ini_set('display_errors',1);
at the top of your script.
You could be running into a max_execution_time ceiling, or you could be running out of memory, etc. Error messages will help with determining that.
As Frank Farmer implies in his comment, you can use set_time_limit(0); in your script to allow it to run indefinitely.
If you're having memory limit issues, you can up time memory limit in your script (ini_set('memory_limit',...);) -- but you should really consider fixing your code so it doesn't keep consuming memory.

Categories