I have some PHP scripts that process data
unfortunately I can only run 6 script simultaneously
it seems that there is some sort of limitation in php or apache that makes the 7th script waits until another script ends
the script only processes data, there isn't any kind of connection with the database or any web request
How can I increase this limit?
#Leo - How are you running these scripts simultaneously? Is it web browser calling it (as you have mentioned apache in question)? -- browser has simultaneously connection limit. OR some other scenario.
place this function top of each page.
set_time_limit(0);
more details you can found on
http://php.net/manual/en/function.set-time-limit.php
Related
I have a PHP script which running some crawling job, and which probably require 5 minutes above to complete.
My question as below:
If I try to execute the script via browser request, probably will experience a request timeout after 30 seconds, but is it the script still running on server until completion?
If I execute the script via cron job, how do I trace the running status? How do I know if the script still running or already been kill by server?
Is it possible to increase the maximum execution time via PHP code without touching the php.ini file?
Appreciate for the reply.
If I try to execute the script via browser request, probably will
experience a request timeout after 30 seconds, but is it the script
still running on server until completion?
your script also stop processing on server.
If I execute the script via cron job, how do I trace the running
status? How do I know if the script still running or already been kill
by server?
You can track it by putting log in file at beginning of your script and at the end of your script.
Is it possible to increase the maximum execution time via PHP code
without touching the php.ini file?
You can increase the maximum execution time via PHP code by
ini_set('max_execution_time',300);
but it will only work if your HTTP_CONNECTION variable is set to keep-alive on server.
I have a PHP script on my Apache web server, which starts another several hours running PHP script. Right after the long-lasting script is started no other PHP script requests are handled. The browser just hangs eternally.
The background script crawls other sites and gathers data from ones. Therefore it takes quite long time.
At the same time static pages are got without problems. Also at the same time any PHP script started locally on the server from bash are executed without problems.
CPU and RAM usage are low. In fact it's test server and my requests are only ones being handled.
I tried to decrease Apache processes in order to be able to trace all of them to see where requests are hung. But when I decreased amount of processes to 2 the problem has gone.
I found no errors neither in syslog nor in apache/error.log
What else can I check?
Though I didn't find the reason of Apache hanging I have solved the task in a different way.
I've set a schedule to run a script every 5 minutes. From web script I'm just creating a file with necessary parameters. Script check existence of the file and if it exists it reads its content and deletes to prevent further scheduled start.
I've got the following scenario: multiple users from local network access a web application in coded in php resident on a IIS server. At every page load 3 ajax calls to 3 separated php script are performed and these calls repeat themselves every X minutes (timed jquery). For every ajax call of every user connected a php-cgi session is opened on the server, adding quickly up to 20 or so processes. The problem is after the ajax call these processes remain open, thus using a large amount of memory on the server with consequential problems to performances (arriving at a total block at times).
All the php scripts are called via the jQuery $.post function and perform one or more queries on a mssql db and end echoing a json encoded object or array. Is there a way to make these process close after the execution of the php script? I would like to avoid the option of making serial calls instead of parallel ones.
Any help is strongly appreciated.
Thanks
Don't know if you have already got a solution but anyways:
Try adding a
die();
?>
at the end of the php scripts that are called. That would kill the scripts that are called after they have completed execution. As each call creates its own process, their would be no issues even if things are done in parallel.
When you overwrite a php file on a server (say, via SFTP) while it is being processed somewhere (perhaps it's a script that takes several seconds to compelete) does it cancel the currently running script or does that finish out even after the overwrite occurs? I suppose I'm asking: does apache load a php script into memory before executing it (and does it hold on to that in memory for the duration of execution)?
does apache load a php script into memory before executing it (and does it hold on to that in memory for the duration of execution)?
Yes.
Nothing at all. The script has already been loaded into memory in its compiled state - no matter how much time it takes, the web server won't load the new file unless you refresh the page.
I have some php code, that execute for a very long time.
I need to realise next scheme:
User enter on some page(page 1)
This page starts execution of my large PHP script in background .(Every change is writting to database)
We sent every N seconds query to database to get current status of execution.
I don't want to use exec command because 1000 users makes 1000 php processes. It's not way for me...
So you basically want a queue (possibly stored in a database) and a command line script ran by cron that process queued items.
Clarification: I'm not sure about what's unclear about my answer, but this complies with the two requirements imposed by the question:
The script cannot be aborted by the client
You share a single process between 1,000 clients
Use http requests to the local http server from within your script in combination with phps ignore_client_abort() function.
That way you keep the load inside the http servers worker processes, have a natural limit and queuing of requests comes for free.
You can use CLI to execute multiple PHP scripts
or
you can try Easy Parallel Processing in PHP