Long lasting script prevents handling new requests - php

I have a PHP script on my Apache web server, which starts another several hours running PHP script. Right after the long-lasting script is started no other PHP script requests are handled. The browser just hangs eternally.
The background script crawls other sites and gathers data from ones. Therefore it takes quite long time.
At the same time static pages are got without problems. Also at the same time any PHP script started locally on the server from bash are executed without problems.
CPU and RAM usage are low. In fact it's test server and my requests are only ones being handled.
I tried to decrease Apache processes in order to be able to trace all of them to see where requests are hung. But when I decreased amount of processes to 2 the problem has gone.
I found no errors neither in syslog nor in apache/error.log
What else can I check?

Though I didn't find the reason of Apache hanging I have solved the task in a different way.
I've set a schedule to run a script every 5 minutes. From web script I'm just creating a file with necessary parameters. Script check existence of the file and if it exists it reads its content and deletes to prevent further scheduled start.

Related

Execute server-side execution of PHP script via webpage

First of all sorry to post a question that seems to have been flogged to death on SO before. However, none of the questions I have reviewed helped me to solve my specific problem.
I have built a web application that runs an extensive data processing routine in PHP (i.e. MySQL queries, calculations, etc.).
Depending on the amount of data fed to the app this processing can take quite a long time so the script needs to run server-side and independently from the web front-end.
There is a problem, however. It seems I cannot control the script execution time limit as long as the script is invoked via cgi.
When I run the script via SSH and the command line it works fine for however long it takes to process the data.
But if I use the exec() command in a php script called via the webserver I always ends up with the error End of script output before headers after approximately 45 seconds.
Rather than having to fiddle with server settings (a nightmare in terms of portability) I would like to find a solution that kicks off the script independently from cgi.
Any suggestions?
Don't execute the long script directly from the website (AKA, directly from Apache) because, as you've mentioned, it will block until it finishes and potentially time out. Instead, use the website to schedule a job (an execution of the long script) to be run immediately.
Here is a basic outline of how you can potentially do this:
Create a new, small database to store job requests, including fields job_id, processing_status, run_start_time, and more relevant fields
Create some Ajax that hits your server and writes a "job request" to this jobs database, set to execute immediately.
Add a crontab script or bot that periodically watches for new jobs. If it finds a job that is yet to be processed but has passed the run_start_time, run it using exec() or some other command executor. This way the command won't timeout because it is not being run by Apache, but by the cron daemon.
When the command finishes, update the jobs database saying that processing is finished.
From your website, write a frontend that allows the user to see if the requested job is finished yet. Once it finishes, it displays some kind of "Done" indicator or something similar.

long running php script called via ajax (Don't want to wait for response)

I have a long running script that can run for awhile. (It sends an email every 5 seconds) to many users. This script is triggered via an ajax request
If the response is never received such as the browser is closed, will the script continue to run? (It appears it does, but are there any conditions on when it won't?
Does sleep count towards the max execution time. (It appears this is false also)
1.
The short answer is: it depends.
In fact, it can be configured in PHP and in web server you use. It depends on the mode you use PHP in (module or CGI or whatever).
You can configure it sometimes, though. There is an option in php.ini:
/* If enabled, the request will be allowed to complete even if the user aborts
; the request. Consider enabling it if executing long requests, which may end up
; being interrupted by the user or a browser timing out. PHP's default behavior
; is to disable this feature.
; http://php.net/ignore-user-abort
;ignore_user_abort = On*/
2. Almost always sleep does count. There are conditions when it does not, but in that case not the execution time is measured but execution cpu time. IIS does count CPU usage by app pools, not sure how it applies to PHP scripts.
It is true that PHP does not kill a script that is in sleep right now. That mean that the script will be killed once the sleep is over (easy test: add sleep(15); to your php and set max execution time to 10. You will get an error about time limit but in 15 seconds, not in 10).
In general, you can not rely on freely using sleeps in script. You should not rely on script that is run without a user (browser) within a web server, either. You are apparently solving a problem with wrong methods: you really should consider using cron jobs/separate tasks.
This depends on the server. Some servers could terminate the script when the socket is closed (which will probably happen when the browser is closed), others could let the script execute until the timeout is reached.
Again, would depend on the server. I can really see a implementation looking at the time the script puts load on a CPU, then again - just measuring how long ago the script was started is an equally good approach. It all depends on what the person(s) making the server software was after.
If you want definite answers I would suggest sifting through the documentation for the webserver and php-implementation your script is running on.

PHP program in shared server terminates in different location each time - fails 3% times

I've written a PHP script which performs web scraping from one site and parse input for my website.
The script is driven by cronjob periodically, and everything is hosted in a shared web-server.
The problem is: my script terminates several times a day, with no error message and in random location in code each time.
The script is long, performing 2 HTTP Gets and 4 HTTP Posts to other country website, each HTTP request takes ~3 seconds to complete; it also writes to files and r/w to/from MySql database.
I'm stuck on it after trying the following things:
1) Talking with my hosting support (IxWebHosting) - they just wasted my time, denying their responsibility and advised me to limit the cronjob periodicy to 5 minute rate maximum (before it was 3 minutes interval, however it didn't change anything.)
2) Instead of running from cronjob context, I've switched to the following method:
a. Cronjob calls a "loader PHP script" every 5 minutes.
b. The "loader PHP script" calls the real PHP script using HTTP Get and terminates before waiting for an answer.
c. The real PHP script perform its ~20 seconds job (here is where the program terminates in random location).
3) Put some log file timestamp writing in many places along the code in order to see where program terminated each run - this showed me the program terminates everywhere in the code.
4) In order to prove it's not my code fault I've performed the following test:
a. Cronjob calls another loader PHP script.
b. The PHP script performs HTTP request to a different testing-purpose PHP script and terminates without waiting for response.
c. The 2nd PHP script will perform dummy 20 seconds task: sleep for a second and write timestamp into log file for 20 times.
Result: the test succeeded! the 2nd program didn't fail... which means it has something to do with my code and the webserver I'm running at - however since it fails everytime in different place and only ~10 times a day (from 288 times it runs a day) then I can't tell where it is (and no error message of PHP).
Thanks in advance, sorry for long description - I'll be happy to provide more details upon request.
Are you logging the actual process, rather than writing logs from within the process ? e.g. does your cron job look like:
* * * * * /home/user/myTroublesomeJob.php 2>&1 >/tmp/crash.log
This will catch the stdout/sterr of the process itself. It may also be worth invoking your script from a parent shell script, and that can catch the PHP process exiting, and dump out the exit code (which would indicate a core dump, a signal being caught etc.). See here for more info.
Try setting the timeout at the start of the script.
set_time_limit(1800);
I found recently that if I ran a script manually, it went fine, but if it was run by Cron, it would throw timeout errors. Putting this limit in helped.
If you are running a script on a Shared server then it will not allow you to run long running Scripts.
If you script takes time then please use a dedicated server. Because in shared server many user are using shared resources so server automatically kills a script which is using extra resources.
..I will suggest you to use amazon EC2 free package. There you will be able to run long running scripts.
Thanks

PHP script modified during parsing/execution

I have a PHP script running on a cron that can take a long time to process fully (e.g. 10 minutes). What would happen if this script was modified while it's being parsed? or during execution? The reason I ask is I have a number of scripts across servers that I want to place under version control and i'm concerned about what might happen if the script happens to get updated while it is processing. If it is an issue then I could place some kind of lock on the file while it is running.
Nothing should happen to the running script, because by the time it starts running, PHP would have already parsed it and the opcodes are already in memory, so there's no more disk access.

PHP Script Crashes and Max Execution Time

I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
Probably the approach I would recommend is to use tempnam() first and write the contents inside (which may take a while). Once done, you do the file locking, etc.
Not sure if this happens when a PUT request is being done; typically PHP will handle file uploads first before handing over the execution to your script.
Script could crash on these two limitations
max_execution_time
memory_limit
while working with resources, unless you have no other errors in script / check for notice errors too

Categories