I got a PHP script that is looping and will continue to do so for about another hour. How do I stop it. The script explicitly overrides the time out and the memory buffer. It's on a shared hosting server with cPanel installed. The entire website is down until the script completes.
I had added a usleep(100000) statement, but it doesn't appear to work.
If the script already runs you can't do anything except killing the process/webserver/host and restarting it.
It is not possible untile you ask to server admin to kill this routine. More over, please avoid to use sleep(), usleep() functions in your php. Php has a limitations to execute those functions when it is in Apache server and Linux OS.
If you want to make delays using php go with date-time functions... see code below
StrCp_Time = Now()
Difference = DateDiff("s",Session("submittime"), StrCp_Time) '// Difference Between Two Dates(Seconds , LastSubmitTime , CurrentTime)
If Difference > 10 Then '// 10 seconds
... Submit Form ...
End If
Session("submittime") = StrCp_Time
First contact your host and ask them to kill the process. This is the best and simple way.
Think the moral is, never override them, just lift them up to a higher level.
Related
I have written a script that runs on a windows machine doing various tasks in a loop. It works great, except it exists, for no apparent reason every two hours.
I have used:
set_time_limit(0);
to ensure the script should keep running forever. The loop is actually caused by a class method calling itself - maybe there's a program counter limit or something?
I have written a bat file which automatically restarts the process if it dies, but I'd really rather it didn't die in the first place.
Does anyone have any suggestions?
I'm not 100% certain, but I am fairly sure that I did indeed hit some kind of recursion limit in PHP. Using a while() loop the script has been running for at least 4 hours so far without a problem.
I'm not sure what the recursion limit is (or why an error wasn't displayed) but this has fixed the issue.
I'm getting into Web Sockets now and have been successfully using the online websockets Pusher(didn't like it) and Scribble(amazing but downtime is too frequent since it's just one person running it).
I've followed this tutorial http://www.flynsarmy.com/2012/02/php-websocket-chat-application-2-0/ on my localhost and it works great!
What I wanted to ask is, how do I setup the server.php from the above file to run as a websocket server on an online webhost/shared server?
Or do I need to get a VPS (and if so, which one do you recommend and how can I setup the websocket server there as I've never really used a VPS before!)
Thank you very much for reading my question and answering. I've read all other question/answers here regarding sockets but haven't been able to find the answer to my above questions yet. Hopefully I find it here!
This is tricky.
You need to execute the server.php script and it needs to never exit. If you have an SSH access to your shared server, you could execute it just like they do on the screenshot and make it run as a background task using something like nohup:
$ nohup php server.php
nohup: ignoring input and appending output to `nohup.out'
After invoking this (using the SSH connection), you may exit and the process will continue running. Everything the script prints will be stored into nohup.out, which you can read at any time.
If you don't have an SSH access, and the only way to actually execute a PHP script is through Apache as the result of a page request, then you could simply go to that page using a browser and never close the browser. But there will be a time out one day or another and the connection between you and Apache will close, effectively stopping the server.php script execution.
And in those previous cases, a lot of shared hosts will not permit a script to run indefinitely. You will notice that there's this line in server.php:
set_time_limit(0);
This tells PHP that there's no time limit. If the host made PHP run in safe mode (which a lot of them do), then you cannot use set_time_limit and the time limit is probably 30 seconds or even less.
So yes, a VPS is probably your best bet. Now, I don't own one myself, and I don't know what's a good/bad price, but I'd say HostGator seems fine.
(Our server is Linux based)
I'm an experienced PHP developer but first time i'll develop a bot which always running and fetch some datas.
I'll explain my application with a simple (and sample) scenario. I have about 2000 web site url and my application will visit this url's and record contents of web page's . This application will work 7 days 24 hours. It will start working again when it's finish 2000 web sites.
But i need some suggestions for my server. As you see, my application will be run infinity until i shut down server. I can do this infinity loop with this :
while(true)
{
APPLICATION CODES HERE
}
But i think this will be an evil for server :) Is it possible to doing something like this, on server side?
Also i think using cronjobs but it's not work for my scenario. Because my script start working again asap it's finish working. I have to "start again when you finish your work" , not "start every 30 minutes" . Because i don't know, maybe fetching all 2000 websites, will take more than 30 minutes or less than 30 minutes.
I hope i explained it very well.
Also i'm worried about memory usage. As you know garbage collector cleans memory after every PHP script stop. But as i said, my app won't stop for days (maybe weeks) . So garbage collector won't be triggered. I'm manually unsetting (unset() function) all used variables at end of script. Is it enough?
I need some suggestions from server administrators :)
PS. I'm developing it as console application, not a web application. I can execute it from command line.
Batch processing.. store all the sites in a csv or something, mark them after completion, then work on all the ones non-marked, then work on all the marked.. etc. Only do say 1 or 5 at a time, initiate batch script every minute from cron..
Don't even try to work on all of them at once.. any errors and you won't know what happened..
Could even store the jobs in a database, store processing stats etc.. allows for fine-tuning and better reporting.
You will probably hit time-limits trying to run infinite php scripts, even from the command line.. also your server admin will hate you. Will probably run into Memory limits if you don't release resources properly.. far too easily done with php.
Read: http://www.ibm.com/developerworks/opensource/library/os-php-batch/
Your script could just run through the list once and quit. That way, what ever resources php is holding can be freed.
Then have a shell script that calls the php script in an infinite loop.
As php is not designed for long running task, I am not sure if the garbage collection is up to the task. Quiting after every run will force it to release everything.
I have made a PHP script which probably would take about 3 hours to complete. I run it from browser and after about 45minutes it stops doing anything. I know this since its polling certain web addresses and then saves some data to database. So it basically stops putting any data to database which lead me to conclusion that it has stopped. It still shows in browser like it would be loading the page though but its neverending.
There arent any errors so it probably is some kind of timeout... But where it occurs is mystery or how can I prevent it from happening. In my case I cant use the CLI, I must user browser client to initiate the script.
I have tried to put
set_time_limit(0);
But it had no apparent effect. Any suggestions what could cause the timeout and a fix for it?
Try this:
set_time_limit(0);
ignore_user_abort(true);
ini_set('max_execution_time', 0);
Most webhosts kill processes that run for a certain length of time. This is intended as a failsafe against infinite loops.
Ask your host about this and see if there's any way it can be disabled for this particular script of yours. In some cases, the killer doesn't apply to Cron tasks, or processes run by SSH. However, this varies from host to host.
Might be the browser that's timing out, not sure if browsers do that or not, but then I've also never had a page require so much time.
Suggestion: I'm assuming you are running a loop. Load a page then run each iteration of the loop in an ajax call to another page, not firing the next iteration until the previous one returns.
There's a setting in PHP to kill processes after sometime. This is especially important for shared servers (you do not want that one process slows up the whole server).
You need to ask your host if you can make modifications to php.ini (through .htaccess). In particular the max_execution_time setting.
If you are using session, then you would need to look at 'session.cookie_lifetime' and not set_time_limit. If you are using an array, the array size might also fill up.
Without more info on how your script handles the task, it would be difficult to identify.
I have a PHP script that grabs a chunk of data from a database, processes it, and then looks to see if there is more data. This processes runs indefinitely and I run several of these at a time on a single server.
It looks something like:
<?php
while($shouldStillRun)
{
// do stuff
}
logThatWeExitedLoop();
?>
The problem is, after some time, something causes the process to stop running and I haven't been able to debug it and determine the cause.
Here is what I'm using to get information so far:
error_log - Logging all errors, but no errors are shown in the error log.
register_shutdown_function - Registered a custom shutdown function. This does get called so I know the process isn't being killed by the server, it's being allowed to finish. (or at least I assume that is the case with this being called?)
debug_backtrace - Logged a debug_backtrace() in my custom shutdown function. This shows only one call and it's my custom shutdown function.
Log if reaches the end of script - Outside of the loop, I have a function that logs that the script exited the loop (and therefore would be reaching the end of the source file normally). When the script dies randomly, it's not logging this, so whatever kills it, kills it while it's in the middle of processing.
What other debugging methods would you suggest for finding the culprit?
Note: I should add that this is not an issue with max_execution_time, which is disabled for these scripts. The time before being killed is inconsistent. It could run for 10 seconds or 12 hours before it dies.
Update/Solution: Thank you all for your suggestions. By logging the output, I discovered that when a MySql query failed, the script was set to die(). D'oh. Updated it to log the mysql errors and then terminate. Got it working now like a charm!
I'd log memory usage of your script. Maybe it acquires too much memory, hits memory limit and dies?
Remember, PHP has a variable in the ini file that says how long a script should run. max-execution-time
Make sure that you are not going over this, or use the set_time_limit() to increase execution time. Is this program running through a web server or via cli?
Adding: My Bad Experiences with PHP. Looking through some background scripts I wrote earlier this year. Sorry, but PHP is a terrible scripting language for doing anything for long lengths of time. I see that the newer PHP (which we haven't upgraded to) adds the functionality to force the GC to run. The problem I've been having is from using too much memory because the GC almost never runs to clean up itself. If you use things that recursively reference themselves, they also will never be freed.
Creating an array of 100,000 items makes memory, but then setting the array to an empty array or splicing it all out, does NOT free it immediately, and doesn't mark it as unused (aka making a new 100,000 element array increases memory).
My personal solution was to write a perl script that ran forever, and system("php my_php.php"); when needed, so that the interpreter would free completely. I'm currently supporting 5.1.6, this might be fixed in 5.3+ or at the very least, now they have GC commands that you can use to force the GC to cleanup.
Simple script
#!/usr/bin/perl -w
use strict;
while(1) {
if( system("php /to/php/script.php") != 0 ) {
sleep(30);
}
}
then in your php script
<?php
// do a single processing block
if( $moreblockstodo ) {
exit(0);
} else {
// no? then lets sleep for a bit until we get more
exit(1);
}
?>
I'd log the state of the function to a file in a few different places in each loop.
You can get the contents of most variables as a string with var_export, using the var_export($varname,true) form.
You could just log this to a certain file, and keep an eye on it. The latest state of the function before the log ends should provide some clues.
Sounds like whatever is happening is not a standard php error. You should be able to throw your own errors using a try... catch statement that should then be logged. I don't have more details other than that because I'm on my phone away from a pc.
I've encountered this before on one of our projects at work. We have a similar setup - a PHP script checks the DB if there are tasks to be done (such as sending out an email, updating records, processing some data as well). The PHP script has a while loop inside, which is set to
while(true) {
//do something
}
After a while, the script will also be killed somehow. I've already tried most of what has been said here like setting max_execution_time, using var_export to log all output, placing a try_catch, making the script output ( php ... > output.txt) etc and we've never been able to find out what the problem is.
I think PHP just isn't built to do background tasks by itself. I know it's not answering your question (how to debug this) but the way we worked this is that we used a cronjob to call the PHP file every 5 minutes. This is similar to Jeremy's answer of using a perl script - it ensures that the interpreter if free after the execution is done.
If this is on Linux, try to look into system logs - the process could be killed by the OOM (out-of-memory) killer (unlikely, you'd also see other problems if this was happening), or a segmentation fault (some versions of PHP don't like some versions of extensions, resulting in weird crashes).