In my website I had a process which needs to create a file (executed with cronjob). However, I removed both the script and the cronjob, but the file is still created and the process from the cronjob executed. I know this sounds really unbelievable, but is there any way a process can stack in the memory of the server and loop? Can you think of any other reason causing this problem?
I had same issue. Process is executed infinity times and code changes, code removal does not help as it seems that process is "caching" your code.
What I have done:
Log in to SSH to server, use top command and search for PHP process PID, than use kill to terminate this process. (some source about it).
To prevent this:
I have created some file on server and inside every loop and just before starting function (I had recursive function) check if file exists (or check for valid content). If not found - do not execute process.
Related
Just wanted to know, how PHP will work in the following case?
Suppose I have cron script which is running every minute.
OR there is infinite loop script which is processing the queue table.
Now suppose I am updating any related class file which is used in infinite loop script.
Does it generate any error or stop the infinite loop script?
And what are the good practices need to follow in such situation?
Nothing will happen to already running scripts when you change any source code.
The source code is read from the file once at the start of the script, is parsed into bytecode, and then stays in memory until the end of the script. Your program is not actually "running from source" all the time or any such thing and it will not notice any changes to the source code files until it needs to load the file again.
An infinite loop program will only reflect changes when you stop and restart it.
A cron job will pick up any change the next time it runs.
I am trying to do what I think is a pretty complicated task.
I would like to execute a PHP script, but only if a file named import_products.csv exists in /var/import. The reason why is because that script attempts to import the aforementioned CSV file, and therefore I don't want it to error if it attempts to run the script every minute.
On the other hand, I DO want the script to run every minute (or even 2 or 4 times per minute) if there IS a file. So what I am invisioning is something like this.
I'd create the cron job to run every minute, which does the following:
Check if /var/import/import_products.csv exists
If it does not exist, exit the script now.
Execute product_import.php (this would be stored in the same folder as my cronjob file)
After execution completes, delete /var/import/import_products.csv
I will be using a separate daemon to generate these CSV files, all that is imperative here is that it only runs the PHP script if the file exists.
If there is a way to do this without using cron, that would work too. Basically I just want it to run that PHP script every time a new import_products.csv is written and delete the file afterwards. Cron jobs are the only way I know for sure to do this.
There are a couple of ways I could envision you doing this. The first is, if possible, the easiest, which would be to add in checks to the PHP script itself to see whether or not the file is present. If your product imports will take longer than one minute you'll also have to consider what happens if the file is still there while another import is happening already.
The second way would be to create a bash or shell script of some kind that will check for the existence of the file and then run the command to execute the PHP script if so, then add the shell script to your cron instead of the PHP script itself.
You can include the file checking within the php script through Exception handling, with a little php side overhead.
I'm making an encoding server and need a little help
I have a directory called uploaded, and a php file called videoConverter.php and when it runs It has a loop that will convert all the video files in the uploaded directory and FTP the completed file to a streaming sever.
There is no problem with the script but the problem is once the loop is finished it won't convert any new files uploaded after the loop ends. I need to find away to execute the php file or script when files are uploaded to the uploaded directory.
I was thinking about using a cron job and running the php file every second, but wouldn't it keep running the script while it's already running?
Does anyone know a good way to approach this issue?
I wouldn't stick with PHP for such a task, it would just be an inefficient layer between the encoding tools and the files themselves - you need an event-driven tool, not a scripting language.
I'd better look at inotify for local changes in the upload folders, then process new files on the go.
My first take would be using inotify wrapper in a Node.js environment:
whenever a new file appears, encode(newFile) is called;
when encoding process is finished, pushToFTP(encodedFile) callback is fired and transfer begins.
Bonus, smaller files don't have to wait till the whole job finishes due to asyncronous js architecture.
You could do:
Script wakes up from cron, checks for 'pid' file.
If it finds a pid file, it shuts down
If it does not find a pid file, it writes it's pid in the current directory.
It then begin to process the video file.
When complete, or on any failure where it must exit, it removes/unlinks the pid file.
Now, this could create an issue if the process fails and never unlinks the fail, no other jobs would run.
The other thing you could do is have a bootstrap script execute, exec('ps aux | grep VIDEO_SCRIPTNAME'); (or something), and if you see the script is already running, don't execute the video conversion script. In that example you'd have two files, the bootstrap that gets run by cron and checks if the video conversion is currently happening, and the second file, the one that actually does the work.
is it possible to launch a php script in background on the webserver with js and let it run even if you change page or not visit the site at all and then get the current status if you call the php script in a second moment?
This php script will process data for hours and sleep for X seconds/minutes for each loops. If what I asked before is possible how can I even get "echos" from it if php will only generated an output only when the script ends?
Maybe this is not a job for PHP?
thank you
EDIT: on a windows machine with apache
It certainly is possible - I have several scripts that run 24/7 written in PHP. Check out Creating Daemons in PHP. It has good info on how to 'daemonize' a php script so that it will run like a service, and it also covers signal handling.
To get debugging output you would redirect to a log file. Do a search on "unix redirect output" as there is a lot of info available.
In Windows it's not much different from UNIX.
First of all, you need to create a PHP script with a run loop. For example, take a look at this: http://code.google.com/p/php-apns/ . This is a PHP "daemon": the main script, PushMonitor.php, runs forever, because it has an infinite loop. It polls a queue at regular intervals, then execute the actions and then wait. Really simple, actually!
The problem, in your case, is that you want to launch the "daemon" from a PHP script.
You may want to look at this: http://robert.accettura.com/blog/2006/09/14/asynchronous-processing-with-php/ (first example code) . You will execute something like launchBackgroundProcess('php myscript.php') .
Note that on the code there's the "start /b" command (and the "&" at the end of the command for UNIX). That is important, because otherwise your process would be killed when the PHP script of the web page is terminated (children process die after parent dies!).
Also, remember that the "php" executable (cli) must be in your path (so you can execute "php" from the command line).
Since the PHP script of the page launching the background process is going to terminate, you can't directly catch the "echoes" in a simple way. My suggestion is to write all output to a file (or a database etc), and then read the contents from that source when necessary.
So, instead of "echo", you will use file_put_contents() etc.
I have a PHP script that grabs a chunk of data from a database, processes it, and then looks to see if there is more data. This processes runs indefinitely and I run several of these at a time on a single server.
It looks something like:
<?php
while($shouldStillRun)
{
// do stuff
}
logThatWeExitedLoop();
?>
The problem is, after some time, something causes the process to stop running and I haven't been able to debug it and determine the cause.
Here is what I'm using to get information so far:
error_log - Logging all errors, but no errors are shown in the error log.
register_shutdown_function - Registered a custom shutdown function. This does get called so I know the process isn't being killed by the server, it's being allowed to finish. (or at least I assume that is the case with this being called?)
debug_backtrace - Logged a debug_backtrace() in my custom shutdown function. This shows only one call and it's my custom shutdown function.
Log if reaches the end of script - Outside of the loop, I have a function that logs that the script exited the loop (and therefore would be reaching the end of the source file normally). When the script dies randomly, it's not logging this, so whatever kills it, kills it while it's in the middle of processing.
What other debugging methods would you suggest for finding the culprit?
Note: I should add that this is not an issue with max_execution_time, which is disabled for these scripts. The time before being killed is inconsistent. It could run for 10 seconds or 12 hours before it dies.
Update/Solution: Thank you all for your suggestions. By logging the output, I discovered that when a MySql query failed, the script was set to die(). D'oh. Updated it to log the mysql errors and then terminate. Got it working now like a charm!
I'd log memory usage of your script. Maybe it acquires too much memory, hits memory limit and dies?
Remember, PHP has a variable in the ini file that says how long a script should run. max-execution-time
Make sure that you are not going over this, or use the set_time_limit() to increase execution time. Is this program running through a web server or via cli?
Adding: My Bad Experiences with PHP. Looking through some background scripts I wrote earlier this year. Sorry, but PHP is a terrible scripting language for doing anything for long lengths of time. I see that the newer PHP (which we haven't upgraded to) adds the functionality to force the GC to run. The problem I've been having is from using too much memory because the GC almost never runs to clean up itself. If you use things that recursively reference themselves, they also will never be freed.
Creating an array of 100,000 items makes memory, but then setting the array to an empty array or splicing it all out, does NOT free it immediately, and doesn't mark it as unused (aka making a new 100,000 element array increases memory).
My personal solution was to write a perl script that ran forever, and system("php my_php.php"); when needed, so that the interpreter would free completely. I'm currently supporting 5.1.6, this might be fixed in 5.3+ or at the very least, now they have GC commands that you can use to force the GC to cleanup.
Simple script
#!/usr/bin/perl -w
use strict;
while(1) {
if( system("php /to/php/script.php") != 0 ) {
sleep(30);
}
}
then in your php script
<?php
// do a single processing block
if( $moreblockstodo ) {
exit(0);
} else {
// no? then lets sleep for a bit until we get more
exit(1);
}
?>
I'd log the state of the function to a file in a few different places in each loop.
You can get the contents of most variables as a string with var_export, using the var_export($varname,true) form.
You could just log this to a certain file, and keep an eye on it. The latest state of the function before the log ends should provide some clues.
Sounds like whatever is happening is not a standard php error. You should be able to throw your own errors using a try... catch statement that should then be logged. I don't have more details other than that because I'm on my phone away from a pc.
I've encountered this before on one of our projects at work. We have a similar setup - a PHP script checks the DB if there are tasks to be done (such as sending out an email, updating records, processing some data as well). The PHP script has a while loop inside, which is set to
while(true) {
//do something
}
After a while, the script will also be killed somehow. I've already tried most of what has been said here like setting max_execution_time, using var_export to log all output, placing a try_catch, making the script output ( php ... > output.txt) etc and we've never been able to find out what the problem is.
I think PHP just isn't built to do background tasks by itself. I know it's not answering your question (how to debug this) but the way we worked this is that we used a cronjob to call the PHP file every 5 minutes. This is similar to Jeremy's answer of using a perl script - it ensures that the interpreter if free after the execution is done.
If this is on Linux, try to look into system logs - the process could be killed by the OOM (out-of-memory) killer (unlikely, you'd also see other problems if this was happening), or a segmentation fault (some versions of PHP don't like some versions of extensions, resulting in weird crashes).