I have centos VPS hosting and installed WHM/cPanel . I want to run a php script using command line for unlimited time.
My script is look like:
<?php
set_time_limit(0);
while(true)
{
//code to send me email
sleep(600);
}
?>
I know that this script should be run for unlimited time.
I have used these commands:
php myfile.php &
nohup php myfile.php &
I found these commands on stackoverflow. And these are running fine. But after one hours, It stop automatically.
I think, i am doing right. But i do not know, which is killing that process.
If not,
i want to know that How to run this script for unlimited time.
What you are doing is correct. It should run. I have PHP scripts that run for much longer than an hour. They run for days on end. I also have programs that are not PHP that should run forever, but don't. It is because they die due to a bug in the program. For example, xscreensaver dies on me about once a week. To keep it running, I use this shell script (which you can use to keep your PHP running):
while:
do
xscreensaver &
wait
done
Now, running that shell script will start the program again if it ever dies for any reason.
If you have console access try using cronjob to run this file.
see : https://www.centos.org/docs/5/html/Deployment_Guide-en-US/ch-autotasks.html
also : http://alvinalexander.com/linux/linux-crontab-file-format-example
Make sure you use php-command in cron-job
Note : You'll require admin rights to edit/work with cron-jobs
Related
I have a file named /root/folder/myfile.php that will handle incoming packets from a specific port by a GPS device.
When I use [root#main ~] php /root/folder/myfile.php, everything works fine.
I need this file run every second to listen.
I researched for a while and figured out that using php cli is a solution, so I tried above command but as long as the shell is open (I'm using PUTTY), file is executing and when I close the shell, process will be killed.
How can I (where can I) add a command that will run this file every second, or may be in realtime?
I'm using linux centOS 6.5.
Thanks in advance
nohup php myscript.php &
the & puts your process in the background.
The solution from Run php script as daemon process
To kill it:
1) display all running proceses with: ps aux | less or top command
2) find pid(process id) and kill with: kill pid
You would want to use the cron functionality of your server.
Similar to this maybe:
running a script from cron every second
I basically have a cron job calling one script every minute. Script immediately stops, if previous script is still running (checks previous script's activity time).
So I made a bug, and the script went in to an infinite loop (I know it was called from by cron atleast one time). I created a fix and uploaded it to the server, but I'm still wondering:
How long will the bugged script run?
How can I know if it is still running?
What does terminate a script and why?
The script just echoes out the same text over and over again.
P.S. PHP's max execution time within the script is set to 0 (infinite) and I don't have a direct access to the server, only FTP.
How can I know if it is still running?
Just set up a new cron job, but have the cron command be a something that helps you debug:
a useful one would be
ps -af | grep php > /some/path/to/mylogfile.txt
the ps command lists info on running processes. with those flags, part of the output will be the original linux command that started the process, and so we can grep the line and look for php because the origional command was probably something like:
php myscript.php
the output is redirected to mylogfile.txt for you to manually read after the cron job runs.
the process id should be part of the output. you can then use the kill command on that process id, again by just entering the command as a fake cron job.
Until the script runs into an timeout(max_execution_time defined in php.ini file or set_time_limit method)
Have a look at the running processes
send kill command to the script or wait till a timeout occurs
PS: you have to php.ini files - one for command line and one for Apache - be sure to Change the max_execution_time in the commandline ini file
My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.
I have a script which crawls info using API. What I want is to run this script continuously to grab data and store it on my local machine. I configured machine as localhost and installed phpmyadmin and MySQL.
You probably will want to run it from the command line. Put your code in some kind of loop to always keep it running, or schedule it to run as a cron job / scheduled task.
Wrap your main code, except for functions, classes, etc, in this:
while (true) {
and this:
sleep(2);
}
Then run it from the command line, using:
$ php myscript.php
If using Windows, you manually have to add php to your PATH. I don't know Windows (using a Mac all my life) but I guess you can do this in My Machine's Properties. :)
Read about set_time_limit, or modify max_execution_time value in your php.ini file .
Instead of a php loop, use a batch file loop. .
This gets around any set_time_limit , maximum memory etc problems. If php crashes, no problem, a new instance will be started immediately.
#echo off
:loop
php myScrapeScript.php
goto loop;
I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!
shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;
First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.
A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.
Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/
Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.
I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.