I want a program (written in php) to run on a remote server even if after I logout. The program is very simple: it does nothing but sleep for 10s (just for a test), like the following:
function index()
{
while(true)
{
sleep(10);
}
}
So I connect to the remote server via SSH. And then launch the program like this:
nohup php -f index.php &
I want it to run in the background on the server after I log out. But I find that everytime after I close the terminal the program can only run for around 10 minutes and then stop, though it does not stop immediately after terminal closed. If I do not close the terminal, it can keep running forever (as expected). Could anyone tell me what is the reason? And how to solve the problem? I have also tried using "disown" as is suggested in this post but got the same problem: How to make a programme continue to run after log out from ssh?
By the way, I am using shared remote host, so it could be due to some server settings but it's strange because it works fine with terminal open.
You can try using disown right after your nohup command.
If it still doesn't work, consider using screen instead. It is a useful utility allowing "virtual terminals" to run, even after logout.
Create a screen using screen -dmS someName. (e.g. screen -dmS myPhpScript)
Enter your screen using screen -r, your window will be cleared.
Execute your command (php -f index.php, no &!)
Exit the screen, by doing [Ctrl]+[A] (which seems to do nothing), and then pressing [D]. The screen will stay in background. You'll come back to the previous prompt (just before step 2), with a message indicating [detached from XXXXX.someName].
You can get back to the screen using screen -r or screen -x someName.
It turns out to be a server issue. A2hosting does not allow a process to run in the background for their shared hosting. So the process will be killed some time (not immediately) after logout.
Related
I need to run a PHP script which is scraping a website for data, and I need to run it on my VPS which has linux. I want to run it as a task so that I should be able to logout of my VPS and the script should keep running in background.
I've read about CRON job but it's more for like scheduling and repetitive tasks; but I need the PHP script only once.
Is there a way in PHP to do that? Please help, I'm just a newbie to this.
Thanks in advance! :)
I've tried it as a CRON job, but it doesn't seem to serve my exact purpose.
so I run my script like this from terminal.
php scrapethewebsite.php
and then it show this
Started scraping at 10:03:00 20-03-2019
and I can't logout or close my vps/ssh connection.
I look for
php scrapethewebsite.php
Started scraping at 10:03:00 20-03-2019
and then I should be able to logout or close my connection. And then I should be able to shutdown my PC and go for a walk..
Yes, You can do this with the screen. The screen is most of the time already installed in Linux VPS. but still, you can get by command.
apt-get install screen
Its give you the ability to have multiple screens in VPS where you can run multiple tasks at the same time.
LIke you have.
Get a screen with command.
screen -S sessionname
sessionname will be ur screen name.
and you can dispach it with command/
CTRL + A, followed by D.
then you can close your putty or any tool from via you accessing your vps.
here you go.
for more information, you can follow this link.
Screen in Linux
Run this in your SSH session, then you can click X and close it and it will still run
nohup php scrapethewebsite.php >/dev/null 2>&1 &
to check if your file is running type this command
top
you should see the php file up there and press spacebar to update the list
_____________________________________________________________________
If your php script has some error from time to time or just runs out of execution time and you want to re-run it when it closes down.. you have to create a .sh file with a while loop inside it and run nohup on it so it will re-run the php file after it errors up.
If you want to use nohup on php file that needs to re-run from time to time, then you should do the following
$ echo 'while true ; do php scrapethewebsite.php ; done > /dev/null' > ~/php_run_loop.sh
$ chmod a+x ~/php_run_loop.sh
$ nohup ~/php_run_loop.sh
NOTE: If you have an error or other problem in your php script, this will cause it to run over and over again without any limit, forever.
So to start I was building a NodeJS application that works alongside a front-end of a website. This is built and what I want is to launch it via PHP. So you can click save, which then starts it, if it is needed.
The application is a websocket server, and will listen on a websocket and process data/commands from a client (the front-end of my website).
Now the code I am using to launch it is as follows:
$command = '/usr/local/bin/node main.js & echo $!';
$processid = shell_exec($command);
But for some reason when I click save on the front-end it just hangs and keeps loading... but showing nothing on the page? It creates the process as I can see it doing:
lsof -i tcp:8000
and the page only stops loading when I kill the process that node is running...
I am using:
kill -9 <pid>
to kill it. So I am a little lost on why it's not running it in the background and then finishing the PHP scripts... it just hangs on it?
Any help would be appreciated.
Once note, is that it all runs from the command line fine, so the file is a separate launcher class, and if I run it via php and add some code to do tasks it works, it's only when I call it from something else I have the issue?
If it helps; the framework that the website uses is Joomla 3.x, thanks in advance.
I am on a MacBook Air El Capitan,
Will also need to work on Linux CentOS
Use this. It will suppress both STDOUT and STDERR and return immediately without waiting for command to finish.
$command = '/usr/local/bin/node main.js > /dev/null 2>&1';
I wrote a shell script in linux to check if one of my program (say programA) is running, if it stopped, it will restart the program.
ok, I also have a php script which hav start & stop button to start and stop the same program from the server side. If the program is already run by the shell script, clicking on start button will NOT run multiple of the same program.
THE PROBLEM IS: if PHP script works fine by itself. But the PHP script cannot close the program if it is run by the shellscript. Is this a permission issue or something that I havent been aware of? (I already did chmod 777 programA btw...)
UPDATE:
in my PHP script, it calls exec("kill -9 PID_of_programA") to kill the program.
I tried to change it to $r = shell_exec("kill -9 PID_of_programA") and echo $r gives me nothing...
You are probably running your program with a user that has no privileges to close other programs... Have you tried to run as super user?
I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could
I am trying to run a php script on my remote Virtual Private Server through the command line. The process I follow is:
Log into the server using PuTTY
On the command line prompt, type> php myScript.php
The script runs just fine. BUT THE PROBLEM is that the script stops running as soon as I close the PuTTY console window.
I need the script to keep on running endlessly. How can I do that? I am running Debian on the server.
Thanks in advance.
I believe that Ben has the correct answer, namely use the nohup command. nohup stands for nohangup and means that your program should ignore a hangup signal, generated when you're putty session is disconnected either by you logging out or because you have been timed out.
You need to be aware that the output of your command will be appended to a file in the current directory named nohup.out (or $HOME/nohup.out if permissions prevent you from creating nohup.out in the current directory). If your program generates a lot of output then this file can get very large, alternatively you can use shell redirection to redirect the output of the script to another file.
nohup php myscript.php >myscript.output 2>&1 &
This command will run your script and send all output (both standard and error) to the file myscript.output which will be created anew each time you run the program.
The final & causes the script to run in the background so you can do other things whilst it is running or logout.
An easy way is to run it though nohup:
nohup php myScript.php &
If you run the php command in a screen, detach the screen, then it won't terminate when you close your console.
Screen is a terminal multiplexer that allows you to manage many processes through one physical terminal. Each process gets its own virtual window, and you can bounce between virtual windows interacting with each process. The processes managed by screen continue to run when their window is not active.