So to start I was building a NodeJS application that works alongside a front-end of a website. This is built and what I want is to launch it via PHP. So you can click save, which then starts it, if it is needed.
The application is a websocket server, and will listen on a websocket and process data/commands from a client (the front-end of my website).
Now the code I am using to launch it is as follows:
$command = '/usr/local/bin/node main.js & echo $!';
$processid = shell_exec($command);
But for some reason when I click save on the front-end it just hangs and keeps loading... but showing nothing on the page? It creates the process as I can see it doing:
lsof -i tcp:8000
and the page only stops loading when I kill the process that node is running...
I am using:
kill -9 <pid>
to kill it. So I am a little lost on why it's not running it in the background and then finishing the PHP scripts... it just hangs on it?
Any help would be appreciated.
Once note, is that it all runs from the command line fine, so the file is a separate launcher class, and if I run it via php and add some code to do tasks it works, it's only when I call it from something else I have the issue?
If it helps; the framework that the website uses is Joomla 3.x, thanks in advance.
I am on a MacBook Air El Capitan,
Will also need to work on Linux CentOS
Use this. It will suppress both STDOUT and STDERR and return immediately without waiting for command to finish.
$command = '/usr/local/bin/node main.js > /dev/null 2>&1';
Related
I want a program (written in php) to run on a remote server even if after I logout. The program is very simple: it does nothing but sleep for 10s (just for a test), like the following:
function index()
{
while(true)
{
sleep(10);
}
}
So I connect to the remote server via SSH. And then launch the program like this:
nohup php -f index.php &
I want it to run in the background on the server after I log out. But I find that everytime after I close the terminal the program can only run for around 10 minutes and then stop, though it does not stop immediately after terminal closed. If I do not close the terminal, it can keep running forever (as expected). Could anyone tell me what is the reason? And how to solve the problem? I have also tried using "disown" as is suggested in this post but got the same problem: How to make a programme continue to run after log out from ssh?
By the way, I am using shared remote host, so it could be due to some server settings but it's strange because it works fine with terminal open.
You can try using disown right after your nohup command.
If it still doesn't work, consider using screen instead. It is a useful utility allowing "virtual terminals" to run, even after logout.
Create a screen using screen -dmS someName. (e.g. screen -dmS myPhpScript)
Enter your screen using screen -r, your window will be cleared.
Execute your command (php -f index.php, no &!)
Exit the screen, by doing [Ctrl]+[A] (which seems to do nothing), and then pressing [D]. The screen will stay in background. You'll come back to the previous prompt (just before step 2), with a message indicating [detached from XXXXX.someName].
You can get back to the screen using screen -r or screen -x someName.
It turns out to be a server issue. A2hosting does not allow a process to run in the background for their shared hosting. So the process will be killed some time (not immediately) after logout.
I need to run a PHP script which is scraping a website for data, and I need to run it on my VPS which has linux. I want to run it as a task so that I should be able to logout of my VPS and the script should keep running in background.
I've read about CRON job but it's more for like scheduling and repetitive tasks; but I need the PHP script only once.
Is there a way in PHP to do that? Please help, I'm just a newbie to this.
Thanks in advance! :)
I've tried it as a CRON job, but it doesn't seem to serve my exact purpose.
so I run my script like this from terminal.
php scrapethewebsite.php
and then it show this
Started scraping at 10:03:00 20-03-2019
and I can't logout or close my vps/ssh connection.
I look for
php scrapethewebsite.php
Started scraping at 10:03:00 20-03-2019
and then I should be able to logout or close my connection. And then I should be able to shutdown my PC and go for a walk..
Yes, You can do this with the screen. The screen is most of the time already installed in Linux VPS. but still, you can get by command.
apt-get install screen
Its give you the ability to have multiple screens in VPS where you can run multiple tasks at the same time.
LIke you have.
Get a screen with command.
screen -S sessionname
sessionname will be ur screen name.
and you can dispach it with command/
CTRL + A, followed by D.
then you can close your putty or any tool from via you accessing your vps.
here you go.
for more information, you can follow this link.
Screen in Linux
Run this in your SSH session, then you can click X and close it and it will still run
nohup php scrapethewebsite.php >/dev/null 2>&1 &
to check if your file is running type this command
top
you should see the php file up there and press spacebar to update the list
_____________________________________________________________________
If your php script has some error from time to time or just runs out of execution time and you want to re-run it when it closes down.. you have to create a .sh file with a while loop inside it and run nohup on it so it will re-run the php file after it errors up.
If you want to use nohup on php file that needs to re-run from time to time, then you should do the following
$ echo 'while true ; do php scrapethewebsite.php ; done > /dev/null' > ~/php_run_loop.sh
$ chmod a+x ~/php_run_loop.sh
$ nohup ~/php_run_loop.sh
NOTE: If you have an error or other problem in your php script, this will cause it to run over and over again without any limit, forever.
I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.
Till now I am using following code on website 1.
$file = file_get_contents('Website-2/update.php');
But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.
Is there a way where I can do this in a better way or tell PHP to move to next line of code.
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.
Common queues apps to look at:
[https://aws.amazon.com/sqs/?nc2=h_m1][1]
[https://beanstalkd.github.io/][2]
[https://www.iron.io/mq][3]
[1]: https://aws.amazon.com/sqs/?nc2=h_m1
[2]: https://beanstalkd.github.io/
[3]: https://www.iron.io/mq
What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.
On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.
You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.
(Async curl request in PHP)
(https://segment.com/blog/how-to-make-async-requests-in-php/)
Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)
exec('php -f "{path to new script}.php" $args > /dev/null &');
The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.
On windows you can use COM and WScript.Shell to do the same thing
$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);
You may want to use escapeshellarg on the filename and any arguments supplied.
So it will look like this
Server1 calls Server2
Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
Server1 continues as normal
Server2 continues the background process
So using your example instead of calling:
file_get_contents('Website-2/update.php');
You will call
file_get_contents('Website-2/update_kickstart.php');
In update_kickstart.php put this code
<?php
exec('php -f "{path}update.php" > /dev/null &');
Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly
Simple...
The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.
I have a file named /root/folder/myfile.php that will handle incoming packets from a specific port by a GPS device.
When I use [root#main ~] php /root/folder/myfile.php, everything works fine.
I need this file run every second to listen.
I researched for a while and figured out that using php cli is a solution, so I tried above command but as long as the shell is open (I'm using PUTTY), file is executing and when I close the shell, process will be killed.
How can I (where can I) add a command that will run this file every second, or may be in realtime?
I'm using linux centOS 6.5.
Thanks in advance
nohup php myscript.php &
the & puts your process in the background.
The solution from Run php script as daemon process
To kill it:
1) display all running proceses with: ps aux | less or top command
2) find pid(process id) and kill with: kill pid
You would want to use the cron functionality of your server.
Similar to this maybe:
running a script from cron every second
I am trying to run a php script on my remote Virtual Private Server through the command line. The process I follow is:
Log into the server using PuTTY
On the command line prompt, type> php myScript.php
The script runs just fine. BUT THE PROBLEM is that the script stops running as soon as I close the PuTTY console window.
I need the script to keep on running endlessly. How can I do that? I am running Debian on the server.
Thanks in advance.
I believe that Ben has the correct answer, namely use the nohup command. nohup stands for nohangup and means that your program should ignore a hangup signal, generated when you're putty session is disconnected either by you logging out or because you have been timed out.
You need to be aware that the output of your command will be appended to a file in the current directory named nohup.out (or $HOME/nohup.out if permissions prevent you from creating nohup.out in the current directory). If your program generates a lot of output then this file can get very large, alternatively you can use shell redirection to redirect the output of the script to another file.
nohup php myscript.php >myscript.output 2>&1 &
This command will run your script and send all output (both standard and error) to the file myscript.output which will be created anew each time you run the program.
The final & causes the script to run in the background so you can do other things whilst it is running or logout.
An easy way is to run it though nohup:
nohup php myScript.php &
If you run the php command in a screen, detach the screen, then it won't terminate when you close your console.
Screen is a terminal multiplexer that allows you to manage many processes through one physical terminal. Each process gets its own virtual window, and you can bounce between virtual windows interacting with each process. The processes managed by screen continue to run when their window is not active.