php system() doesnt let the rest of the code , execute - php

I have a PHP script like this:
<?php
system("firefox http://run.imacros.net/?m=the_macro.iim 2>&1");
// CODE//
?>
When I run this from terminal, opens my FireFox normally but never run the rest of the code in the script!
If I close the FireFox manually then the script runs the rest of the code.
I want to execute my script without stacking in the system() command.

The 2>&1 redirects stderr to stdout, but you haven't redirected stdout, so if any output is generated to either handle, PHP waits. To discard any output to both handles, use
system("firefox http://run.imacros.net/?m=the_macro.iim >/dev/null 2>&1");

Related

AJAX Script doesn't respond until background program ran with exec() ends

I run a background PHP program with exec() like this :
exec('/usr/bin/php bgScript.php "arg1" "arg2" > /dev/null 2>&1 &');
It works and the program does run in background.
Problem
I have Output Buffering Enabled and would like to keep it that way.
My whole script is this :
exec('/usr/bin/php bgScript.php "arg1" "arg2" > /dev/null 2>&1 &');
echo json_encode(array(
"status" => "started"
));
When an AJAX request is made to the file above, the process is started and is in background. I assume this because, further requests to the server returns data and doesn't wait for the previous AJAX script to finish.
But, the problem is that the JSON data is not outputted until the background process is completed.
Since the program is made to run in the background, shouldn't the JSON Data be outputted without waiting for the exec() to end ? I don't know how to say this techinically (Forgive me) : Why does the Output Buffer continue until exec() ends ?
How can I make the script output the JSON Data right after the program is started in the background and close the connection between the AJAX script and browser ?
The command does not run in the background if it's stdio is not redirected. From the official documentation
Note:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
There are several methods to alter this behavior in Unix like systems you can run the command with command >/dev/null & on windows you can use start command.

Redirecting STDERR and STDOUT to a log file when script is executed via php

I have a scenario where a php file runs a perl script file using exec function. I want STDOUT and STDERR to be redirected to a log file which can then be read via ajax call to another php file.
Problem: I'm unable to redirect STDOUT and STDERR when the script is called via PHP.
Php code:
$cmd=getcwd().'/scheduledscrape.pl --path='.$spath.' --size='.$filesize;
$outputlastline = exec($cmd, $output);
print $outputlastline;
I tried to modify the above existing code by changing it to:
$cmd=getcwd().'/scheduledscrape.pl --path='.$spath.' --size='.$filesize.' >log.txt 2>&1';
but nothing happened. Reading about this on the php documentation, it was commented that to redirect the output, one needs to invoke another shell script. Hence I tried this:
$cmd=getcwd().'/runscript.sh '.getcwd().'/scheduledscrape.pl --path='.$spath.' --size='.$filesize;
$outputlastline = exec($cmd, $output);
print $outputlastline;
and the shell script contained:
echo "$#" > running.txt
"$#" >log.txt 2>&1
But again nothing happened, i.e the perl script runs but the log file isnt getting created. What am I doing wrong?

External program in PHP

I'm trying to make a simple script in PHP which download a video of youtube, at the first moment I tried some classes I found on web but unsuccessful, so I decide to use youtube-dl program and call it from to my script.
The big problem is: apparently the process is killed when the page loads in the browser and the download is interrupted.
The most curious thing is that if I execute the script like that: php page.php, the script works nicely but the browser doesn't work.
I note the same thing with wget command, the process also killed.
The code is something like:
<?php
exec("youtube-dl -o /var/www/YT/video.flv https://youtube....");
?>
and
<?php
exec("wget http://link");
?>
*Both youtube-dl and wget are in the same directory from script, I tried too redirect output to /dev/null and fork process mas both no success.
I would try executing it at the background.
<?php
exec("youtube-dl -o /var/www/YT/video.flv https://youtube.... > /dev/null 2>&1 &");
?>
If that works then what it's happening is that your php script ends before youtube-dl

PHP - Parallel processing

I'm working on a web application that needs to send a lot of HTTP requests and update the table, this will block the PHP from executing. So I though I might have to write a separate PHP script and run it via my main application. I tried Exec but still the program waits until the script is executed.
exec('php do_job.php');
I even tried redirecting the output to a file as PHP.Net suggests:
Note: If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
$result = exec('php do_job.php > output.txt &',$output);
But still no success ... Further down the same page I came accross this:
$command = 'php do_job.php';
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
Still no sucess ... Lastly I tried:
pclose(popen("start /B ". $command, "r"));
What am I doing wrong here?
I'm developing my app on localhost (XAMPP - Windows), later I'll be releasing it on a Linux host. My last resort would be to run the script via CRON jobs. Is this the only way?
By just adding & at the end of command you can run any command in background. Here I'm redirecting o/p to /dev/null to avoid hang(For Linux).
exec('php do_job.php > /dev/null &');
If you want see the o/p of the command you can redirect it to a file.
exec('php do_job.php > full_path_to_file &');

Capturing and displaying exec/shell_exec periodic output?

You can easily use exec() or shell_exec() to execute a system command, like ls -l /var/www/mysite and then output the result of the command.
How would one execute and display the result of a command that periodically prints information out to the console?
You have a simply Python script. If you run the script from the console, it simply prints out some type of information to the console every 10 seconds forever until you force-quit the script.
How could use PHP to execute this python command and somehow capture and stream the output into the browser in realtime? Is something like this possible? I'm thinking there would have to be some sort of Ajax involved but I'm not sure.
I tried doing something like this:
python myscript.py > output.txt
And then I was planning on maybe using Ajax to periodically tail or cat the content of the output.txt and display in the browser. But output.txt doesn't appear to have any content added to it until after the script has been force-quit.
You don't see any output to output.txt because it's being buffered. For python there's an option to make it line-buffered. From the manpage:
-u Force the binary I/O layers of stdin, stdout and stderr to be unbuffered. The text I/O layer will still be
line-buffered.
So your command would then become:
python -u myscript.py > output.txt
For PHP the flush function should help you.

Categories