php shell_exec takes longer than cli execution? - php

I have a php script that executes casperjs on my Centos6 as following:
set_time_limit(0);
echo shell_exec("casperjs /home/senrioro/senioro.js --url='".$_GET['url']."'");
When the script gets running from cli using php myfile.php command, It takes less than 3 sec but when i open that file using my browser it takes more than 30 sec and sometimes browser shows me a 504 error .
My Question is , is there any configuration that i need to do to make my browser php execution closer to cli ? any help would be appreciate.

Related

Apache hangs after 200 seconds not receiving output

I have a PHP script that hangs in the browser when it runs for over 200 seconds. This PHP script calls shell_exec on a bash script and waits for output. Then, echoes that output to the browser.
Everything works fine when:
script runs < 200 seconds
run script from command line as localuser NOT apache
Does not work when:
run from the browser (apache)
run for > 200 seconds
The operations of the bash script execute fine in all cases and i can see the output when i write it to the console ... but it will just hang in the browser (no output) even after the script has "finished".
I understand that set_time_limit will not have an effect here because system commands do not count towards script execution time.
REF: Max Execution Time and System Calls
Below is a screenshot of my Apache timeout settings. To be honest, I am unsure if that would contribute to this issue, if using shell_exec (thus not outputting / echoing anything for some time) would have an effect on that, or what would result if an Apache timeout were to occur.
OTHER NOTES
I think it could be worth noting that after some time of leaving the script "hanging", it runs again without prompting.

hangs when execute powershell script using php

when tring to run powershell script using $out = shell_exec('powershell.exe -command C:\xampp\htdocs\web\ping.ps1 < NUL');
echo $out;
it hangs and noting is done , the page just keep loading ,
that's my simple script
ping 8.8.8.8 -t
i used this command in powershell in order to allow executing scripts at first ,
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
but nothing changes .
I need help , how to execute powershell scripts using php ?
This is happening because the tag -t gives the command prompt the command to continue pinging until it's interrupted.
The constant loading is php executing the power shell script and waiting for said script to stop executing before continuing. Because your power shell script never stops until you navigate away from the page. It'll constantly load
So the interruptions would be php memory being maxed out and failing. Or the user navigates away from the page which halts execution. Please review
http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/sag_tcpip_pro_pingconnect.mspx?mfr=true
I would recommend using a count then capturing the input after execution

PHP hosted on IIS gives 500 on every alternate call

I have a PHP Script in which it executes a batch(.bat) file using passthru() function. The output of batch file is printed via using echo statement.
This PHP Script works absolutely fine when hosted on Apache webserver, however the same PHP script produces 500.0 error on every alternate call, when hosted on IIS 7.5
I did some research and found out that if PHP script takes long time to execute, the browser gets unresponsive.
Hence, I edited the PHP script to write into a file like "Before executing batch file" and "After executing batch file".
As there 500.0 error was displayed, the file was still getting updated by above lines. This concludes that while the script is getting executed but browser is displaying 500.0
Is there any settings that can be tweaked in IIS?
This problem occurs only for IIS 7.5. When I use Apache it works like a charm.
I've had the exact same problem as you; executing a batch file via exec(), shell_exec(), etc, would result in an internal 500 server error every other time I refreshed the page.
I resolved this by removing all PAUSE commands from the batch file.
Make sure you don't have any breaks in the flow of the batch file. That is, if user input is required at any point during the execution of the batch script php will hang and the server will time out.
Hope this helps!
(I'd comment but I don't have 50 reputation)

When does a PHP script stop executing when called from CLI?

I basically have a cron job calling one script every minute. Script immediately stops, if previous script is still running (checks previous script's activity time).
So I made a bug, and the script went in to an infinite loop (I know it was called from by cron atleast one time). I created a fix and uploaded it to the server, but I'm still wondering:
How long will the bugged script run?
How can I know if it is still running?
What does terminate a script and why?
The script just echoes out the same text over and over again.
P.S. PHP's max execution time within the script is set to 0 (infinite) and I don't have a direct access to the server, only FTP.
How can I know if it is still running?
Just set up a new cron job, but have the cron command be a something that helps you debug:
a useful one would be
ps -af | grep php > /some/path/to/mylogfile.txt
the ps command lists info on running processes. with those flags, part of the output will be the original linux command that started the process, and so we can grep the line and look for php because the origional command was probably something like:
php myscript.php
the output is redirected to mylogfile.txt for you to manually read after the cron job runs.
the process id should be part of the output. you can then use the kill command on that process id, again by just entering the command as a fake cron job.
Until the script runs into an timeout(max_execution_time defined in php.ini file or set_time_limit method)
Have a look at the running processes
send kill command to the script or wait till a timeout occurs
PS: you have to php.ini files - one for command line and one for Apache - be sure to Change the max_execution_time in the commandline ini file

Shell command works on command line but not in PHP exec

I have a command that when run direct on the command line works as expected. It runs for over 30 seconds and does not throw any errors. When the same command is called through a PHP script through the php function exec() (which is contained in a script called by a cron) it throws the following error:
Maximum execution time of 30 seconds
exceeded
We have a number of servers and i have run this command on a very similar server with the exact same dataset without any issues so i'm happy there is no script-level issue. I'm becoming more inclined to think this is related to something at the server level - either in the PHP setup or the server setup in some way but really not sure where to look. For those that are interested both servers have a max execution time of 30 seconds.
the command itself is called like this -
from command line as:
root#server>php -q /path/to/file.php
this works...
and via cron within a PHP file as:
exec("php -q /path/to/file.php");
this throws the max execution time error. it was always my understanding that there was no execution time limit when PHP is run from the command line.
I should point out that the script that is called, calls a number of other scripts and it is one of these scripts that is erroring. Looking at my logs, the max execution time error actually occurs before 30 seconds has even elapsed too! So, less than 30 seconds after being called, a script, called by a cron script that appears to be running as CLI is throwing a max execution error.
To check that the script is running as i expected (as CLI with no max execution time) i performed the following check:
A PHP script containing this code:
// test.php
echo exec("php test2.php");
where test2.php contains:
echo ini_get('max_execution_time');
and this script is run like this:
root#server> php test.php
// returns 0
This proves a script called in this way is running under CLI with a max execution time of 0 which just proves my thoughts, i really cannot see why this script is failing on max execution time!
it seems that your script takes too much time to execute, try to
set time limit, http://php.net/manual/en/function.set-time-limit.php
or check this post:
Asynchronous shell exec in PHP
Does the command take over 30 seconds on the command line? Have you tried increased the execution timeout in the php.ini?
You can temporarily set the timeout by including this at the top of the script. This will not work when running in safe mode as is specified in the documents for setting max_execution_time with ini_set().
<?php
ini_set('max_execution_time', 60); // Set to be longer than
// 60 seconds if needed
// Rest of script...
?>
One thing of note in the docs is this:
When running PHP from the command line
the default setting is 0.
What does php -v | grep cli, run from both the shell and in the exec command from the cron-loaded php file show?
Does explictly typing /usr/bin/php (modify as appropriate) make any difference?
I've actually found what the issue is (kinda). It seems that its maybe a bug with PHP reporting max_execution_time to be exceeded when the error is actually with max_input_time as described here
I tried changing the exec call to php -d max_execution_time=0 -q /path/to/file.php and i got the error "Maximum execution time of 0 seconds exceeded" which makes no sense, i changed the code to be php -d max_input_time=0 -q /path/to/file.php and the code ran without erroring. Unfortunately, its still running 10 minutes later. At least this proves that the issue is with max_input_time though
I'm surprised that no one above has actually timed the completed exec call. The problem is that exec(x) is taking a much longer time than command line x. I have a very complex perl script (with 8 levels of internal recursion) that takes about 40 sec to execute from the command line. Using exec inside a php script to call the same perl program takes about 300 sec to execute, i.e., a factor of about 7X longer time. This is such an unexpected effect that people aren't increasing their max execution time sufficiently to see their programs complete. As a result, they are mystified by the timeout. (BTW, I am running on WAMP in a fast machine with nominally 8 cpus, and the rest of my php program is essentially trivial, so the time difference must be completely in the exec.)
create wrapper.sh file as below
export DISPLAY=:0<br>
xhost + 2>>/var/www/err.log<br>
/usr/bin/php "/var/www/read_sms1.php" 2>>/var/www/err.log<br>
and put it in cron as below
bash /var/www/wrapper.sh<br>
y read_sms1.php contain<br>
$ping_ex = exec("/usr/local/bin/gnokii --getsms SM 1 end ", $exec_result, $pr);
and above solution workedfine for me in ubuntu 12.04

Categories