So I am making a simple website for personal use, it sends an Post request via Ajax, PHP code is something like that:
$cmd = "some command";
$r = shell_exec($cmd);
echo $r;
Now, the first time I send the request it works, but if I send more requests without refreshing whole page it returns nothing. PHP script does execute, just shell_exec returns nothing. I have no idea what is causing this.
(Debian, Apache2, PHP7.0)
ajax code:
$.post("exec.php", {command: val}).done(function(data){
self.outp.append(data);
});
And I confirmed that val is correct, if in PHP i add something to $r (example: $r .= "test") it does return that.
#Edit I've found out using GET insted of POST makes the problem disappear, it does not really resolve the problem, but it's something.
Related
Can someone show a simple example on how to use fastcgi_finish_request() function?
I googled but only found some general mention of it, some people say they use it successfully but I could not find a single example with code.
For example, I have a PHP object. To send a response to a browser I generate HTML, then
returning it via getResult(). Then echo the result.
Like this:
$obj = new controller();
echo $o->getResult();
Let's say I want to take advantage of this optimization technique to send result to browser and then finish up some potentially long process like connecting to some API, like maybe Facebook API.
How would I go about doing this? I understand that basically I can call fastcgi_finish_request(); and then continue executing php script.
I just need to see example code, I'm not smart enough to figure it out by myself.
I understand that basically I can call fastcgi_finish_request(); and then continue executing PHP script.
Yes, that's all you have to do.
$obj = new controller();
echo $o->getResult();
fastcgi_finish_request();
do_facebook_thing();
To convince yourself it is working, do this:
echo "Test";
fastcgi_finish_request();
sleep(10);
If you remove the second line, then you will see that the browser has to wait 10 seconds.
I am creating an API that receives a post request and then returns data. This data needs to be returned immediately but then I need to execute a long running curl operation after that.
<?php
if (!empty($_POST)) {
$numbers = isset($_POST['numbers']) ? $_POST['numbers'] : 0;
$numbers = $numbers*$secrethash;
$result = array("error" => false,
"numbers" => $numbers);
echo json_encode($result);
$curlfunction($numbers);
exit;
Unfortunately, data is not returned until the curl function completes which causes my application to timeout. It is possible to execute the curl in the background so my data can be returned immediately?
You could put the function in a separate file and execute it similar to this. This would cause a separate process to be created. Maybe write the result to a file and read it. Set up your php to take your variable as a parameter.
exec("/usr/bin/php /path/to/script.php > /dev/null &");
It might be worth a shot. :-)
There are several options for you.
You can install CRON job after outputting json content via exec() function.
Make AJAX request from your page after json data was returned.
I use phpseclib to SSH into my remote server from a web browser and execute a php file. Below is the code I use:
$ssh->exec('cd myfolder/; php main.php ' . $file, 'packet_handler');
function packet_handler(){
echo "Completed";
header("Location: exec_completed.php");
}
The main.php file gets executed without any issue. The problem is with returning the data after execution. I have the following questions:
I do a lot of processing in the main.php file and i need to show real time progress of what the script does. When i execute the file through exec, only the first echo in the main.php is printed and the execution stops. Is there any way to get real time data from the executing script.
I follow this example from phpseclib for callbacks although my callback function packet_handler doesn't run after exec is executed. I want to redirect to another page once the main.php I execute through SSH has completed its execution. Now if i redirect to that page i get only partial results as the main.php file has not completed its execution. I tried to use sleep(10) but my main.php may take longer times to execute at times so it didn't work. please suggest any ideas
Callback example from phpseclib:
<?php
include('Net/SSH2.php');
$ssh = new Net_SSH2('www.domain.tld');
if (!$ssh->login('username', 'password')) {
exit('Login Failed');
}
function packet_handler($str)
{
echo $str;
}
$ssh->exec('ping 127.0.0.1', 'packet_handler');
?>
For #1... doing $ssh->exec without the callback should work. If it doesn't I'd need to see the logs, which you can get by doing define('NET_SSH2_LOGGGING', 2) at the top and then echo $ssh->getLog() after. Posting the log at pastebin.com and then posting a link would be good. But that said, that won't get you real time output either.
For #2... the callback function is mainly intended for real-time updates and odds are very likely that what you'll get with each call of the callback function will be an incomplete output. So for your callback to output "Completed" and redirect the user to another location is, in all likelihood, incorrect.
Another approach that may work for you: use the interactive shell. Example:
http://phpseclib.sourceforge.net/ssh/examples.html#sudo,
I don't know what your output is like. Maybe you could read() until you got to certain parts of the output that are guaranteed to be output. Or maybe you could use $ssh->setTimeout(5) and get updated output every five seconds or something..
Why doesn't PHP exec() work on the first page load?
I'm executing a python script via PHP using the following line:
exec("python suggester.py " . $query_plus . " " . $location, $output);
Most of the time this works fine, but on the initial load of my page (suggester.promediacorp.com) the POST request sits in waiting/pending for almost a minute until it finally returns a response. If the page is refreshed, or another query runs after, it works perfectly.
I'm almost 100% sure the issue is related to exec(), because when I remove that code I get my response immediately. Additionally, the issue persists even if the python file has no contents.
You do not have arguments escaped propperly. See http://php.net/manual/en/function.escapeshellarg.php
I figured it out. It was an issue with php's error logging that was echoed before my script finished running.
I have a PHP script which calls exec(). I've been having trouble all day with some code calling the same script working and some not (exec() returns a 127 error code).
I have finally worked out that the code that is not working is the code that is being called from jQuery on my web page:
$('#next_button').click(function(event) {
$.get('download_forms.php', function(data) {
alert(data);
});
});
However, if I type the url for download_forms.php into the address bar of my browser, then exec() will execute properly. I have tries to run other scripts that call exec() from jQuery to test and they all fail, but work if typed into the address bar.
I don't see why this would be an issue. Whether I type the url into Firefox's address bar, or whether pressing the button on my webpage, an HTTP request will be made.
Does anybody know what the difference could possibly be?
Note: I have tried different commands in exec() and they are all failing from my jQuery (note all the rest of the PHP code runs fine) but work when the script address is typed directly into the address bar.
Many thanks
Update
This is my download_forms.php code. The initial exec() was just to see if exec() worked at all. As above, it only executes properly if typed directly into the address bar.
include ('inc/session.inc.php');
require_once('Downloader.php');
exec('id', $output, $r);
echo var_dump($output);
echo($r);
try {
$downloader = new Downloader();
$saveMessages = $downloader->saveToDatabase();
// exec() in the combineAndDownloadForms() method
$downloadMessages = $downloader->combineAndDownloadForms();
} catch(Exception $e) {
echo $e->getMessage();
}
Further Update
I made a hyperlink through from my webpage to the download_forms.php page (ie a <a>), but exec() still doesn't execute. At least I know it's nothing to do with ajax.
For what it's worth, I fixed this issue and all of the above was a red herring.
I hadn't noticed that the page I was on was using the secure https protocol, so when the download_forms.php script was being used, it was also accessed using https and it seems that the exec() and passthru() functions won't execute commands on the server in these circumstances, which makes sense.
I changed the script so that it changed the protocol of the download_forms.php url to ordinary http and it's now functioning fine.
HTH