Sending POST Requests without waiting for response? - php

I am writing a simple REST service, which responds to requests from clients. All in PHP.
My concern is, that when my server responds to a request, it could end up tying up resources if the client side is too slow in sending back "ok" response.
How do I send a POST request via lib_curl setting it to not wait for any responses, but rather quit immidiately after the POST data have been sent?
Is this even possible? Thank you !

You cannot just send data without receiving an answer with HTTP. HTTP always goes request -> response. Even if the response is just very short (like a simple 200 with no text), there needs to be a response. And every HTTP socket will wait for that response.
If you don't care about the response, you could add a process to the server that makes your requests, and you just push your request data to it (like a service that is running in the background, checking a request database, and always starting the request whenever a new entry was added). That way you would make the request asynchronously and could quit as soon as you added that request to the stack.
Also as meouw said, the client is not part of any communication you are doing with php. Php is a server-side language, so when the client requests a webpage (the php file), the server executes that file (and does all requests the php file states) and then returns the result to the client.

This solutions is for software minimal recevied package to continue script. If you want don't care about respond and have access to exec than use exec and call script in background. First Recevier File:
recevier.php
ignore_user_abort(true); //continue script if connetions become close by webbrowser(client) within working script
ob_end_clean(); // this 4 lines just extra sending to web about close connect it just in case
header("Connection: close\r\n"); //send to website close connect
header("Content-Encoding: none\r\n");
header("Content-Length: 1"); //
fastcgi_finish_request(); //close nginx,apache connect to php-fpm (php working but nginx or apache stop communication with php)
//continue scripting
// ...DO HERE WHAT YOU WANT ...
//check test with your mongo or mysql to sure php still keep connection with db
FRONTGROUND by PHP request to HTTP:
this solution is better than background and you need wait only 1ms
sender.php:
curl_setopt($curl, CURLOPT_TIMEOUT_MS, 1); //HERE MAGIC (We wait only 1ms on connection) Script waiting but (processing of send package to $curl is continue up to successful) so after 1ms we continue scripting and in background php continue already package to destiny. This is like apple on tree, we cut and go, but apple still fallow to destiny but we don't care what happened when fall down :)
curl_setopt($curl, CURLOPT_NOSIGNAL, 1); // i'dont know just it works together read manual ;)
--------- Check next answer to complete solution ------------
BACKGROUND By Server Request to HTTP: This will execute $cmd in the background (no cmd window) without PHP waiting for it to finish, on both Windows and Unix. #source https://www.php.net/manual/en/function.exec.php
<?php
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
?>

If you really don't care about the response you're probably best off exec-ing a wget command. This is mentioned in passing in some of the other answers, but here's a super easy function for sending a _POST package via this approach (which is asynchronous and takes 1-2ms):
function wget_request($url, $post_array, $check_ssl=true) {
$cmd = "curl -X POST -H 'Content-Type: application/json'";
$cmd.= " -d '" . json_encode($post_array) . "' '" . $url . "'";
if (!$check_ssl){
$cmd.= "' --insecure"; // this can speed things up, though it's not secure
}
$cmd .= " > /dev/null 2>&1 &"; //just dismiss the response
exec($cmd, $output, $exit);
return $exit == 0;
}
Credits: Function was adapted from
https://segment.com/blog/how-to-make-async-requests-in-php/

http://curl.haxx.se/mail/lib-2002-05/0090.html
libcurl has no asynchronous interface.
You can do that yourself either by
using threads or by using the
non-blocking "multi interface" that
libcurl offers. Read up on the multi
interface here:
http://curl.haxx.se/libcurl/c/libcurl-multi.html
PHP example of multi interface is here:
http://www.phpied.com/simultaneuos-http-requests-in-php-with-curl/

I have never tried this, but setting the CURLOPT_TIMEOUT to a very low value might do the trick. Try 0 or 0.1.
However, I don't know how cURL and the client will behave with this, whether the connection will be actively cancelled when the connection is already established, and the timeout is reached. You would have to try out. If you're calling PHP scripts, maybe ignore_user_abort() can make sure your scripts run through either way.

If you have 2 PHP servers communicating with each other, e.g. server 1 wants to send JSON data to a server 2, the server 2 is doing some heavy work and terminates the connection right after it receives the data so the server 1 doesn't have to wait for the result. You can do it like this:
Server 1 (client creating POST request with JSON data):
Use CURL, don't use file_get_contents() because in my experience, file_get_contents() doesn't handle Connection: close HTTP header correctly and doesn't terminate the connection as it should.
$curl = curl_init('http://server2.com/');
curl_setopt($curl, CURLOPT_HEADER, false);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, ["Content-type: application/json"]);
curl_setopt($curl, CURLOPT_POST, true);
curl_setopt($curl, CURLOPT_POSTFIELDS, json_encode(['some data']));
$response = curl_exec($curl);
$status = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if ($status !== 200) {
exit("Failed with status {$status}, response {$response}, curl_error " . curl_error($curl) . ", curl_errno " . curl_errno($curl));
}
curl_close($curl);
echo $response;
Server 2:
Modified code from bubba-h57 used.
// Cause we are clever and don't want the rest of the script to be bound by a timeout.
// Set to zero so no time limit is imposed from here on out.
set_time_limit(0);
// Client disconnect should NOT abort our script execution
ignore_user_abort(true);
// Clean (erase) the output buffer and turn off output buffering
// in case there was anything up in there to begin with.
ob_end_clean();
// Turn on output buffering, because ... we just turned it off ...
// if it was on.
ob_start();
echo 'I received the data, closing connection now, bye.';
// Return the length of the output buffer
$size = ob_get_length();
// Send headers to tell the browser to close the connection
// Remember, the headers must be called prior to any actual
// input being sent via our flush(es) below.
header("Connection: close");
// Hack how to turn off mod deflate in Apache (gzip compression).
header("Content-Encoding: none");
header("Content-Length: {$size}");
// Set the HTTP response code
http_response_code(200);
// Flush (send) the output buffer and turn off output buffering
ob_end_flush();
// Flush (send) the output buffer
// This looks like overkill, but trust me. I know, you really don't need this
// unless you do need it, in which case, you will be glad you had it!
#ob_flush();
// Flush system output buffer
// I know, more over kill looking stuff, but this
// Flushes the system write buffers of PHP and whatever backend PHP is using
// (CGI, a web server, etc). This attempts to push current output all the way
// to the browser with a few caveats.
flush();
// Close current session.
session_write_close();
// Here, you can proceed with some heavy work.
echo "This won't be sent, the connection should be already closed";

In Laravel
use Illuminate\Support\Facades\Http;
...Some Code here
$prom = Http::timeout(1)->async()->post($URL_STRING, $ARRAY_DATA)->wait();
... Some more important code here
return "Request sent"; //OR whatever you need to return
This works for me as I don't need to know the response always.

As other people says when you make a http request you have to wait the response.
In PHP what you can do is make the request using the exec function.
Check this link: php exec command (or similar) to not wait for result

Related

Send response and continue executing script

I'm creating a highload Telegram Bot, which means many requests come in and they take time to handle.
I use webhook (Telegram sends updates to, say, handler.php) which requires to respond with correct [] answer and fully close the connection for Telegram to send the next update. Otherwise Telegram will only increase pending_update_count and won't send any new updates until the previous one is handled.
So what I'm trying to do is to respond correctly and close the connection before code is executed.
StackOverflow suggests some solutions. But none of these will work for me because they only close the connection if there was no output and I need to respond with [].
How do I close a connection early?
Send response and continue executing script - PHP
The only workaround I came up with is to call shell_exec('php sendMessage.php >/dev/null 2>/dev/null &') from handler.php. Works perfectly well but that's not what I need. Any other suggestions to respond and close the connection, so code can execute in the background?
I found a simple solution that works for Telegram Bot API webhooks. It allows me to call the handling script directly within the script itself without using exec() or shell_exec().
Time-consuming code
$bool = true; // some condition in your regular code
if($bool) {
sleep(10);
}
This will cause a huge delay for other bot users since Telegram didn't get the response and won't send you any other updates.
Solution
if(isset($_GET['action'])) {
sleep(10);
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/handler.php?action');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 50);
curl_exec($ch);
curl_close($ch);
This way the script will call itself (or any other script), Telegram will be provided with correct and quick response, while the heavy part is executed separately.
It is strongly recommended not to go below 50 ms on CURLOPT_TIMEOUT_MS, sometimes it just doesn't work when receiving requests from Telegram.

How to execute a function after response send in PHP

I am trying to execute a callback function after response send in php.
For example in JAVA i made that using Threads. But in php after response it finish the process of script.Besides I try to implement pthreads but its too much complicated.
In my code:
if(isset($_REQUEST['x']) && $_REQUEST['x'] == "x") {
header('Content-type: application/json');
$data = json_decode(file_get_contents('php://input'), TRUE);
if (!empty($data)) {
$request = new XRequest($data['params']);
$customParams = unserialize(file_get_contents('customParams'));
$customParams->callCallback($request); //Calling from another PHP class
echo(json_encode(array('status' => 'OK')));
}
}
The request come from different server. I want to start first php echo response when response send i want to call $customParams->callCallback($request);
How can I do that? Any ideas?
In php i solved my problem using bottom code. But pay attention to fastcgi_finish_request . With out this my server can not stop the first response and start callback.
Thanks.
ob_start();
// Send your response.
echo json_encode(array('status' => 'ok')) ;
// Get the size of the output.
$size = ob_get_length();
// Disable compression (in case content length is compressed).
header("Content-Encoding: none");
header($_SERVER["SERVER_PROTOCOL"] . " 202 Accepted");
header("Status: 202 Accepted");
// Set the content length of the response.
header("Content-Length: {$size}");
// Close the connection.
header("Connection: close");
ignore_user_abort(true);
set_time_limit(0);
// Flush all output.
ob_end_flush();
ob_flush();
flush();
session_write_close();
fastcgi_finish_request();
// Do processing here
sleep(5);
callBackAfterResponse();
PHP's concurrency model is simple and based around the fact that multiple PHP scripts can be executed simultaneously by a Web server. So typically, the way you'd implement this is by
Placing the body of your callback function in its own, separate script; and
Invoking it from the parent script through an outgoing Web request (using cURL or similar).
That is, have the first PHP script request the second at a URL on (presumably) the same Web server, just as though a user had opened the two URLs sequentially in their Web browser. This way, the second script can continue to run after the first has completed its response and terminated.
More sophisticated approaches are possible, involving message queues or remote-procedure call mechanisms like XML-RPC and Apache Thrift, if the second PHP script is made to run separately and continuously in its own process. But I think this will be enough for what you're trying to do.

How to run PHP PDO SQL query and dont wait for response [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
how can I achieve a task that should be done in thread in php
Normally when running a PDO query in PHP, it waits for the database result. Is there a way to avoid this?
I need my script to reply really fast, but the SQL that runs may take some time to run.
Example:
<?php
$pdo = new PDO('bla bla');
// This takes up to 1 second, but I need the script to reply within a few ms
$pdo->exec('UPDATE large_table SET foo = \'bar\' WHERE id = 42');
die('ok');
Is this doable?
For INSERT queries, you can use INSERT DELAYED (Manual). These queries will be placed into a workqueue and return instantly. The downside is that you don't have any feedback on whether the query was executed successfully.
For obscure reasons however, there is no UPDATE DELAYED...
The common way would be to render the output first, then flush output to client using flush() and then do the time comsuming query. Also you should know about ignore_user_abort(). This function keeps PHP running although the connection to the client may have been ended. (e.g user closes browser)
I've prepared two scripts that illustrate this. First is slow.php which flushes output early and then starts a time consuming task. The second is get.php which uses libcurl to recieve the page. If you test it, the get.php will return almost immediately while the slow.php is still running. I also have tested the slow php with current Mozilla.
slow.php:
// The example will not work unless ob_end_clean() is called
// on top. Strange behaviour! Would like to know a reason
ob_end_clean();
// disable all content encoding as we won't
// be able to calculate the content-length if its enabled
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
header("Content-Encoding: none");
// Tell client that he should close the connection
header("Connection: close");
// keep the script running even if the CLIENT closes the connection
ignore_user_abort();
// using ob* functions its easy to content the content-length later
ob_start();
// do your output
echo 'hello world', PHP_EOL;
// get the content length
$size = ob_get_length();
header("Content-Length: $size");
// clear ob* buffers
for ($i = 0; $i < ob_get_level(); $i++) {
ob_end_flush();
}
flush(); // clear php internal output buffer
// start a time consuming task
sleep(3);
get.php
// simplest curl example
$url = 'http://localhost/slow.php';
$ch = curl_init($url);
$fp = fopen("example_homepage.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Call your update script via AJAX showing a loader to your user while you do your update.
If you don't really need the results of the query in the output at page loading time it's the only way.

Continue PHP execution after sending HTTP response

How can I have PHP 5.2 (running as apache mod_php) send a complete HTTP response to the client, and then keep executing operations for one more minute?
The long story:
I have a PHP script that has to execute a few long database requests and send e-mail, which takes 45 to 60 seconds to run. This script is called by an application that I have no control over. I need the application to report any error messages received from the PHP script (mostly invalid parameter errors).
The application has a timeout delay shorter than 45 seconds (I do not know the exact value) and therefore registers every execution of the PHP script as an error. Therefore, I need PHP to send the complete HTTP response to the client as fast as possible (ideally, as soon as the input parameters have been validated), and then run the database and e-mail processing.
I'm running mod_php, so pcntl_fork is not available. I could work my way around this by saving the data to be processed to the database and run the actual process from cron, but I'm looking for a shorter solution.
I had this snippet in my "special scripts" toolbox, but it got lost (clouds were not common back then), so I was searching for it and came up with this question, surprised to see that it's missing, I searched more and came back here to post it:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
session_write_close(); // Added a line suggested in the comment
// Do processing here
sleep(30);
echo('Text user will never see');
?>
I actually use it in few places. And it totally makes sense there: a banklink is returning the request of a successful payment and I have to call a lot of services and process a lot of data when that happens. That sometimes takes more than 10 seconds, yet the banklink has fixed timeout period. So I acknowledge the banklink and show him the way out, and do my stuff when he is already gone.
Have the script that handles the initial request create an entry in a processing queue, and then immediately return. Then, create a separate process (via cron maybe) that regularly runs whatever jobs are pending in the queue.
What you need is this kind of setup
One can to use "http fork" to oneself or any other script. I mean something like this:
// parent sript, called by user request from browser
// create socket for calling child script
$socketToChild = fsockopen("localhost", 80);
// HTTP-packet building; header first
$msgToChild = "POST /sript.php?&param=value&<more params> HTTP/1.0\n";
$msgToChild .= "Host: localhost\n";
$postData = "Any data for child as POST-query";
$msgToChild .= "Content-Length: ".strlen($postData)."\n\n";
// header done, glue with data
$msgToChild .= $postData;
// send packet no oneself www-server - new process will be created to handle our query
fwrite($socketToChild, $msgToChild);
// wait and read answer from child
$data = fread($socketToChild, $dataSize);
// close connection to child
fclose($socketToChild);
...
Now the child script:
// parse HTTP-query somewhere and somehow before this point
// "disable partial output" or
// "enable buffering" to give out all at once later
ob_start();
// "say hello" to client (parent script in this case) disconnection
// before child ends - we need not care about it
ignore_user_abort(1);
// we will work forever
set_time_limit(0);
// we need to say something to parent to stop its waiting
// it could be something useful like client ID or just "OK"
...
echo $reply;
// push buffer to parent
ob_flush();
// parent gets our answer and disconnects
// but we can work "in background" :)
...
The main idea is:
parent script called by user request;
parent calls child script (same as parent or another) on the same server (or any other server) and gives request data to them;
parent says ok to user and ends;
child works.
If you need to interact with child - you can use DB as "communication medium": parent may read child status and write commands, child may read commands and write status. If you need that for several child scripts - you should keep child id on the user side to discriminate them and send that id to parent each time you want to check status of respective child.
I've found that here - http://linuxportal.ru/forums/index.php/t/22951/
What about calling a script on the file server to execute as if it had been triggered at the command line? You can do this with PHP's exec.
You can use the PHP function register-shutdown-function that will execute something after the script has completed its dialog with the browser.
See also ignore_user_abort - but you shouldn't need this function if you use the register_shutdown_function. On the same page, set_time_limit(0) will prevent your script to time out.
Using a queue, exec or cron would be an overkill to this simple task.
There is no reason not to stay within the same script.
This combination worked great for me:
ignore_user_abort(true);
$response = "some response";
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush(); // releasing the browser from waiting
// continue the script with the slow processing here...
read more in:
How to continue process after responding to ajax request in PHP?
It is possible to use cURL for that, with a very short timeout. This would be your main file:
<?php>
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/processor.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 10); //just some very short timeout
curl_exec($ch);
curl_close($ch);
?>
And this your processor file:
<?php
ignore_user_abort(true); //very important!
for($x = 0; $x < 10; $x++) //do some very time-consuming task
sleep(10);
?>
As you can see, the upper script will timeout after a short time (10 milliseconds in this case). It is possible that CURLOPT_TIMEOUT_MS will not work like this, in this case, it would be equivalent to curl_setopt($ch, CURLOPT_TIMEOUT, 1).
So when the processor file has been accessed, it will do its tasks no matter that the user (i.e. the calling file) aborts the connection.
Of course you can also pass GET or POST parameters between the pages.
You can create an http request between server and server. (not browser is needed).
The secret to create a background http request is setting a very small timeout, so the response is ignored.
This is a working function that I have used for that pupose:
MAY
31
PHP asynchronous background request
Another way to create an asynchronous request in PHP (simulating background mode).
/**
* Another way to make asyncronous (o como se escriba asincrono!) request with php
* Con esto se puede simpular un fork en PHP.. nada que envidarle a javita ni C++
* Esta vez usando fsockopen
* #author PHPepe
* #param unknown_type $url
* #param unknown_type $params
*/
function phpepe_async($url, $params = array()) {
$post_params = array();
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
// Usage:
phpepe_async("http://192.168.1.110/pepe/feng_scripts/phprequest/fork2.php");
For more info you can take a look at
http://www.phpepe.com/2011/05/php-asynchronous-background-request.html
You can split these functions into three scripts.
1. Initiate process and call second one via exec or command, this is also possible to run via http call.
2. second one will run database processing and at the end will start last one
3. last one will email
Bah, I misunderstood your requirements. Looks like they're actually:
Script receives input from an external source you do not control
Script processes and validates the input, and lets the external app know if they're good or not and terminates the session.
Script kicks off a long-running proccess.
In this case, then yes, using an outside job queue and/or cron would work. After the input is validated, insert the job details into the queue, and exit. Another script can then run, pick up the job details from the queue, and kick off the longer process. Alex Howansky has the right idea.
Sorry, I admit I skimmed a bit the first time around.
I would recommend spawning a new async request at the end, rather than continuing the process with the user.
You can spawn the other request using the answer here:
Asynchronous PHP calls?
In your Apache php.ini config file, make sure that output buffering is disabled:
output_buffering = off

sending a non-blocking HTTP POST request

I have a two websites in php and python.
When a user sends a request to the server I need php/python to send an HTTP POST request to a remote server. I want to reply to the user immediately without waiting for a response from the remote server.
Is it possible to continue running a php/python script after sending a response to the user. In that case I'll first reply to the user and only then send the HTTP POST request to the remote server.
Is it possible to create a non-blocking HTTP client in php/python without handling the response at all?
A solution that will have the same logic in php and python is preferable for me.
Thanks
In PHP you can close the connection by sending this request (this is HTTP related and works also in python, although I don't know the proper syntax to use):
// Send the response to the client
header('Connection: Close');
// Do the background job: just don't output anything!
Addendum: I forgot to mention you probably have to set the "Context-Length". Also, check out this comment for tips and a real test case.
Example:
<?php
ob_end_clean();
header('Connection: close');
ob_start();
echo 'Your stuff goes here...';
header('Content-Length: ' . ob_get_length());
ob_end_flush();
flush();
// Now we are in background mode
sleep(10);
echo 'This text should not be visible';
?>
You can spawn another process to handle the POST to the other server. In PHP you would spawn the process and "disconnect" so you don't wait for the response.
exec("nohup /path/to/script/post_content.php > /dev/null 2>&1 &");
You can then you curl to perform the post. If you want to pass parameters to the PHP script, you can use the getopt() function to read them. Not sure if you would do something similar in Python.
What you need to do is have the PHP script execute another script that does the server call and then sends the user the request.
You have to set a middle man. So in your own server you would have:
A web form;
A submit handler ( php or python script that handles the form submission );
Your handler creates a new file and fill it up with the submission data. You can, for instance, format the data as JSON;
So your handler has a single job, save the submitted data in a file and respond the user, nothing else. This should be fast.
Create a filesystem event driven cron ( not a time driven cron ). See this and this questions (for a windows and an ubuntu servers respectively).
Set your cron to execute a php or python script which will then re-post the data to a remote server.
You have to use fsockopen. And don't listen to the result
<?php
$fp = fsockopen('example.com', 80);
$vars = array(
'hello' => 'world'
);
$content = http_build_query($vars);
fwrite($fp, "POST /reposter.php HTTP/1.1\r\n");
fwrite($fp, "Host: example.com\r\n");
fwrite($fp, "Content-Type: application/x-www-form-urlencoded\r\n");
fwrite($fp, "Content-Length: ".strlen($content)."\r\n");
fwrite($fp, "Connection: close\r\n");
fwrite($fp, "\r\n");
fwrite($fp, $content);
Hookah is designed to solve your problem.

Categories