How to keep a PHP script running even after the user leaves? - php

I have a PHP script that checks the last time a SQLite database has been updated (30 minutes timespan) when a user visits the page. If it has been longer than 30 minutes, then the script will pull new information into the database. However, I'm worried that the user might leave while the database is updating, therefore neglecting to update some of the entries. What can I do to keep the script executing even after the user leaves?
I've looked at some of the similar questions here and found people suggesting using ignore_user_abort(), however there seems to be issues with that when data cannot be sent back to the client. Any other suggestions would be greatly appreciated. Thanks!

Asynchronous PHP call
One option would be to do asynchronous PHP call(Request). See Asynchronous PHP calls? for more information. But keep in mind that when doing this a lot you are spawing a lot of background processes which could kill your server.
P.S: Also when you are using shared hosting, doing this stuff is generally not appreciated much
Message Queue
A way better way to do this would be using a Message Queue(MQ). You could use Redis or Beanstalkd just to name two popular MQs. You have a free redis instance provided to you thanks to http://redistogo.com/. From the client/producer(user visiting your page) you would then add message to the queue using RPUSH. From the consumer(SQLite) which is a PHP process running in the background endlessly(CLI) you would retrieve messages put on the queue using BLPOP. Spawning processes is expensive and is avoided when using a message queue.

You can't send data back to the client when he exits/stop to visit your website. You can open sock witch prevent from slowdown on client's side when loading page
$fp = fsockopen($host, 80, $errno, $errstr, 10);
if (!$fp) {
echo "$errstr ($errno)\n";
} else {
$header = "GET /cron.php HTTP/1.1\r\n";
$header .= "Host: $host\r\n";
$header .= "Connection: close\r\n\r\n";
fputs($fp, $header);
fclose($fp);
}
//do another stuff
this will send signal to /cron.php to do stuff you want and ALSO does not slowdown user's browsing experience ;)

Related

PHP Asynchronous process to call facebook SDK

I've been messing this for far too long and would like a bit more help.
Basically a user logs in and I get there facebook data (Email, id, etc).
This alone takes around 25 seconds.
I've work hard to reduce it to that.
Also I need to pull the friend list and avatar image.
Initially this used to take 4 minutes.
But after hard researching I got it down to around 30 secs.
So combined it takes around 50 seconds.
So I start to separate the friend list and the avatar image process to run asynchronously.
I tried using
Ajax call: Causes user to wait for process to end. IE when attempting to move to another page it hangs until process is finished.
exec: Does not keep facebook login data so cannot call facebook API
fsocket: Does not seem to run asynchronously. Still take the same time.
So I'm not sure on my options.
I used this method to run the socket method but I'm still getting logins at around 50 seconds. IF I remove the call to this process it reduces to just over 20.
What an I doing wrong?
Does anyone have a solution for this?
Form multiple reasons ranging from in internal features and mobile optimization I need to gather the friend list and download the images.
Ajax is called on the via
$.ajax({
type: "POST",
url: url,
data: data,
cache: false,
context: context,
});
And the fsocket is called via
$host = getHost( );
$sock = fsockopen($host, 80);
fwrite($sock, "GET ".getRoot( )."/updatefb.php HTTP/1.1\r\n");
fwrite($sock, "Host: $host\r\n");
fwrite($sock, "Cookie: PHPSESSID=" . $_COOKIE['PHPSESSID'] . "\r\n");
fwrite($sock, "Connection: close\r\n");
fwrite($sock, "\r\n");
fflush($sock);
fclose($sock);

Send HTTP request from PHP without waiting for response?

I want to have an HTTP GET request sent from PHP. Example:
http://tracker.example.com?product_number=5230&price=123.52
The idea is to do server-side web-analytics: Instead of sending tracking
information from JavaScript to a server, the server sends tracking
information directly to another server.
Requirements:
The request should take as little time as possible, in order to not
noticeably delay processing of the PHP page.
The response from the tracker.example.com does not need to be
checked. As examples, some possible responses from
tracker.example.com:
200: That's fine, but no need to check that.
404: Bad luck, but - again - no need to check that.
301: Although a redirect would be appropriate, it would delay
processing of the PHP page, so don't do that.
In short: All responses can be discarded.
Ideas for solutions:
In a now deleted answer, someone suggested calling command line
curl from PHP in a shell process. This seems like a good idea,
only that I don't know if forking a lot of shell processes under
heavy load is a wise thing to do.
I found php-ga, a package for doing server-side Google
Analytics from PHP. On the project's page, it is
mentioned: "Can be configured to [...] use non-blocking requests."
So far I haven't found the time to investigate what method php-ga
uses internally, but this method could be it!
In a nutshell: What is the best solution to do generic server-side
tracking/analytics from PHP.
Unfortunately PHP by definition is blocking. While this holds true for the majority of functions and operations you will normally be handling, the current scenario is different.
The process which I like to call HTTP-Ping, requires that you only touch a specific URI, forcing the specific server to boot-strap it's internal logic. Some functions allow you to achieve something very similar to this HTTP-ping, by not waiting for a response.
Take note that the process of pinging an url, is a two step process:
Resolve the DNS
Making the request
While making the request should be rather fast once the DNS is resolved and the connection is made, there aren't many ways of making the DNS resolve faster.
Some ways of doing an http-ping are:
cURL, by setting CONNECTION_TIMEOUT to a low value
fsockopen by closing immediately after writing
stream_socket_client (same as fsockopen) and also adding STREAM_CLIENT_ASYNC_CONNECT
While both cURL and fsockopen are both blocking while the DNS is being resolved. I have noticed that fsockopen is significantly faster, even in worst case scenarios.
stream_socket_client on the other hand should fix the problem regarding DNS resolving and should be the optimal solution in this scenario, but I have not managed to get it to work.
One final solution is to start another thread/process that does this for you. Making a system call for this should work, but also forking the current process should do that also. Unfortunately both are not really safe in applications where you can't control the environment on which PHP is running.
System calls are more often than not blocked and pcntl is not enabled by default.
I would call tracker.example.com this way:
get_headers('http://tracker.example.com?product_number=5230&price=123.52');
and in the tracker script:
ob_end_clean();
ignore_user_abort(true);
ob_start();
header("Connection: close");
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
// from here the response has been sent. you can now wait as long as you want and do some tracking stuff
sleep(5); //wait 5 seconds
do_some_stuff();
exit;
I implemented function for fast GET request to url without waiting for response:
function fast_request($url)
{
$parts=parse_url($url);
$fp = fsockopen($parts['host'],isset($parts['port'])?$parts['port']:80,$errno, $errstr, 30);
$out = "GET ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Length: 0"."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
We were using fsockopen and fwrite combo, then it up and stopped working one day. Or it was kind of intermittent. After a little research and testing, and if you have fopen wrappers enabled, I ended up using file_get_contents and stream_context_create functions with a timeout that is set to 100th of second. The timeout parameter can receive floating values (https://www.php.net/manual/en/context.http.php). I wrapped it in a try...catch block so it would fail silently. It works beautifully for our purposes. You can do logging stuff in the catch if needed. The timeout is the key if you don't want the function to block runtime.
function fetchWithoutResponseURL( $url )
{
$context = stream_context_create([
"http" => [
"method"=>"GET",
"timeout" => .01
]
]
);
try {
file_get_contents($url, 0, $context);
}catch( Exception $e ){
// Fail silently
}
}
For those of you working with wordrpess as a backend -
it is as simple as:
wp_remote_get( $url, array(blocking=>false) );
Came here whilst researching a similar problem. If you have a database connection handy, one other possibility is to quickly stuff the request details into a table, and then have a seperate cron-based process that periodically scans that table for new records to process, and makes the tracking request, freeing up your web application from having to make the HTTP request itself.
You can use shell_exec, and command line curl.
For an example, see this question
You can actually do this using CURL directly.
I have both implemented it using a very short timeout (CURLOPT_TIMEOUT_MS) and/or using curl_multi_exec.
Be advised: eventually i quit this method because not every request was correctly made. This could have been caused by my own server though i haven't been able to rule out the option of curl failing.
I needed to do something similar, just ping a url and discard all responses. I used the proc_open command which lets you end the process right away using proc_close. I'm assuming you have lynx installed on your server:
<?php
function ping($url) {
$proc = proc_open("lynx $url",[],$pipes);
proc_close($proc);
}
?>
<?php
// Create a stream
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"Accept-language: en"
)
);
$context = stream_context_create($opts);
// Open the file using the HTTP headers set above
$file = file_get_contents('http://tracker.example.com?product_number=5230&price=123.52', false, $context);
?>

CodeIgniter RESTful, async / background process

I'm using codeIgniter RESTful API (https://github.com/philsturgeon/codeigniter-restserver) that return information (json format) to my android/iphone app.
There are an operation where i send some values, if it is everything OK i return 200 code as response.
Now, i want to add a new operation at the same method: send notifications of this modifications with APNS (Apple Push Notificacion Service) and GCM (Google Cloud Messaging).
It works well when i have to send no more than 3-5 notifications, the problem is APNS, because i have to send this messages one by one and it takes a long time, so my apps recieves a timeout exception (all the notifications are sent but the user get the Error Connection...)
Can i send the 200 code response and then continue sending this notifications? (Something like this...)
function my_update_method_post(){
//....GET my POST values
update($data);
$this->response(array('result'=>1),200));
//Send Notifications
....
}
Thanks in advance...
I found a solution that works perfect for me because i don't expect any result value. If notification can't be send...i log it in my database.
This is the function that i use to send "async" request (yes, This is not an asynchronous request, but it works how i'm looking for)
function curl_post_async($url, $params)
{
$post_string = http_build_query($params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
if(!$fp)
{
//Perform whatever logging you want to have happen b/c this call failed!
}
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
Yes this is possible.
You should look at PHP exec() and this link. You should set up a function in your controller to be called from the command line. you will then pass in an array of the GCM/APNS data to be used.
This solution is not ideal because you won't be able to tell the client that all message were sent successfully. You will send back 200 to say the request was received ok and that is all.
Since PHP doesn't natively support threads or asynchronus function calls you will have to use a kindof hacky solution.
Have a look at my question here: PHP file_get_contents() follow Content-length header
The Solution is to send a Connection: Close and Content-Length header, then make the client to be aware of these headers (see link above). In case of curl for example the connection will be closed as soon as the Content-Length is reached, but your PHP Script still runs "in the background" so you can start time consuming operations then.
Kind regards,
Stefan
P.S. If the Script takes really long to execute, make sure that the PHP max exection time doesn't get in your way
Take a look at this article. I like this solution much more than one where you have the client tell the server to hang up immediately; there are multiple benefits if you build this solution on the server side.
You know the server will continue processing once the client has disconnected
The client can still receive a response from the server
EDIT
I'd not realized OP doesn't have access to the service here. In this case, the article I've mentioned is of little value. The problem here is the server is taking a long time to respond and hanging the client up. For this I suggest curl_multi_init. This allows you to make a number of requests simultaneously.

Long delay for pdf generation & sending e-mail. Put an image while server parses scripts?

I'm working on a form in which you can order courier service. The main idea for it is to generate PDF file containing validated data and then attach it to an e-mail to client + cc for courier company.
The thing is pdf generation (TCPDF) and e-mail sending (Swiftmail) takes noticeably long.
I would like to prevent impatient user from clicking 'confirm' over and over. The ideal solution would be to show some gif 'loading' image or so. I've looked into jquery .load() function and it looks like a good thing for this problem, but what about users w/o js?
Can you point me in the right direction?
I would use ajax in this case. but if lack of JS is a genuine concern for you, there are couple of ways to do this without JS.
First you can display something to the user before starting to work with pdf. eg
<?php
echo "<html><head></head><body> Please Wait..";
flush();
// process pdf here
?>
function flush() will force the webserver to send the data to the browser, so it will be displayed while the rest of the page is loaded. However some browsers will cache this thing internally. There are ways to overcome this (check out comments in php.net for a flush() function) but it could get messy.
Another way is to process PDF asynchronously. In this case target script just displays a message that email will be send shortly. And fires the script witch actually going to send it. eg:
<?php
echo "<html><head></head><body> Email will be send out shortly </body></html>";
$params='';
foreach ($_POST as $name => $value) {
$params.=$name.'='.$value.'&';
}
if ($params!='')
$params=urldecode(substr($params,0,-1));
$parts=parse_url('process_pdf.php'.'?'.$params);
$fp = fsockopen('process_pdf.php',80, $errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($parts['query'])."\r\n";
$out.= "Connection: Close\r\n\r\n";
$out.= $parts['query'];
fwrite($fp, $out);
fclose($fp);
?>
in this example "process_pdf.php" will get all the same POST parameters as original script. but will be executed in the background without interrupting the original page.
More than 98% of users have javascript activated. The remaining 2% is made by:
Utterly nerdy people who know how to disable it and want to do it: they probably won't click on the submit button as if it was going out of style, since they'll notice the default browser loading indicators;
People with sight problems, who use screen readers and won't benefit much from the image anyway.
Js is taken mostly for granted nowadays, and it's a web standard. I'd say don't worry and go for it - there is no other way to do it. Still, for the blind, it would be nice to have a <noscript> tag with information such as "The operation might take some seconds".
Another approach would be to do the processing (creating the PDF and sending the email) in a background job. So when your uses clicks 'Confirm', this background job is started with the input from the form and the user is shown a message that his job is being processed and he or she will receive a confirmation by email shortly. This way, you avoid relying on Javascript all together.
You could use something like Beanstalkd for this.

Continue PHP execution after sending HTTP response

How can I have PHP 5.2 (running as apache mod_php) send a complete HTTP response to the client, and then keep executing operations for one more minute?
The long story:
I have a PHP script that has to execute a few long database requests and send e-mail, which takes 45 to 60 seconds to run. This script is called by an application that I have no control over. I need the application to report any error messages received from the PHP script (mostly invalid parameter errors).
The application has a timeout delay shorter than 45 seconds (I do not know the exact value) and therefore registers every execution of the PHP script as an error. Therefore, I need PHP to send the complete HTTP response to the client as fast as possible (ideally, as soon as the input parameters have been validated), and then run the database and e-mail processing.
I'm running mod_php, so pcntl_fork is not available. I could work my way around this by saving the data to be processed to the database and run the actual process from cron, but I'm looking for a shorter solution.
I had this snippet in my "special scripts" toolbox, but it got lost (clouds were not common back then), so I was searching for it and came up with this question, surprised to see that it's missing, I searched more and came back here to post it:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
session_write_close(); // Added a line suggested in the comment
// Do processing here
sleep(30);
echo('Text user will never see');
?>
I actually use it in few places. And it totally makes sense there: a banklink is returning the request of a successful payment and I have to call a lot of services and process a lot of data when that happens. That sometimes takes more than 10 seconds, yet the banklink has fixed timeout period. So I acknowledge the banklink and show him the way out, and do my stuff when he is already gone.
Have the script that handles the initial request create an entry in a processing queue, and then immediately return. Then, create a separate process (via cron maybe) that regularly runs whatever jobs are pending in the queue.
What you need is this kind of setup
One can to use "http fork" to oneself or any other script. I mean something like this:
// parent sript, called by user request from browser
// create socket for calling child script
$socketToChild = fsockopen("localhost", 80);
// HTTP-packet building; header first
$msgToChild = "POST /sript.php?&param=value&<more params> HTTP/1.0\n";
$msgToChild .= "Host: localhost\n";
$postData = "Any data for child as POST-query";
$msgToChild .= "Content-Length: ".strlen($postData)."\n\n";
// header done, glue with data
$msgToChild .= $postData;
// send packet no oneself www-server - new process will be created to handle our query
fwrite($socketToChild, $msgToChild);
// wait and read answer from child
$data = fread($socketToChild, $dataSize);
// close connection to child
fclose($socketToChild);
...
Now the child script:
// parse HTTP-query somewhere and somehow before this point
// "disable partial output" or
// "enable buffering" to give out all at once later
ob_start();
// "say hello" to client (parent script in this case) disconnection
// before child ends - we need not care about it
ignore_user_abort(1);
// we will work forever
set_time_limit(0);
// we need to say something to parent to stop its waiting
// it could be something useful like client ID or just "OK"
...
echo $reply;
// push buffer to parent
ob_flush();
// parent gets our answer and disconnects
// but we can work "in background" :)
...
The main idea is:
parent script called by user request;
parent calls child script (same as parent or another) on the same server (or any other server) and gives request data to them;
parent says ok to user and ends;
child works.
If you need to interact with child - you can use DB as "communication medium": parent may read child status and write commands, child may read commands and write status. If you need that for several child scripts - you should keep child id on the user side to discriminate them and send that id to parent each time you want to check status of respective child.
I've found that here - http://linuxportal.ru/forums/index.php/t/22951/
What about calling a script on the file server to execute as if it had been triggered at the command line? You can do this with PHP's exec.
You can use the PHP function register-shutdown-function that will execute something after the script has completed its dialog with the browser.
See also ignore_user_abort - but you shouldn't need this function if you use the register_shutdown_function. On the same page, set_time_limit(0) will prevent your script to time out.
Using a queue, exec or cron would be an overkill to this simple task.
There is no reason not to stay within the same script.
This combination worked great for me:
ignore_user_abort(true);
$response = "some response";
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush(); // releasing the browser from waiting
// continue the script with the slow processing here...
read more in:
How to continue process after responding to ajax request in PHP?
It is possible to use cURL for that, with a very short timeout. This would be your main file:
<?php>
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/processor.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 10); //just some very short timeout
curl_exec($ch);
curl_close($ch);
?>
And this your processor file:
<?php
ignore_user_abort(true); //very important!
for($x = 0; $x < 10; $x++) //do some very time-consuming task
sleep(10);
?>
As you can see, the upper script will timeout after a short time (10 milliseconds in this case). It is possible that CURLOPT_TIMEOUT_MS will not work like this, in this case, it would be equivalent to curl_setopt($ch, CURLOPT_TIMEOUT, 1).
So when the processor file has been accessed, it will do its tasks no matter that the user (i.e. the calling file) aborts the connection.
Of course you can also pass GET or POST parameters between the pages.
You can create an http request between server and server. (not browser is needed).
The secret to create a background http request is setting a very small timeout, so the response is ignored.
This is a working function that I have used for that pupose:
MAY
31
PHP asynchronous background request
Another way to create an asynchronous request in PHP (simulating background mode).
/**
* Another way to make asyncronous (o como se escriba asincrono!) request with php
* Con esto se puede simpular un fork en PHP.. nada que envidarle a javita ni C++
* Esta vez usando fsockopen
* #author PHPepe
* #param unknown_type $url
* #param unknown_type $params
*/
function phpepe_async($url, $params = array()) {
$post_params = array();
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
// Usage:
phpepe_async("http://192.168.1.110/pepe/feng_scripts/phprequest/fork2.php");
For more info you can take a look at
http://www.phpepe.com/2011/05/php-asynchronous-background-request.html
You can split these functions into three scripts.
1. Initiate process and call second one via exec or command, this is also possible to run via http call.
2. second one will run database processing and at the end will start last one
3. last one will email
Bah, I misunderstood your requirements. Looks like they're actually:
Script receives input from an external source you do not control
Script processes and validates the input, and lets the external app know if they're good or not and terminates the session.
Script kicks off a long-running proccess.
In this case, then yes, using an outside job queue and/or cron would work. After the input is validated, insert the job details into the queue, and exit. Another script can then run, pick up the job details from the queue, and kick off the longer process. Alex Howansky has the right idea.
Sorry, I admit I skimmed a bit the first time around.
I would recommend spawning a new async request at the end, rather than continuing the process with the user.
You can spawn the other request using the answer here:
Asynchronous PHP calls?
In your Apache php.ini config file, make sure that output buffering is disabled:
output_buffering = off

Categories