How can I run PHP code asynchronously without waiting? I have a long run (almost infinite) that should run while server starts and should process asynchronously without waiting.
The possible options I guess are:
Running the code in a web page and keep it open to do that task
Calling the script from some command line utility (I am not sure how) which would process in the background.
I am running the PHP scripts on my local server which will send emails when certain events occur, e.g. birthday reminders.
Please suggest how can I achieve this without opening the page in a browser.
If you wanted to run it from the browser (perhaps you're not familiar with the command line) you could still do it. I researched many solutions for this a few months ago and the most reliable and simplest to implement was the following from How to post an asynchronous HTTP request in PHP
<?php
$params['my_param'] = $a_value;
post_async('http:://localhost/batch/myjob.php', $params);
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url, array $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
Let's say the file above is in your web root directory (/var/www) for example and is called runjobs.php. By visiting http://localhost/runjobs.php your myjob.php file would start to run. You'd probably want to add some output to the browser to let you know it was submitted successfully and it wouldn't hurt to add some security if your web server is open to the rest of the world. One nice thing about this solution if you add some security is that you can start the job anywhere you can find a browser.
Definitely sounds like a job for a cron task. You can set up a php script to do your task once and have the cron run as often as you like. Here's a good writeup on how to have a php script run as a cron task; it's very easy to do.
This isn't really what PHP is designed for. You have to use the PECL threading library to spin off threads that run asynchronously, and I don't recommend it. The new hotness in the async department is node.js - I recommend you look into that and see if you can utilize it. It's designed for light weight, asynchronous network operations, and can be used to fire PHP scripts.
How Can I run PHP code asynchronously
without waiting. I have a long run
(almost inifinite) that should run
while server starts and should process
asynchrously without waiting.
Assuming a typical LAMP system, you can start a PHP daemon from the command line with
root# php yourscript.php &
where yourscript.php contains something similar to
<?php
$log = fopen('/var/log/yourscript.log', 'a+');
// ### check if we are running already omitted
while(true) {
// do interesting stuff and log it.
// don't be a CPU hog
sleep(1);
}
?>
Embellishments:
To make your script directly executable: chmod +x yourscript.php
and add #!/usr/bin/php to the beginning of yourscript
To start with apache, you should add that command to your apache startup script (usually apachectl) and be sure to add code to kill it when apache stops.
The check if you are already running involves a file with your PID in /var/locks/
and something like system('/bin/ps '.$thePID); It also makes the kill instruction easier to write.
thanks Todd Chaffee but it is not working for me so i edited your code i hope you will not mind and may be it will also help others with this technique
cornjobpage.php //mainpage
<?php
//if you want to call page for multiples time w.r.t array
//then uncomment loop start & end)
?>
<?php
//foreach ($inputkeywordsArr as $singleKeyword) {
$url="http://localhost/projectname/testpage.php";
$params['Keywordname'] = "testValue";//$singleKeyword
post_async($url, $params);
//}//foreach ($inputkeywordsArr end
?>
<?php
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url, array $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?
echo $_REQUEST["Keywordname"];//Output > testValue
?>
Related
I have a web application I'm working on where the user clicks a button to initiate an ajax request to a php file. The php file takes a very long time to run which forces the user to wait until the php files has finished running before notifying the user the request has been completed.
Here is a sample of the code I'm using:
jQuery caller:
$('#button').click(function(){
$.ajax({
type: "POST",
url: 'index-process.php',
success: function (data) {
console.log(data);
alert('finished');
}
});
});
index-process.php
<?php
/// placeholder for a long script
sleep(60);
echo "finished processing";
?>
I'm looking for some kind of work around that will allow me to notify the user that the request was submitted. Then allow the code to finish running in the background. I don't necessarily need my php script to return any values to the user. It just needs to execute.
So far I've tried something like this using a curl request with 2 different php files, but it still forces the user to wait until both php files have finished running before finishing the ajax request:
index-process.php
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'index-process2.php');
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'do_nothing');
curl_exec($ch);
curl_close($ch);
echo "finished processing";
function do_nothing($curl, $input) {
return 0; // aborts transfer with an error
}
?>
index-process2.php
<?php
ob_end_clean();
ignore_user_abort(true);
ob_start();
header("Connection: close");
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
/// placeholder for a long script
sleep(60);
?>
If there isn't a good solution using php is there a possible solution using jQuery? If so, could somebody show me the proper way of coding this?
fsockopen is the best answer I could come up on this one. Hope this helps somebody down the road. The following code allows me to call file1.php via ajax which will send data to file2.php. Once file1.php sends data to file2.php using fsockopen it does not wait for a response. Additional code can be run in file1.php while file2.php is doing it's thing. File1.php can echo a response to the ajax request right away even while file2.php is still running.
This answer is only useful for someone who needs to run a long executing script that requires data in but no data back in return.
file1.php called via ajax request:
$vars = array('hello' => 'world');
$post_data = http_build_query($vars);
/// ssl and 443 are used for https, change to tls and 80 for http
/// only the main website domain is needed
/// do not put the full path to the file you need to call here
$fp = fsockopen("ssl://www.main-website-domain.com", 443, $errno, $errstr, 1);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
//// this is where the full path to the file you want to reach goes
//// format is (method) (path not including the domain) (HTTP version)
$out = "POST /Full/Path/to-test-file2.php HTTP/1.1\r\n";
$out .= "Host: www.main-website-domain.com\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-length: " . strlen($post_data) . "\r\n";
$out .= "User-Agent: What-ever-you-want\r\n";
$out .= "Connection: close\r\n\r\n";
$out .= $post_data . "\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
file2.php
$data = $_POST['hello'];
ignore_user_abort(true);
sleep(60);///example of a long running script
// since no data will be returned it's best to store any results in a database
// echo, print_r, var_dump, or any other display mechanism will not work in this file unless directly accessed
For people who use php-fpm (meaning, nginx or apache via mod_fcgi), you can use fastcgi_finish_request to achieve described behavior.
What it does
fastcgi_finish_request will send the output to the client (browser) but it'll execute all code specified after the function.
Example
echo "Request accepted";
fastcgi_finish_request(); // Sends "Request accepted" back to web server, but executed all code below
// Do your long running task
sleep(60);
I have a mail address on my server which receives emails and pipes them to a php script - piper.php This script then parses the email and retrieves from, to, subject and message parts correctly. Once that is complete, I'd like this script to post this information to a remote CakePHP website to be stored in a MySQL database. I have all of this working except for the posting part.
To post, I use the code given at this link using cURL but I have two questions (code is below for reference only)
function curl_post_async($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(‘,’, $val);
$post_params[] = $key.’=’.urlencode($val);
}
$post_string = implode(‘&’, $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts[‘host’],
isset($parts[‘port’])?$parts[‘port’]:80,
$errno, $errstr, 30);
pete_assert(($fp!=0), "Couldn’t open a socket to ".$url." (".$errstr.")");
$out = "POST ".$parts[‘path’]." HTTP/1.1\r\n";
$out.= "Host: ".$parts[‘host’]."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
The code uses URL encoding - I am concerned that this may not work if , for example, my email message exceeds 1MB in text - arent there limits on the size of data that can be sent via URL encoding?
Is there a better way of doing this? Should I establish a PERL script to directly connect to the Remote MySQL database and write? Suggestions are welcome
It's not the CURL.
RFC 2616 does not specify any requirement for post length.
Server has its own post restrictions(Apache for example: http://httpd.apache.org/docs/2.2/mod/core.html#LimitRequestBody). And your php instance could have other options http://php.net/manual/en/ini.core.php#ini.post-max-size.
You could write php(or perl, or python, or bash -- it does not matter) script to connect to mysql directly. But you have to open connections to your database from outside. Remember, that mysql has restriction too: http://dev.mysql.com/doc/refman/5.5/en/server-system-variables.html#sysvar_max_allowed_packet.
It's up to you to deside what to do. Both ways seems legit.
We use REST for logging - we need more info than a simple textlog can provide. So we implemented a REST service, that accepts some basic information about the event and then asynchronously fetches more detail from the event's source web. So it's practically a ping with very little additional info. The logger machine than verifies API key, connects to DB and writes the basic info along with pointers to more detailed information.
We have a problem though, that logging may slow down the app a lot, because the connection and waiting for answer takes quite some time (it's fast but when you log 10+ events in one request it's a problem).
The question is:
Is there a way in plain PHP (5.3) to do a request to an URL and NOT wait for any answer, or just wait for HTTP 200 header to be sure?
I think I can make the logging server send HTTP 200 header as soon as it gets the request. But it needs to do some more work then ;)
What you want is called a asynchrnous request.
A solution could look like this (quoted):
function curl_post_async($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
If you need more informations take a look at:
http://petewarden.typepad.com/searchbrowser/2008/06/how-to-post-an.html
Here are two tricks maybe helpful.
I. Continue execute after http connection closed. Which means you can close the connection immediately when the main function finished, then continue with your log process.
<?php
ignore_user_abort(true);//avoid apache to kill the php running
ob_start();//start buffer output
echo "show something to user";//do something you need -- your main function
session_write_close();//close session file on server side if needed
header("Content-Encoding: none");//send header to avoid the browser side to take content as gzip format
header("Content-Length: ".ob_get_length());//send length header
header("Connection: close");//or redirect to some url
ob_end_flush();flush();//really send content, can't change the order:1.ob buffer to normal buffer, 2.normal buffer to output
//continue do something on server side
ob_start();
sleep(5);//the user won't wait for the 5 seconds
echo 'for log';//user can't see this
file_put_contents('/tmp/process.log', ob_get_contents());
// or call remote server like http://your.log.server/log.php?xxx=yyy&aaa=bbb
ob_end_clean();
?>
II. Use function apache_note to write logs is pretty lightweight choice than insert into DB. Because Apache will open the log file and keep the handle during Apache running. It's stable and really fast.
Apache configuration:
<VirtualHost *:80>
DocumentRoot /path/to/your/web
ServerName your.domain.com
ErrorLog /path/to/your/log/error_log
<Directory /path/to/your/web>
AllowOverride All
Order allow,deny
Allow from all
</Directory>
SetEnvIf Request_URI "/log\.php" mylog
LogFormat "%{mylog}n" log1
CustomLog "|/usr/sbin/cronolog /path/to/logs/mylog/%Y%m%d/mysite.%Y%m%d%H.log" log1 env=mylog
</VirtualHost>
PHP code:
<?php
apache_note('mylog', session_id()); //you can log any data you need, see http://www.php.net/manual/en/function.apache-note.php
Then you can use my tricks both I and II, call URL http://your.log.server/log.php?xxx=yyy&aaa=bbb to log your detail data after connection of main page closed. No extra time cost needed at all.
Please see the edits at the bottom for additional information!
I have two servers. Both should be able to call each other with a GET request.
To make the request (it's more firing an event than makeing a request actually) I am using this code:
function URLCallAsync($url, $params, $type='POST')
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
// Data goes in the path for a GET request
if('GET' == $type) $parts['path'] .= '?'.$post_string;
$out = "$type ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
// Data goes in the request body for a POST request
if ('POST' == $type && isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
I feed the function with the exact same data (but the url) on both servers (I copied the calling file to test it!!) but it only works in one direction!
I write the calls to that function in a log file so I can investigate if something is going wrong.
Server A -> Server B, works exactly as it should, the logfile at server A contains the correct url
Server B -> Server A, only prints the correct information in the logfile of server B, but Server A never receives the request.
What could be the reason for something like this?
edit:
Could it be the differnt kinds of server?
Server A is nginx, Server B is apache.
Server A also has a '~' symbol in it's url, maybe thats the problem?
The parameters of the get request are encoded with php's "urlencode" maybe that creates problems?
I tried around a bit, but the problem is still that the request isn't coming trough to Server A. But from a browser it works perfectly somehow (assuming I enter the correct URL with the parameters).
edit2:
If I exchange "URLCallAsync" with "file_get_contents" it works like it should. But the problem is that file_get_contents is blocking!
So it can only be the function itself. But strangely it works in the opposite direction :(
edit3:
The function "URLCallAsync" runs trough without error, notice or anything else.
It just isn't received by the other server.
What exactly is file_get_contents doing so different???
I got it working.
After a lot of fiddling with wireshark I found that file_get_contents is even simpler than my async function!
It simply omits the Content-Length field completly! It just provides "GET ..." and "Host".
It also uses HTTP/1.0 instead of 1.1, but that didn't change anything.
So the solution is: Also posting the Content-Length header (which had the value 0, since i used GET) will somehow make the server reject the request. I don't know for sure if it was the server that rejected the request, or something else, like a firewall that maybe detected a "malformed" request, but at least the problem is solved.
So next time you send requests, don't provide the Content-Length header if you don't need it :)
I would like to set a cron job in cpanel to run these different pages. I thought it would be easier to put them in one file. I know how to set these up to run individually, but the way it is written below, it won't run.
What do I need to change to get it to run smoothly?
<?php
ini_set('max_execution_time', 18000);
exec('/usr/bin/php -q /home2/sample/public_html/linktest/myapp-page1.php');
sleep (120);
exec('/usr/bin/php -q /home2/sample/public_html/linktest/myapp-page2.php');
sleep (120);
exec('/usr/bin/php -q /home2/sample/public_html/linktest/myapp-page3.php');
sleep (120);
exec('/usr/bin/php -q /home2/sample/public_html/linktest/myapp-page4.php');
sleep (120);
exec('/usr/bin/php -q /home2/sample/public_html/linktest/myapp-page5.php');
sleep (120);
echo 'Cron ran successfully';
?>
Thanks!
Or you could use wget and load a set of URLs from a file
wget -i CronScripts.txt
These would have to be accessible from the outside world though
Is it possible you are running in safe_mode and not allowed to modify max_execution_time?
you can use this technique it will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
cornjobpage.php //mainpage
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
//post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
//post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
//call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
?>
<?php
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url,$params)
{
$post_string = $params;
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?
echo $_REQUEST["Keywordname"];//case1 Output > testValue
?>
PS:if you want to send url parameters as loop then follow this answer :https://stackoverflow.com/a/41225209/6295712
you need to make sure that the exec function is allowed in your php.