PHP - How to make php.net's sample code not hang? - php

The Simple TCP/IP client example will hang if no data is being received.
To see what I mean, copy the client code and run it, but remove one of the \r\n's in the GET request to make it invalid, so it looks like this:
$in = "HEAD / HTTP/1.1\r\n";
$in .= "Host: www.example.com\r\n";
$in .= "Connection: Close\r\n";
The server won't respond, which is normal, but I want to be able to have a timeout or a way to keep it from hanging.
I've tried setting the timeout using socket_set_option but I think that only applies to socket_connect.
Anyone have a solution?
If I use socket_recv with a MSG_DONTWAIT flag, what is the proper way to wait for all data, connection close, or a timeout if neither of those occur?

You can use socket_select() to determine if a socket has data to be read.

Related

Connecting to .net webservice from php - return buffer not ended

I have a php based website which is connecting to a third party SOAP web service.
I am creating and parsing the request and response envelope xml successfully. The only issue is with the actual communication. I am using sockets, as I saw this code as an example and was able to modify it to work for our case.
$socket = fsockopen("ssl://***",443);
if ($socket!==false){
$request = "POST /***.asmx HTTP/1.1\n".
"Host: ***\n".
"Content-Type: text/xml; charset=utf-8\n".
"Content-Length: ".strlen($request)."\n".
"SOAPAction: \"http://***\"\n\n".
$request."\n";
fputs($socket, $request);
However reading the response was always hanging until the connection timed out. This is due to the method I use to read FGETS requiring an end of line or end of file marker, which it doesn't receive from the web service. As a result I am forced to read the response one byte at a time and check if the SOAP envelope is closed, and then manually stop. This seems somewhat inefficient to me.
while (!feof($socket) && substr($response,-16) != "</soap:Envelope>" ) {
$response .= fgets($socket,2); // force read one byte at a time
}
So, my question(s) are:
Is this a known phenomenon for .net web services?
Is there a better way to perform this communication, or specifically the read (other than FGETS)?
And on a slightly unrelated note, why does the fgets function require me to specify 2, to read 1 byte? (Or is this related to the charset, ie: if the response is in utf8 then 1 character is represented by 2 bytes)

Send HTTP request from PHP without waiting for response?

I want to have an HTTP GET request sent from PHP. Example:
http://tracker.example.com?product_number=5230&price=123.52
The idea is to do server-side web-analytics: Instead of sending tracking
information from JavaScript to a server, the server sends tracking
information directly to another server.
Requirements:
The request should take as little time as possible, in order to not
noticeably delay processing of the PHP page.
The response from the tracker.example.com does not need to be
checked. As examples, some possible responses from
tracker.example.com:
200: That's fine, but no need to check that.
404: Bad luck, but - again - no need to check that.
301: Although a redirect would be appropriate, it would delay
processing of the PHP page, so don't do that.
In short: All responses can be discarded.
Ideas for solutions:
In a now deleted answer, someone suggested calling command line
curl from PHP in a shell process. This seems like a good idea,
only that I don't know if forking a lot of shell processes under
heavy load is a wise thing to do.
I found php-ga, a package for doing server-side Google
Analytics from PHP. On the project's page, it is
mentioned: "Can be configured to [...] use non-blocking requests."
So far I haven't found the time to investigate what method php-ga
uses internally, but this method could be it!
In a nutshell: What is the best solution to do generic server-side
tracking/analytics from PHP.
Unfortunately PHP by definition is blocking. While this holds true for the majority of functions and operations you will normally be handling, the current scenario is different.
The process which I like to call HTTP-Ping, requires that you only touch a specific URI, forcing the specific server to boot-strap it's internal logic. Some functions allow you to achieve something very similar to this HTTP-ping, by not waiting for a response.
Take note that the process of pinging an url, is a two step process:
Resolve the DNS
Making the request
While making the request should be rather fast once the DNS is resolved and the connection is made, there aren't many ways of making the DNS resolve faster.
Some ways of doing an http-ping are:
cURL, by setting CONNECTION_TIMEOUT to a low value
fsockopen by closing immediately after writing
stream_socket_client (same as fsockopen) and also adding STREAM_CLIENT_ASYNC_CONNECT
While both cURL and fsockopen are both blocking while the DNS is being resolved. I have noticed that fsockopen is significantly faster, even in worst case scenarios.
stream_socket_client on the other hand should fix the problem regarding DNS resolving and should be the optimal solution in this scenario, but I have not managed to get it to work.
One final solution is to start another thread/process that does this for you. Making a system call for this should work, but also forking the current process should do that also. Unfortunately both are not really safe in applications where you can't control the environment on which PHP is running.
System calls are more often than not blocked and pcntl is not enabled by default.
I would call tracker.example.com this way:
get_headers('http://tracker.example.com?product_number=5230&price=123.52');
and in the tracker script:
ob_end_clean();
ignore_user_abort(true);
ob_start();
header("Connection: close");
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
// from here the response has been sent. you can now wait as long as you want and do some tracking stuff
sleep(5); //wait 5 seconds
do_some_stuff();
exit;
I implemented function for fast GET request to url without waiting for response:
function fast_request($url)
{
$parts=parse_url($url);
$fp = fsockopen($parts['host'],isset($parts['port'])?$parts['port']:80,$errno, $errstr, 30);
$out = "GET ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Length: 0"."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
We were using fsockopen and fwrite combo, then it up and stopped working one day. Or it was kind of intermittent. After a little research and testing, and if you have fopen wrappers enabled, I ended up using file_get_contents and stream_context_create functions with a timeout that is set to 100th of second. The timeout parameter can receive floating values (https://www.php.net/manual/en/context.http.php). I wrapped it in a try...catch block so it would fail silently. It works beautifully for our purposes. You can do logging stuff in the catch if needed. The timeout is the key if you don't want the function to block runtime.
function fetchWithoutResponseURL( $url )
{
$context = stream_context_create([
"http" => [
"method"=>"GET",
"timeout" => .01
]
]
);
try {
file_get_contents($url, 0, $context);
}catch( Exception $e ){
// Fail silently
}
}
For those of you working with wordrpess as a backend -
it is as simple as:
wp_remote_get( $url, array(blocking=>false) );
Came here whilst researching a similar problem. If you have a database connection handy, one other possibility is to quickly stuff the request details into a table, and then have a seperate cron-based process that periodically scans that table for new records to process, and makes the tracking request, freeing up your web application from having to make the HTTP request itself.
You can use shell_exec, and command line curl.
For an example, see this question
You can actually do this using CURL directly.
I have both implemented it using a very short timeout (CURLOPT_TIMEOUT_MS) and/or using curl_multi_exec.
Be advised: eventually i quit this method because not every request was correctly made. This could have been caused by my own server though i haven't been able to rule out the option of curl failing.
I needed to do something similar, just ping a url and discard all responses. I used the proc_open command which lets you end the process right away using proc_close. I'm assuming you have lynx installed on your server:
<?php
function ping($url) {
$proc = proc_open("lynx $url",[],$pipes);
proc_close($proc);
}
?>
<?php
// Create a stream
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"Accept-language: en"
)
);
$context = stream_context_create($opts);
// Open the file using the HTTP headers set above
$file = file_get_contents('http://tracker.example.com?product_number=5230&price=123.52', false, $context);
?>

CodeIgniter RESTful, async / background process

I'm using codeIgniter RESTful API (https://github.com/philsturgeon/codeigniter-restserver) that return information (json format) to my android/iphone app.
There are an operation where i send some values, if it is everything OK i return 200 code as response.
Now, i want to add a new operation at the same method: send notifications of this modifications with APNS (Apple Push Notificacion Service) and GCM (Google Cloud Messaging).
It works well when i have to send no more than 3-5 notifications, the problem is APNS, because i have to send this messages one by one and it takes a long time, so my apps recieves a timeout exception (all the notifications are sent but the user get the Error Connection...)
Can i send the 200 code response and then continue sending this notifications? (Something like this...)
function my_update_method_post(){
//....GET my POST values
update($data);
$this->response(array('result'=>1),200));
//Send Notifications
....
}
Thanks in advance...
I found a solution that works perfect for me because i don't expect any result value. If notification can't be send...i log it in my database.
This is the function that i use to send "async" request (yes, This is not an asynchronous request, but it works how i'm looking for)
function curl_post_async($url, $params)
{
$post_string = http_build_query($params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
if(!$fp)
{
//Perform whatever logging you want to have happen b/c this call failed!
}
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
Yes this is possible.
You should look at PHP exec() and this link. You should set up a function in your controller to be called from the command line. you will then pass in an array of the GCM/APNS data to be used.
This solution is not ideal because you won't be able to tell the client that all message were sent successfully. You will send back 200 to say the request was received ok and that is all.
Since PHP doesn't natively support threads or asynchronus function calls you will have to use a kindof hacky solution.
Have a look at my question here: PHP file_get_contents() follow Content-length header
The Solution is to send a Connection: Close and Content-Length header, then make the client to be aware of these headers (see link above). In case of curl for example the connection will be closed as soon as the Content-Length is reached, but your PHP Script still runs "in the background" so you can start time consuming operations then.
Kind regards,
Stefan
P.S. If the Script takes really long to execute, make sure that the PHP max exection time doesn't get in your way
Take a look at this article. I like this solution much more than one where you have the client tell the server to hang up immediately; there are multiple benefits if you build this solution on the server side.
You know the server will continue processing once the client has disconnected
The client can still receive a response from the server
EDIT
I'd not realized OP doesn't have access to the service here. In this case, the article I've mentioned is of little value. The problem here is the server is taking a long time to respond and hanging the client up. For this I suggest curl_multi_init. This allows you to make a number of requests simultaneously.

PHP Async GET request works on one server, but doesn't on the other

Please see the edits at the bottom for additional information!
I have two servers. Both should be able to call each other with a GET request.
To make the request (it's more firing an event than makeing a request actually) I am using this code:
function URLCallAsync($url, $params, $type='POST')
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
// Data goes in the path for a GET request
if('GET' == $type) $parts['path'] .= '?'.$post_string;
$out = "$type ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
// Data goes in the request body for a POST request
if ('POST' == $type && isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
I feed the function with the exact same data (but the url) on both servers (I copied the calling file to test it!!) but it only works in one direction!
I write the calls to that function in a log file so I can investigate if something is going wrong.
Server A -> Server B, works exactly as it should, the logfile at server A contains the correct url
Server B -> Server A, only prints the correct information in the logfile of server B, but Server A never receives the request.
What could be the reason for something like this?
edit:
Could it be the differnt kinds of server?
Server A is nginx, Server B is apache.
Server A also has a '~' symbol in it's url, maybe thats the problem?
The parameters of the get request are encoded with php's "urlencode" maybe that creates problems?
I tried around a bit, but the problem is still that the request isn't coming trough to Server A. But from a browser it works perfectly somehow (assuming I enter the correct URL with the parameters).
edit2:
If I exchange "URLCallAsync" with "file_get_contents" it works like it should. But the problem is that file_get_contents is blocking!
So it can only be the function itself. But strangely it works in the opposite direction :(
edit3:
The function "URLCallAsync" runs trough without error, notice or anything else.
It just isn't received by the other server.
What exactly is file_get_contents doing so different???
I got it working.
After a lot of fiddling with wireshark I found that file_get_contents is even simpler than my async function!
It simply omits the Content-Length field completly! It just provides "GET ..." and "Host".
It also uses HTTP/1.0 instead of 1.1, but that didn't change anything.
So the solution is: Also posting the Content-Length header (which had the value 0, since i used GET) will somehow make the server reject the request. I don't know for sure if it was the server that rejected the request, or something else, like a firewall that maybe detected a "malformed" request, but at least the problem is solved.
So next time you send requests, don't provide the Content-Length header if you don't need it :)

How to keep a PHP script running even after the user leaves?

I have a PHP script that checks the last time a SQLite database has been updated (30 minutes timespan) when a user visits the page. If it has been longer than 30 minutes, then the script will pull new information into the database. However, I'm worried that the user might leave while the database is updating, therefore neglecting to update some of the entries. What can I do to keep the script executing even after the user leaves?
I've looked at some of the similar questions here and found people suggesting using ignore_user_abort(), however there seems to be issues with that when data cannot be sent back to the client. Any other suggestions would be greatly appreciated. Thanks!
Asynchronous PHP call
One option would be to do asynchronous PHP call(Request). See Asynchronous PHP calls? for more information. But keep in mind that when doing this a lot you are spawing a lot of background processes which could kill your server.
P.S: Also when you are using shared hosting, doing this stuff is generally not appreciated much
Message Queue
A way better way to do this would be using a Message Queue(MQ). You could use Redis or Beanstalkd just to name two popular MQs. You have a free redis instance provided to you thanks to http://redistogo.com/. From the client/producer(user visiting your page) you would then add message to the queue using RPUSH. From the consumer(SQLite) which is a PHP process running in the background endlessly(CLI) you would retrieve messages put on the queue using BLPOP. Spawning processes is expensive and is avoided when using a message queue.
You can't send data back to the client when he exits/stop to visit your website. You can open sock witch prevent from slowdown on client's side when loading page
$fp = fsockopen($host, 80, $errno, $errstr, 10);
if (!$fp) {
echo "$errstr ($errno)\n";
} else {
$header = "GET /cron.php HTTP/1.1\r\n";
$header .= "Host: $host\r\n";
$header .= "Connection: close\r\n\r\n";
fputs($fp, $header);
fclose($fp);
}
//do another stuff
this will send signal to /cron.php to do stuff you want and ALSO does not slowdown user's browsing experience ;)

Categories