Stream response from CURL request without waiting for it to finish - php

I have a PHP script on my server that is making a request to another server for an image.
The script is accessed just like a regular image source like this:
<img src="http://example.com/imagecontroller.php?id=1234" />
Browser -> Script -> External Server
The script is doing a CURL request to the external server.
Is it possible to "stream" the CURL response directly back to the client (browser) as it is received on the server?
Assume my script is on a slow shared hosting server and the external server is blazing fast (a CDN). Is there a way to serve the response directly back to the client without my script being a bottleneck? It would be great if my server didn't have to wait for the entire image to be loaded into memory before beginning the response to the client.

Pass the -N/--no-buffer flag to curl. It does the following:
Disables the buffering of the output stream. In normal work
situations, curl will use a standard buffered output stream that will
have the effect that it will output the data in chunks, not
necessarily exactly when the data arrives. Using this option will
disable that buffering.
Note that this is the negated option name documented. You can thus use
--buffer to enforce the buffering.

Check out Pascal Martin's answer to an unrelated question, in which he discusses using CURLOPT_FILE for streaming curl responses. His explanation for handling " Manipulate a string that is 30 million characters long " should work in your case.
Hope this helps!

Yes you can use the CURLOPT_WRITEFUNCTION flag:
curl_setopt($ch, CURLOPT_WRITEFUNCTION, $callback);
Where $ch is the Curl handler, and $callback is the callback function name.
This command will stream response data from remote site. The callback function can look something like:
$result = '';
$callback = function ($ch, $str) {
global $result;
$result .= $str;//$str has the chunks of data streamed back.
//here you can mess with the stream data either with $result or $str
return strlen($str);//don't touch this
};
If not interrupted at the end $result will contain all the response from remote site.

Not with curl, you could use fsocket to do streaming.

Related

Why does this code so negatively affect my server's performance?

I have a Silverstripe site that deals with very big data. I made an API that returns a very large dump, and I call that API at the front-end by ajax get.
When ajax calling the API, it will take 10 mins for data to return (very long json data and customer accepted that).
While they are waiting for the data return, they open the same site in another tab to do other things, but the site is very slow until the previous ajax request is finished.
Is there anything I can do to avoid everything going unresponsive while waiting for big json data?
Here's the code and an explanation of what it does:
I created a method named geteverything that resides on the web server as below, it accessesses another server (data server) to get data via streaming API (sitting in data server). There's a lot of data, and the data server is slow; my customer doesn't mind the request taking long, they mind how slow everything else becomes. Sessions are used to determine particulars of the request.
protected function geteverything($http, $id) {
if(($System = DataObject::get_by_id('ESM_System', $id))) {
if(isset($_GET['AAA']) && isset($_GET['BBB']) && isset($_GET['CCC']) && isset($_GET['DDD'])) {
/**
--some condition check and data format for AAA BBB CCC and DDD goes here
**/
$request = "http://dataserver/streaming?method=xxx";
set_time_limit(120);
$jsonstring = file_get_contents($request);
echo($jsonstring);
}
}
}
How can I fix this, or what else would you need to know in order to help?
The reason it's taking so long is your downloading the entirity of the json to your server THEN sending it all to the user. There's no need to wait for you to get the whole file before you start sending it.
Rather than using file_get_contents make the connection with curl and write the output directly to php://output.
For example, this script will copy http://example.com/ exactly as is:
<?php
// Initialise cURL. You can specify the URL in curl_setopt instead if you prefer
$ch = curl_init("http://example.com/");
// Open a file handler to PHP's output stream
$fp = fopen('php://output', 'w');
// Turn off headers, we don't care about them
curl_setopt($ch, CURLOPT_HEADER, 0);
// Tell curl to write the response to the stream
curl_setopt($ch, CURLOPT_FILE, $fp);
// Make the request
curl_exec($ch);
// close resources
curl_close($ch);
fclose($fp);

Posting FLAC to Google Voice Recognition API from PHP

I am quite experienced in PHP but I've always had troubles with connection between servers like "post". I have a FLAC audio file that I need to post to Google's Speech Recognition API server. I don't know neither how to "listen" to its response. I would like a script like that, assuming that this kind of function exists :
<?php
$fileId = $_GET['fileId'];
$filepath = $fileId . ".flac";
recognize($filepath);
function recognize($pathToFile) {
//It's the following function that I'm looking for
$response = $pathToFile->post("http://www.google.com/speech-api/v1/.....&client=chromium");
//The $response would be the short JSON that Google feed back.
echo $response;
}
?>
EDIT
I've followed a tutorial to create a Shell Script that posts my FLAC file using Wget --post. I would like to post like this, but in PHP. Also, at the end of the command, there is this > answer.ret line, so that Google's answer would be written to this file. I was wondering if there was an alternate method to it in PHP.
Here's the command line :
wget -q -U "Mozilla/5.0" --post-file audio1.flac --header="Content-Type: audio/x-flac; rate=16000" -O - "http://www.google.com/speech-api/v1/recognize?lang=fr-fr&client=chromium" > trancription1.ret
EDIT 2
I figured out how to do it, with #hakre 's answer and baked up a little Gist for curious people. Here it is: https://gist.github.com/chlkbumper/4969389. Don't forget that the FLAC file must be a 16k bitrate FLAC
A POST request is just a standard HTTP request, just with the POST method specified. The rest of the HTTP Request and HTTP Response is pretty much the same.
You get the response of a request in form of a HTTP Response btw.. It is absolutely normaltiv defined in RFC 2616 - just relate to this document and it explains everything.
A function in PHP to send HTTP requests is file_get_contents, it returns the requests response. This is done via the HTTP stream wrapper that offers some options you need to send a POST request (default is GET). See HTTP context options.
Another popular PHP extension for sending HTTP requests are the Curl bindings.

Run a PHP file from another file

I have a script on my server send sends email. and shows a response as 0 or 1
Here is the URL :
http://examplewebsite.com/emailsender.php?to=$to&subject=$subject&message=$message
I am punting data in $to,$messages,$header.And it's sending the email.
I need to get the response of the page too.
How can i do that?
use file_get_contents or curl to get the output:
$output = file_get_contents(" http://smwebtech.com/webservices/emailsender.php?to=$to&subject=$subject&message=$message");
The URL can be called with file_get_contents() or cURL, both will give you the resulting HTML.
You should implement some sort of security to prevent people abusing your email script, such as an IP whitelist.
In PHP there are a number of ways. The easiest is file_get_contents() (which supports URL wrappers), or if you want a bit more power but more setup you can use CURL.
<?php
$response = file_get_contents('http://smwebtech.com/webservices/emailsender.php?to=$to&subject=$subject&message=$message');
var_dump($response);
?>

file_get_contents() GET request not showing up on my webserver log

I've got a simple php script to ping some of my domains using file_get_contents(), however I have checked my logs and they are not recording any get requests.
I have
$result = file_get_contents($url);
echo $url. ' pinged ok\n';
where $url for each of the domains is just a simple string of the form http://mydomain.com/, echo verifies this. Manual requests made by myself are showing.
Why would the get requests not be showing in my logs?
Actually I've got it to register the hit when I send $result to the browser. I guess this means the webserver only records browser requests? Is there any way to mimic such in php?
ok tried curl php:
// create curl resource
$ch = curl_init();
// set url
curl_setopt($ch, CURLOPT_URL, "getcorporate.co.nr");
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// $output contains the output string
$output = curl_exec($ch);
// close curl resource to free up system resources
curl_close($ch);
same effect though - no hit registered in logs. So far it only registers when I feed the http response back from my script to the browser. Obviously this will only work for a single request and not a bunch as is the purpose of my script.
If something else is going wrong, what debugging output can I look at?
Edit: D'oh! See comments below accepted answer for explanation of my erroneous thinking.
If the request is actually being made, it would be in the logs.
Your example code could be failing silently.
What happens if you do:
<?PHP
if ($result = file_get_contents($url)){
echo "Success";
}else{
echo "Epic Fail!";
}
If that's failing, you'll want to turn on some error reporting or logging and try to figure out why.
Note: if you're in safe mode, or otherwise have fopen url wrappers disabled, file_get_contents() will not grab a remote page. This is the most likely reason things would be failing (assuming there's not a typo in the contents of $url).
Use curl instead?
That's odd. Maybe there is some caching afoot? Have you tried changing the URL dynamically ($url = $url."?timestamp=".time() for example)?

How do I check for valid (not dead) links programmatically using PHP?

Given a list of urls, I would like to check that each url:
Returns a 200 OK status code
Returns a response within X amount of time
The end goal is a system that is capable of flagging urls as potentially broken so that an administrator can review them.
The script will be written in PHP and will most likely run on a daily basis via cron.
The script will be processing approximately 1000 urls at a go.
Question has two parts:
Are there any bigtime gotchas with an operation like this, what issues have you run into?
What is the best method for checking the status of a url in PHP considering both accuracy and performance?
Use the PHP cURL extension. Unlike fopen() it can also make HTTP HEAD requests which are sufficient to check the availability of a URL and save you a ton of bandwith as you don't have to download the entire body of the page to check.
As a starting point you could use some function like this:
function is_available($url, $timeout = 30) {
$ch = curl_init(); // get cURL handle
// set cURL options
$opts = array(CURLOPT_RETURNTRANSFER => true, // do not output to browser
CURLOPT_URL => $url, // set URL
CURLOPT_NOBODY => true, // do a HEAD request only
CURLOPT_TIMEOUT => $timeout); // set timeout
curl_setopt_array($ch, $opts);
curl_exec($ch); // do it!
$retval = curl_getinfo($ch, CURLINFO_HTTP_CODE) == 200; // check if HTTP OK
curl_close($ch); // close handle
return $retval;
}
However, there's a ton of possible optimizations: You might want to re-use the cURL instance and, if checking more than one URL per host, even re-use the connection.
Oh, and this code does check strictly for HTTP response code 200. It does not follow redirects (302) -- but there also is a cURL-option for that.
Look into cURL. There's a library for PHP.
There's also an executable version of cURL so you could even write the script in bash.
I actually wrote something in PHP that does this over a database of 5k+ URLs. I used the PEAR class HTTP_Request, which has a method called getResponseCode(). I just iterate over the URLs, passing them to getResponseCode and evaluate the response.
However, it doesn't work for FTP addresses, URLs that don't begin with http or https (unconfirmed, but I believe it's the case), and sites with invalid security certificates (a 0 is not found). Also, a 0 is returned for server-not-found (there's no status code for that).
And it's probably easier than cURL as you include a few files and use a single function to get an integer code back.
fopen() supports http URI.
If you need more flexibility (such as timeout), look into the cURL extension.
Seems like it might be a job for curl.
If you're not stuck on PHP Perl's LWP might be an answer too.
You should also be aware of URLs returning 301 or 302 HTTP responses which redirect to another page. Generally this doesn't mean the link is invalid. For example, http://amazon.com returns 301 and redirects to http://www.amazon.com/.
Just returning a 200 response is not enough; many valid links will continue to return "200" after they change into porn / gambling portals when the former owner fails to renew.
Domain squatters typically ensure that every URL in their domains returns 200.
One potential problem you will undoubtably run into is when the box this script is running on looses access to the Internet... you'll get 1000 false positives.
It would probably be better for your script to keep some type of history and only report a failure after 5 days of failure.
Also, the script should be self-checking in some way (like checking a known good web site [google?]) before continuing with the standard checks.
You only need a bash script to do this. Please check my answer on a similar post here. It is a one-liner that reuses HTTP connections to dramatically improve speed, retries n times for temporary errors and follows redirects.

Categories