How would you show a file with cURL? - php

There is an epic lack of PHP cURL love on the Internet for beginners like me. I was wondering how to use cURL to download & display an ICS file (They're plain text to me...) in my PHP code. Unless fopen() is 1,000 times easier, I'd like to stick with cURL for this one.

If your webserver allows it, file_get_contents() is even easier.
echo file_get_contents('http://www.example.com/path/to/your/file.ics');
If you can not open URLs with file_get_contents() check out all the stuff on Stack Overflow, which I believe should be fine for a beginner.

If remote file_get_contents is not enabled, cURL can indeed do this.
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, 'http://example.com/file.ics');
// this is the key option - sets curl_exec to return the HTTP response
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
$file_contents = curl_exec($curl);

Related

Tunelling link data through PHP?

I want to be able to go to mydoma.in/tunnel.php?file=http://otherdoma.in/music.mp3, and then get the data of http://otherdoma.in/music.mp3 streamed to the client.
I tried doing this via Header();, but it redirects instead of "tunelling" the data.
How can i do this?
Use cURL for streaming:
<?php
$url = $_GET["file"];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_BUFFERSIZE, 256);
curl_exec($ch);
curl_close($ch);
?>
If they are small, you might be able to use file_get_contents(). Otherwise, you should probably use cURL. You would want to cURL the URL from the get variable "file". Then save it to a local temporary location with PHP. Then, use header() to direct yourself to the local file. Deleting the temporary file is the only issue, as there isn't really a way to determine when you have finished downloading it or not. So you might be able to sleep or delay the file removal, but you might find it's a better option to use a cron job to clean up all of the temporary files later.
Have your PHP script pull the remote content:
$data = file_get_contents($remote_url);
And then just spit it out:
echo $data;
Or simply:
echo file_get_contents($remote_url);
You might have to add some headers to indicate the content type.
Alternatively, you could configure a proxy with something like nginx -- this will allow you to rewrite particular URLs to a remote site and then serve them as local, no coding required.

PHP Redirect a file from another server to end user

I want to be able to allow user to enter in variable URL which file they would like to download from remote server URL e.g /download.php?url=fvr_anim_foxintro_V4_01.jpg
<?php
$url = $_GET['url'];
header("Location: http://fvr.homestead.com/files/animation/" . $url);
?>
The above is purely an example I grabbed from google images. The problem is I do not want the end user to be allowed to see where the file is originally coming from so it would need to get the file download to the server and the server passes it along to the end user. Is there a method of doing this?
I find many examples for files hosted on the server but no examples for serving files hosted on a remote server. In other words I would be passing them along. The files would be quite large (up to 100MB)
Thanks in advance!
You can use cURL for this:
<?php
$url = "http://share.meebo.com/content/katy_perry/wallpapers/3.jpg";
$ch = curl_init();
$timeout = 0;
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
// Getting binary data
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$image = curl_exec($ch);
curl_close($ch);
// output to browser
header("Content-type: image/jpeg");
echo $image;
?>
Source: http://forums.phpfreaks.com/topic/120308-solved-curl-get-image/
Of course, this example is just for an image (as you've suggested) but you can use cURL for all kinds of remote data retrieval via HTTP GET, POST, PUT, DELETE, etc. Search around the web for "php curl" and you'll find an endless supply of information.
The ideal solution would be to use PHP's cURL Library, but if you're using shared hosting keep in mind this library may be disabled.
Assuming you can use cURL, you simply echo the Content-type header with the appropriate MIME Type and echo the results from curl_exec().
To get a basic idea of how to use the cURL library, look at the example under the curl_init() function.

Running file_put_contents in parallel?

was searching stackoverflow for a solution, but couldn't find anything even close to what I am trying to achieve. Perhaps I am just blissfully unaware of some magic PHP sauce everyone is doing tackling this problem... ;)
Basically I have an array with give or take a few hundred urls, pointing to different XML files on a remote server. I'm doing some magic file-checking to see if the content of the XML files have changed and if it did, I'll download newer XMLs to my server.
PHP code:
$urls = array(
'http://stackoverflow.com/a-really-nice-file.xml',
'http://stackoverflow.com/another-cool-file2.xml'
);
foreach($urls as $url){
set_time_limit(0);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FAILONERROR, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, false);
$contents = curl_exec($ch);
curl_close($ch);
file_put_contents($filename, $contents);
}
Now, $filename is set somewhere else and gives each xml it's own ID based on my logic.
So far this script is running OK and does what it should, but it does it terribly slow. I know my server can handle a lot more and I suspect my foreach is slowing down the process.
Is there any way I can speed up the foreach? Currently I am thinking to up the file_put_contents in each foreach loop to 10 or 20, basically cutting my execution time 10- or 20-fold, but can't think of how to approach this the best and most performance kind of way. Any help or pointers on how to proceed?
Your bottleneck (most likely) is your curl requests, you can only write to a file after each request is done, there is no way (in a single script) to speed up that process.
I don't know how it all works but you can execute curl requests in parallel: http://php.net/manual/en/function.curl-multi-exec.php.
Maybe you can fetch the data (if memory is available to store it) and then as they complete fill in the data.
Just run more script. Each script will download some urls.
You can get more information about this pattern here: http://en.wikipedia.org/wiki/Thread_pool_pattern
The more script your run the more parallelism you get
I use on paralel requests guzzle pool ;) ( you can send x paralel request)
http://docs.guzzlephp.org/en/stable/quickstart.html

PHP Curl Slowness

For some reason my curl call is very slow. Here is the code I used.
$postData = "test"
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postData);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
$result = curl_exec($ch);
Executing this code takes on average 250ms to finish.
However when I just open the url in a browser, firebug says it only takes about 80ms.
Is there something I am doing wrong? Or is this the overhead associated with PHP Curl.
It's the call to
curl_exec
That is taking up all the time.
UPDATE:
So I figured out right after I posted this that if I set the curl option
curl_setopt($ch, CURLOPT_POSTFIELDS, $postData);
It significantly slows down
curl_exec
The post data could be anything and it will slow it down.
Even if I set
curl_setopt($ch, CURLOPT_POST, false);
It's slow.
I'll try to work around it by just adding the parameters to the URI as a query string.
SECOND UPDATE:
Confirmed that if I just call the URI using GET and passing parameters
as a query string it is much faster than using POST and putting the parameters in the body.
CURL has some problems with DNS look-ups. Try using IP address instead of domain name.
Curl has the ability to tell exactly how long each piece took and where the slowness is (name lookup, connect, transfer time). Use curl_getinfo (http://www.php.net/manual/en/function.curl-getinfo.php) after you run curl_exec.
If curl is slow, it is generally not the PHP code, it's almost always network related.
try this
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4 );
Adding "curl_setopt($ch, CURLOPT_POSTREDIR, CURL_REDIR_POST_ALL);" solved here. Any problem with this solution?
I just resolved this exact problem by removing the following two options:
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postData);
Somehow on the site I was fetching, the POST request to over ten full seconds. If it's GET, it's less than a second.
So... in my wrapper function that does the Curl requests, it now only sets those two options when there is something in $postData
I just experienced a massive speed-up through compression. By adding the Accept-Encoding header to "gzip, deflate", or just to all formats which Curl supports, my ~200MB download took 6s instead of 20s:
curl_setopt($ch, CURLOPT_ENCODING, '');
Notes:
If an empty string, "", is set, a header containing all supported encoding types is sent.
you do not even have to care about decompression after the download, as this is done by Curl internally.
CURLOPT_ENCODING requires Curl 7.10+
The curl functions in php directly use the curl command line tool under *nix systems.
Therefore it really only depends on the network speed since in general curl itself is much faster than a webbrowser since it (by default) does not load any additional data like included pictures, stylesheets etc. of a website.
It might be possible that you are not aware, that the network performance of the server on which you were testing your php script is way worse than on your local computer where you were testing with the browser. Therefore both measurements are not really comparable.
generally thats acceptable when you are loading contents or posting to slower end of world. curl call are directly proportional to your network speed and throughput of your webserver

Printing rss using PHP?

I am trying to use an rss feed from a domain that does not have a crossdomain file and because of that I am going to use a web service in the middle where I will be just getting the rss feed from a url (let's say the url is: www.example.com/feed) and then just print it to a page.
The service would work like: www.mywebservice.com/feed.php?word=something) and that will just go print the rss feed for: www.example.com/feed&q=word).
I used:
<?php
$word = $_GET["word"];
$ch=curl_init("http://example.com/feed.php?word=".$word."");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$data = curl_exec($ch);
curl_close($ch);
print $data;
?>
But this did not work, it gives me (SYSTEM ERROR: we're sorry but a serious error has occurred in the system). I am on shared hosting
Any help?
The readfile function reads a file and writes it to the output buffer.
readfile('http://example.com/feed.rss');
A URL can be used as a filename with this function if the fopen wrappers have been enabled. See fopen() for more details on how to specify the filename. See the List of Supported Protocols/Wrappers for links to information about what abilities the various wrappers have, notes on their usage, and information on any predefined variables they may provide.
If you need to do anything with the XML, use one of PHP's many XML libraries, preferably DOM, but there is also SimpleXml or XMLReader. As an alternative, you could use Zend_Feed from the Zend Framework as a standalone component to work with the RSS feed.
If you cannot enable allow_url_fopen on your server, try cURL like Matchu suggested or go with Artefacto's suggestion.
Consider doing this with mod_rewrite (using the P flag) or setting up a reverse proxy with ProxyPass.
Since you say you can't do the fancy URL-file-opening shortcuts due to server restrictions, you will need to use PHP's cURL module to send an HTTP request.
If you want to also parse XML and process it further, be sure to look into SimpleXML. Let's you parse and manipulate the feed.
So at the end I ended up doing this:
$word = $_GET["word"];
$url = "http://www.example.com/feed.php?q=".$word;
$curl = #curl_init ($url);
#curl_setopt ($curl, CURLOPT_HEADER, FALSE);
#curl_setopt ($curl, CURLOPT_RETURNTRANSFER, TRUE);
#curl_setopt ($curl, CURLOPT_FOLLOWLOCATION, TRUE);
#curl_setopt ($curl, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
$source = #curl_exec ($curl);
#curl_close ($curl);
print $source;
I hope this is considered as an answer not an edit (if an edit please tell me so I can just delete this answer and edit the post)

Categories