I would like to know why it is better to use curl instead off other methods like
$ret=file($url) in php.
This is actually used to access an http api for an sms gateway.
Someone dropped a "I would recommend using curl for http connections", but I don't know why.
I just read that it is necessary for Paypal payments, so that does sound interesting.
I did a Google search "why use libcurl", but I haven't found an answer yet.
Could someone explain please?
I think the FAQ on the curl site says it best:
1.2 What is libcurl?
libcurl is a reliable and portable
library which provides you with an
easy interface to a range of common
Internet protocols.
You can use libcurl for free in your
application, be it open source,
commercial or closed-source.
libcurl is most probably the most
portable, most powerful and most often
used C-based multi-platform file
transfer library on this planet - be
it open source or commercial.
Also, curl facilitates downloading data from multiple sources simultaneously with better efficiency than say file_get_contents() or file().
Well, I don't know much about other methods of doing HTTP calls in PHP, so I'm not sure if they can do this or not, but Curl can mimic a web browser in pretty much every way, by setting headers, even the user-agent header, etc so that the web server just thinks its a browser which can be important as some sites will try to stop access from anything that isn't a traditional browser
Curl extension has a lot of options that you can set, for example the connection time out. You can also add post variables or access the site with a specific referer. I also reccomend you to use CURL.
Related
I'm building a PHP-script that checks if an URL is reachable. This is currently achieved with cURL. This is script is part of a bigger application that calls this cURL-Script very often .
So I am wondering if there is a faster/better way than cURL. So far i couldnt find anything except this article from this question: What is faster than curl?
This doesn't help me since I have the following requirements for the check:
Must follow redirects
Must return a HTTP-status-code (like 200 or
404)
Must be able to authenticate (htaccess)
EDIT:
Must work in PHP safemode
As per comment, the cause of slowness isn't the cURL library. There's a whole system behind HTTP, such as DNS, web server(s), the connectivity to those servers and ultimately - every site / service uses some sort of server-side language to perform work, which can be slow as well. You need to find out what exactly in this chain isn't responding fast enough - is it DNS? Is it the connection of your server to the service you're connecting to? Is the service itself slow?
You can be 100% sure that it's not cURL.
Whats the best method for posting some data from a server side script, to a PHP web app on another server?
I have control over both ends, but I need to keep it as clean as possible.
I'm hoping people don't mistake this as a request for code, I'm not after anything like that, just a suitable method, even the name of a technology is good enough for me. (FYI the recipient web app will be built in Yii which supports REST if that matters).
Use cURL: http://curl.haxx.se
If you're calling from a PHP script, you can use PHP:cURL https://php.net/curl
Probably best to do it over SSL, if you want to keep the info safe.
Most of the answers here mention cURL, which is fine for smaller use-cases. However if you have more complex and/or growing needs, or plan to open up access to other servers in the future, you might want to consider creating and consuming a web service.
This article makes a somewhat compelling argument for RESTful web services over SOAP-based, but depending on who will be consuming the service, a SOAP-based web service can be both simple to consume (How to easily consume a web service from PHP) and set up (php web service example). Consuming a RESTful web service is easily done via cURL (Call a REST API in PHP).
The choice really comes down to scope and your consuming audience.
You can access your REST API with PHPs cURL Extension.
You will find examples here.
If you use a framework, some have alternatives to cURL, which are easier to handle (like Zend http client).
Or for very simple purposes (and if php-settings allow this), you could use file_get_contents().
I have created many websites that get all there data from a API.
(located on the same server on most cases).
The websites are operation really slow, of all the curl requests.
I first thought it was our mysql server (sepparate server) but now we implemented caching it's still slow.
Is there a good way to find out why it takes so long to do the curl requests.
And what would be a good way to go?
Could use a browser rest client to point to your API and use a profiling tool (xdebug/xhprof) to find the source of the bottleneck.
Possibly make sure the api calls resolve locally and dont go all the way out to the internet before coming back in (but may not shave off much time).
Would recommend starting with the APIs code.
The problem is not the cURL request.
I'm currently in the process of building/implementing a logging system for a website I'm working on that's in PHP. The way the logging system works is I send a JSON request to localhost and that json gets logged (basically, anyway.
My question is:
what's the fastest way I can make a quick fire and forget call with a JSON POST? Is there a way to fire and forget with cURL?
There are multiple ways to do it: you could use the curl_multi functionality of the php_curl extension, which allows you to send asynchronous HTTP requests using cURL, but this requires that extension. GuzzlePHP provides a large wrapper around much of the functionality of cURL, including the curl_multi features if you are looking for an object-oriented approach. PHP's sockets also support asynchronous communications, a library which implements this for the HTTP protocol is available here [the client is written in "pure" PHP and has no dependency on cURL but supports asynchronous requests and fully complies with the HTTP 1.1 spec].
If you are looking for a fire and forget logging solution you might want to look at something that uses UDP protocol like Graylog.
You could use a small image that hits a PHP script. The php script logs the hit and returns a tiny 1x1 transparent GIF. Then the logging will happen after the page loads.
I currently use pretty exclusively the PHP stream context functionality (see http://us2.php.net/manual/en/function.stream-context-create.php) to access HTTP resources and I've been able to successfully use it to do PUTs, DELETEs, POSTs, manage cookies and do just about everything I've needed to do. I originally started using it because I had SSL issues with earlier Debian PHP cURL builds (there was an OpenSSL double-initialization issue within the Apache process that errored out when trying to access SSL urls): those are probably fixed now but I've not had occasion to go back.
In discussions with a friend he contended that the cURL api is faster/better so I wanted to ask: is there any definite experience/knowledge about which option is superior, and in what ways?
Streams are pretty neat in my experience. You probably know it already, but here's a post on streams with a twist in case not:
http://fabien.potencier.org/article/44/php-iterators-and-streams-are-awesome
Curl is nice and fast, and simple; but I honestly wouldn't prefer one or the other for performance reasons. I've never measured but I doubt it makes that much of a difference in comparison with the overhead of doing a remote request in the first place.
In regards to performance, cURL wins by a lot consistently. I won't deny that it's harder to use and it might not matter for general use, but the difference was pretty dramatic and I thought it was worth pointing out.