I'm currently in the process of building/implementing a logging system for a website I'm working on that's in PHP. The way the logging system works is I send a JSON request to localhost and that json gets logged (basically, anyway.
My question is:
what's the fastest way I can make a quick fire and forget call with a JSON POST? Is there a way to fire and forget with cURL?
There are multiple ways to do it: you could use the curl_multi functionality of the php_curl extension, which allows you to send asynchronous HTTP requests using cURL, but this requires that extension. GuzzlePHP provides a large wrapper around much of the functionality of cURL, including the curl_multi features if you are looking for an object-oriented approach. PHP's sockets also support asynchronous communications, a library which implements this for the HTTP protocol is available here [the client is written in "pure" PHP and has no dependency on cURL but supports asynchronous requests and fully complies with the HTTP 1.1 spec].
If you are looking for a fire and forget logging solution you might want to look at something that uses UDP protocol like Graylog.
You could use a small image that hits a PHP script. The php script logs the hit and returns a tiny 1x1 transparent GIF. Then the logging will happen after the page loads.
Related
I'm writing a gateway script in PHP which connects to a remote server, obtains some information and returns it for JSON usage (no JSONP possibility).
This gateway is being requested every second, so it's very important for curl to use keep-alive. From what I learned, curl will do it automatically if we will use the same handle across multiple requests.
The question is: how do I store the handle between two reloads? It's not possible to store the handle resource in session, it also can't be serialized.
Or maybe there's other way to ensure keep-alive in curl?
Generally speaking, every request exists independent of every other request. Connections and other resources are not pooled between requests.
There are possible solutions
Use a proxy with content adaptation (Squid and Greasyspoon would work here) this does take some work to set up. But you will be able to write scripts in java, javascript or ruby to adapt your content.
Run your PHP script as a deamon, sort of like a webserver. This would take a bit of engineering, but it can be done with PHP. You would be getting into sockets and threading.
You might be able to use this as a starting point: http://nanoweb.si.kz/
Does anyone has any solutions on accomplishing asynchronous cross-domain GET requests. I am looking to make a site that checks available names of other sites. The faster the better.
I'd like it to use my server if possible, as its most likely faster than the client. Would most likely send it a huge array (300-10000) requests.
Examples, links, anything will work.
You would have to make a same-domain get request to your server, and have your PHP script do the checking (maybe using CURL) before responding to the request.
http://www.php.net/manual/en/curl.examples-basic.php
Do you want to perform the Cross-Domain Check using JavaScript or using PHP?
If using JavaScript you will probably be restricted by the Same-Origin Policy, though some pages may allow your browser to access them using Ajax.
If using PHP there is no way to perform a asynchronous request, because PHP is synchronous all over.
Maybe a good variant would be to send a request to a Node.JS server from your JavaScript and then let Node.JS get the page without blocking a process?
Check out curl http://us.php.net/curl
Is it possible to download a single file with multiple parallel connections through php?
As far as I know not easily, no.
You would have to build a PHP implementation of whatever protocols or additions there are to the HTTP protocol to allow the download of files through multiple connections.
While that may not be impossible, I've never heard of such an implementation and even if one exists, it stands to reason that it would be a horrible drain on resources on server side to do this in PHP.
I recommend you look for solutions on the server side for this (i.e. Apache modules and settings for example).
Not through PHP alone. You may be able to do this using some client-side components (maybe even javascript) but I suspect that it would be a lot of work. Many companies that distribute software, instead of having the user download installers via HTTP, deliver a small executable with user interface that then downloads the file with all the optimizations possible. Maybe you could go this way?
You can use the PHP http extension. It has a "range" option for requests, and can wait for incoming request data in the background (open multiple socket connections, non-blocking functions calls). I've even seen callbacks mentioned somewhere there.
And you want to look into HttpRequestPool. Don't ask me for examples, though. If it's that important for your case, you have to write the processing logic yourself.
Or just googled:
Parallel HTTP requests in PHP using PECL HTTP classes [Answer: HttpRequestPool class]
I would like to know why it is better to use curl instead off other methods like
$ret=file($url) in php.
This is actually used to access an http api for an sms gateway.
Someone dropped a "I would recommend using curl for http connections", but I don't know why.
I just read that it is necessary for Paypal payments, so that does sound interesting.
I did a Google search "why use libcurl", but I haven't found an answer yet.
Could someone explain please?
I think the FAQ on the curl site says it best:
1.2 What is libcurl?
libcurl is a reliable and portable
library which provides you with an
easy interface to a range of common
Internet protocols.
You can use libcurl for free in your
application, be it open source,
commercial or closed-source.
libcurl is most probably the most
portable, most powerful and most often
used C-based multi-platform file
transfer library on this planet - be
it open source or commercial.
Also, curl facilitates downloading data from multiple sources simultaneously with better efficiency than say file_get_contents() or file().
Well, I don't know much about other methods of doing HTTP calls in PHP, so I'm not sure if they can do this or not, but Curl can mimic a web browser in pretty much every way, by setting headers, even the user-agent header, etc so that the web server just thinks its a browser which can be important as some sites will try to stop access from anything that isn't a traditional browser
Curl extension has a lot of options that you can set, for example the connection time out. You can also add post variables or access the site with a specific referer. I also reccomend you to use CURL.
I want to have a PHP script send a XML formatted string to another PHP script that resides on a different server in a different part of town.
Is there any nice, clean way of doing this?
(PHP5 and all the latest software available)
check out cURL for posting data between pages.
If it were me, I would just POST the xml data to the other script. You could use a socket from PHP, or use CURL. I think that's the cleanest solution, although SOAP is also viable if you don't mind the overhead of the SOAP request, as well as using a library.
I strongly suggest rolling your own RESTful API and avoiding the complexity of SOAP altogether. All you need is the curl extension to handle the HTTP request/response, and simple_xml to build/process the XML. If your data is in a reasonable format, it should be easy for you to push it into an XML string and submit it as a POST to the other server. That server will respond to the request by reading the XML string from the POST var back into an object, and voila! It shouldn't take you all day to whip this out.
XML-RPC or SOAP or just a RESTful API
You can use cURL (complex API), the http extension (cleaner), or if you need to do more complex stuff you can even use the Scriptable Browser from simpletest.