cURL API Extremely Slow - php

I have created many websites that get all there data from a API.
(located on the same server on most cases).
The websites are operation really slow, of all the curl requests.
I first thought it was our mysql server (sepparate server) but now we implemented caching it's still slow.
Is there a good way to find out why it takes so long to do the curl requests.
And what would be a good way to go?

Could use a browser rest client to point to your API and use a profiling tool (xdebug/xhprof) to find the source of the bottleneck.
Possibly make sure the api calls resolve locally and dont go all the way out to the internet before coming back in (but may not shave off much time).
Would recommend starting with the APIs code.

The problem is not the cURL request.

Related

PHP - cross-application communication

I have two PHP applications on my server. One of them has RESTAPI which I would like to consume and render in the second application. What is better way then curling the API? Can I somehow ask php-fpm for the data directly or something like that?
Doing curl and making request through the webserver seems wrong.
All this happens on single server - I know it probably doesn't scale well but its small project.
why use REST if you can access the functions directly?
If everything is on the same server then there is no need for some REST, since it makes a somewhat pointless run through the webserver.
But if it is already there and you don't care about the overhead (if there's not much traffic going on then it would make sense), then use file_get_contents instead of curl, it is easier to use, but I doubt it is faster/slower; both are right.
You could also use a second webserver (a second virtualhost) on a different port for internal use. That way things are nicely separated.
(If everything is on different servers, but a local network, then using sockets would be fastest. )
Doing curl and making request through the webserver seems wrong. - I disagree with that. You can still achieve what you want to achieve using Php CURL, even if it's on the same server.
I was in the same problem, but i solved it using MySQL to "queue" tasks, and a worker could use any pooling method, or PHP executing a new server side worker.
Since the results were stored in the same database, the PHP pages could load the results, or the status anytime.

Is there a faster/better way to check if an URL is reachable than cURL?

I'm building a PHP-script that checks if an URL is reachable. This is currently achieved with cURL. This is script is part of a bigger application that calls this cURL-Script very often .
So I am wondering if there is a faster/better way than cURL. So far i couldnt find anything except this article from this question: What is faster than curl?
This doesn't help me since I have the following requirements for the check:
Must follow redirects
Must return a HTTP-status-code (like 200 or
404)
Must be able to authenticate (htaccess)
EDIT:
Must work in PHP safemode
As per comment, the cause of slowness isn't the cURL library. There's a whole system behind HTTP, such as DNS, web server(s), the connectivity to those servers and ultimately - every site / service uses some sort of server-side language to perform work, which can be slow as well. You need to find out what exactly in this chain isn't responding fast enough - is it DNS? Is it the connection of your server to the service you're connecting to? Is the service itself slow?
You can be 100% sure that it's not cURL.

Whats the best way to handle failed web requests?

I have web requests that fail every now and then, however my application really needs the data that service provides.
What is the best pattern for retying the request?
I know there would be issues with cascading if I just kept trying indefinitely and straight away.
I am using the cURL library in PHP
Google uses an algorithm that tries after 2^retrycount seconds. I think that is a good algorithm, but if you need the information right now, try to cache the answer and use the cache until the resource is available again. If it's possible to wait that long, I'd recommend the Google algorithm.

What technology shall I use for a webpage that constantly requests data from server

We need to create a web-based frontend for displaying some data. The problem is that the data needs to be updated about once a second.
For me as a web-developer the obvious solution is AJAX.
Unfortunately, one of the purposes of this web frontend is to be displayed inside of embedded browser window which is expected to run constantly for months or even years. That's it, months of work with no restart / refresh.
During testing we ran a proof of concept interface (which requested a simple set of data each 1,5s) in Safari for over a month. During this period of time, the memory usage of Safari raised from ~30 MB to over 100MB.
Thus we're afraid of stability of such a solution.
I'm wondering if you could recommend us any other technique for this task, possibly with less overhead (when requesting simple sets of data - as in our case - I'm afraid the HTTP headers are very significant part of data)
I would suggest looking into node.js and the now.js plugging, which allows for realtime updates via websockets. It even has support for older browsers, so if the browser does not support websockets, it will do a fall over to either a comet server implementation, AJAX or an iframe.
It's extremely easy to setup on a linux environment, and there's ample documentation to get you started.
It works with javascript and runs on the Google V8 javascript engine, so if you've ever worked with OOP Javascript, you should be able to pick it up relatively easy.
LINKS:
http://nodejs.org/
http://nowjs.com/
How about Adobe AIR as front-end? You can use Flash/FLEX inside which have decent garbage collectors so long running shoudn't be a problem. AIR also allows to write in XHTML and JavaScript so it could be a good option if you're only familiar with those technologies
PHP is not a good choice for this kind of requests. Comet seems to be a good way to receive data from server. You can use for example excellent Tornado (Python) as backend.
ActionScript allows to use TCP sockets so you can write your own protocol for even better performance and use BOOST Asio (C++) or Netty (Java) as scalable backend
Maybe websocket ? Instead of making an AJAX request each X seconds, the server push new data as they comes.
My personal faverite is php4+, mysql, apache or lightpd webserver.
Tough I also suggest Python.
I specialize in what you are mentioning, with that said, will you be actually looking at the screen? If not you should request the page using an http socket or via a wget cronjob on a linux box.
Yes the http header is very important, if you try to strip them out the webserver will issue a "Server - Bad Request" Error.
Let me know what you decide, I have a lot to share :)
I suspect that the problem is not AJAX per se, but using a browser an sich: I don't think any where made with constant running in mind, and I'm assuming that all (re)loading processes will become some form of extra memory in the end.
I think you would be best off to consume your data trough something simple you design yourself. You can obviously produce it on the same spot (server, requestable via HTTP or whatever you like most), but you do not need a complete webbrowser if your goal is first "a couple of years uptime".

Is the PHP CURL api cleaner/faster/better than using streams for HTTP/HTTPS access?

I currently use pretty exclusively the PHP stream context functionality (see http://us2.php.net/manual/en/function.stream-context-create.php) to access HTTP resources and I've been able to successfully use it to do PUTs, DELETEs, POSTs, manage cookies and do just about everything I've needed to do. I originally started using it because I had SSL issues with earlier Debian PHP cURL builds (there was an OpenSSL double-initialization issue within the Apache process that errored out when trying to access SSL urls): those are probably fixed now but I've not had occasion to go back.
In discussions with a friend he contended that the cURL api is faster/better so I wanted to ask: is there any definite experience/knowledge about which option is superior, and in what ways?
Streams are pretty neat in my experience. You probably know it already, but here's a post on streams with a twist in case not:
http://fabien.potencier.org/article/44/php-iterators-and-streams-are-awesome
Curl is nice and fast, and simple; but I honestly wouldn't prefer one or the other for performance reasons. I've never measured but I doubt it makes that much of a difference in comparison with the overhead of doing a remote request in the first place.
In regards to performance, cURL wins by a lot consistently. I won't deny that it's harder to use and it might not matter for general use, but the difference was pretty dramatic and I thought it was worth pointing out.

Categories