file_get_contents/curl blocks other clients - php

I use file_get_contents/curl to get access for one API at the another server from my php script. This API isn't fast and can take up to 10 seconds to respond.
When I try to open 2 pages on my web site at the same time, which uses this API, they loaded one by one, i.e. I need to wait 1st to be loaded before server will start to server request for 2nd page.
I use Apache2 and php under linux.
How I can avoid such behaviour, I don't want to block other clients while one of them access this API. Need help!
Thanks.

Yes.
There is this PHP library: http://code.google.com/p/multirequest/ (it's a multithreaded CURL lib).
As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

Related

PHP - cross-application communication

I have two PHP applications on my server. One of them has RESTAPI which I would like to consume and render in the second application. What is better way then curling the API? Can I somehow ask php-fpm for the data directly or something like that?
Doing curl and making request through the webserver seems wrong.
All this happens on single server - I know it probably doesn't scale well but its small project.
why use REST if you can access the functions directly?
If everything is on the same server then there is no need for some REST, since it makes a somewhat pointless run through the webserver.
But if it is already there and you don't care about the overhead (if there's not much traffic going on then it would make sense), then use file_get_contents instead of curl, it is easier to use, but I doubt it is faster/slower; both are right.
You could also use a second webserver (a second virtualhost) on a different port for internal use. That way things are nicely separated.
(If everything is on different servers, but a local network, then using sockets would be fastest. )
Doing curl and making request through the webserver seems wrong. - I disagree with that. You can still achieve what you want to achieve using Php CURL, even if it's on the same server.
I was in the same problem, but i solved it using MySQL to "queue" tasks, and a worker could use any pooling method, or PHP executing a new server side worker.
Since the results were stored in the same database, the PHP pages could load the results, or the status anytime.

detecting calls to outside PHP (curl, file get contents ...)

Is there a simple way to detect all outside calls from PHP?
In an open sourced project I have a lot of 3rd party scripts. With use of new relic I was able to debug long execution times leading back to some of this scripts making calls back to their servers.
I dont mind this but I want to know what data this scripts are sending and most of all I dont want to have slow site when 3rd party script server is down or not accessible.
Is there an easy way to log all curl, file get contents etc requests?
Thanks!
You are searching for a packet sniffer. Usually you'll use tcpdump and/or wireshark.
http://wiki.ubuntuusers.de/tcpdump
https://www.wireshark.org/
There are many solutions, but my preferred is
You build your own proxy : example a dedicated Apache Server (it can running in the same IP but different port who will handle this type of operations). After that, you change all of your old URL to pass by your proxy
Imagine that you have this in your code : curl_init('www.google.com'); so you have to change it by: curl_init('http://localhost:8090/CALLS_OUTSIDE_PHP_CONTROLLER.php?url_to_redirect=www.google.com');
The PHP controller running under 8090 can do many operations as : blacklist/whitelist some urls, doing regular URL check in background... many cool stuff

How can a make a PHP script run without redirecting to it?

Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.

PHP CLI + Ajax for web terminal

I'm trying to determine the best approach to providing an Ajax based terminal using PHP. I haven't made an attempt at writing it yet but having rolled the idea around, the only way I could see it possible, would be 2 scripts:
Script 1; handles Ajax communication
between server and client browser. when a
request is made to use the terminal,
it connects to (or starts as a service then
connects to) Script 2 via a socket.
Script 2; performs the system calls,
passing back output to the Ajax
script for output via the socket.
There are multiple holes I can see in this though, and I'm wondering if anyone has created/seen a set of scripts that can perform these tasks? Any insight would be greatly appreciated!
Thanks :)
Edit: I think I was unclear about a few things. I've found a few scripts that imitate terminals, providing nearly the functionality that I'm looking for, such as AjaxPHPTerm (http://sourceforge.net/projects/ajaxphpterm/)
The problem is that, I'm trying to find a method that permits interaction with shell scripts. If a script prompts Press any key to continue, or Select option [x], using AjaxPHPTerm, it just hangs or drops out of the shell script.
That's why I started thinking sockets, or streams; some way of forming a direct I/O stream to the system calls.
Http is stateless and AJAX, sockets or any other technology based on pages generated by server will not change it magically. Whatever tricks You would use, it will be not efficient and simply not worth the effort (In my opinion at least).
The problem seems to be that AjaxPHPTerm is actually closer to a shell than a terminal (glancing at the code, it seems to do its own CWD handling, and has a simple read-eval-print loop).
Assuming a Posix-compatible OS on the server, the proper way to implement this would probably be to use the pseudo-terminal facility, so that your web terminal appears like a virtual terminal on the system, that running programs can interactively access.

Call PHP code from Python

I'm trying to integrate an old PHP ad management system into a (Django) Python-based web application. The PHP and the Python code are both installed on the same hosts, PHP is executed by mod_php5 and Python through mod_wsgi, usually.
Now I wonder what's the best way to call this PHP ad management code from within my Python code in a most efficient manner (the ad management code has to be called multiple times for each page)?
The solutions I came up with so far, are the following:
Write SOAP interface in PHP for the ad management code and write a SOAP client in Python which then calls the appropriate functions.
The problem I see is, that will slow down the execution of the Python code considerably, since for each page served, multiple SOAP client requests are necessary in the background.
Call the PHP code through os.execvp() or subprocess.Popen() using PHP command line interface.
The problem here is that the PHP code makes use of the Apache environment ($_SERVER vars and other superglobals). I'm not sure if this can be simulated correctly.
Rewrite the ad management code in Python.
This will probably be the last resort. This ad management code just runs and runs, and there is no one remaining who wrote a piece of code for this :) I'd be quite afraid to do this ;)
Any other ideas or hints how this can be done?
Thanks.
How about using AJAX from the browser to load the ads?
For instance (using JQuery):
$(document).ready(function() { $("#apageelement").load("/phpapp/getads.php"); })
This allows you to keep you app almost completely separate from the PHP app.
Best solution is to use server side includes. Most webservers support this.
For example this is how it would be done in nginx:
<!--# include virtual="http://localhost:8080/phpapp/getads.php" -->
Your webserver would then dynamically request from your php backend, and insert it into the response that goes to the client. No javascript necessary, and entirely transparent.
You could also use a borderless <iframe>
I've done this in the past by serving the PHP portions directly via Apache. You could either put them in with your media files, (/site_media/php/) or if you prefer to use something more lightweight for your media server (like lighttpd), you can set up another portion of the site that goes through apache with PHP enabled.
From there, you can either take the ajax route in your templates, or you can load the PHP from your views using urllib(2) or httplib(2). Better yet, wrap the urllib2 call in a templatetag, and call that in your templates.

Categories