Is there a simple way to detect all outside calls from PHP?
In an open sourced project I have a lot of 3rd party scripts. With use of new relic I was able to debug long execution times leading back to some of this scripts making calls back to their servers.
I dont mind this but I want to know what data this scripts are sending and most of all I dont want to have slow site when 3rd party script server is down or not accessible.
Is there an easy way to log all curl, file get contents etc requests?
Thanks!
You are searching for a packet sniffer. Usually you'll use tcpdump and/or wireshark.
http://wiki.ubuntuusers.de/tcpdump
https://www.wireshark.org/
There are many solutions, but my preferred is
You build your own proxy : example a dedicated Apache Server (it can running in the same IP but different port who will handle this type of operations). After that, you change all of your old URL to pass by your proxy
Imagine that you have this in your code : curl_init('www.google.com'); so you have to change it by: curl_init('http://localhost:8090/CALLS_OUTSIDE_PHP_CONTROLLER.php?url_to_redirect=www.google.com');
The PHP controller running under 8090 can do many operations as : blacklist/whitelist some urls, doing regular URL check in background... many cool stuff
Related
I want to separate out the API calls my site makes to another install as the site has become big and buggy by having everything together. I would like to know what ways are there if any to make two sites communicate when they are on the same server.
I originally was thinking I could get the client-facing site to just include the models from the API site through a custom loader for CodeIgniter, but I am currently leaning towards wanting the API site to take advantage of Laravel which would obviously scrap directly loading them.
Currently I have some calls which are using CURL to POST requests, is this the only way? I was hoping to drop the HTTP calls in favour of something more direct.
As I said in my comments to the question, I'm definitely no expert on this kind of stuff, but my original thinking was that IPC-style stuff could be done, maybe using names pipes.
PHP does allow for this in its POSIX and process control functions. Simply use posix_mkfifo to create a named pipe and then you should be able to use fopen, fread, etc. (along with the stream_* functions if you need to) to write to and read form the pipe. However, I'm not sure how well that works with a single writer and multiple readers, and it's also probably quite a large change to your code to replace the HTTP stuff you currently have.
So the next possibility is that, if you want to stick with HTTP (and don't mind the small overhead of forming HTTP requests and the headers, etc.), then you should ensure that your server is using local sockets to cut down on transport costs. If your web site's domain name is the same hostname as the server itself this should already be happening (as your /etc/hosts file will have any entry pointing the hostname to 127.0.0.1). However, if not, all you need to do is add such an entry and, as far as I'm aware, it'll work. At the very worst you could hardcode 127.0.0.1 in your code (and ensure your webserver responds correctly to these requests), of course.
Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.
I use file_get_contents/curl to get access for one API at the another server from my php script. This API isn't fast and can take up to 10 seconds to respond.
When I try to open 2 pages on my web site at the same time, which uses this API, they loaded one by one, i.e. I need to wait 1st to be loaded before server will start to server request for 2nd page.
I use Apache2 and php under linux.
How I can avoid such behaviour, I don't want to block other clients while one of them access this API. Need help!
Thanks.
Yes.
There is this PHP library: http://code.google.com/p/multirequest/ (it's a multithreaded CURL lib).
As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.
I have a dedicated webserver running PHP software which needs to automatically collect and update the IP's of a couple Windows and possibly Linux machines of mine which have dynamic IP's (roughly same idea as the no-ip.com client). The simplest thing to do I think is to run a service on each machine which simply pulls a unique URL from the webserver which can then lookup the client IP and match it with the URL etc.
$_SERVER["HTTP_CLIENT_IP"]
What's the best language/library/environment to build a client service which can make a URL request with easy access to the machine's external IP (to check for changes to the dynamic IP so as to not flood the webserver)? It need not be anything fancy, it doesn't even need to read anything from the server, it just has to make the URL request.
Besides web programming, I have some experience with Python and C and a few others. Any pointers or resources I can read up on the subject would be appreciated. Also, am I over-thinking this? Thanks
you can write a shell script with wget calls followed by a sleep in a loop.
wget performs http requests and it's available for windows and is already installed on all/some unix machines.
Mechanize is way overkill for making URL requests in Python. It's really really easy:
from urllib2 import urlopen
urlopen("my://url").read()
The first line may vary depending on your version of Python (for instance, it's urllib.request in Python 3.)
There are hundreds of "What's my IP" services out there; or you could even write your own! Plug one of those into the URL read to get the IP. You'll have to poll to work out when it changes, since you can't run code on the router.
If you already have some kind of method to do an URL request, do one each 10 minutes to this Page:
http://checkip.dyndns.org/Current%20IP%20Check.htm
wich will tell you "Current IP Address: XXX.XXX.XXX.XXX", and you only have to extract some string data.
(You only flood the whatsmyip webserver, and not your own anymore!)
Glad to Help
EGOrecords
I have an n-tier system where a frontend templating layer makes calls out to a backend application server. For instance, I need to retrieve some search results, and the frontend calls the backend to get the results.
Both the templating engine and the appserver are written in PHP. I currently use PHPed to initiate debug sessions to the templating engine, however, when the http request goes out to remote service my debugger just sits and waits for the IO to complete.
What I would like to do is emulate the HTTP call but really just stay inside my PHP process, do a giant push of the environment onto some kind of stack, then have my appserver environment load and process the call. After the call is done, I do an env pop, and get the results of the http call in a var (for instance, via an output buffer). I can run both services on the same server. Does anyone have any ideas or libraries that already do this?
Can you not run a debugger and set a breakpoint in the appserver too? Two different debug sessions - one to trap the templating engine call and one to trap the call in the appserver.
You should be able to trace the output from the appserver in the templating engine debugging session.
If it is not possible to run two debug sessions then create some test inputs for the appserver by capturing outputs from the templating engine and use a single debugger with your test appserver inputs.
This is embarrassingly crude, and quite free of any study of how the debugger works, but have you tried adding
debugBreak();
at the entry points to your called routine? (Assuming both processes running on the same machine).
I have used this technique to break back into a process called via AMFPHP. I have had a PHP file loading Flash file into browser, which then calls back to PHP using AMFPHP, all on the same server. When I hit the debugBreak() line, PhpED regains control.
Why don't you use an HTTP sniffer? Something like tcpflow.
Alternatively, you could just log the complete XML to a file for each request & response.
Unfortunately it's not clear from your question what you're trying to achieve so these are just guesses. You should probably state more clearly exactly what problem you're trying to solve.
You could possibly re-factor your code that calls out to the remove service and use dependency injection and mocks. That would allow you to isolate the development of the front-end with the back by suppling "mocked" but valid data.
Hope that helps.
Can I assume you're talking about the lack of threads in PHP, so the service stops the flow of your program and halts the debugger? There's ways around it, but they are hard, cumbersome and hackish.
For example, if you use a framework like Zend for the HTTP traffic, you can hack the HTTP class to use primitive sockets for the service reading/writing instead of the built-in stuff, and create a small task switcher (loop :) to track what's going on.
You could of course use fopen ( 'http://...' ) and fread in chunks in a loop as well, that could do the trick, but you need http: support in streams turned on.
I don't know much about PHP debugging, and I'm not sure I follow 'push of the environment onto some kind of stack', but I wonder if netcat + some shell scripting could be useful here for troubleshooting ?
You can use netcat to:
Spoof an HTTP Request
Act like a webserver (listen on a port - pick a port, any port!)
http://www.plenz.com/netcat-tips
You could use it to stub out a fake webservice on the one end:
echo "<xml .. <node>hello php!</node>" | netcat -lp 80 ... etc
... and you can certainly use it listening on a port to very clearly see what the incoming requests to the webservice look like.
Could you use a shell script with netcat as middle man that acts like your webservice, immediately returns something generic to make your PHP happy, then passes the request on to your actual appserver and logs the results?
Super simple.
netcat webserver http://img240.imageshack.us/img240/791/netcat.jpg
This is not open source, but check out Charles. It works as a proxy, and is the best debugging proxy I've seen to date. It works on linux, os/x and windows.
Pretty much any HTTP library will allow you to specify a proxy.
PhpED supports parallel debug sessions - meaning you can start debugging code that issues request to initial server and then inter-server requests too. All you need is to set breakpoints in corresponding projects and pass debugger request between the servers. Normally you can do this by re-transmitting value of DBGSESSID (the debugger request) variable with its value to the 2nd server. The variable can be found among $_COOKIES and/or $_GET (depending on how you start debugging -- from the IDE or usign Debugger Toolbar). To re-transmit the variable to the secondary server(s) you can add it to POST variables or as URL parameter or cookie. If you can't do that f.e. if your server filters out all from get/post/cookies, try to embed DebugBreak() call.
Make sure that all your servers can find the IDE by its IP address in the request and allowed to connect back to the IDE -- e.g. you have necessary rules in firewall and LinuxSE (buy default this SE layer is enabled in all modern Linuxes these days). It took me a day to figure out why my server can't connect.
In case if connection from the server to the IDE is not possible (if workstation with the IDE is in a different network, for example at your home), you can use ssh tunnels. In this case the IDE address is localhost, of course.