I have an n-tier system where a frontend templating layer makes calls out to a backend application server. For instance, I need to retrieve some search results, and the frontend calls the backend to get the results.
Both the templating engine and the appserver are written in PHP. I currently use PHPed to initiate debug sessions to the templating engine, however, when the http request goes out to remote service my debugger just sits and waits for the IO to complete.
What I would like to do is emulate the HTTP call but really just stay inside my PHP process, do a giant push of the environment onto some kind of stack, then have my appserver environment load and process the call. After the call is done, I do an env pop, and get the results of the http call in a var (for instance, via an output buffer). I can run both services on the same server. Does anyone have any ideas or libraries that already do this?
Can you not run a debugger and set a breakpoint in the appserver too? Two different debug sessions - one to trap the templating engine call and one to trap the call in the appserver.
You should be able to trace the output from the appserver in the templating engine debugging session.
If it is not possible to run two debug sessions then create some test inputs for the appserver by capturing outputs from the templating engine and use a single debugger with your test appserver inputs.
This is embarrassingly crude, and quite free of any study of how the debugger works, but have you tried adding
debugBreak();
at the entry points to your called routine? (Assuming both processes running on the same machine).
I have used this technique to break back into a process called via AMFPHP. I have had a PHP file loading Flash file into browser, which then calls back to PHP using AMFPHP, all on the same server. When I hit the debugBreak() line, PhpED regains control.
Why don't you use an HTTP sniffer? Something like tcpflow.
Alternatively, you could just log the complete XML to a file for each request & response.
Unfortunately it's not clear from your question what you're trying to achieve so these are just guesses. You should probably state more clearly exactly what problem you're trying to solve.
You could possibly re-factor your code that calls out to the remove service and use dependency injection and mocks. That would allow you to isolate the development of the front-end with the back by suppling "mocked" but valid data.
Hope that helps.
Can I assume you're talking about the lack of threads in PHP, so the service stops the flow of your program and halts the debugger? There's ways around it, but they are hard, cumbersome and hackish.
For example, if you use a framework like Zend for the HTTP traffic, you can hack the HTTP class to use primitive sockets for the service reading/writing instead of the built-in stuff, and create a small task switcher (loop :) to track what's going on.
You could of course use fopen ( 'http://...' ) and fread in chunks in a loop as well, that could do the trick, but you need http: support in streams turned on.
I don't know much about PHP debugging, and I'm not sure I follow 'push of the environment onto some kind of stack', but I wonder if netcat + some shell scripting could be useful here for troubleshooting ?
You can use netcat to:
Spoof an HTTP Request
Act like a webserver (listen on a port - pick a port, any port!)
http://www.plenz.com/netcat-tips
You could use it to stub out a fake webservice on the one end:
echo "<xml .. <node>hello php!</node>" | netcat -lp 80 ... etc
... and you can certainly use it listening on a port to very clearly see what the incoming requests to the webservice look like.
Could you use a shell script with netcat as middle man that acts like your webservice, immediately returns something generic to make your PHP happy, then passes the request on to your actual appserver and logs the results?
Super simple.
netcat webserver http://img240.imageshack.us/img240/791/netcat.jpg
This is not open source, but check out Charles. It works as a proxy, and is the best debugging proxy I've seen to date. It works on linux, os/x and windows.
Pretty much any HTTP library will allow you to specify a proxy.
PhpED supports parallel debug sessions - meaning you can start debugging code that issues request to initial server and then inter-server requests too. All you need is to set breakpoints in corresponding projects and pass debugger request between the servers. Normally you can do this by re-transmitting value of DBGSESSID (the debugger request) variable with its value to the 2nd server. The variable can be found among $_COOKIES and/or $_GET (depending on how you start debugging -- from the IDE or usign Debugger Toolbar). To re-transmit the variable to the secondary server(s) you can add it to POST variables or as URL parameter or cookie. If you can't do that f.e. if your server filters out all from get/post/cookies, try to embed DebugBreak() call.
Make sure that all your servers can find the IDE by its IP address in the request and allowed to connect back to the IDE -- e.g. you have necessary rules in firewall and LinuxSE (buy default this SE layer is enabled in all modern Linuxes these days). It took me a day to figure out why my server can't connect.
In case if connection from the server to the IDE is not possible (if workstation with the IDE is in a different network, for example at your home), you can use ssh tunnels. In this case the IDE address is localhost, of course.
Related
I have two PHP applications on my server. One of them has RESTAPI which I would like to consume and render in the second application. What is better way then curling the API? Can I somehow ask php-fpm for the data directly or something like that?
Doing curl and making request through the webserver seems wrong.
All this happens on single server - I know it probably doesn't scale well but its small project.
why use REST if you can access the functions directly?
If everything is on the same server then there is no need for some REST, since it makes a somewhat pointless run through the webserver.
But if it is already there and you don't care about the overhead (if there's not much traffic going on then it would make sense), then use file_get_contents instead of curl, it is easier to use, but I doubt it is faster/slower; both are right.
You could also use a second webserver (a second virtualhost) on a different port for internal use. That way things are nicely separated.
(If everything is on different servers, but a local network, then using sockets would be fastest. )
Doing curl and making request through the webserver seems wrong. - I disagree with that. You can still achieve what you want to achieve using Php CURL, even if it's on the same server.
I was in the same problem, but i solved it using MySQL to "queue" tasks, and a worker could use any pooling method, or PHP executing a new server side worker.
Since the results were stored in the same database, the PHP pages could load the results, or the status anytime.
Is there a simple way to detect all outside calls from PHP?
In an open sourced project I have a lot of 3rd party scripts. With use of new relic I was able to debug long execution times leading back to some of this scripts making calls back to their servers.
I dont mind this but I want to know what data this scripts are sending and most of all I dont want to have slow site when 3rd party script server is down or not accessible.
Is there an easy way to log all curl, file get contents etc requests?
Thanks!
You are searching for a packet sniffer. Usually you'll use tcpdump and/or wireshark.
http://wiki.ubuntuusers.de/tcpdump
https://www.wireshark.org/
There are many solutions, but my preferred is
You build your own proxy : example a dedicated Apache Server (it can running in the same IP but different port who will handle this type of operations). After that, you change all of your old URL to pass by your proxy
Imagine that you have this in your code : curl_init('www.google.com'); so you have to change it by: curl_init('http://localhost:8090/CALLS_OUTSIDE_PHP_CONTROLLER.php?url_to_redirect=www.google.com');
The PHP controller running under 8090 can do many operations as : blacklist/whitelist some urls, doing regular URL check in background... many cool stuff
Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.
I'm trying to determine the best approach to providing an Ajax based terminal using PHP. I haven't made an attempt at writing it yet but having rolled the idea around, the only way I could see it possible, would be 2 scripts:
Script 1; handles Ajax communication
between server and client browser. when a
request is made to use the terminal,
it connects to (or starts as a service then
connects to) Script 2 via a socket.
Script 2; performs the system calls,
passing back output to the Ajax
script for output via the socket.
There are multiple holes I can see in this though, and I'm wondering if anyone has created/seen a set of scripts that can perform these tasks? Any insight would be greatly appreciated!
Thanks :)
Edit: I think I was unclear about a few things. I've found a few scripts that imitate terminals, providing nearly the functionality that I'm looking for, such as AjaxPHPTerm (http://sourceforge.net/projects/ajaxphpterm/)
The problem is that, I'm trying to find a method that permits interaction with shell scripts. If a script prompts Press any key to continue, or Select option [x], using AjaxPHPTerm, it just hangs or drops out of the shell script.
That's why I started thinking sockets, or streams; some way of forming a direct I/O stream to the system calls.
Http is stateless and AJAX, sockets or any other technology based on pages generated by server will not change it magically. Whatever tricks You would use, it will be not efficient and simply not worth the effort (In my opinion at least).
The problem seems to be that AjaxPHPTerm is actually closer to a shell than a terminal (glancing at the code, it seems to do its own CWD handling, and has a simple read-eval-print loop).
Assuming a Posix-compatible OS on the server, the proper way to implement this would probably be to use the pseudo-terminal facility, so that your web terminal appears like a virtual terminal on the system, that running programs can interactively access.
I've got a nice question here :)
I need to debug my web service written in PHP. Its client is written in C#.
After a couple of days of searching I realized this is not an easy task. At least it seems nobody knows the right solution.
What is the problem in, actually?
We have 2 popular PHP debugging libraries : PHP Debugger from NuSphere and XDebug extension.
The problem is they both are controlled from URL query string or with the help of cookies. For example, to enable debugging with PHP Debugger you need to add ?DBGSESSID=xxx parameter to your URL or to have DBGSESSID cookie.
But when your web service is called from the external client, the client doesn't have a cookie and doesn't add DBGSESSID url parameter. So how can we debug in this situation?
PS. I don't want to write to log files, see request and response headers/data or something like this. I want normal step-by-step debugging and breakpoints.
Anyone?
Well, I am answering to myself.
If we use PHPEd & DBG, then we can use the magic function DebugBreak().
Make sure PHPEd & PHP DBG Listenere are running, write
DebugBreak('1#127.0.0.1');
anywhere in your werbservice's code, make a call from the client, and voila! - you are in PHPEd on that line in debugging mode!
you could set xdebug.remote_autostart to 1 to always debug (no request parameter needed). this could be limited to some url with the <Location> or <Files> directive.
Or just log some debug information (using Zend_Log or Pear Log if you want a generic library) using var_export.
quick and dirty way is:
file_put_contents('/tmp/log1.txt',
var_export(array($_REQUEST, $something), true));
You could write data to a log file (meh).
Or output debugging information in response headers (if the client can view them). But as far as using breakpoints, you may be out of luck.
You could also look into connection hijacking on your local computer (something similar to the Firefox AddOn Tamper Data) where you can interrupt the request and add the url parameter.
Try SoapUI to issue requests manually and get the detailed responses. Not sure if you can fake the cookie, but you can control the endpoints, and therefore the URL to an extent.
I seem to remember that you can configure NuSphere's product to automatically attempt to connect to the debug listener with or without the DBGSESSID parameter (in query string or cookie). I'm not positive if that's the case, though. However, you can get the effect you're looking for by doing the following. It may be a little more manually intensive than you're hoping for.
Setup some sort of HTTP query/response listener.
Perform desired access against web service from client.
Manually re-issue those requests, appending the appropriate DBGSESSID
For a little more initial setup, but lower friction debugging later:
Configure your client to access an alternate URL.
Setup a proxy to listen on that URL (for debugging, I've seen Privoxy recommended, though I have no experience with it personally).
Configure the proxy to forward all requests to the real web service, appending an appropriate DBGSESSID parameter or including the cookie
I use the plugin Poster to help debug my php Webservice
Edit :
Found a better tool to debug web service : Advanced REST client Application
It's a Chrome Plugin, works great to test all kind of web services that use REST