How to inspect CURL requests?
My PHP scripts are hosted on IIS and I want to find some debugging tool for CURL.
Could you suggest something in fiddler-style?
(Or maybe there is a way to use fiddler itself, I failed to do so because if I make my CURL to tunnel through proxy 127.0.0.1 it makes CONNECT requests instead of GET)
wireshark is not working for HTTPS but for HTTP only.
Can you change your curl script to use HTTP ?
Use curl -v for verbose mode.
From man curl
-v/--verbose
Makes the fetching more verbose/talkative. Mostly useful for debugging. A line starting
with '>' means "header data" sent by curl, '<' means "header data" received by curl that
is hidden in normal cases, and a line starting with '*' means additional info provided by
curl.
Note that if you only want HTTP headers in the output, -i/--include might be the option
you're looking for.
If you think this option still doesn't give you enough details, consider using --trace or
--trace-ascii instead.
This option overrides previous uses of --trace-ascii or --trace.
Related
I want to do some m2m-communication, where my server is to emulate a human operating a webpage.
So I'm trying to SEND an XMLHttpRequest from php TO another server.
Whatever I've searched for gives how for php to ACCEPT a XMLHttpRequest
I have debugged the browser, and Chrome webdeveloper tools have given me a cURL cmd which works.
The curl cmd ends in
--data-binary '[{"productNumber":"12345678","quantity":1}]'
I'm using snoopy to send the requests, and have emulated every cookie and header, but the server still responds with 400 Invalid Request.
I think the problem lies in that snoopy usually is used like this:
$submit_vars['email'] = "johndoe#example.com";
$submit_vars['password'] = 'secret';
$snoopy->submit($submit_url, $submit_vars);
i.e. Snoopy expects an array of form variables, not a string.
Is there a way to make snoopy send the equivalent of curl --data-binary ?
Inside a Slave site I have a script that performs a cURL request vs a server of mine (Master).
Locally, I have installed those two sites and I'd wish to debug what happens on Master when Slave tries to connect it.
Ideally, the perfect solution would be to attach my own request to PHPStorm debugger, so I can actually see what's going on.
I tried to start the debug, but then PHPStorm attaches to the calling script, and not the receiving site.
Do you have any suggestions on how can I actually debug it, without the need to rely on the good old var_dump();die();?
Well, in the end of the day, PHPStorm relies on a cookie to attach to the incoming request.
By default, such cookie has the following value: XDEBUG_SESSION=PHPSTORM.
This means that you simply have to add the following line to your code:
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Cookie: XDEBUG_SESSION=PHPSTORM"));
and PHPStorm will "see" the incoming request, allowing you to debug it.
Additional Tip
The previous trick works everywhere!
If you are trying to debug a cURL request from command line, once again you simply have to pass the cookie param and PHPStorm will attach to the request:
--cookie "XDEBUG_SESSION=PHPSTORM"
How can I check in my PHP script, that script execute exactly from Cron Tab by wget util, not from web browser by any user?
There is no reliable solution. Anything wget can do, your browser can do too.
Your best shot is something like sending to wget to http://example.com/mysript.php?accesskey=some+really+obscure+password+that+only+you+should+know and check for that access key in your script. Of course, if anyone gets the password, this kind of protection is useless, but it's a far more consistent point of failure than blindly relying on User-Agent sniffing.
A possibility would be to use $argv. Check if $argv[1] is a certain value and call the script from crontab with the value as argument, like so: script.php argument1.
You're question is a bit difficult to understand bus I guess you wan't to make sure a PHP script is requested by wget (that get initiated by cron)
Although it might be more efficient to call the PHP script directly by cron in this case you could check the server's logging end search for the user agent matching something like wget.
An insecure solution would be to check the headers for the User-Agent:
wget -d http://www.google.com/
---request begin---
GET / HTTP/1.0
User-Agent: Wget/1.12 (linux-gnu)
Accept: */*
Host: www.google.com
Connection: Keep-Alive
---request end---
So you could do:
<?php
$userAgent = $_SERVER['HTTP_USER_AGENT'];
if (strstr($userAgent, 'Wget')) {
// wget Request
}
You can pass some arguments in crontab for your script http://php.net/manual/reserved.variables.argv.php
Then checking for these args you'll know if your script is used from command line or web.
EDIT :
Seeing answers, let's make this clear.
Calls with Wget or cURL or whatever HTTP GET request WON'T PASS ARGS!
ARGS will only pass with local call (like : php script.php arg1 arg2).
Please, noobs, stop talking when you don't know about it, and try it out yourself on your server if you aren't sure about it.
I use chromium --ingognito www.mysite.com/page.php?msg=mymessage to open my website and pass it a msg.
I wish to know how to pass the same msg param via POST instead to use GET, from command line.
Do you do anything with the site in Chromium after opening it? Otherwise you could use a more capable command line http client like curl(1) which would make this very easy.
See this example:
curl --data "param1=value1¶m2=value2" http://example.com/resource.cgi
With console ? I don't know, but you can try to use this extension : Advanced REST Client.
The web developers helper program to create and test custom HTTP requests.
Here is the link : https://chrome.google.com/webstore/detail/advanced-rest-client/hgmloofddffdnphfgcellkdfbfbjeloo
I am trying to track down an issue with a cURL call in PHP. It works fine in our test environment, but not in our production environment. When I try to execute the cURL function, it just hangs and never ever responds. I have tried making a cURL connection from the command line and the same thing happens.
I'm wondering if cURL logs what is happening somewhere, because I can't figure out what is happening during the time the command is churning and churning. Does anyone know if there is a log that tracks what is happening there?
I think it is connectivity issues, but our IT guy insists I should be able to access it without a problem. Any ideas? I'm running CentOS and PHP 5.1.
Updates: Using verbose mode, I've gotten an error 28 "Connect() Timed Out". I tried extending the timeout to 100 seconds, and limiting the max-redirs to 5, no change. I tried pinging the box, and also got a timeout. So I'm going to present this back to IT and see if they will look at it again. Thanks for all the help, hopefully I'll be back in a half-hour with news that it was their problem.
Update 2: Turns out my box was resolving the server name with the external IP address. When IT gave me the internal IP address and I replaced it in the cURL call, everything worked great. Thanks for all the help everybody.
In your php, you can set the CURLOPT_VERBOSE variable:
curl_setopt($curl, CURLOPT_VERBOSE, TRUE);
This then logs to STDERR, or to the file specified using CURLOPT_STDERR (which takes a file pointer):
curl_setopt($curl, CURLOPT_STDERR, $fp);
From the command line, you can use the following switches:
--verbose to report more info to the command line
--trace <file> or --trace-ascii <file> to trace to a file
You can use --trace-time to prepend time stamps to verbose/file outputs
You can also use curl_getinfo() to get information about your specific transfer.
http://in.php.net/manual/en/function.curl-getinfo.php
Have you tried setting CURLOPT_MAXREDIRS? I've found that sometimes there will be an 'infinite' redirect loop for some websites that a normal browser user doesn't see.
If at all possible, try sudo ing as the user PHP runs under (possibly the one Apache runs under).
The curl problem could have various reasons that require a user input, for example an untrusted certificate that is stored in the trusted certificates cache of the root user, but not the PHP one. In that case, the command would be waiting for an input that never happens.
Update: This applies only if you run curl externally using exec - maybe it doesn't apply.