I'm developing a gateway script that needs to send info to another provider's server, and I need to debug the code.
Is there a way, on my own Linux + Apache + PHP server to capture the CURL / XML data from this script?
I know with PHP, that I could see for example the $_POST, $_GET or $_REQUEST data in a script, but with CURL I don't actually get to the http://intranet/capture.php script in my browser - so this doesn't work.
Is there any other way, with a script on the server to capture everything that's passed to the server, and dump it to a database / flat file?
I even tried monitoring /var/logs/http/access_log on the Linux server, but it didn't reveal much
So, how can I see what the CURL script does, exactly, as the server sees it?
what you can try is this.
echo htmlentities(file_get_contents('http://intranet/capture.php'));
I'm not sure if this is what you mean but it does the same as curl (sort of)
You want to see the output of curl
$ch = curl_init(); // initialize curl handle
curl_setopt($ch, CURLOPT_URL, $url ); // set url to post to
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);// allow redirects
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); // return into a variable
curl_setopt($ch, CURLOPT_TIMEOUT, 3); // times out after 4s
curl_setopt($ch, CURLOPT_POST, 1); // set POST method
$result = curl_exec($ch); // run the whole process
curl_close($ch);
echo htmlentities($result);
I hope this is what you mean
In this case, you are the client and the provider's server is the server.
Assuming you are running the curl command from the client, all you can get to is what Robert Cabri said.
If you are attempting to look at whats being received by the server, you need to have appropriate access and also need to know what application stack the server is running to serve your request.
Related
I've tried to retrieve the image data of my Facebook profile picture, using both file_get_contents and curl.
The problem occurs on my Google compute engine instance, while on any other server (localhost - mamp, AWS) the script works fine.
An example for one of the scripts I was using
<?php
var_dump(
json_decode(
file_get_contents("https://graph.facebook.com/_FACEBOOK_ID_/picture?width=720&height=720")
)
);
Please keep in mind that I've tried using the parameter redirect=false, and accessing the image url I've got in my json response returned false as-well.
Also, I've tried using wget in SSH to the image's url, which returned (403) Forbidden.
My assumption is that I need to configure something differently in my server, not PHP, but because I'm able to retrieve any other image, with the same script - I'm not sure what.
I've already experienced this exact problem,
Ignoring SSL verification while using cURL did the trick for me.
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); // Ignore SSL verification
curl_setopt($ch, CURLOPT_URL, "https://graph.facebook.com/_FACEBOOK_ID_/picture?width=720&height=720");
$data = curl_exec($ch);
curl_close($ch);
var_dump($data);
So I am encountering a strange situation with a php curl request.
It works perfectly when the script is called from the same local network as the server which is running it but when somebody calls it from outside is not working anymore.
This is my code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, ROOT_DIR."directory/someScript.php");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$output = curl_exec($ch);
curl_close($ch);
When I run someScript.php manually even from outside the LAN it works just fine. The issue occurs only when it I try to run it by calling it with curl via another php script.
Anybody have any idea? The curl library is enabled. And both scripts are on the same server.
I am using port forwarding on port 80 to make the server visible in the Internet.
I decided to post the solution in case somebody else has the same problem.
In the end I found the cause for this. The problem was that I was calling the script using the external IP instead of the internal one. Further more because of this XAMPP redirected the call to its default page and because I didn't use the output I did not noticed that.
Hello
I am working with a legacy system where an ASP.NET application posts an XML file to a server via curl.exe (this url to send is configurable by a .config file).
Now due to legacy system limitations, I need curl post this XML to my ubuntu server by changing the said .congfig file, modify the received XML as I need and finally curl post it to the real server.
How can this be done ? My guess is a php or a python script running under apache2 server, listening posts. Once received the xml file, do the required modifications on the file and post to the real curl server.
Via php or python, how can this be done ?
Since ASP.NET application is posting XML, you simply need to handle a normal POST request, modify XML to match your requirement and post it using cURL to the real cURL server. In PHP, it would look something like this (more or less meta code, error checking and additional logic is needed):
$xml = $_POST['xml'];
// do something with posted XML
.....
// post it to the "real" cURL server
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('xml' => $xml));
$result = curl_exec($ch);
curl_close($ch);
That's about it, check cURL documentation and use what is necessary for POST to work with your server, and your are all good.
We've written a script that pulls data from an external server. If the server goes down we don't want our server waiting for the data since we process a lot of data and we don't want it bogged down. To address this, we're trying to timeout our curl calls if they take more than a couple hundred milliseconds.
I found some documentation saying that CURLOPT_TIMEOUT_MS and CURLOPT_CONNECTTIMEOUT_MS should be available in my version of php and libcurl, but it does not seem to be timing out, even if I set the timeout to 1ms.
$url = "http://www.cnn.com;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER,0); //Change this to a 1 to return headers
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT_MS, 1);
$data = curl_exec($ch);
curl_close($ch);
Does anyone know what we're doing wrong or another way to do this?
saw this in unresponsive dns server and curl multi timeouts not working:
"...We have had some times where a
site that we pull information has had
dns server become unresponsive. When
this happens the timeouts set in curl
(php bindings) do not work as
expected. It times out after 1min 14
sec with "Could not resolve host:
www.yahoo.com (Domain name not found)"
To make this happen in test env we
modify /etc/resolv.conf to have a
nameserver that does not exist
(nameserver 1.1.1.1). No mater what
they are set at
(CURLOPT_CONNECTTIMEOUT, CURLOPT_CONNECTTIMEOUT_MS
, CURLOPT_TIMEOUT, CURLOPT_TIMEOUT_MS)
they don't timeout when we cant get
to the DNS server. I use curl_multi
because i we have multiple sources
that we pull info from at the same
time. The example below makes one
call for example simplicity. And as a
side note curl_errno does not return
an error code even though there was an
error. Not sure why..."
I am doing an HTTP POST using cURL
$url = "http://localhost:8080/~demo/cgi-bin/execute/100";
//open connection
$ch = curl_init();
//set the url, number of POST vars, POST data
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_POST,count($data));
curl_setopt($ch,CURLOPT_POSTFIELDS,$data);
//execute post
$result = curl_exec($ch);
echo("$result");
//close connection
curl_close($ch);
The post gets executed, but the response is shown with the error:
The requested URL /~demo/100 was
not found on this server.
The above URL, obviously, does not exist not the server because (somehow) cURL has changed the URL.
It should have been /~demo/cgi-bin/execute/100 . This URL works in browser.
Please tell me why does it do that?
AND how can i stop this, for what I want?
Install Fiddler.
Enable debugging.
Visit the site in the browser.
Execute php cURL code.
Fiddler will tell you exactly what the web server is receiving and sending. since you are running locally, you can see exactly what php is sending as well. Compare the two and that will tell you the problem.
Maybe cURL tries to access default http port 80? Try to use
curl_setopt($ch, CURLOPT_PORT, 8080)
It may not be cURL that is changing the URL, rather that the web server is sending a redirect header to cURL, pointing at a different location. Perhaps the following would help:
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
where is?
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);