This is a pretty simple example of using a CURL in my php script to test two servers results. All of a sudden my code broke and I have no idea what happened. Here is the scenario we have two or i thought so identical servers setup with WHM/Cpanel and two identical sets of cloned git repositories. One is our staging server the other our production box.
My problem is that one server staging returns the expected results for the simple scripts below. Our production box just returns nulls. I checked the configuration on both servers with phpinfo() and curl is installed correctly.
My question is has anyone out there had this problem before. I would really like to figure this out and it will probably fix the program we so desperately need.
Thanks again for any responses. Note the code below is only to show CURL working not validating any responses or error's that may have occurred; however, we will display any if they are present.
We tested the same code on two servers stagingpinnaclemedplus.com works, pmpcustomer.com returns null values.
// pageCurl.php
$data['key'] = $_POST['key'];
$data['pdf'] = $_POST['pdf'];
$data['session_id']= $_POST['session_id'];
echo json_encode($data);
// pagetestcurl.php
session_start();
$url = 'http://stagingpinnaclemedplus.com/pageCurl.php';
$postData['key']= 'LABEL_PATH';
$postData['pdf'] = 'off'; // Signifies for the PHP page to create PDF file not shown to browser
$postData['print_mode'] = 'c';
$postData['session_id'] = session_id();
$ch = curl_init();
curl_setopt_array(
$ch, array(
CURLOPT_URL => $url,
CURLOPT_POST => true,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_POSTFIELDS => $postData,
CURLOPT_FOLLOWLOCATION => true
));
$result = curl_exec($ch);
$error = curl_error($ch);
curl_close($ch);
echo "</br>";
echo "Result:".print_r($result).'</br>';
if($error)
echo "Error:".var_dump($error);
When i run the code on the staging server we get what i expect:
{"key":"LABEL_PATH","pdf":"off","session_id":"r2jkkmbhd73maj9e8o72mdvqq3"} Result:1\
When i run it on my production box (note the url host name is changed to pmpcustomer.com) for testpagecurl.php. I get this result:
[Result: {"key":null,"pdf":null,"session_id":null}
string(0) "" Error:]
Related
We've gotten permission to periodically copy a webcam image from another site. We use cURL functions elsewhere in our code, but when trying to access this image, we are unable to.
I'm not sure what is going on. The code we use for many other cURL functions is like so:
$image = 'http://island-alpaca.selfip.com:10202/SnapShotJPEG?Resolution=640x480&Quality=Standard'
$options = array(
CURLOPT_URL => $image,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_CONNECTTIMEOUT => 120,
CURLOPT_TIMEOUT => 120,
CURLOPT_MAXREDIRS => 10
);
$ch = curl_init();
curl_setopt_array($ch, $options);
$cURL_source = curl_exec($ch);
curl_close($ch);
This code doesn't work for the following URL (webcam image), which is accessible in a browser from our location: http://island-alpaca.selfip.com:10202/SnapShotJPEG?Resolution=640x480&Quality=Standard
When I run a test cURL, it just seems to hang for the length of the timeout. $cURL_source never has any data.
I've tried some other cURL examples online, but to no avail. I'm assuming there's a way to build the cURL request to get this to work, but nothing I've tried seems to get me anywhere.
Any help would be greatly appreciated.
Thanks
I don't see any problems with your code. You can get error sometimes because of different problems with network. You can try to wait for good response in loop to increase the chances of success.
Something like:
$image = 'http://island-alpaca.selfip.com:10202/SnapShotJPEG?Resolution=640x480&Quality=Standard';
$tries = 3; // max tries to get good response
$retry_after = 5; // seconds to wait before new try
while($tries > 0) {
$options = array(
CURLOPT_URL => $image,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_CONNECTTIMEOUT => 10,
CURLOPT_TIMEOUT => 10,
CURLOPT_MAXREDIRS => 10
);
$ch = curl_init();
curl_setopt_array($ch, $options);
$cURL_source = curl_exec($ch);
curl_close($ch);
if($cURL_source !== false) {
break;
}
else {
$tries--;
sleep($retry_after);
}
}
Can you fetch the URL from the server where this code is running? Perhaps it has firewall rules in place? You are fetching from a non-standard port: 10202. It must be allowed by your firewall.
I, like the others, found it easy to fetch the image with curl/php.
As it was said before, I can either see any problem with the code. However, maybe you should consider setting more timeout for the curl - to be sure that this slow loading picture finally gets loaded. So, as a possibility, try to increase CURLOPT_TIMEOUT to weird big number, as well as corresponding timeout for php script execution. It may help.
Maybe, the best variant is to mix the previous author's variant and this one.
I tried wget on the image URL and it downloads the image and then seems to hang - perhaps the server isn't correctly closing the connection.
However I got file_get_contents to work rather than curl, if that helps:
<?php
$image = 'http://island-alpaca.selfip.com:10202/SnapShotJPEG?Resolution=640x480&Quality=Standard';
$imageData = base64_encode(file_get_contents($image));
$src = 'data: '.mime_content_type($image).';base64,'.$imageData;
echo '<img src="',$src,'">';
Are you sure it's not working? Your code is working fine for me (after adding the missing semicolon after $image = ...).
The reason it might be giving you trouble is because it's not actually an image, it's an MJPEG. It uses an HTTP session that's kept open and with a multipart content (similar to what you see in MIME email), and the server pushes a new JPEG frame to replace the last one on an interval. CURL seems to be happy just giving you the first frame though.
I have a curl put request that works fine on my localhost but on the live server it throws back a 500 error. Here is my code:
public static function send( $xml )
{
$xml = str_replace( "\n", "", $xml );
//Write to temporary file
$put_data = tmpfile();
fwrite( $put_data, $xml );
fseek( $put_data, 0 );
$options = array(
CURLOPT_URL => 'http://*****************/cgi-bin/commctrl.pl?SessionId=' . Xml_helper::generate_session_id() . '&SystemId=live',
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_HTTPHEADER => array( 'Content-type: text/xml' ),
CURLOPT_PUT => TRUE,
CURLOPT_INFILE => $put_data,
CURLOPT_INFILESIZE => strlen( $xml )
);
$curl = curl_init();
curl_setopt_array( $curl, $options );
$result = curl_exec( $curl );
curl_close( $curl );
return $result;
}
I do have curl enabled on the server!
Does anyone have any ideas why it is not working on the server? I am on shared hosting if that helps.
I also have enabled error reporting at the top of the file but no errors show after the curl has completed. I just get the generic 500 error page.
Thanks
UPDATE:
I have been in contact with the client and they have confirmed that the information that is sent is received by them and inserted into their back office system. So it must be something to do with the response that is the cause. It is a small block of xml that is suppose to be returned.
ANOTHER UPDATE
I have tried the same script on a different server and heroku and I still get the same result.
ANOTHER UPDATE
I think I may have found the route of the issue. The script seems to be timing out because of a timeout on FastCGI and because I am on shared hosting I can not change it. Can any one confirm this?
FINAL UPDATE
I got in contact with my hosting provider and they confirmed that the script was timing out due to the timeout value on the server not the one I can access with any PHP function or ini_set().
If the error is, like you think it is, to do with a script timeout and you do not have access to the php.ini file - there is an easy fix
simply use set_time_limit(INT) where INT is the number of seconds, at the beginning of your script to override the settings in the php.ini file
Setting a timeout of set_time_limit(128) should solve all your problems and is generally accepted as a reasonable upper limit
More info can be found here http://php.net/manual/en/function.set-time-limit.php
Here are a few things to try:
Remove the variability in the script - for testing, hardcode the session id, so that the curl curl is the same. You cannot reliably test something, if it changes each time you run it.
Try using curl directly from the command line, via something like curl http://*****************/cgi-bin/commctrl.pl?SessionId=12345&SystemId=live. This will show you if the problem is really due to the computer itself, or something to do with PHP.
Check the logs on your server, probably something like /var/log/apache/error.log depending on what OS your server uses. Also look at the access log, so that you can see whether you are actually receiving the same request.
Finally, if you really run out of ideas, you can use a program like wireshark or tcpdump/WinDump to monitor the connection, so that you can compare the packets being sent from each computer. This will give you an idea of how they are different - are they being mangled by a firewall? Is php adding extra headers to one of them? Are different CURL defaults causing different data to be included?
I suspect your server does not support tmpfile(). Just to verify:
public static function send( $xml ) {
$xml = str_replace( "\n", "", $xml );
//Write to temporary file
$put_data = tmpfile();
if (!$put_data) die('tmpfile failed');
...
If you are on GoDaddy server check this out... https://stackoverflow.com/questions/9957397/tmpfile-returns-false-on-godaddy-server
Which server is actually showing the 500 ? from your code it seems the local server rather than the remote.
change
public static function send( $xml )
{
to
public static function send( $xml )
{
error_reporting(E_ALL);
if(!function_exists('curl_exec')) {
var_dump("NO CURL");
return false;
}
does that work ?
This is almost certainly NOT the php timeout setting.
If you are using FastCGI as you have stated then you need to edit this file:
/etc/httpd/conf.d/fcgid.conf
And change:
FcgidIOTimeout 3600
Then do:
service httpd restart
This was driving me insane for 3 days. The top voted answer to this is wrong!
I am trying to run a script on a remote server, and have the results of that script returned to the calling script. A variable is sent to the remote script, and based on that the remote script is meant to retrieve a list of filenames on the remote server, and return those filenames as an array. However, using return in the included file is not returning an actual value, it just aborts the script. Other than that, the remote script runs without a problem, and I can have it var_dump the list of filenames for me, but that doesn't do much good for me on the local script. Both servers are owned by us (us being my company).
I've tried something simple like this just to see if I could get a return value and it didn't work:
Local Script:
$test = include "http://remote_host_address/remote_script.php";
var_dump($test);
Remote Script:
$ret = "Hello World";
return $ret;
This outputs int(1). The code itself of the remote script works perfectly, that I've tested, and the variable I send as a get variable also goes through no problem. The only problem is that I am not getting a return value from the remote_script.
Also, yes allow_url_include is on for the local server. However, it is off for the remote server; but that should not make a difference: http://php.net/allow-url-include.
I have looked over some of the other related questions on this topic, and nothing seems to quite describe my problem. Any help would be greatly appreciated, as I have spent a few hours looking this over already and have not made any progress.
Try using file_get_contents() instead of include.
This writes the file into a variable [causing the script to execute remotely], but won't run the response locally.
Alternatively If you have the ability to use cURL, it is safer and probably quicker.
A small snippet to remotely file_get_contents();
function curl_get_contents($url){
$c = curl_init($url);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
$res = curl_exec($c);
if (curl_getinfo($c, CURLINFO_HTTP_CODE) > 400) $res = false;
curl_close($c);
return $res;
}
By way of explanation
RETURNTRANSFER puts the response of the curl request into a variable (instead of printing to screen).
CURLOPT_FOLLOWLOCATION has not been set, so if the page has been moved, curl will not follow it. You can set that, or have it set based on a second argument.
HTTP_CODE, if above 400 (an err code, presumably 404 or 500) will return false instead of the fancy custom 404 page that might be setup.
In my testing, get_headers() is more reliable than curlinfo_http_code but requires a second call to the page being included, which can make things go awry.
eg. if (!strpos(200, get_headers($url)[0])) return false;
Script on other server http://remote_host_address/remote_script.php is probably being executed. Rename file to .txt and then use include on that file.
If the script must be run remotely, then run it, and return/echo php code. Example:
File: http://localhost/test.php
<?php
header('Content-Type: text/plain; charset=utf-8');
$array = array();
// your logic
$array['remote_server_time'] = time();
$array['sub'] = array(1, 2, 3);
// etc.
//output
echo '<?php return ' . var_export($array, true) . ';';
will output:
<?php return array (
'remote_server_time' => 1381401586,
'sub' =>
array (
0 => 1,
1 => 2,
2 => 3,
),
);
File: http://localhost/index.php
<?php
header('Content-Type: text/plain; charset=utf-8');
$array = include('http://localhost/test.php');
print_r($array);
will output:
Array
(
[remote_server_time] => 1381401701
[sub] => Array
(
[0] => 1
[1] => 2
[2] => 3
)
)
I have a php file let's say A.php that gets some variables by $_POST method and updates a local database.
Another php file with the name dataGather.php gathers the data in the correct form and after that it tries to send the data to the local database by using the A.php file. Note that both files are in the same directory.
The code where I use the curl functions to do the POST request is the following:
$url = "A.php";
$ch = curl_init();
$curlConfig = array(
CURLOPT_URL => $url,
CURLOPT_POST => true,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_POSTFIELDS => $datatopost
);
curl_setopt_array($ch, $curlConfig);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
$datatopost
is an array like the following:
$datatopost = array (
"value1" => $val1,
"value2" => $val2,
etc
}
The problem is that when I run my program I get the following result:
Fatal error: Maximum execution time of 30 seconds exceeded in
C:\xampp\htdocs\dataGather.php on line 97
does anyone know why this is happening? Thanks in advance
PS: The file A.php is 100% correct because I have tested it by gathering the information needed with javascript. It informs the database the way I want. Also the array $datatopost has all the information in the correct form.
I suspect you directly run your php script without using a web server but by simply starting the script as executable. This is suggested by the fact that you have an absolute path in your error message. Whilst it is absolutely fine to run a php script like that you have to ask yourself: what does that cURL call actually make? It does not open and run the php file A.php you tried to reference. Why not? Because cURL opens URLs, not files. And without using a server that can react to url requests (ike a http server), what do you expect to happen?
The error you get is a timeout, since cURL tries to contact a http server. Since you did not specify a valid URL it most likely falls back to 'localhost'. but there is not server listening there...
So I've been finding a lot of posts here and other places on the interwebs regarding PHP, cURL and SSL. I've got a problem that I'm not seeing around.
Obviously, if I set SSL_VERIFYPEER/HOST to blindly accept I can get this to work, but I would like to use my cert to verify the connection.
So here is some code:
$options = array(
CURLOPT_URL => $oAuthResult['signed_url'],
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_HEADER => 0,
CURLOPT_SSL_VERIFYPEER => TRUE,
CURLOPT_SSL_VERIFYHOST => 2,
CURLOPT_CAINFO => getcwd() . '\application\third_party\certs\rootCerr.crt'
);
curl_setopt_array($ch, $options);
try {
$result = curl_exec($ch);
$errCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
if (curl_getinfo($ch, CURLINFO_HTTP_CODE) != 200) {
throw new Exception('<strong>Error trying to ExecuteWebRequest, returned: '.$errCode .'<br>URL:'.$url . '<br>POST data (if any):</strong><br>');
}
curl_close($ch);
} catch (Exception $e) {
//print the error stuff
}
The error code that is returned is 0...which means that everything is A-OK...but since nothing comes back to the screen...I'm pretty sure it's not working.
Anyone?
The $errCode you extract is the HTTP code which is 200-299 when OK. Getting 0 means it was never set due to a problem or similar.
You should rather use curl_errno() after curl_exec() to figure out if things went fine or not. (You can't check the curl_exec() return code for errors as easily, as you have CURLOPT_RETURNTRANSFER enabled which makes that function instead return the contents of the transfer it is set to get. Of course, getting no contents at all returned should also be a good indicator that something failed.)
I've implemented LibCurl Certs by using the CURLOPT_CAINFO as you have indicated...
However, by providing the file name itself wasn't good enough... It had crashed on me too.
For me, the file was referenced by relative path... Additionally, I had to make sure the cert was in Base64 format too. Then everything went through without a hitch..