I have a curl put request that works fine on my localhost but on the live server it throws back a 500 error. Here is my code:
public static function send( $xml )
{
$xml = str_replace( "\n", "", $xml );
//Write to temporary file
$put_data = tmpfile();
fwrite( $put_data, $xml );
fseek( $put_data, 0 );
$options = array(
CURLOPT_URL => 'http://*****************/cgi-bin/commctrl.pl?SessionId=' . Xml_helper::generate_session_id() . '&SystemId=live',
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_HTTPHEADER => array( 'Content-type: text/xml' ),
CURLOPT_PUT => TRUE,
CURLOPT_INFILE => $put_data,
CURLOPT_INFILESIZE => strlen( $xml )
);
$curl = curl_init();
curl_setopt_array( $curl, $options );
$result = curl_exec( $curl );
curl_close( $curl );
return $result;
}
I do have curl enabled on the server!
Does anyone have any ideas why it is not working on the server? I am on shared hosting if that helps.
I also have enabled error reporting at the top of the file but no errors show after the curl has completed. I just get the generic 500 error page.
Thanks
UPDATE:
I have been in contact with the client and they have confirmed that the information that is sent is received by them and inserted into their back office system. So it must be something to do with the response that is the cause. It is a small block of xml that is suppose to be returned.
ANOTHER UPDATE
I have tried the same script on a different server and heroku and I still get the same result.
ANOTHER UPDATE
I think I may have found the route of the issue. The script seems to be timing out because of a timeout on FastCGI and because I am on shared hosting I can not change it. Can any one confirm this?
FINAL UPDATE
I got in contact with my hosting provider and they confirmed that the script was timing out due to the timeout value on the server not the one I can access with any PHP function or ini_set().
If the error is, like you think it is, to do with a script timeout and you do not have access to the php.ini file - there is an easy fix
simply use set_time_limit(INT) where INT is the number of seconds, at the beginning of your script to override the settings in the php.ini file
Setting a timeout of set_time_limit(128) should solve all your problems and is generally accepted as a reasonable upper limit
More info can be found here http://php.net/manual/en/function.set-time-limit.php
Here are a few things to try:
Remove the variability in the script - for testing, hardcode the session id, so that the curl curl is the same. You cannot reliably test something, if it changes each time you run it.
Try using curl directly from the command line, via something like curl http://*****************/cgi-bin/commctrl.pl?SessionId=12345&SystemId=live. This will show you if the problem is really due to the computer itself, or something to do with PHP.
Check the logs on your server, probably something like /var/log/apache/error.log depending on what OS your server uses. Also look at the access log, so that you can see whether you are actually receiving the same request.
Finally, if you really run out of ideas, you can use a program like wireshark or tcpdump/WinDump to monitor the connection, so that you can compare the packets being sent from each computer. This will give you an idea of how they are different - are they being mangled by a firewall? Is php adding extra headers to one of them? Are different CURL defaults causing different data to be included?
I suspect your server does not support tmpfile(). Just to verify:
public static function send( $xml ) {
$xml = str_replace( "\n", "", $xml );
//Write to temporary file
$put_data = tmpfile();
if (!$put_data) die('tmpfile failed');
...
If you are on GoDaddy server check this out... https://stackoverflow.com/questions/9957397/tmpfile-returns-false-on-godaddy-server
Which server is actually showing the 500 ? from your code it seems the local server rather than the remote.
change
public static function send( $xml )
{
to
public static function send( $xml )
{
error_reporting(E_ALL);
if(!function_exists('curl_exec')) {
var_dump("NO CURL");
return false;
}
does that work ?
This is almost certainly NOT the php timeout setting.
If you are using FastCGI as you have stated then you need to edit this file:
/etc/httpd/conf.d/fcgid.conf
And change:
FcgidIOTimeout 3600
Then do:
service httpd restart
This was driving me insane for 3 days. The top voted answer to this is wrong!
Related
I have to send an SMS by making an HTTP request via GET method. The link contains information in the form of GET variables, e.g.
http://www.somelink.com/file.php?from=12345&to=67890&message=hello%20there
After I run the script it has to be as if someone clicked the link and activated the SMS sending process.
I have found some links about get request and curl and what not, it’s all so confusing!
I think the easiest way to make an HTTP request via GET method from PHP is using file_get_contents().
<?php
$response = file_get_contents('http://example.com/send-sms?from=12345&to=67890&message=hello%20there');
echo $response;
Don’t forget to see the notes section for info on PHP configuration required for this to work. You need to set allow_url_fopen to true in your php.ini.
Note that this works for GET requests only and that you will have no access to the headers (request, nor response). Also, enabling allow_url_fopen might not be a good choice for security reasons.
The easiest way is probably to use cURL. See https://web.archive.org/web/20180819060003/http://codular.com/curl-with-php for some examples.
Lets assume that we want to retrive http://www.google.com
$cURL = curl_init();
$setopt_array = array(CURLOPT_URL => "http://www.google.com", CURLOPT_RETURNTRANSFER => true, CURLOPT_HTTPHEADER => array());
curl_setopt_array($cURL, $setopt_array);
$json_response_data = curl_exec($cURL);
print_r($json_response_data);
curl_close($cURL);
/*
cURL is preinstalled by goDaddy.com and many other php hosting providers
it is also preinstalled in wamp and xampp
good luck.
*/
I'm working on a bit of PHP code that depends on a remote file which happens to be hosted on pastebin. The server I am working on has all the necessary functions enabled, as running it with FILE_URL set to http://google.com returns the expected results. I've also verified through php.ini for extra measure.
Everything should work, but it doesn't. Calling file() on a URL formed as such, http://pastebin.com/raw.php?i=<paste id here>, returns a 500 server error. Doing the same on the exact same file hosted locally or on google.com returns a reasonable result.
I have verified that the URL is set to the correct value and verified that the remote page is where I think that it is. I'm at a loss.
ini_set("allow_url_fopen", true);
// Prefer remote (up-to-date) file, fallback to local file
if( ini_get("allow_url_fopen") ){
$file = file( FILE_URL );
}
if(!isset( $file ) || !$file ) {
$file = file( LOCAL_FILE_PATH );
}
I wasn't able to test this, but you should use curl, try something like this:
<?php
$url = "http://pastebin.com/2ZdFcEKh";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_exec($ch);
Pastebin appear to use a protection system that will automatically block IP addresses that issue requests that are "bot-like".
In the case of your example, you will get a 500 server error since the file() command never completes (since their protection system never closes the connection) and there is no timeout facility in your call. The script is probably considered "bot-like" since file() does not pass through all the standard HTTP headers a typical browser would.
To solve this problem, I would recommend investigating cURL and perhaps look at setting a browser user agent as a starting point to grant access to your script. I should also mention that it would be in your interests to investigate whether or not this is considered a breach of the Pastebin user agreement. While I cannot see any reference to using scripts in their FAQ (as of 2012/12/29), they have installed protection against scripts for a reason.
Recently, with no changes to my code, my PHP page started to hang at a certain area. It generates all of the HTML on the page right before this line:
$tickerJSON = file_get_contents("http://mtgox.com/code/data/ticker.php");
I commented out everything else and this is the cause of the error.
I know that that JSON url is valid and the array names are correct. I'm not sure where the problem is in this case. Any help?
Note: It doesn't display a partial or white page, it'll keep loading forever with no display output.
The problem is that the remote server appears to be purposely stall requests that don't send a user agent string. By default, PHP's user-agent string is blank.
Try adding this line directly above your call:
ini_set('user_agent', 'PHP/' . PHP_VERSION);
I've tested the above using this script and it worked great for me:
<?php
ini_set('user_agent', 'PHP/' . PHP_VERSION);
$tickerJSON = file_get_contents("http://mtgox.com/code/data/ticker.php");
echo $tickerJSON;
Update:
$tickerJSON = shell_exec('wget --no-check-certificate -q -O - https://mtgox.com/code/data/ticker.php');
The remote connection you do takes a very long time. You can go around with that providing a timeout value. If it takes too long, the function won't return any data but it wont hinder the script as well from continuing to run.
Next to that you need to set the user-agent:
// Create a stream
$opts = array(
'http'=>array(
'timeout'=> 3, // 3 second timeout
'user_agent'=> 'hashcash',
'header'=>"Accept-language: en\r\n"
)
);
$context = stream_context_create($opts);
$url = "https://mtgox.com/code/data/ticker.php";
$tickerJSON = file_get_contents($url, FALSE, $context);
I am trying to read an XML-file from another server. However the the company that's hosting me seems to have turned of the file_get_contents function from retrieving files for files from other servers (and their support is not very bright and it takes forever for them to answer). So I need a work around in some way.
This is my current code
$url = urldecode( $object_list_url );
$xmlstr = file_get_contents ( $url );
$obj = new SimpleXMLElement ( $xmlstr, LIBXML_NOCDATA );
You could use cURL (if that's not been disabled).
Something like this:
$c = curl_init($url);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
$xmlstr = curl_exec($c);
The ini var you're referring to is allow_url_fopen. To check, run this script:
var_dump(ini_get('allow_url_fopen'));
Ask your host to turn that ini value on (if it's disabled - it's on by default).
You should not be able to access any remote url without that ini setting on.
Also an idea if they won't could be to try copying the file to your server. I expect all filesystem functions will be covered by that ini setting but it's always worth a try.
Can you execute the following script and provide the information as a comment?
<?php
phpinfo();
?>
I'm trying to get the contents from another file with file_get_contents (don't ask why).
I have two files: test1.php and test2.php. test1.php returns a string, bases on the user that is logged in.
test2.php tries to get the contents of test1.php and is being executed by the browser, thus getting the cookies.
To send the cookies with file_get_contents, I create a streaming context:
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"))`;
I'm retrieving the contents with:
$contents = file_get_contents("http://www.example.com/test1.php", false, $opts);
But now I get the error:
Warning: file_get_contents(http://www.example.com/test1.php) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
Does somebody knows what I'm doing wrong here?
edit:
forgot to mention: Without the streaming_context, the page just loads. But without the cookies I don't get the info I need.
First, this is probably just a typo in your question, but the third arguments to file_get_contents() needs to be your streaming context, NOT the array of options. I ran a quick test with something like this, and everything worked as expected
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
$contents = file_get_contents('http://example.com/test1.txt', false, $context);
echo $contents;
The error indicates the server is returning a 404. Try fetching the URL from the machine PHP is running on and not from your workstation/desktop/laptop. It may be that your web server is having trouble reaching the site, your local machine has a cached copy, or some other network screwiness.
Be sure you repeat your exact request when running this test, including the cookie you're sending (command line curl is good for this). It's entirely possible that the page in question may load fine in a browser without the cookie, but when you send the cookie the site actually is returning a 404.
Make sure that $_SERVER['HTTP_COOKIE'] has the raw cookie you think it does.
If you're screen scraping, download Firefox and a copy of the LiveHTTPHeaders extension. Perform all the necessary steps to reach whatever page it is you want in Firefox. Then, using the output from LiveHTTPHeaders, recreate the exact same request requence. Include every header, not just the cookies.
Finally, PHP Curl exists for a reason. If at all possible, (I'm not asking!) use it instead. :)
Just to share this information.
When using session_start(), the session file is lock by PHP. Thus the actual script is the only script that can access the session file. If you try to access it via fsockopen() or file_get_contents() you can wait a long time since you try to open a file that has been locked.
One way to solve this problem is to use the session_write_close() to unlock the file and relock it after with session_start().
Example:
<?php
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
session_write_close(); // unlock the file
$contents = file_get_contents('http://120.0.0.1/controler.php?c=test_session', false, $context);
session_start(); // Lock the file
echo $contents;
?>
Since file_get_contents() is a blocking function, both script won't be in concurrency while trying to modify the session file.
But i'm sure this is not the best manner to manipulate session with an extend connection.
Btw: it's faster than cURL and fsockopen()
Let me know if you find something better.
Just out of curiosity, are you attempting file_get_contents on a page that has a space in it? I remember trying to use fgc on a URL that had a space in the name and while my web browser parsed it just fine, fgc didn't. I ended up having to use a str_replace to replace ' ' with '%20'.
I would think that this should have been relatively easy to spot that though as it would report only half of the filename. Also, I noticed in one of these posts, someone used \r\n while defining the headers. Keep in mind that PHP doesn't like these to be in single quotes, but they work fine in double.
Make sure that file1.php exists on the server. Try opening it in your own browser to make sure!