Is fopen safe to use in public software? - php

I am creating a web application that I hope to release to the public for downloading and installing on anyone's own web server, however I just was informed that some webhosts disable the use of fopen in php due to "security issues", particularly on shared hosts. I use fopen during the installation process of the application, should I be concerned about this? Is this a common practice in shared hosts? If so, is there another way I can write to a file? I have heard of cURL, but this would require more advanced knowledge on the part of the end user, no? If so, this can obviously not be expected. Thanks very much!

fopen() is never disabled. The php.ini setting "allow_url_fopen" however is. So if you only access local files, not http:// URLs via fopen() this is not really a concern.
If you need URL support you should otherwise include a HTTP request class, like the one in PEAR. This way you avoid the user-unfriendly dependency on the cURL extension module.

In my limited experience, fopen() is seldom disabled. Writing to a local file with curl is nonsense, so this wouldn't be an alternative. As all writing to a local file kind of depends on fopen, the most usual route for normal packages is:
Trying to set the content in a file on installation (possibly a file already there with a decent default in the normal packages files).
On failure, present to user with the content you'd like to set, and offer him the option to either copy/paste that content manually, or to retry to set the content (for instance, when the user sets the file permissions correctly, which you of course explain how to do).

using cURL:
function GET($url,$header = null,$post = 0,$cookie = null){
$handle = curl_init();
curl_setopt($handle, CURLOPT_URL, $url);
curl_setopt($handle, CURLOPT_HEADER, $header);
curl_setopt($handle, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
if($post) {
curl_setopt($handle, CURLOPT_POST, true);
curl_setopt($handle, CURLOPT_CUSTOMREQUEST,($post)?"POST":"GET");
curl_setopt($handle, CURLOPT_POSTFIELDS, $post);
}
curl_setopt($handle, CURLOPT_COOKIE, $cookie);
if(preg_match('/https/',$url)) {
curl_setopt($handle, CURLOPT_SSL_VERIFYPEER, false);
}
return($buffer = #curl_exec($handle)) ? $buffer : 0;
}
//A basic example of the requisition process :
echo GET('http://google.com',1)
//post data:
GET('/test.php',1,
array('Name' => 'Jet',
'id' => 12,
'foo' => 'abc'));
returns:
successfully : source-code;
0 : Request failed
//send cookies :
GET('http://example.com/send.php',1,
array('Name' => 'Jet',
'id' => 12,
'foo' => 'abc'),"cookies");
file_put_contents :
http://php.net/file_put_contents

Related

PHP // Include file from another server if the standard-include server is down

I have lots of websites, where i use a lot of includes on. Those files I include are on an external include-server. My problem is: I want to make those files redundant, so if the include server goes down, they are taken from my second include server.
Doing that manually on each website will take by far too long, so I wonder if there is a way to do it for instance on the server-side (so if the server is down it forwards to the other server).
Here is an example of how I usually include my files:
<?php
$url = 'http://myincludeserver.com/folder/fileiwanttoinclude.php';
function get_data($url)
{
$ch = curl_init($url);
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $_REQUEST);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$returned_content = get_data($url);
if(!empty($returned_content))
{
echo $returned_content;
}
else
{
include('includes/local_error_message.php');
};
?>
Thanks for reading!
Short answer:
You're more than likely going to want to refactor your code.
Longer answer:
If you truly want to do this at the server level then you're looking at implementing a "failover." You can read the wikipedia article, or this howto guide for a more in-depth explanation. To explain it simply, you would basically need 3 web servers:
Your include server
A backup server
A monitoring / primary server
It sounds like you've already got all three, but bullet three would ideally be a service provided through a third-party for extra redundancy to handle the DNS (there could still be downtime as DNS updates are being propagated). Of course, this introduces several gotchas that might have you end up refactoring anyway. For example you might run into load balancing challenges; your application now needs to consider shared resources between servers such as anything written to disk, sessions or databases. Tools like HAProxy can help.
The simpler option, especially if the domains associated with the includes are hidden from the user, is to refactor and simply replace bullet three with a script similar to your get_data function:
function ping($domain) {
$ch = curl_init($domain);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
curl_close($ch);
return $response ? true : false;
}
$server1 = 'http://example.com';
$server2 = 'http://google.com';
if (ping($server1)) {
return $server1;
} else {
return $server2;
}
exit;
This would require you to update all of your files, but the good news is that you can automate the process by traversing all of your PHP files and replace the code via regex or by using a tokenizer. How you implement this option is entirely dependent on your actual code along with any differences between each site.
The only caveat here is that it could potentially double the hits to your server, so it would probably be better to use it in such a way that you're setting an environment or global variable and then have it execute periodically through cron.
I hope that helps.

How to send a file with curl without using a form?

I am working on a project where I use two separate servers, one for the development and one for the visible version. This is how the process works, and where I'm having troubles: Every morning, I run some VBA macros that collects data, compiles that data (mainly .xlsx files) and sends it to my development server via FTP. The visible server is supposed to use that data to display some informations etc, but FTP is blocked on that server.
Because of that, I have to copy everything from my development server to the visible server every morning, so that the data on the visible server is updated, and I'd like to automatize that.
I tried sending the data from the VBA macros directly to the visible server via HTTP Requests (WinHTTPRequest to be exact) but that didn't work.
I searched online and found that cURL can send HTTP requests through PHP, and i'd like to use that solution if possible, here is my current code:
send.php:
<?php
$request = curl_init('mysite/receive.php');
curl_setopt($request, CURLOPT_POST, true);
curl_setopt($request,
CURLOPT_POSTFIELDS,
array(
'file' => '#MyFileRealPath.xlsx;filename=file'
));
curl_setopt($request, CURLOPT_RETURNTRANSFER, true);
echo curl_exec($request);
curl_close($request);
?>
receive.php:
<?php
var_dump($_FILES);
?>
When I run send.php, I get:
array(0) { }
So the receive.php file does not get any file, does someone know how to fix that?
If what I'm trying to do is not possible, does someone know any other way that I could try to send the files from the development server to the visible one?
Thanks and sorry for my not perfect english, I'm not a native speaker.
Have a nice day!
I had something similar today when our server suddenly did not send a file over curl anymore.
I found that the PHP version was upgraded.
Apparently, in a previous version a new curl setting (CURLOPT_SAFE_UPLOAD) was introduced which disables using # symbol for file uploads, and "PHP 5.6.0 changes the default value to TRUE".
I think you need to add this setting using curl_setopt and disable it, like:
$request = curl_init('mysite/receive.php');
curl_setopt($request, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_SAFE_UPLOAD, false);
curl_setopt($request,
CURLOPT_POSTFIELDS,
array(
'file' => '#MyFileRealPath.xlsx;filename=file'
));
curl_setopt($request, CURLOPT_RETURNTRANSFER, true);
Alternatively, you can use CurlFile:
$request = curl_init('mysite/receive.php');
curl_setopt($request, CURLOPT_POST, true);
curl_setopt($request,
CURLOPT_POSTFIELDS,
array(
'file' => new CurlFile('MyFileRealPath.xlsx', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
));

Proxy blocking file_get_contents

In my business I have to use google map for my application (calculated distance)
We currently use a configuration script for proxy.
In my application I use the method to query file_get_contents Google Map.
$url = 'http://maps.google.com/maps/api/directions/xml?language=fr&origin='.$adresse1.'&destination='.$adresse2.'&sensor=false';;
$xml=file_get_contents($url);
$root = simplexml_load_string($xml);
$distance=$root->route->leg->distance->value;
$duree=$root->route->leg->duration->value;
$etapes=$root->route->leg->step;
return array(
'distanceEnMetres'=>$distance,
'dureeEnSecondes'=>$duree,
'etapes'=>$etapes,
'adresseDepart'=>$root->route->leg->start_address,
'adresseArrivee'=>$root->route->leg->end_address
);
}
But with the proxy I have an unknown host error. (I tested my home, the code works fine). I wanted to know if there is a way to take into account the proxy that I identify myself as when I browse the web?
You can do this with cURL. It's a bit more verbose than a simple call to file_get_contents(), but a lot more configurable:
$url = 'http://maps.google.com/maps/api/directions/xml?language=fr&origin='.$adresse1.'&destination='.$adresse2.'&sensor=false';
$handle = curl_init($url);
curl_setopt($handle, CURLOPT_PROXY, ''); // your proxy address (and optional :port)
curl_setopt($handle, CURLOPT_PROXYUSERPWD, ''); // credentials in username:password format (if required)
curl_setopt($handle, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, 1);
$xml = curl_exec($handle);
curl_close($handle);
//continue logic using $xml as before
the google have restrictions to number of queries per time segment for it's api
at first u must cache results on your side, and add a pause between queries
https://developers.google.com/maps/documentation/business/articles/usage_limits
Of course you can also mention a "context" to make file_get_contents recognize a proxy. Please see my own question and my own answer to that question here

Easiest way to grab filesize of remote file in PHP?

I was thinking of doing a head request with cURL, was wondering if this is the way to go?
The best solution which follows the KISS principle
$head = array_change_key_case(get_headers("http://example.com/file.ext", 1));
$filesize = $head['content-length'];
I'm guessing using curl to send a HEAD request is a nice possibility ; something like this would probably do :
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://sstatic.net/so/img/logo.png');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_exec($ch);
$size = curl_getinfo($ch, CURLINFO_CONTENT_LENGTH_DOWNLOAD);
var_dump($size);
And will get you :
float 3438
This way, you are using a HEAD request, and not downloading the whole file -- still, you depend on the remote server send a correct Content-length header.
Another option you might think about would be to use filesize... But this will fail : the documentation states (quoting) :
As of PHP 5.0.0, this function can
also be used with some URL wrappers.
Refer to List of Supported
Protocols/Wrappers for a listing
of which wrappers support stat()
family of functionality.
And, unfortunately, with HTTP and HTTPS wrappers, stat() is not supported...
If you try, you'll get an error, like this :
Warning: filesize() [function.filesize]: stat failed
for http://sstatic.net/so/img/logo.png
Too bad :-(
Yes. Since the file is remote, you're completely dependent on the value of the Content-Length header (unless you want to download the whole file). You'll want to curl_setopt($ch, CURLOPT_NOBODY, true) and curl_setopt($ch, CURLOPT_HEADER, true).
Using a HEAD request and checking for Content-Length is the standard way to do it, but you can't rely on it in general, since the server might not support it. The Content-Length header is optional, and further the server might not even implement the HEAD method. If you know which server you're probing, then you can test if it works, but as a general solution it isn't bullet proof.
If you don't need a bulletproof solution you can just do:
strlen(file_get_contents($url));

Any way to keep curl's cookies in memory and not on disk

I'm doing some cURL work in php 5.3.0.
I'm wondering if there is any way to tell the curl handle/object to keep the cookies in memory (assuming I'm reusing the same handle for multiple requests), or to somehow return them and let me pass them back when making a new handle.
Theres this long accepted method for getting them in/out of the request:
curl_setopt($ch, CURLOPT_COOKIEJAR, $filename);
curl_setopt($ch, CURLOPT_COOKIEFILE, $filename);
But I'm hitting some scenarios where I need to be running multiple copies of a script out of the same directory, and they step on each others cookie files. Yes, I know I could use tempnam() and make sure each run has its own cookie file, but that leads me to my 2nd issue.
There is also the issue of having these cookie files on the disk at all. Disk I/O is slow and a bottle neck I'm sure. I dont want to have to deal with cleaning up the cookie file when the script is finished (if it even exits in a way that lets me clean it up).
Any ideas? Or is this just the way things are?
You can use the CURLOPT_COOKIEJAR option, and set the file to "/dev/null" for Linux / MacOS X or "NULL" for Windows. This will prevent the cookies from being written to disk, but it will keep them around in memory as long as you reuse the handle and don't call curl_easy_cleanup().
Unfortunately, I don't think you can use 'php://memory' as the input and output stream. The workaround is to parse the headers yourself. This can be done pretty easily. Here is an example of a page making two requests and passing the cookies yourself.
curl.php:
<?php
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, 'http://localhost/test.php?message=Hello!');
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, false);
curl_setopt($curl, CURLOPT_HEADER, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($curl);
curl_close($curl);
preg_match_all('|Set-Cookie: (.*);|U', $data, $matches);
$cookies = implode('; ', $matches[1]);
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, 'http://localhost/test.php');
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, false);
curl_setopt($curl, CURLOPT_HEADER, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_COOKIE, $cookies);
$data = curl_exec($curl);
echo $data;
?>
test.php:
<?php
session_start();
if(isset($_SESSION['message'])) {
echo $_SESSION['message'];
} else {
echo 'No message in session';
}
if(isset($_GET['message'])) {
$_SESSION['message'] = $_GET['message'];
}
?>
This will output 'Hello!' on the second request.
Just set CURLOPT_COOKIEFILE to a file that doesn't exist, usually an empty string is the best option. Then DON'T set CURLOPT_COOKIEJAR, this is the trick. This will prevent a file from being written but the cookies will stay in memory. I just tested this and it works (my test: send http auth data to a URL that redirects you to a login URL that authenticates the request, then redirects you back to the original URL with a cookie).
There is but it's completely unintuitive.
curl_setopt($curl, CURLOPT_COOKIEFILE, "");
For more details please see my answer in the comments
If using Linux, you could set these to point somewhere within /dev/shm .. this will keep them in memory and you can be assured that they won't persist across re-boots.
I somehow thought that Curl's cleanup handled the unlinking of cookies, but I could be mistaken.
What works for me is using this setting:
curl_setopt($ch, CURLOPT_HEADER, 1);
And then parsing the result. Details in this blog post where I found out how to do this.
And since that is old, here is a gist replacing deprecated functions.

Categories