So I came at a little problem with a project of mine. We have a bulky server with lots of space as well as a light static storage server that can only be used to store things. We need to make sure only the people who are authenticated can access the resources on the static server, so I thought about making a psuedo-proxy out of readfile(), as we can use allow_url_fopen.
So I tried the following code as a test:
<?php
$type = "video/webm";
$loc = "http://a.pomf.se/fzggfj.webm";
header('Content-Type: '.$type);
header('Content-Length: '.filesize($loc));
readfile($loc);
exit;
This always fails, the browser reads this as corrupted. Interestingly, when you do this:
<?php
$type = "video/webm";
$loc = "../test.webm";
header('Content-Type: '.$type);
header('Content-Length: '.filesize($loc));
readfile($loc);
exit;
It does work, even though the file is the exact same. Does anyone know why readfile will not do this correctly, and explain this to me?
EDIT:
I got the error message from it, it was stuck in the file.
Warning: filesize(): stat failed for http://a.pomf.se/fzggfj.webm in C:\uniform\UniServerZ\www\director.php on line 5
Is filesize() my problem here?
Ok I fixed it. deceze was correct, and filesize was the issue. Let the record show that filesize doesn't work on remote resources I guess.
You need to activate allow_url_fopen by adding allow_url_fopen=1 in your php.ini.
why You aren't downloading video to temporary directory and redirect user to there? (of course You can clear outdated tmp dir later with cron script)
try this:
<?php
$loc = "http://a.pomf.se/fzggfj.webm";
$pathToVideos = dirname(__FILE__).'/tmp/';
$ext = explode('.', $loc);
$ext = end($ext);
$hash = md5($loc);
$filename = $hash.'.'.$ext;
$tmpFile = $pathToVideos.$filename;
if(!is_file($tmpFile)) {
exec('wget -O '.escapeshellarg($tmpFile).' '.escapeshellarg($loc));
}
header('Location: /tmp/'.$filename);
exit(0);
Related
I want to protect a pdf file from being directly linked but instead have my logged in users be able to access it. I have a link which currently goes to a javascript function which posts a form:
$('nameofdoc').setProperty('value',doc);
document.getElementById('sendme').submit();
where sendme is the name of the form and nameof doc the index of the document I want to display.
This then goes to a php file:
$docpath = $holdingArray[0].$holdingArray[1];
$file = $holdingArray[0]; //file name
$filename = $holdingArray[1]; //path to the file]
header( 'Location:'.$docpath ) ;
header('Content-type: application/pdf');
header('Content-Disposition: attachment; filename="'.$filename . '"');
readfile($filename)
This all works fine it loads up the file and outputs the pdf. What I can't do is protect the directory from direct linking - ie www.mydomain.com/pathToPdf/pdfname.pdf
I've thought of using .htaccess to protect the directory but it's on a shared host so I'm not sure about the security and anyway when I've tried I can't get it to work.
Any help would be great since this is my fourth day of trying to fix this.
thanks
Update
I've had a lot of help thank you but I'm not quite there yet.
I've got an .htaccess file that now launches another php file when a pdf is requested from the directory:
RewriteEngine on
RewriteRule ^(.*).(pdf)$ fileopen.php
When the fileopen.php file lauches it fails to open the pdf
$path = $_SERVER['REQUEST_URI'];
$paths = explode('/', $path);
$lastIndex = count($paths) - 1;
$fileName = $paths[$lastIndex];
$file = basename($path);
$filepath = $path;
if (file_exists($file)) {
header( 'Location: http://www.mydomain.com'.$path ) ;
header("Content-type: application/pdf");
header("Content-Disposition: attachment; filename=".$file);
readfile($filepath);
}else{
echo "file not found using path ".$path." and file is ".$file;
}
The output is
file not found using path /documents/6/Doc1.pdf and file is Doc1.pdf
but the file does exist and is in that direcotry - any ideas??
OKAY I'm happy to report that Jaroslav really helped me sort out the issue. His method works well but it is tricky to get all the directory stuff lined up. In the end I spent a few hours playing about with combinations to get it working but the principle he gave works well. Thanks
The best way would be to protect that folder with htaccess, as you have mentioned. So you put all PDFs in pdf/ folder, and in the same pdf folder you out .htaccess file:
RewriteEngine on
RewriteRule .* your-php-script.php
Now no files can be accessed by url in this folder. Every request to every file in this folder will return what your-php-script.php script returns. In your-php-script.php you do something like this:
//Check if user has right to access the file. If no, show access denied and exit the script.
$path = $_SERVER['REQUEST_URI'];
$paths = explode('/', path);
$lastIndex = count($paths) - 1;
$fileName = $paths[$lastIndex]; // Maybe add some code to detect subfolder if you have them
// Check if that file exists, if no show some error message
// Output headers here
readfile($filename);
Now if user opens domain.com/pdf/nsa-secrets.pdf Apache will run your-php-script.php. Script will have variable $_SERVER['REQUEST_URI'] set to "domain.com/pdf/nsa-secrets.pdf". You take the last part (filename) and output it to a user (or not).
This will stop anyone from accessing files directly from the internet by knowing URL. If someone has direct access to files on your server, that will not stop them. On the other hand, I think any shared hosting stops users from getting files of other clients. Only way to do it is to hack the server in some way. But then we are getting very paranoid and if that may be a case for you, you shouldn't use shared hosting in the first place.
If you cannot make htaccess work, you can try to obfuscate files, so it would be difficult to spot them for someone outside. For example change file from mySecretData.pdf to djjsdmdkjeksm.pdf. This may help a little bit.
I want to protect a pdf file from being directly linked but instead have my logged in users be able to access it.
Check to ensure there is an authenticated user before streaming the PDF's content.
This is kinda sloppy but it could work assuming you can setup a MYSQL DB. It lets you pass the "password" in the URL as an MD5 string or as a clear text if you want to. Trying to setup some kind of security without using htaccess or an existing frame work is kinda clunky. This however won't even attach the file to the stream until it knows you've been "Authenticated" I think you could maybe make this a little better if you setup a login page that saved a cookie locally then you wouldn't need to pass the "passphrase" in the URL.
$file = $_GET['file'];
$pass = $_GET['pass'];
$download_folder = '../Protected';
$file = basename($file);
$filepath = "$download_folder/$file";
if (file_exists($filepath)) {
if(CheckUser($pass)){
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=$file");
session_write_close();
readfile($filepath);
} else {
echo 'Not Authenticated!';
}
} else {
echo 'No File!';
}
function CheckUser($value){
$con = mysqli_connect("test.com","test","123456","my_db");
// Check connection
if (mysqli_connect_errno()){
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
$result = mysqli_query($con,"SELECT user FROM pass_table WHERE password =".md5($value).";");
while($row = mysqli_fetch_array($result)){
mysqli_close($con);
//return $row['user'];
if($row['user']){
return true;
}
}
mysqli_close($con);
return false;
}
I wrote a php function to copy an internet image to local folder, sometimes it works well, but sometimes it just generate an invalid file with size of 1257B.
function copyImageToLocal($url, $id)
{
$ext=strrchr($url, ".");
$filename = 'images/' . $id . $ext;
ob_start();
readfile($url);
$img = ob_get_contents();
ob_end_clean();
$fp=#fopen($filename, "a");
fwrite($fp, $img);
fclose($fp);
}
Note: The $url passed in is valid, sometimes this function fails at first time, but may successful for the second or third time. It's really strange...
Does this require some special PHP settings?
Please help me!
I have found the real reason: the test image url is not allowed for program accessing, though it can be opened in browser.
I tried some other image url, the function works well.
So, it seems I need find a way to process this kind of cases.
Thanks guys!
Why don't you just open the file and write it to the disk like so:
file_put_contents($filename, fopen($url, 'r'));
This will even do buffering for you so you shouldn't run into memory problems (since you are storing the whole image in memory before writing it to a file)
I'm writing a function in php, client side I have a canvas image which I use toDataUrl() along with a file name to save the image on the server. The here's the code:
<?php
$imageData=$GLOBALS['HTTP_RAW_POST_DATA'];
$data = json_decode($imageData, true);
$file = $data["file"];
$image = $data["data"];
$filteredData=substr($image, strpos($image, ",")+1);
$unencodedData=base64_decode($filteredData);
$fp = fopen( 'image/' . $file , 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );
?>
The thing is that this code works. And for two out of three of the pages I used it on it works fine. The problem is when I copy and pasted it a third time to implement it again, for some reason the file is made on the server except that no data get's written into the file. I don't think it's a problem client side because I write in a debug alert message in the javascript and a debug echo into the PHP and both are able to print out the data fine. I made this short debug file:
<?php
$fp = fopen('data.txt', 'wb');
if(is_writable('data.txt')){
echo "file is writable<br>";
}
if(fwrite($fp, 'test') == FALSE){
echo "failed to write data<br>";
}
fclose($fp);
?>
And the output is
file is writable
failed to write data
I've tried using chmod and setting everything, the folder, the text file before I write to it to 0777 and I still get the same result; the file is made but no data is written into it. Is there anything I'm missing or any other approaches that might help. I haven't found anything on google and am still baffled as to why the same code worked exactly as expected twice before suddenly stopping for no apparent reason.
Thanks in advance.
I know this is an old post, but I had a very similar problem and found a solution (for me at least)! I ran out of disk space on my server, so it could create a 0 byte file, but wouldn't write to it. After I cleared out some space (deleted a 13gb error.log file) everything started working again as expected.
If fopen works but fwrite mysteriously doesn't, check your disk space. 'df -h' is the command to check disk space on a linux server.
instead of $fp = fopen('data.txt', 'wb'); give $fp = fopen('data.txt', 'w'); and try
Changed "wb" to "w"
When you write $fp = fopen('data.txt', 'w'); for your domain website.com having root at /var/www/website/ and if the php file is located at /var/www/website/php/server/file/admin.php or something similar, it will actually create a file at /var/www/website/data.txt
Try giving absolute path or path relative to your domain root to create files like,
$fp = fopen('php/server/file/data.txt', 'w');
Try the find command to see if the file is created anywhere else in the folder directory by using the following in Ubuntu,
find /var/www/website/ -name 'data.txt'
I had this issue, probably can help you solve if you have similar issue.
I have a directory containing data that should not be world-accessible until a certain date.
The directory, naturally, should not be directly world-readable with a web browser. I currently solve this with .htpasswd and .htaccess.
However, there is a world-readable .php file one directory level up. The PHP file, based on the date, conditionally generates basic HTML tags (e.g., <img .../>) that read from the protected directory.
Unfortunately, in my tests, the .php file requires authentication to load the data. My question is whether I'm trying to do something fundamentally impossible, or whether I can tweak it to make it work. Also, if it is possible, are there any additional issues (security or otherwise) that I should know about?
Additional information:
If possible, I would prefer not to use Javascript.
PHP 5.3 is available.
Any other ideas for a solution (I already thought of a cron-job, which I might yet do)?
I'm guessing a problem you might have is if you try to output <img src="protected.jpg" /> even from an unprotected php file, you'll be able to show the HTML but NOT the image file itself.
If i understand correctly what you're trying to do, you need either :
to write some kind of proxy script in PHP, in order to control access to each file (this is a bit tedious and requires generating the right headers + mime types).
to control access directly from .htaccess using time/date conditions, which might be your best option. see there : http://www.askapache.com/htaccess/time_hour-rewritecond-time.html
Edit : proxy example : i can't seem to find an example online so this is a function i often use when i wish to control access to a file from PHP (for instance this can be sensitive data whose access needs to be verified from $_SESSION or DB values) :
function send_binary_data($path, $mimetype, $filename = null){
#ob_clean();
if($filename === null) $filename = basename($path);
$size = filesize($path);
//no-cache
header('Cache-Control: no-cache, must-revalidate, public');
header('Pragma: no-cache');
//binary file
header('Content-Transfer-Encoding: binary');
//mimetype
header('Content-Type: ' . $mimetype);
header('Content-Length: ' . $size);
header('Content-Disposition: inline; filename=' . $filename);
header('Content-Description: ' . $filename);
$chunksize = 1 * (1024 * 1024);
$buffer = '';
$handle = fopen($path, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
print $buffer;
}
$result = fclose($handle);
unset($handle);
$handle = null;
die();
}
Of course you still need to restrict direct access from .htaccess but in case of a proxy you'll redirect all requests to your unprotected proxy script, like this :
RewriteEngine ON
RewriteRule ^(.*)$ /proxy.php?file=$1 [NC,QSA]
And proxy.php would contain something like :
if(!isset($_GET['file'])) die('file not set');
$file = $_GET['file'];
//perform all your custom checking, including security checks due to retrieving data from $_GET, and if access is granted :
$path = 'yourpath/'.$file;
$mimetype = 'defineregardingyour/ownrules';
send_binary_data($path, $mimetype);
.htaccess only serves as an access control directly from the Internet to the directory.
PHP access is controlled by the chmod permissions. Try chmodding it to 755. You can still put password-protection or any other kind of protection on it with the .htaccess-file.
Considering the added comments, I assume you're trying to include images in your output that are inside the protected directory. Naturally, an unauthenticated user cannot access them... Why else would you have protected them?
You can add the files that need to be world-accessible to your .htaccess file.
I am working on a function for a PostgreSQL database, that when client issues a database dump, the dump is offered as a download. This snapshot could then later be used to restore the database with. However, I can't seem to figure out how to do it. When the user presses the button, an AJAX call to the server is made, as to which the server executes the following code:
if($_POST['command'] == 'dump'){
$dump = $table->Dump();
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename=/"'.$dump.'/"');
}
Where $table->Dump() looks like this:
public function Dump(){
$filename = dirname(__FILE__)."/db_Dump.out";
exec("pg_dump ".$this->name." > $filename");
return $filename;
}
The dump isn't made though. Any tips on this?
This approach however, doesn't work. I thought that setting the headers would be enough to cause a download, but apparently I was wrong. So what would be the correct way of creating a download?
Edit 1, #stevevls:
if($_POST['command'] == 'dump'){
$dump = $table->Dump();
$fh = fopen($dump, 'r') or die("Can't open file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename=/"'.$dump.'/"');
$dumpData = fread($fh, filesize($fh));
fclose($fh);
echo $dumpData;
}
I still don't get anything as a download though.
Edit 2, #myself
I have been able to get a return value, it seemed that the the check if the command given was 'dump' was never reached. I fixed that, and now I get an error on the pg_dump command. I now get
sh: cannot create ../database/db_Dump.sql: Permission denied
I bet this is due to php not being allowed to run pg_dump, but how could I get the system to allow it to be able to run it?
Edit 3, #myself
After resolving the issue with the pg__dump (I added www-data, Apaches user on my system, to the sudoers list, which resoved the issue. Also setting the correct permissions on the directory to write to is handy aswell.) I now get the db_Dump.sql as plain text instead of a save as dialog. Any ideas on that?
first of all check if dump file was created on disc.
Second, check if your PHP script has not reached time limit, because making dump can last long.
Third, you want to read whole dump into memory? You can easly reach memory limit, so do it part-by-part. On php.net you have example in fread manual:
$handle = fopen("http://www.example.com/", "rb");
$contents = '';
while (!feof($handle)) {
$contents .= fread($handle, 8192);
}
fclose($handle);
Turns out, it was all due to the fact of how I requested the download. It seems that it is impossible to get a download when you request it via Ajax, as the returned file get's accepted in the success method of the call. After Changing this to a direct link to the file, I was able to get a download.