I try to create the shop where user could buy the video/audio files. The files will be placed at another remote server (Debian). I can't figure out how to let downloading for particular user only. I could calculate the control sum somehow by IP and the link will be something like this:
http://100.000.000.000/files/video.avi?hash=87a686d86d8868a6868a
But how to check this on the remote server? I don't know is the good idea to read whole movie file with PHP script.
Basically two methods are possible.
File system:
You could use the file system, create a password protected folder for each user and copy all their files to it, or better, if you use Linux, use symbolic links (ln -s).
PHP:
Or you could stream files through PHP while it checks access. I don't think that's a real problem. PHP doesn't need to do much if it just pushes through raw data.
$total = filesize($filepath);
$blocksize = (2 << 20); //2M chunks
$sent = 0;
$handle = fopen($filepath, "r");
// Push headers that tell what kind of file is coming down the pike
header('Content-type: '.$content_type);
header('Content-Disposition: attachment; filename='.$filename);
header('Content-length: '.$filesize*1024);
// Now we need to loop through the file and echo out chunks of file data
// Dumping the whole file fails at > 30M!
while($sent < $total) {
echo fread($handle, $blocksize);
$sent += $blocksize;
}
(code is short, no error checks, no password check, no file closure, etc)
It does depend on what kind of password system you have, and what you're allowed to do on your server.
Related
I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions
I am using php to download a file and I want the file should get delete automatically from the server after successful completion of download. I am using this code in php.
$fullPath = 'folder_name/download.txt';
if ($fd = fopen ($fullPath, "r")) {
$fsize = filesize($fullPath);
$path_parts = pathinfo($fullPath);
$ext = strtolower($path_parts["extension"]);
header("Content-type: application/octet-stream");
header("Content-Disposition: filename=\"".$path_parts["basename"]."\"");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
$fd = fopen ($fullPath, "r");
while(!feof($fd)) {
$buffer = fread($fd, 2048);
echo $buffer;
}
fclose ($fd);
}
unlink($fullPath);
You can see in the code after download I am unlink the file. But if I do so corrupted file is getting downloaded. Because sometime the file getting deleted before it get download fully. Is there anyway in php to know that client download the file successfully then I can delete it? Any idea will be highly appreciated.
As far as I'm aware, you cannot use server-side PHP to detect whether the download has finished for the client. It seems ignore_user_abort() is the answer to your question (see below), otherwise you may just delete the file after a set amount of time.
ignore_user_abort(true);
if (connection_aborted()) {
unlink($f);
}
Related/Duplicate on Stackoverflow:
deleting a file after user download it
check if download is completed
If you really are downloading (instead of uploading, like code in your posts suggests), you might be interested in tmpfile function that is specifically designed to create files, that will be immediately removed on having its descriptors closed.
There is no way to know when the user finished downloading a file with PHP, I'd use a queue system to delete the file after n seconds of the request:
How to Build a PHP Queue System
Check the hash code of the file on the server and on the client side...
You could check the hash code with the javascript(How to calculate md5 hash of a file using javascript) send it to server and then check if it is the same al on the server...
Check the request, if the Range HTTP header is set, the client is downloading the file in pieces, it wants to download only a small part of the data at once (for example: Range: bytes=500-999). Normally this is handled by the webserver automatically, but in this case you have to handle it and send only the requested range back. Store the progress in session and deny access only if the client downloaded all of the pieces.
This may be a little buggy for large files but small ones on a fast connection I use this with no problems.
<?php
### Check the CREATE FILE has been set and the file exists
if(isset($_POST['create_file']) && file_exists($file_name)):
### Download Speed (in KB)
$dls = 50;
### Delay time (in seconds) added to download time
$delay = 5;
## calculates estimated download time
$timer = round(filesize($file_name) / 1024 / $dls + $delay);
###Calculates file size in kb divides by download speed + 5 ?>
<iframe width="0" height="0" frameborder="0" src="<?php echo $file_name;?>"></iframe>
<h2>Please Wait, Your Download will complete in: <span id="logout-timer"></span></h2>
Redirects to SELF with File Value ?f=$file_name
<script>setTimeout(function(){ window.location = "<?php echo $_SERVER['PHP_SELF']?>?f=<?php echo $file_name;?>";},<?php echo $timer;?>000);</script>
Deletes the file
<?php
endif;
if (isset($_GET['f'])):
unlink($_GET['f']);
### removes GET value and returns to page's original url
echo "<script> location.replace('".$_SERVER['PHP_SELF']."')</script>";
endif;?>
Download timer set for each file in var seconds
<script>
var seconds=<?php echo $timer;?>;function secondPassed(){var a=Math.round((seconds-30)/60);var b=seconds%60;if(b<10){b="0"+b}document.getElementById('logout-timer').innerHTML=a+":"+b;if(seconds==0){clearInterval(countdownTimer);document.getElementById('logout-timer').innerHTML="Timer"}else{seconds--}}var countdownTimer=setInterval('secondPassed()',1000);
</script>
Not sure it will work in almost all cases but try sleep(10); something to delay the deletion of the file for some specific time.
Im making a website for a band, and they are giving out their EP for free, but they want it so the user has to enter their email address before downloading... How would this be done in php?
The downloadable should be placed out of the reach of web user but within your PHP script reach. Then once user is done filling form, you can then force download the file contents by opening it locally using say "fopen".
Update (Adding Sample Code):
Suppose the file is "txt.txt" which could be in your script reach. You will open it, read and then put the contents after calling header and telling it that its an attachment (force download)
$done = true;
if($done == true){
$filename = "txt.txt";
$conn = fopen($filename,"r");
$contents = fread($conn, filesize($filename));
fclose($conn);
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename="downloaded.txt"');
echo $contents;
}
If you're storing the emails somehow, then simply set a $_SESSION value when they submit their email, and when writing the page, if the part of the $_SESSION value has been set, then provide a link to the media.
To start a session & set the value:
session_start();
$_SERVER['hasEmail']=true;
And in the page:
session_start();
if ($_SERVER['hasEmail']) {
//Provide link
}
You could then also have the media link take you to a PHP script which uses fopen() or similar to get the file from another location on your filesystem out of reach of the user, such as one under .htaccess blocking, and it'll only provide the media is the $_SESSION value is set.
I'm finding it difficult to phrase this question correctly, let me try to explain our problem...
We have an intranet running on Ubunutu box with Apache2/PHP 5.2.4. We have a bit of PHP code that reads a file from a directory that is not publically accessible and output it to the screen (code below):
$file_path = '/home/path/to/filename.gif';
if(file_exists($file_path)){
$output = FALSE;
//File Information
$path_parts = pathinfo($file_path);
$file_size = filesize($file_path);
$file_ext = (isset($path_parts['extension'])) ? strtolower($path_parts['extension']) : null;
$file_name = $path_parts['basename'];
//Sets up the headers
if($file_size > 0){
header('Content-Length: ' .$file_size);
}
header('Content-Disposition: attachment; filename="'.$file_name.'"');
header('Content-Type: application/octet-stream');
//Reads the File
if($file_size > 0){
$handle = fopen($file_path, "r");
$output = fread($handle, $file_size);
fclose($handle);
}
//Outputs the File
echo $output;
}
Inside our network when, browsing to the page that uses this code, the file is downloaded perfectly and quickly...
However, when accessing this page via our Cisco ASA/Proxy/VPN (not sure what to call it) this code locks up the browser, but does eventually download the file...
After a bit of experimenting, after taking out the headers and just echoing the contents of the file to the browser, it prints no problem. However as soon as I add the lines with the headers back into the code it causes the hanging again, but only when accessed via this box..
Anybody come across this problem before or have any idea what we can try to move forward?
Thanks for any advice...
Have you tried eliminating the content-size header entirely? The proxy may be taking that as a firm promise and if the data you're sending ends up being a different size, the proxy may wait for those last few "missing" bytes to show up.
Just as an aside, you should use [readfile()][1] instead of the fopen()/fread()/echo construct you have now.
As it stands now, you're slurping the contents of the entire file into memory and then echoing out. For large files and multiple requests, you'll kill the server with memory starvation. readfile will automatically stream the file in smaller chunks so that memory usage is minimal.
Your proxy obviously have problems with the Content-Type: application/octet-stream. Try setting it to the real MIME-type of each file. You can use the Fileinfo module to find out which MIME-type a certain file is, like this:
//You may need to specify the location of your system's magic file
//See http://php.net/finfo_open for more info
$finfo = new finfo(FILEINFO_MIME);
$mimetype = $finfo->file($file_path);
I have a file hosting site and users earn a reward for downloads. So I wanted to know is there a way I can track whether the visitor downloaded whole file so that there are no fake partial downloads just for rewards.
Thank You.
If you could monitor the HTTP response codes returned by your web server and tie them back to the sessions that generated them, you would be in business.
A response code of 206 shows that the system has delivered some of the information but not all of it. When the final chunk of the file goes out, it should not have a response code of 206.
If you can tie this to user sessions by putting the session code inside the URL, then you could give points based on a simple log aggregation.
I implemented a similar solution on a file hosting website.
What you want to do is use the register_shutdown_function callback that allows you to detect the end of execution of a php script regardless of the outcome.
Then you want to have your file in a non-web accessible location on your server(s), and have your downloads go through php: idea being that you want to be able to track how many bytes have been passed to the client.
Here's a basic way of implementing (eg:)
<?php
register_shutdown_function('shutdown', $args);
$file_name = 'file.ext';
$path_to_file = '/path/to/file';
$stat = #stat($path_to_file);
//Set headers
header('Content-Type: application/octet-stream');
header('Content-Length: '.$stat['size']);
header('Connection: close');
header('Content-disposition: attachment; filename='.$file_name);
//Get pointer to file
$fp = fopen('/path/to/file', 'rb');
fpassthru($fp);
function shutdown() {
$status = connection_status();
//A connection status of 0 indicates no premature end of script
if($status == 0){
//Do whatever to increment the counter for the file.
}
}
>?
There are obviously ways to improve, so if you need more details or another behaviour, please let me know!