PHP Download resume with hidden url link for large file - php

I'm sorry to bother you with my issues, but i'm facing a a problem that I have some trouble to fix.
I have a website with a login restricted area.
Once the user is logged he can access to the files of my company (big file)
But to avoid the link to be spread all over the internet when a user want to download the file, located on an external url: he clicks on a url which will contain the name of the file crypted in md5 , which redirect on a php script which is going to generate in php, headers with the download using fsockopen.
However this does not support resume of download which is not very practical when we are downloading somes files of 2 or 3 gb or when you are using a downloader.
How can I do to enable resume ?
I have seen some php scripts using fread method , however i don't think it would be a good idea in my case, because for big files, it could make lag the server.. when you do a progressive fread on a 2gb files, good luck for the process when they are 30 poeple downloading the file in the meantime.

If you use fopen() and fseek() like this, you're essentially doing the same as any Webserver does to reply HTTP-RANGE requests.

You could make it so that it doesn't allow the file to be downloaded unless they are logged in.
So instead of providing them with a direct link to the file, they get a link like foo.com/download.php?myfile.ext
And the download.php would check the session before providing the user with the file download.

Related

php - how to use readdfile to not force a download

Is there a way to use php readdfile function to allow browsers to stream media files, such as an mp3 file?
I currently have a download system where readdfile is used because the file downloads are not stored in a public directory, however, I would like my users to be able to stream these files with there browser. At the moment, it forces them to download.
I have used the example from php.net.
thanks,
josh.
Depends on what the files are, and what you mean by "stream". The proper definition, in this context, is to send out enough of a large audio or video file that it can be played clientside without having to have received the entire file. Also, most streaming service allow the user to 'seek' to any given position in the video and have the stream restart at that point, again so that the user does not have to download the whole file.
The above requires specialized streaming server software.
However, I get the feeling that you may simply want the browser to open the file instead of prompting the user to save it. This requires you to set the Content-Type: header with the proper MIME type to the client via the header() function.

Processing file in the users directory

I am writing a scripts that processes the .csv file. The script currently have to upload the csv file to the server in order to process it, and the user have to download the processed file which is a lot of work from a user.
My question is, is there a way to process files from the user's directory path without the user having to upload the file first? So the user will just browse to the file to be processed and the file will be save and processed in that path.
Thanks,
Sbo
Then the only option you have is to do it client-side. To do it client-side you thus have to use a client-side technology like Flash or JavaScript. The latter is probably the better choice. The following URL explains how you can do a client-side file upload: http://igstan.ro/posts/2009-01-11-ajax-file-upload-with-pure-javascript.html
You want to get access to user's computer? Forget it.
Only way to achieve it is to use Java Applets with special permissions in php you need to upload it, it can be uploaded to temp directory but you need to still upload it.
Java Applets need to be signed and has certificate to be accepted by user. There is no other way I know to get access to user's files.
Check this link as well

Is downloading file through PHP or through direct link faster?

I need to get user download some file (for example, PDF). What will be longer:
send this file by PHP (with specific headers),
or put it in http public folder, and get user the public link to download it (without PHP help)?
In 1st case the original file could be in private zone.
But I'm thinking it will take some time to send this file by PHP.
So how I can measure PHP spent time to sending file and how much memory it can consumed?
P.S. in the 1st case, when PHP sends headers and browser (if pdf plugin is installed) will try to opening it inside browser, is PHP still working, or it push out whole file after headers sent immediately? Or if plugin not installed and browser will show "save as" dialog PHP still working ?
There will be very little in it if you are worried about download speeds.
I guess it comes down to how big your files are, how many downloads you expect to have, and if your documents should be publicly accessible, the download speed of the client.
Your main issue with PHP is the memory it consumes - each link will create a new process, which would be maybe 8M - 20M depending on what your script does, whether you use a framework etc.
Out of interest, I wrote a symfony application to offer downloads, and to do things like concurrency limiting, bandwidth limiting etc. It's here if you're interested in taking a look at the code. (I've not licensed it per se, but I'm happy to make it GPL3 if you like).

Allow logged in users to view and download files (some 250+ MB) that would normally be 403 access denied

I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.

How to store processed file in users desktop?

I have created a website.In that scaling an image option is created.. Now i want to store that scaled image in users desktop..But its saving in code existing folder..
Please help me by sending php script to store that file in desktop
If your website is going to actually live on the web instead of on people's computers locally then you can't directly save it to their computer. You can serve it to them with php as a file download by setting the proper mime-type and content headers (using header()) followed by the file itself, or better yet offer the user a download link so they can choose to download it themselves.
If your website is going to be used locally (a little odd, but it could happen), then you can use fopen(), fwrite() and fclose() in php to work with local files.
I don't think it is possible to do this without asking for user intervention on where to save the processed file. If you think about it, it would be a significant security flaw if a web server could arbitrarily save files to user desktops!
The best you could do is set the content header of the generated file to have a content disposition of attachment and then the browser will prompt the user where to save the file.

Categories