Is there a way to use php readdfile function to allow browsers to stream media files, such as an mp3 file?
I currently have a download system where readdfile is used because the file downloads are not stored in a public directory, however, I would like my users to be able to stream these files with there browser. At the moment, it forces them to download.
I have used the example from php.net.
thanks,
josh.
Depends on what the files are, and what you mean by "stream". The proper definition, in this context, is to send out enough of a large audio or video file that it can be played clientside without having to have received the entire file. Also, most streaming service allow the user to 'seek' to any given position in the video and have the stream restart at that point, again so that the user does not have to download the whole file.
The above requires specialized streaming server software.
However, I get the feeling that you may simply want the browser to open the file instead of prompting the user to save it. This requires you to set the Content-Type: header with the proper MIME type to the client via the header() function.
Related
Say i store pdf files in the database (not important). The users of the application visit a page periodically to grab a stored PDF and print it - for adhesive labels btw.
I am annoyed at the thought of their downloads directory filling up with duplicates of the same document over time since they will download the PDF every time they need to print the labels.
Is there a way to instruct the browser to cache this file? Or any method of relative linking to the users file system possibly? All users will be on Chrome/Firefox using Windows 7 Pro.
Etags will help you to do this. If the file hasn't been updated since the client last downloaded, the server will send a 304 "not modified" response, instead of the file.
If your files are dynamically generated, you will need to manually implement etag generation in PHP rather than relying on the web server.
http://www.php.net/manual/en/function.http-cache-etag.php
I've found a useful solution to my problem.
From the comments on my question, we concluded it would work best to utilize the browser's built in PDF/DOC renderer and download anything else that isn't recognized.
I read this standard: https://www.rfc-editor.org/rfc/rfc6266
This is the solution (header):
Content-Disposition: inline; filename=something.pdf
Instead of attachment, I've used "inline" in order to utilize the browser when necessary.
Most browsers will do this automatically based on the URL. If the URL for a particular PDF blob is constant, the browser will not re-download it unless the server responds that it has changed (by way of HTTP fields).
You should therefore design your site to have "permalinks" for each resource. This could be achieved by having a resource-ID of some sort in the URL string.
As others have said in comments, a server cannot guarantee that a client does ANYTHING in particular; all you can offer are suggestions that you hope most browsers will treat similarly.
I've a php based website and would like browser to cache the images for 30 days .. i am using a shared hosting solution where I do not have access to apache config to enable mod-headers or other modules and so can not use htaccess mechanisms for this.
my site is a regular php app, and has both html contents and images. I would like browser to cache images only. I've seen php's "header" function, but couldn't find a way to force only image cache .. How do i go about it ?
Thanks
As far as I know, if you can't get access to Apache to set the headers, your only other option is to serve images from a PHP script so you can use the PHP Header methods to set the headers.
In this case, you'd need to write a PHP image handler, and replace all your image tags with calls to this handler (e.g. http://mysite.com/imagehandler.php?image=logo.png). You would then have you imagehandler.php script retrieve the image from the file system, set the mime type and cache control headers, and stream the image back to the client.
You could write your own, or if you google, you will find image handler PHP scripts. Either way, make sure you focus on security - don't allow the client to retrieve arbitrary files from your web server, because that would be a fairly major security hole....
I'm sorry to bother you with my issues, but i'm facing a a problem that I have some trouble to fix.
I have a website with a login restricted area.
Once the user is logged he can access to the files of my company (big file)
But to avoid the link to be spread all over the internet when a user want to download the file, located on an external url: he clicks on a url which will contain the name of the file crypted in md5 , which redirect on a php script which is going to generate in php, headers with the download using fsockopen.
However this does not support resume of download which is not very practical when we are downloading somes files of 2 or 3 gb or when you are using a downloader.
How can I do to enable resume ?
I have seen some php scripts using fread method , however i don't think it would be a good idea in my case, because for big files, it could make lag the server.. when you do a progressive fread on a 2gb files, good luck for the process when they are 30 poeple downloading the file in the meantime.
If you use fopen() and fseek() like this, you're essentially doing the same as any Webserver does to reply HTTP-RANGE requests.
You could make it so that it doesn't allow the file to be downloaded unless they are logged in.
So instead of providing them with a direct link to the file, they get a link like foo.com/download.php?myfile.ext
And the download.php would check the session before providing the user with the file download.
Folks
I have an image at some server (SOURCE)
i.e. http://stagging-school-images.s3.amazonaws.com/2274928daf974332ed4e69fddc7a342e.jpg
Now I want to upload it to somewhere else (DESTINATION)
i.e. example.mysite.com/receiveImage.php
First, I am copying image from source to my local server and then uploading it to destination.
It's perfectly working but taking too much time as it copy the image and then uploads...
I want to make it more simple and optimized by directly uploading image from source URL to destination URL.
Is there a way to handle this ?
I am using php/cURL to handle my current functionality.
Any help would be very much appreciated.
Cheers !!
If example.mysite.com/receiveImage.php is your own service, then you may
pass SOURCE URL to your PHP script as GET or POST parameter
in PHP script, use file_get_contents() function to obtain image by URL, and save it to your storage
Otherwise it's impossible by means of HTTP.
However, there are some ways to increase files uploading speed a little:
If files are huge, you may use two threads: one for downloading (it will store all downloaded data to some buffer) and one for uploading (it will get all available data from buffer and upload it to site). As far as I know, this can't be done easily with PHP, because multi-threading is currently not supported yet.
If there are too many files, you may use many threads / processes, which will do download/upload simultaneously.
By the way, these means do not eliminate double traffic for your intermediate service.
One of the services may have a form somewhere that will allow you to specify a URL to receive from/send to, but there is no generic HTTP mechanism for doing so.
copy($url, $uploadDir.'/'.$fileName);
The only way to transfer the image directly from source to destination is to initiate the transfer from either the source or the destination. You can't magically beam the data between these two locations without them talking to each other directly. If you can SSH login to your mysite.com server you could download the image directly from there. You could also write a script that runs on mysite.com and directly downloads the image from the source.
If that's not possible, the best alternative may be to play around with fread/fwrite instead of curl. This should allow you to read a little bit from the source, then directly upload that bit to the destination so download and upload can work in parallel. For huge files this should make a real difference, for small files on a decent connection it probably won't.
create two textfield one url, other filename
in php, use :
uploadDir is path to your file directory ;)
copy($url, $uploadDir.'/'.$fileName);
I have created a website.In that scaling an image option is created.. Now i want to store that scaled image in users desktop..But its saving in code existing folder..
Please help me by sending php script to store that file in desktop
If your website is going to actually live on the web instead of on people's computers locally then you can't directly save it to their computer. You can serve it to them with php as a file download by setting the proper mime-type and content headers (using header()) followed by the file itself, or better yet offer the user a download link so they can choose to download it themselves.
If your website is going to be used locally (a little odd, but it could happen), then you can use fopen(), fwrite() and fclose() in php to work with local files.
I don't think it is possible to do this without asking for user intervention on where to save the processed file. If you think about it, it would be a significant security flaw if a web server could arbitrarily save files to user desktops!
The best you could do is set the content header of the generated file to have a content disposition of attachment and then the browser will prompt the user where to save the file.