I have a website, in which users can upload mp3 files (uploadify), stream them using an html5 player (jplayer) and download them using a php script (www.zubrag.com/scripts/).
When a user uploads a song, the path to the audio file is saved in the database and i'm using that data in order to play and show a download link for the song.
The problem that i'm experiencing is that, according to my host, this method is using a lot of memory on the server, which is dedicated.
Link to script: http://pastebin.com/Vus8SRa7
How should I handle the script properly? And what would be the best way to track down the problem? Any ideas on cleaning up the code?
Any help much appreciated.
I would recommend storing your files on disk (named something random [check for collisions!] or sequential, without file extension, and outside of the doc root), and only store information in your DB. It's much easier to stream a file from disk this way than it is out of a database result.
When you retrieve an entire file's contents out of a database result, that data has to be in memory. readfile() doesn't have this issue. Use headers to return the original file name when sending the file back to the client, if you wish.
I would suggest you not to buffer the content when you are writing binary data of the MP3 onto your HTTP output. That way you'd be saving a lot on physical and virtual memory usage.
Related
I'm trying to convert a website to use S3 storage instead of local (expensive) disk storage. I solved the download problem using a stream wrapper interface on the S3Client. The upload problem is harder.
It seems to me that when I post to a PHP endpoint, the $_FILES object is already populated and copied to /tmp/ before I can even intercept it!
On top of that, the S3Client->upload() expects a file on the disk already!
Seems like a double-whammy against what I'm trying to do, and most advice I've found uses NodeJS or Java streaming so I don't know how to translate.
It would be better if I could intercept the code that populates $_FILES and then send up 5MB chunks from memory with the S3\ObjectUploader, but how do you crack open the PHP multipart handler?
Thoughts?
EDIT: It is a very low quantity of files, 0-20 per day, mostly 1-5MB sometimes hitting 40~70MB. Periodically (once every few weeks) a 1-2GB file will be uploaded. Hence the desire to move off an EC2 instance and into heroku/beanstalk type PaaS where I won't have much /tmp/ space.
It's hard to comment on your specific situation without knowing the performance requirements of the application and the volume of users needed to access it so I'll try to answer assuming a basic web app uploading profile avatars.
There are some good reasons for this, the file is streamed to the disk for multiple purposes one of which is to conserve memory use. If your file is not on the disk than it is in memory(think disk usage is expensive? bump up your memory usage and see how expensive that gets), which is fine for a single user uploading a small file, but not so great for a bunch of users uploading small files or worse: large files. You'll likely see the best performance if you use the defaults on these libraries and let them stream to and from the disk.
But again I don't know your use case and you may actually need to avoid the disk at all costs for some unknown reason.
So I am simply seeking to upload files to the server for later upload to Azure Blob Storage.
I am aware you can POST data using a HTML form, however this isn't a solution.
I would like to be able to upload files directly using POST such as through cURL or other means. As far as I'm aware PHP also requires the content type header to be specified as multipart/form-data for this.
I have tried to use file handles, but it appears that I am only able to interact with locally stored files. I have also played around with move_uploaded_files, but I am unable to POST files directly.
Streaming is important here, as files will likely be 5MB+, which wouldn't work for multiple concurrent uploads as PHP would quickly saturate its allocated memory; while I can change PHP settings on the staging server, I have no idea what the PHP memory allocation will be like on the Azure VM production server. It also wouldn't make sense to keep changing PHP settings with load increases.
However, if PHP does not offer such a solution out of the box, I'm open to alternatives.
To summarise, how do I stream data (e.g. videos) to a PHP file?
Thank you in advance.
We have files that are hosted on RapidShare which we would like to serve through our own website. Basically, when a user requests http://site.com/download.php?file=whatever.txt, the script should stream the file from RapidShare to the user.
The only thing I'm having trouble getting my head around is how to properly stream it. I'd like to use cURL, but I'm not sure if I can read the download from RapidShare in chunks and then echo them to the user. The best way I've thought of so far is to use a combination of fopen, fread, echo'ing the chunk of the file to the user, flushing, and repeating that process until the entire file is transferred.
I'm aware of the PHP readfile() function aswell, but would that be the best option? Bear in mind that these files can be several GB's in size, and although we have servers with 16GB RAM I want to keep the memory usage as low as possible.
Thank you for any advice.
HTTP has a Header called "Range" which basically allows you to fetch any chunk of a file (knowing that you already know the file size), but since PHP isn't multi-threaded aware, I don't see any benefit of using it.
Afaik, if you don't want to consume all your RAM, the only way to go is a two steps way.
First, stream the remote file using fopen()/fread() (or any php functions which allow you to use stream), split the read in small chunks (2048 bits may be enough), write/append the result to a tempfile(), then "echoing" back to your user by reading the temporary file.
That way, even a file 2To would, basically, consumes 2048 bits since only the chunk and the handle of the file is in memory.
You may also write some kind of proxy manager to cache and keep already downloaded files to avoid the remote reading process if a file is heavily downloaded (and keep it locally for a given time).
Folks
I have an image at some server (SOURCE)
i.e. http://stagging-school-images.s3.amazonaws.com/2274928daf974332ed4e69fddc7a342e.jpg
Now I want to upload it to somewhere else (DESTINATION)
i.e. example.mysite.com/receiveImage.php
First, I am copying image from source to my local server and then uploading it to destination.
It's perfectly working but taking too much time as it copy the image and then uploads...
I want to make it more simple and optimized by directly uploading image from source URL to destination URL.
Is there a way to handle this ?
I am using php/cURL to handle my current functionality.
Any help would be very much appreciated.
Cheers !!
If example.mysite.com/receiveImage.php is your own service, then you may
pass SOURCE URL to your PHP script as GET or POST parameter
in PHP script, use file_get_contents() function to obtain image by URL, and save it to your storage
Otherwise it's impossible by means of HTTP.
However, there are some ways to increase files uploading speed a little:
If files are huge, you may use two threads: one for downloading (it will store all downloaded data to some buffer) and one for uploading (it will get all available data from buffer and upload it to site). As far as I know, this can't be done easily with PHP, because multi-threading is currently not supported yet.
If there are too many files, you may use many threads / processes, which will do download/upload simultaneously.
By the way, these means do not eliminate double traffic for your intermediate service.
One of the services may have a form somewhere that will allow you to specify a URL to receive from/send to, but there is no generic HTTP mechanism for doing so.
copy($url, $uploadDir.'/'.$fileName);
The only way to transfer the image directly from source to destination is to initiate the transfer from either the source or the destination. You can't magically beam the data between these two locations without them talking to each other directly. If you can SSH login to your mysite.com server you could download the image directly from there. You could also write a script that runs on mysite.com and directly downloads the image from the source.
If that's not possible, the best alternative may be to play around with fread/fwrite instead of curl. This should allow you to read a little bit from the source, then directly upload that bit to the destination so download and upload can work in parallel. For huge files this should make a real difference, for small files on a decent connection it probably won't.
create two textfield one url, other filename
in php, use :
uploadDir is path to your file directory ;)
copy($url, $uploadDir.'/'.$fileName);
I have created a website.In that scaling an image option is created.. Now i want to store that scaled image in users desktop..But its saving in code existing folder..
Please help me by sending php script to store that file in desktop
If your website is going to actually live on the web instead of on people's computers locally then you can't directly save it to their computer. You can serve it to them with php as a file download by setting the proper mime-type and content headers (using header()) followed by the file itself, or better yet offer the user a download link so they can choose to download it themselves.
If your website is going to be used locally (a little odd, but it could happen), then you can use fopen(), fwrite() and fclose() in php to work with local files.
I don't think it is possible to do this without asking for user intervention on where to save the processed file. If you think about it, it would be a significant security flaw if a web server could arbitrarily save files to user desktops!
The best you could do is set the content header of the generated file to have a content disposition of attachment and then the browser will prompt the user where to save the file.