ffmpeg real-time encoding while file uploading by chunks - php

Now the process is:
File upload
Encode file with ffmpeg when file has been uploaded
Can be done at the same time?, obviusly yes, but I don't know how.
The file upload process is by 8MB chunks stored in server, then, temporary I have videofile.ext.tmp that is growing up to final uploaded file.
I read about ffmpeg -stream_loop or -loop parameters, but I don't know if the upload process is good for real-time encoding or how to manage it.
Any help will be appreciated.
EDITED: I tried this feature and works good when: Internet connection is faster than ffmpeg encode speed. When uploaded file is big enough.
I personally ruled out this procedure because the internet speed may change or cut out.

It depends on the file format. Some formats like ts,mkv,flv will work this way. Mp4 may or may not work depending on how the file was created.

Related

Large file upload through Browser (100 GB)

Is there any way to upload large files (more than 80 Gb) through a web browser? Previously I have been uploading files (img, png, jpg) using plupload but it seems not to be working for larger files. I would also like to know how to implement a web page where users could upload like Mega.co.nz or Drive.google.com.
If it is impossible to do it using web development tools, can anyone guide me about how I can divide & upload a file in segments?
Thanks.
You can use the JavaScript Blob object to slice large files into smaller chunks and transfer these to the server to be merged together. This has the added benefit of being able to pause/resume downloads and indicate progress.
If you don't fancy doing it yourself there are existing solutions that use this approach. One example is HTML5 Uploader by Filkor.
If I was you I would use something like FTP to accomplish this. If you can use ASP.NET there are already good libraries that exist for file transfer.
Here is a post that shows an example of uploading a file: Upload file to ftp using c#
The catch is you will need a server. I suggest Filezilla. https://filezilla-project.org/

Accessing file data before upload complete in php

I am currently uploading a file to my server via http. All is well with the upload but I wanted to know if there is a way to access the data of an upload in the $_FILES global before the upload has completed so I can open another stream to a server and push the data over as it comes in
Previously in java I have had read and write streams open at once rather than reading then writing (working with parsing flat files)and this has dramatically increased the speed, so I guess I'm searching for a PHP equivalent for file uploads. Any suggestions?
May be this would give you a path to explore.
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
Check FileReader.readAsDataURL() section specially. Mostly when image upload needs a preview before uploading, FileReader object is used. Search for it you may get an idea

Automatic MP3 Compression in Wordpress Media Upload

Currently the MP3 file exceeds the limit set in wordpress. And Although I am going to raise that limit with some information I found on the topic, the Mp3s are still rather large. If the site were for me, I'd simply compress them. So I need to find a method to compress them. I assume it needs to save that large file, to transcode it to a smaller file, and then delete the old one. Any ideas?
I assume it needs to save that large file, to transcode it to a smaller file, and then delete the old one
This is correct. You always have to store the content first. Encoding should not be done via PHP though, there are more effective libraries for that. For example, you can run lame mp3 encoder on the command line via a system() call in php. Be aware though that encoding might fail and also that it takes quite long. So you should run a cleaner script for the big files via cron instead of trying to delete them via the upload script.

Mp3 streaming/downloading website - apache server memory issue

I have a website, in which users can upload mp3 files (uploadify), stream them using an html5 player (jplayer) and download them using a php script (www.zubrag.com/scripts/).
When a user uploads a song, the path to the audio file is saved in the database and i'm using that data in order to play and show a download link for the song.
The problem that i'm experiencing is that, according to my host, this method is using a lot of memory on the server, which is dedicated.
Link to script: http://pastebin.com/Vus8SRa7
How should I handle the script properly? And what would be the best way to track down the problem? Any ideas on cleaning up the code?
Any help much appreciated.
I would recommend storing your files on disk (named something random [check for collisions!] or sequential, without file extension, and outside of the doc root), and only store information in your DB. It's much easier to stream a file from disk this way than it is out of a database result.
When you retrieve an entire file's contents out of a database result, that data has to be in memory. readfile() doesn't have this issue. Use headers to return the original file name when sending the file back to the client, if you wish.
I would suggest you not to buffer the content when you are writing binary data of the MP3 onto your HTTP output. That way you'd be saving a lot on physical and virtual memory usage.

Automatic ffmpeg conversion of only completed, fully uploaded files

Maybe it would be best to start by describing the scenario.
We have a Debian server with ffmpeg that we use to covert various video files into FLV.
The files are supplied by a number of different people via FTP and are kept in the "uploads" folder.
I need to write a PHP script that would go through all the files in the uploads folder, select the ones which are complete (i.e. not currently being uploaded or without any uploading errors) and then convert them to FLV using ffmpeg.
I can do the conversion and everything else, but how do I determine whether a file is complete and fully uploaded?
Many thanks!
Afaik you can't just figure out if a file is still being uploaded. You could run a cronjob everyminute getting the filesizes and store these in a database or file. Then if you run the cronjob the second time and the filesize is the same: convert them, if not.. wait another minute and then try again.
I don't believe there's a filesize stored with a file that contains the size it should be after the upload is done.
There is another way to go about this, which we have done for years.
Most ftp servers (proftpd does) will output a log which will tell you when a file upload has completed successfully. You can set this logging to go to a unix named pipe / fifo, and then have a daemonized script read this to determine which files to process. This works great, and only processes files after they are uploaded completely and successfully.

Categories