I have a video converter. Here's how it works, you give the URL to the video, it downloads it to the server, then it converts it to mp3. So it works, but the problem is anything over 10 MB (which is only about 30 seconds) crashes the server. I need to know how to upload it in parts, so it doesn't crash the server.
file_put_contents($dest,file_get_contents($url));
The best approach is to download content in chunks. A nice method for doing so can be found in an answer here. In the $callback function parameter, you can pass a method to convert and write bytes being read.
file_get_contents_chunked($url, 4096, function($chunk, &$handle, $iteration) {
file_put_contents($dest, $chunk, FILE_APPEND);
});
Related
I have some videos that are hosted on S3 (.mp4 and .mov) some of which are rather large (1.2GB+).
I want to get the first frame from each video using the PHP wrapper for FFmpeg but I don't want to have to download the full file first.
What I really want to do is download a certain percentage of the file, something like 2%, so that I can guarantee that I will get the first frame.
I found a way to download 1mb of the file here: https://code.i-harness.com/en/q/c09357
However, it is the following chunk of this code that I don't really understand how it is only downloading 1mb.
function myfunction($ch, $data) {
$length = fwrite($this->fh, $data);
$size=&$this->size;
if($length === FALSE) {
return 0;
} else {
$size += $length;
}
// Downloads 1MB.
return $size < 1024 * 1024 * 1 ? $length : 0;
}
To me that says set the size to be the size of the file and then if the size is less than 1mb return the length, else return 0.
Now, I know it does work because I have run it, but I don't know how it works so that I can convert this into getting the percentage of the file.
Downloading 1 or 2 MB of the file is fine for the smaller files and the mp4 files, however the .mov files fail to get the first frame if it is less than about 20mb and some frames throw a division by zero error when getting the frame, I guess from the above function returning 0.
Could anyone shed some light on how all of this is working please, or even better if you could suggest an improvement?
myfunction is almost certainly set as the CURLOPT_WRITEFUNCTION callback function for curl_exec, and if that function returns 0 (or any number other than the size of $data), then curl will terminate the transfer, and curl_exec will return the CURLE_ABORTED_BY_CALLBACK error code. thus after you've downloaded >=1 mebibyte, curl_exec will stop with the CURLE_ABORTED_BY_CALLBACK error.
What I really want to do is download a certain percentage of the file, something like 2%, so that I can guarantee that I will get the first frame. - depending on the movie encoding, the first mebibyte may not be enough. there are some encoding schemes (as a specific example, .mpeg movies can be encoded this way) where you need a few bytes from the end of the file to render the first frame (iirc for .mpeg it's called the MOOV Atom - on mpeg movies where the MOOV atom is at the end of the file, you need a few bytes from the end of the file to render the first frame. for all streaming-optimized .mpeg movies, the MOOV atom is at the beginning of the file, not the end, and your 1st mebibyte scheme would work, but if it's at the end your scheme won't work unless the whole movie is <1 mebibyte)
a much better approach is to just let ffmpeg deal with it. ffmpeg will know how much data to download, and will attempt to only download the required parts, instead of the whole movie, and you'd need a program like ffmpeg to extract the first frame later anyway.
try
function getFirstFrameAsJpg(string $url):string{
if(file_exists("/dev/null")){
$ret=shell_exec("ffmpeg -i ".escapeshellarg($url)." -f image2pipe -frames 1 -r 1 -c:v:1 jpeg - 2>/dev/null");
}else{
// windows, probably, where /dev/null isn't supported but NUL works the same way.
$ret=shell_exec("ffmpeg -i ".escapeshellarg($url)." -f image2pipe -frames 1 -r 1 -c:v:1 jpeg - 2>NUL");
}
return $ret;
}
it will return the first frame of the video in the url as the binary for a .jpg image. (meaning you can do file_put_contents('image.jpg',getFirstFrameAsJpg($url)); - interestingly, if ffmpeg is not installed, $ret will be NULL, which means if you use strict_types=1, you will get an exception, otherwise you will get an empty string. )
ps before you allow potential hackers to specify the url for this function, make sure to validate that it is indeed a http url, as i did not consider the security implications of letting hackers run getFirstFrameAsJpg("/etc/passwd") or similar.
if you need to download with a bunch of headers, consider setting up a proxy scheme for ffmpeg where ffmpeg is told to download from a unique proxy-url instead, and still let ffmpeg deal with which parts of the movie to download, and make sure to implement the http range header for such a proxy, as ffmpeg will need it if extracting the 1st frame from a movie where the last part of the movie is required to extract the first frame.
(thanks to c_14 # freenode #ffmpeg for the image2pipe command)
Maybe I'm asking the impossible but I wanted to clone a stream multiple times. A sort of multicast emulation. The idea is to write every 0.002 seconds a 1300 bytes big buffer into a .sock file (instead of IP:port to avoid overheading) and then to read from other scripts the same .sock file multiple times.
Doing it through a regular file is not doable. It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly.
This works perfectly with the script that generates the chunks:
$handle = #fopen($url, 'rb');
$buffer = 1300;
while (1) {
$chunck = fread($handle, $buffer);
$handle2 = fopen('/var/tmp/stream_chunck.tmp', 'w');
fwrite($handle2, $chunck);
fclose($handle2);
readfile('/var/tmp/stream_chunck.tmp');
}
BUT the output of another script that reads the chunks:
while (1) {
readfile('/var/tmp/stream_chunck.tmp');
}
is messy. I don't know how to synchronize the reading process of chunks and I thought that sockets could make a miracle.
It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly
Using a single file without any sort of flow control shouldn't be a problem - tail -F does just that. The disadvantage is that the data will just accululate indefinitely on the filesystem as long as a single client has an open file handle (even if you truncate the file).
But if you're writing chunks, then write each chunk to a different file (using an atomic write mechanism) then everyone can read it by polling for available files....
do {
while (!file_exists("$dir/$prefix.$current_chunk")) {
clearstatcache();
usleep(1000);
}
process(file_get_contents("$dir/$prefix.$current_chunk"));
$current_chunk++;
} while (!$finished);
Equally, you could this with a database - which should have slightly lower overhead for the polling, and simplifies the garbage collection of old chunks.
But this is all about how to make your solution workable - it doesn't really address the problem you are trying to solve. If we knew what you were trying to achieve then we might be able to advise on a more appropriate solution - e.g. if it's a chat application, video broadcast, something else....
I suspect a more appropriate solution would be to use mutli-processing, single memory model server - and when we're talking about PHP (which doesn't really do threading very well) that means an event based/asynchronous server. There's a bit more involved than simply calling socket_select() but there are some good scripts available which do most of the complicated stuff for you.
I've written a C++ program for an embedded system. My program processes images taken by a camera (350 fps or even more ) and apply some machine vision algorithms on them. I wrote the result of my program which is an jpg image in a specific location in my hard-drive (e.g localhost/output/result.jpg)
Now I want to write a PHP code for displaying this image every 30 ms or faster (it depends to the output of my c++ program .... sometimes it is 30 fps and sometimes 10 or 40 but never more than 40 !!!) the video frames are not related to each other so I cannot use a streaming video since these algorithms suppose the frames are sequential ....
It is possible that the PHP code reads a corrupted image (Since, one program wants to write and the other one wants to read)
I thought to use a mutex concept here ... create a flag (text file) which is accessible to both programs whenever my C++ program wants to write into the same location set the flag high (write something in the text file) and when it finished the writing clear the flag so when php code sees that flag waits until the image is written into the hard drive and then displays it ....
I'm familiar to C++ but completely new to php, I'm able to display images in php but I don't know how to use timers in a way that the above problem would be solved.
what should I do ? I put a code which works but I don't want to use it since it makes only this part of the webpage be updated (Because of while(1)) ? is there any alternative solution ? if I'm not able to display more than 20 frames per second, what frame rate is possible in this scripting language ? what factor plays the role in this thing ?
Thanks
My Code:
<?php
while(1)
{
$handle = #fopen("/var/www/CTX_ITX/Flag_File.txt", "r");
if ($handle) {
if(($buffer = fgets($handle, 10)) !== false)
{
if($buffer=="Yes)
{
echo "<img src = "result.jpg" alt='test' \>";
}
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
sleep(.01); //10 ms pause before starting again ....
}
?>
PHP is not suitable for your task. It responds to a request. Your JS would have to load the image every 30ms.
I would suggest you take a look at Node.JS and HTML5 WebSockets. This technology will enable you to theoretically send pictures fast enough without reconnects etc. (only if your connection is fast enough to handle the traffic)
I have an interesting problem. I need to do a progress bar from an asycronusly php file downloading. I thought the best way to do it is before the download starts the script is making a txt file which is including the file name and the original file size as well.
Now we have an ajax function which calling a php script what is intended to check the local file size. I have 2 main problems.
files are bigger then 2GB so filesize() function is out of business
i tried to find a different way to determine the local file size like this:
.
function getSize($filename) {
$a = fopen($filename, 'r');
fseek($a, 0, SEEK_END);
$filesize = ftell($a);
fclose($a);
return $filesize;
}
Unfortunately the second way giving me a tons of error assuming that i cannot open a file which is currently downloading.
Is there any way i can check a size of a file which is currently downloading and the file size will be bigger then 2 GB?
Any help is greatly appreciated.
I found the solution by using an exec() function:
exec("ls -s -k /path/to/your/file/".$file_name,$out);
Just change your OS and PHP to support 64 bit computing. and you can still use filesize().
From filesize() manual:
Return Values
Returns the size of the file in bytes, or FALSE (and generates an
error of level E_WARNING) in case of an error.
Note: Because PHP's integer type is signed and many platforms use
32bit integers, some filesystem functions may return unexpected
results for files which are larger than 2GB.
is file_get_contents() enough for downloading remote movie files located on a server ?
i just think that perhaps storing large movie files to string is harmful ? according to the php docs.
OR do i need to use cURL ? I dont know cURL.
UPDATE: these are big movie files. around 200MB each.
file_get_contents() is a problem because it's going to load the entire file into memory in one go. If you have enough memory to support the operation (taking into account that if this is a web server, you may have multiple hits that generate this behavior simultaneously, and therefore each need that much memory), then file_get_contents() should be fine. However, it's not the right way to do it - you should use a library specifically intended for these sort of operations. As mentioned by others, cURL will do the trick, or wget. You might also have good luck using fopen('http://someurl', 'r') and reading blocks from the file and then dumping them straight to a local file that's been opened for write privileges.
As #mopoke suggested it could depend on the size of the file. For a small movie it may suffice. In general I think cURL would be a better fit though. You have much more flexibility with it than with file_get_contents().
For the best performance you may find it makes sense to just use a standard unix util like WGET. You should be able to call it with system("wget ...") or exec()
http://www.php.net/manual/en/function.system.php
you can read a few bytes at a time using fread().
$src="http://somewhere/test.avi";
$dst="test.avi";
$f = fopen($src, 'rb');
$o = fopen($dst, 'wb');
while (!feof($f)) {
if (fwrite($o, fread($f, 2048)) === FALSE) {
return 1;
}
}
fclose($f);
fclose($o);