Pushing file to web-browser while being written - php

I have a backgrounded ffmpeg process which is outputting a audio file, I want to push this file to user web-browser while ffmpeg continues to write upon the file. I tried the below but this send 0 byte files.
// open the file in a binary mode
$fp = fopen($fname, 'rb');
// send the right headers
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($fname));
ob_end_clean();
fpassthru($fp);
exit;
ffmpeg cannot be launched from PHP/Python and output captured here.

The fpassthru function won't do the job here. In addition, there is going to be an issue of knowing when the file is complete.
File reading functions stop when the end of the file is reached. If there is a concurrent writer increasing the length of the file, then it's indeterminate how far along a reader will get before seeing EOF. In addition, there's no clear way - through the file operations - to know whether the file writer is done.
It may be feasible to attempt to read with timeouts using a loop like this (psuedo-code):
LOOP
READ bytes
IF count read == 0
THEN
SLEEP briefly
INCREMENT idle_count
ELSE
SET idle_count = 0
WRITE
END IF
UNTIL ( idle_count == 10 )
I can put that into PHP code if it helps.
Here is PHP code that does this with two files, in.dat and out.dat:
<?php
$in_fp = fopen("./in.dat", "r");
$out_fp = fopen("./out.dat", "w");
$idle_count = 0;
while ( $idle_count < 10 )
{
if ( $idle_count > 0 )
sleep(1);
$val = fread($in_fp, 4192);
if ( ! $val )
{
$idle_count++;
}
else
{
$idle_count = 0;
$rc = fwrite($out_fp, $val);
if ( $rc != strlen($val) )
die("error on writing of the output file\n");
}
}
Note the odd location of the sleep is to prevent sleeping one second after the last attempt to read.
I recommend setting the idle timeout limit higher than 10 for this purpose.

Related

PHP Piping Data with f_open

I have a piece of code, the issue is the file "data" is over 8GB. This is very memory intensive. I want to reduce the usage of RAM and saw the f_load would be ideal, however, how could i explode this data?
This is my current code:
$data = file_get_contents("data");
$data = explode("|", $data);
foreach ($data as $d) { // rest of code
theoretically, i need to open a pipe, stream and close a pipe How would i go about this?
I've tried using f_open rather than file_get_contents but errors started popping up so i'm doing something wrong and would really like to learn.
You can use stream_get_line to read your data block per block (with | as the delimiter character) : PHP stream_get_line()
$fh = fopen('data', 'r'); // open the file in read-only mode
while( $d = stream_get_line($fh, 1000, '|') ) // read the file until the next |
{
echo $d . PHP_EOL ; // display one block of data
}
fclose($fh); // close the file

PHP File Handling (Download Counter) Reading file data as a number, writing it as that plus 1

I'm trying to make a download counter in a website for a video game in PHP, but for some reason, instead of incrementing the contents of the downloadcount.txt file by 1, it takes the number, increments it, and appends it to the end of the file. How could I just make it replace the file contents instead of appending it?
Here's the source:
<?php
ob_start();
$newURL = 'versions/v1.0.0aplha/Dungeon1UP.zip';
//header('Location: '.$newURL);
//increment download counter
$file = fopen("downloadcount.txt", "w+") or die("Unable to open file!");
$content = fread($file,filesize("downloadcount.txt"));
echo $content;
$output = (int) $content + 1;
//$output = 'test';
fwrite($file, $output);
fclose($file);
ob_end_flush();
?>
The number in the file is supposed to increase by one every time, but instead, it gives me numbers like this: 101110121011101310111012101110149.2233720368548E+189.2233720368548E+189.2233720368548E+18
As correctly pointed out in one of the comments, for your specific case you can use fseek ( $file, 0 ) right before writing, such as:
fseek ( $file, 0 );
fwrite($file, $output);
Or even simpler you can rewind($file) before writing, this will ensure that the next write happens at byte 0 - ie the start of the file.
The reason why the file gets appended it is because you're opening the file in append and truncate mode, that is "w+". You have to open it in readwrite mode in case you do not want to reset the contents, just "r+" on your fopen, such as:
fopen("downloadcount.txt", "r+")
Just make sure the file exists before writing!
Please see fopen modes here:
https://www.php.net/manual/en/function.fopen.php
And working code here:
https://bpaste.net/show/iasj
It will be much simpler to use file_get_contents/file_put_contents:
// update with more precise path to file:
$content = file_get_contents(__DIR__ . "/downloadcount.txt");
echo $content;
$output = (int) $content + 1;
// by default `file_put_contents` overwrites file content
file_put_contents(__DIR__ . "/downloadcount.txt", $output);
That appending should just be a typecasting problem, but I would not encourage you to handle counts the file way. In order to count the number of downloads for a file, it's better to make a database update of a row using transactions to handle concurrency properly, as doing it the file way could compromise accuracy.
You can get the content, check if the file has data. If not initialise to 0 and then just replace the content.
$fileContent = file_get_contents("downloadcount.txt");
$content = (!empty($fileContent) ? $fileContent : 0);
$content++;
file_put_contents('downloadcount.txt', $content);
Check $str or directly content inside the file

PHP SSH read file in chunks by pattern

I have a PHP script that takes a user-supplied string, then SSHs out to a remote server, reads the file into an array, then parses out the request/response blocks containing the string to return to the user.
This implementation does not work with large log files, because PHP runs out of memory trying to store the whole file in an array.
Example data:
*** REQUEST
request line 1
request line 2
request line 3
[...]
*** RESPONSE
response line 2
response line 2
response line 3
[...]
[blank line]
The length of the requests and responses vary, so I can never be sure how many lines there will be.
How can I read a file in chunks without storing the whole file in memory, while still ensuring I'll always be able to process a full request/response block of data from the log without truncating it?
I feel like I'm just being exceptionally dense about this, since my experience is usually working with whole files or arrays.
Here's my current code (with $search representing the user-supplied string we're looking for in the log), which is putting the whole file into an array first:
$stream = ssh2_exec($ssh, $command);
stream_set_blocking($stream, true);
$data = '';
while($buffer = fread($stream, 4096)) {
$data .= $buffer;
}
fclose($stream);
$rawlog = $data;
$logline = explode("\n",$rawlog);
reset($logline);
$block='';
foreach ( $logline as $k => $v ) {
if ( preg_match("/\*\*\* REQUEST",$v) && $block != '') {
if ( preg_match("/$search/i",$block) ) {
$results[] = $block;
}
$block=$v . "\n";
} else {
$block .= $v . "\n";
}
}
if ( preg_match("/$search/i",$block) ) {
$results[] = $block;
}
Any suggestions?
Hard to say if this would work for you but if the logs are in files you could use phpseclib's SFTP implementation (latest Git version).
eg.
If you do $sftp->get('filename.ext', false, 0, 1000) it'll download bytes 0-1000 from filename.ext and return a string with those bytes. If you do $sftp->get('filename.ext', false, 1000, 1000) it'll download bytes 1000-2000.
You can use command like tail which will get lines from 0 to 99, from 100 to 199, and so on.
This will require more ssh commands, but will not require you to store all result in memory.
Or, you can first store all the output into local file, and after that parse it.

Split big files using PHP

I want to split huge files (to be specific, tar.gz files) in multiple part from php code. Main reason to do this is, php's 2gb limit on 32bit system.
SO I want to split big files in multiple part and process each part seperately.
Is this possible? If yes, how?
My comment was voted up twice, so maybe my guess was onto something :P
If on a unix environment, try this...
exec('split -d -b 2048m file.tar.gz pieces');
split
Your pieces should be pieces1, pieces2, etc.
You could get the number of resulting pieces easily by using stat() in PHP to get the file size and then do the simple math (int) ($stat['size'] / 2048*1024*1024) (I think).
A simple method (if using Linux based server) is to use the exec command and to run the split command:
exec('split Large.tar.gz -b 4096k SmallParts'); // 4MB parts
/* | | | | |
| | |______| |
App | | |_____________
The source file | |
The split size Out Filename
*/
See here for more details: http://www.computerhope.com/unix/usplit.htm
Or you can use: http://www.computerhope.com/unix/ucsplit.htm
exec('csplit -k -s -f part_ -n 3 LargeFile.tar.gz');
PHP runs within a single thread and the only way to increase this thread count is to create child process using the fork commands.
This is not resource friendly. What I would suggest is to look into a language that can do this fast and effectively. I would suggest using node.js.
Just install node on the server and then create a small script, called node_split for instance, that can do the job on its own for you.
But I do strongly advise that you do not use PHP for this job but use exec to allow the host operating system to do this.
HJSPLIT
http://www.hjsplit.org/php/
PHP itself might not be able to...
If you can figure out how to do this from your computers' command line,
You should be able to then execute these commands using exec();
function split_file($source, $targetpath='/split/', $lines=1000){
$i=0;
$j=1;
$date = date("m-d-y");
$buffer='';
$handle = fopen ($_SERVER['DOCUMENT_ROOT'].$source, "r");
while (!feof ($handle)) {
$buffer .= fgets($handle, 4096);
$i++;
if ($i >= $lines) {
$fname = $_SERVER['DOCUMENT_ROOT'].$targetpath."part_".$date.$j.".txt";
$fhandle = fopen($fname, "w") or die($php_errormsg);
if (!$fhandle) {
echo "Cannot open file ($fname)";
//exit;
}
if (!fwrite($fhandle, $buffer)) {
echo "Cannot write to file ($fname)";
//exit;
}
fclose($fhandle);
$j++;
$buffer='';
$i=0;
$line+=10; // add 10 to $lines after each iteration. Modify this line as required
}
}
fclose ($handle);
}
$handle = fopen('source/file/path','r');
$f = 1; //new file number
while(!feof($handle))
{
$newfile = fopen('newfile/path/'.$f.'.txt','w'); //create new file to write to with file number
for($i = 1; $i <= 5000; $i++) //for 5000 lines
{
$import = fgets($handle);
//print_r($import);
fwrite($newfile,$import);
if(feof($handle))
{break;} //If file ends, break loop
}
fclose($newfile);
$f++; //Increment newfile number
}
fclose($handle);
If you want to split files which are
already on server, you can do it
(simply use the file functions fread,
fopen, fwrite, fseek to read/write
part of the file).
If you want to
split files which are uploaded from
the client, I am afraid you cannot.
This would probably be possible in php, but php was built for web development and trying to this whole operation in one request will result in the request timing out.
You could however use another language like java or c# and build a background process that you can notify from php to perform the operation. Or even run from php, depending on your Security settings on the host.
Splits are named as filename.part0 filename.part1 ...
<?php
function fsplit($file,$buffer=1024){
//open file to read
$file_handle = fopen($file,'r');
//get file size
$file_size = filesize($file);
//no of parts to split
$parts = $file_size / $buffer;
//store all the file names
$file_parts = array();
//path to write the final files
$store_path = "splits/";
//name of input file
$file_name = basename($file);
for($i=0;$i<$parts;$i++){
//read buffer sized amount from file
$file_part = fread($file_handle, $buffer);
//the filename of the part
$file_part_path = $store_path.$file_name.".part$i";
//open the new file [create it] to write
$file_new = fopen($file_part_path,'w+');
//write the part of file
fwrite($file_new, $file_part);
//add the name of the file to part list [optional]
array_push($file_parts, $file_part_path);
//close the part file handle
fclose($file_new);
}
//close the main file handle
fclose($file_handle);
return $file_parts;
}
?>

Limit download count with text file in PHP

if ($_SERVER['REQUEST_METHOD']=='GET' && $_GET['download']==='1')
{
$handle = fopen('lastdownload.txt','rw');
$date = #fread($handle,filesize('lastdownload.txt'));
if (time() - 30 * 60 > $date)
{
fwrite($handle,time());
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="dwnld_'.date('d_m_Y_H_i',filemtime('download.zip')).'.zip"');
readfile('download.zip');
}
exit;
}
Hi everyone, i have a problem about limiting download count.
I want to limit my download count.
If someone request the file with ?download=1
It checks the current time and the time inside the file
If 30 minutes passed before the last download, it lets you download again, else it just exits.
Any help please?
Thank you.
Unless your still using PHP4, I would just use file_put_contents() and file_get_contents().
"rw" is not a valid mode for fopen. You should use "r+" or "x+" and rewind the file pointer after reading:
$handle = fopen('lastdownload.txt','r+');
$date = #fread($handle,filesize('lastdownload.txt'));
rewind($handle);
if(fileatime("lastdownload.txt")>=300)
{
//Access OR File Download Code Here
}

Categories