How to process big text file in PHP?
In python one can use generators and read file just line by line without loading whole file to memory. Is there something like generators in PHP?
you can run your php script in batch command ...
make a new file (run.bat)
right click and edit
and put this in the file :
c:/PHP/php.exe -f .\yourscript.php
change the url to php.exe and to yourscript.php to where its located
then loop true the file with this :
<?php
$data=file("the big file.txt");
$lines=count($data);
$x=0;
while($x<=$lines){
print $data[$x]."\n";
sleep(1); //print out new line every 1 sec
$x++;}
Related
I run a php script from console which has multiple echos while processing and I redirect the script output to file.
I want to overwrite the previous echos output after each echo.
command: php script.php > output.json
after the first echo output.json contains
{"property" : "firstValue" }
after the second echo output.json contains
{"property" : "firstValue" }{"property" : "secondValue" }
and is no longer a valid json
I want after the second echo ouput.json to contain
{"property" : "secondValue" }
When you want each new line of output to overwrite the last one, read 1 line at a time:
while IFS= read -r line; do
printf "%s\n" "${line}" > output.json
done <(php script.php)
STDOUT doesn't work that way. If you're generating multiple lines within a single script, and you only want the last one, you could maybe pipe the output through tail:
php script.php | tail -1 > output.json
Or you could handle the file writing yourself, within the script. Something like:
...
file_put_contents('/path/to/file.json', $someOutput);
...
file_put_contents('/path/to/file.json', $someNewOutput);
...
file_put_contents('/path/to/file.json', $someEvenNewerOutput);
actual I finished writing my program. Because it is only a plugin and it runs on a external server I still want to see if I get some errors or something else in the console.
I wrote every console input with echo ...;. My question now is if it is possible to get the text of the console?
Because then I could easily safe it in a .txt file and could get access to it from the web :) - Or is there another way to get the console text?
I could probably just say fwrite(...) instand of echo ...;. But this will cost a lot of time...
Greetings and Thank You!
An alternative that could be usefull on windows would be to save all the output buffer to a txt, first check your php configuration for the console app implicit_flush must be off then
<?php
ob_start(); //before any echo
/** YOUR CODE HERE **/
$output = ob_get_contents(); //this variable has all the echoes
file_put_contents('c:\whatever.txt',$output);
ob_flush(); //shows the echoes on console
?>
If your goal is to create a text file to access, then you should create a text file directly.
(do this instead of echoing to console)
$output = $consoleData . "\n";
$output .= $moreConsoleData . "\n";
(Once you've completed that, just create the file:)
$file = fopen('output.txt', 'a');
fwrite($file, $output);
fclose($file);
Of course, this is sparse - you should also check that the file exists, create it if necessary, etc.
For console (commando line interface) you can redirect the output of your script:
php yourscript.php > path-of-your-file.txt
If you haven't access to a command line interface or to edit the cronjob line, you can duplicate the starndar output at the begining of the script:
$fdout = fopen('path-to-your-script.txt', 'wb');
eio_dup2($fdout, STDOUT);
eio_event_loop();
fclose($fdout);
(eio is an pecl extension)
If you are running the script using the console (i.e. php yourscript.php), you can easily save the output my modifying your command to:
php yourscript.php > path/to/log.txt
The above command will capture all output by the script and save it to log.txt. Change the paths for your script / log as required.
I am uploading a video, which is supposed to generate three screenshot thumbnails. I have the same upload code running in both admin and front-end, but for some odd reason the thumb is only being generated when I upload from front end, and not from backend...
My directory structure
root/convert.php (this is the file running through exec call)
(the following two files are the upload files running in user-end and admin-end respectively)
root/upload.php
root/siteadmin/modules/videos/edit.php
I believe convert.php is not being run from admin-side for some reason. The command is something like:
$cmd = $cgi . $config['phppath']. ' ' .$config['BASE_DIR']. '/convert.php ' .$vdoname. ' ' .$vid. ' ' .$ff;echo $cmd;die;
exec($cmd. '>/dev/null &');
And echoing out the exec $cmd, I get this:
/usr/bin/php /home/testsite/public_html/dev/convert.php 1272.mp4 1272 /home/testsite/public_html/dev/video/1272.mp4
How do I make sure convert.php is being run?
EDIT: OK, now I am sure it is not being executed from admin-side, any ideas why?
http://php.net/manual/en/function.exec.php
"return_var" - If the return_var argument is present along with the output argument, then the return status of the executed command will be written to this variable.
Another way to determine if exec actually runs the convert.php file, add some debugging info in convert.php (e.g. write something to a file when the covert.php script starts).
Just an Idea
you could print "TRUE" in the convert script when it runs successfully.
don't add >/dev/null &
check the return value of exec
$value = exec($cmd);
if($value == 'TRUE')
// did run sucessfully
}
chmod 755 convet.php
you also make sure the first line of convert.php is:
#!/usr/bin/php
check the full path of php cli executable.
Also make sure convert.php las unix line ending ("\n")
I am processing a big .gz file using PHP (transfering data from gz to mysql)
it takes about 10 minutes per .gz file.
I have a lot of .gz file to be processed.
After PHP is finished with one file I have to manually change the PHP script to select another .gz file and then run the script again manually.
I want it to be automatically run the next job to process the next file.
the gz file is named as 1, 2 ,3, 4, 5 ...
I can simply make a loop to be something like this ( process file 1 - 5):
for ($i = 1 ; $i >= 5; $i++)
{
$file = gzfile($i.'.gz')
...gz content processing...
}
However, since the gz file is really big, I cannot do that, because if I use this loop, PHP will run multiple big gz files as single script job. (takes a lot of memory)
What I want to do is after PHP is finished with one job I want a new job to process the next file.
maybe its going to be something like this:
$file = gzfile($_GET['filename'].'.gz')
...gz content processing...
Thank You
If you clean up after processing and free all memory using unset(), you could simply wrap the whole script in a foreach (glob(...) as $filename) loop. Like this:
<?php
foreach (glob(...) as $filename) {
// your script code here
unset($thisVar, $thatVar, ...);
}
?>
What you should do is
Schedule a cronjob to run your php script every x minutes
When script is run, check if there is a lock file in place, if not create one and start processing the next unprocessed gz file, if yes abort
Wait for the queue to get cleared
You should call the PHP script with argument, from a shell script. Here's the doc how to use command-line parameters in PHP http://php.net/manual/en/features.commandline.php
Or, I can't try it now, but you may give a chance to unset($file) after processing the gzip.
for ($i = 1 ; $i >= 5; $i++)
{
$file = gzfile($i.'.gz')
...gz content processing...
unset($file);
}
This is my code:
$zplHandle = fopen($target_file,'w');
fwrite($zplHandle, $zplBlock01);
fwrite($zplHandle, $zplBlock02);
fwrite($zplHandle, $zplBlock03);
fclose($zplHandle);
When will the file be saved? Is it immediately after writing to it or after closing it?
I am asking this because I have Printfil listening to files in a folder and prints any file that is newly created. If PHP commits a save immediately after fwrite, I may run into issues of Printfil not capturing the subsequent writes.
Thank you for the help.
PHP may or may not write the content immediately. There is a caching layer in between. You can force it to write using fflush(), but you can't force it to wait unless you use only one fwrite().
I made a tiny piece of code to test it and it seems that after fwrite the new content will be detected immediately,not after fclose.
Here's my test on Linux.
#!/usr/bin/env php
<?php
$f = fopen("file.txt","a+");
for($i=0;$i<10;$i++)
{
sleep(1);
fwrite($f,"something\n");
echo $i," write ...\n";
}
fclose($f);
echo "stop write";
?>
After running the PHP script ,I use tail -f file.txt to detect the new content.And It shows new contents the same time as php's output tag.
the file will be saved on fclose. if you want to put the content to the file before, use fflush().
Assuming your working in PHP 5.x, try file_put_contents() instead, as it wraps the open/write/close into one call.
http://us3.php.net/manual/en/function.file-put-contents.php