PHP Script using CronJob - php

I have been working on a small personal project that involves a php script using ftp to connect to a game server, and then reading log files to record kills and deaths into a database which will then be placed on my website.
At the current moment, my plan is to have the cronjob run each day at midnight (new log file created at midnight) and the script will then loop as long as the current day matches the date when the script was started. It opens the ftp file stream, reads lines, and records the line number when at the end of the file. Then, it will sleep for 1 minute. Then it will check if the file has become longer than its previously recorded length. If it has, then it will read those lines.
I have set_time_limit(0) as well.
Here is a snippet of my code that I think is causing my problem
$lastpos = 1;
$length = 0;
$filename = ftp........./logs_03_27_16.txt
while($daycount == (date("d") + 0)) {
sleep(60);
clearstatcache(false, $filename);
$length = filesize($filename);
if ($length < $lastpos) {
$lastpos = $length;
} elseif ($length > $lastpos) {
$file = fopen($filename, "rb");
while ($file === false) {
sleep(60);
$file = fopen($filename, "rb");
}
fseek($file, $lastpos);
while(!feof($file)) {
//Do Operations on file
}
$lastpos = ftell($file);
fclose($file);
flush();
}
}
fclose($file);
Correction: Now I get 504 Gateway Time-out when attempting to run it in my browser

Related

Correct fread() application?

Say I'm uploading a chunked file and I have to recompose it. I know the total chunks and data from every iteration.
I founded code like this:
for ($i = 1; $i <= $num_chunks; $i++) {
$file = fopen($target_file.$i, 'rb');
$buff = fread($file, 2097152);
fclose($file);
$final = fopen($target_file, 'ab');
$write = fwrite($final, $buff);
fclose($final);
unlink($target_file.$i);
}
Apparently, the 2097152 value, has no meaning, at least to me. I read the php docs but couldn't understand too much. Could anyone explain me how I should choose that secon param of fread? And how the thing works?
The second parameter is the amount of data to read, as your reading this in one chunk you have to be sure that it is enough to process any chunk. The value you've set is 2MB, which may be enough, but you could change the code so that it reads it in smaller chunks and loops till the input is fully read.
I've also changed it to open the output file once and just write the contents as you go along...
$final = fopen($target_file, 'wb'); // Open for write and start from beginning of file
for ($i = 1; $i <= $num_chunks; $i++) {
$file = fopen($target_file.$i, 'rb');
while($buff = fread($file, 4096)) {
fwrite($final, $buff);
}
fclose($file);
unlink($target_file.$i);
}
fclose($final);

fread() download aborts after one gigabyte

I'm currently having the problem that my download always stops at almost exactly one gigabyte. I have created a PHP download script which checks the user permissions before downloading and then prevents copying the URL. For this I use fread() to pass the data through PHP. However, I have no idea why the download always stops exactly after one gigabyte. The file is about twice as large.
Could someone take a look at this?
// ... Permission check, get download url, etc. ...
try
{
define('CHUNK_SIZE', 8 * 1024);
function readfile_chunked($filename, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
if(ob_get_level() > 0)
{
ob_flush();
flush();
}
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt;
}
return $status;
}
$mimetype = 'mime/type';
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . $download->url);
readfile_chunked($download->url);
}
catch(Exception $e)
{
echo "Error while reading the file";
exit;
}
Update:
I have now tested the script on another server. Here the download sometimes terminated after 300 MB and sometimes after 1.3 GB. I then defined the following settings:
ini_set('display_errors', 0);
ini_set('error_reporting', E_ALL);
ini_set("log_errors", 1);
ini_set("error_log", "error.log");
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 7200);
On the other server, the script has now run through several times without any problems. The download was complete and the file was error-free. However, I copied these settings to the original system and the download still stops. I downloaded the file three times in a row. The file size varies slightly:
1.057.983 KB
1.056.192 KB
1.056.776 KB
I'm running out of ideas. What else could that be? They are both Apache web servers.
There are also no noticeable entries in the log.

Unzipping a file with PHP, but not all files are extracted

I'm working on extracting a zip archive with PHP. The structure of the archive is seven folders, each of which contains on the order of 10,000 files, each around 1 kB.
My code is pretty simple and uses the ZipArchive class:
$zip = new ZipArchive();
$result = $zip->open($filename);
if ($result === true) {
$zip->extractTo($tmpdir);
$zip->close();
}
The problem I'm having, though, is that the extraction seems to halt. The first folder is fully extracted, but only about half of the second one is. None of the other five are extracted at all.
I also tried using this code, which breaks it into chunks of 10 kB at a time, but got the exact same result:
$archive = zip_open($filename);
while ($entry = zip_read($archive)) {
$size = zip_entry_filesize($entry);
$name = zip_entry_name($entry);
if (substr($name, -1) == '/') {
if (!file_exists($tmpdir . $name)) mkdir($tmpdir . $name);
} else {
$unzipped = fopen($tmpdir . $name, 'wb');
while ($size > 0) {
$chunkSize = ($size > 10240) ? 10240 : $size;
$size -= $chunkSize;
$chunk = zip_entry_read($entry, $chunkSize);
if ($chunk !== false) fwrite($unzipped, $chunk);
}
fclose($unzipped);
}
}
I've also tried increasing the memory limit in PHP from 512 MB to 1024 MB, but again got the same result. Unzipped everything is around 100 MB, so I wouldn't anticipate it being a memory issue anyway.
Probably its your max execution time... disable the limit completely by setting it to 0 or set a good value.
ini_set('max_execution_time', 10000);
... dont set it to 0 in production use ...
If you don't have access to ini_set() because of the disable_function directive you may have to edit its value in your php.ini directly.

PHP Stop Reading Remote Files when after it is Fully Downloaded

I'm getting a 30 second timeout error because the code keeps checking if the file is over 5mb when it's under. The code is designed to reject files over 5mb but i need it to also stop executing when the file is under 5mb. Is there a way to check the file transfer chunk to see if it's empty? I'm currently using this example by DaveRandom:
PHP Stop Remote File Download if it Exceeds 5mb
Code by DaveRandom:
$url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg';
$file = '../temp/test.jpg';
$limit = 5 * 1024 * 1024; // 5MB
if (!$rfp = fopen($url, 'r')) {
// error, could not open remote file
}
if (!$lfp = fopen($file, 'w')) {
// error, could not open local file
}
// Check the content-length for exceeding the limit
foreach ($http_response_header as $header) {
if (preg_match('/^\s*content-length\s*:\s*(\d+)\s*$/', $header, $matches)) {
if ($matches[1] > $limit) {
// error, file too large
}
}
}
$downloaded = 0;
while ($downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}
if ($downloaded > $limit) {
// error, file too large
unlink($file); // delete local data
} else {
// success
}
You should check if you have reached the end of the file:
while (!feof($rfp) && $downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}

Split a large zip file in small chunks using php script

I am using below script for spliting a large zip file in small chucks.
$filename = "pro.zip";
$targetfolder = '/tmp';
// File size in Mb per piece/split.
// For a 200Mb file if piecesize=10 it will create twenty 10Mb files
$piecesize = 10; // splitted file size in MB
$buffer = 1024;
$piece = 1048576*$piecesize;
$current = 0;
$splitnum = 1;
if(!file_exists($targetfolder)) {
if(mkdir($targetfolder)) {
echo "Created target folder $targetfolder".br();
}
}
if(!$handle = fopen($filename, "rb")) {
die("Unable to open $filename for read! Make sure you edited filesplit.php correctly!".br());
}
$base_filename = basename($filename);
$piece_name = $targetfolder.'/'.$base_filename.'.'.str_pad($splitnum, 3, "0", STR_PAD_LEFT);
if(!$fw = fopen($piece_name,"w")) {
die("Unable to open $piece_name for write. Make sure target folder is writeable.".br());
}
echo "Splitting $base_filename into $piecesize Mb files ".br()."(last piece may be smaller in size)".br();
echo "Writing $piece_name...".br();
while (!feof($handle) and $splitnum < 999) {
if($current < $piece) {
if($content = fread($handle, $buffer)) {
if(fwrite($fw, $content)) {
$current += $buffer;
} else {
die("filesplit.php is unable to write to target folder");
}
}
} else {
fclose($fw);
$current = 0;
$splitnum++;
$piece_name = $targetfolder.'/'.$base_filename.'.'.str_pad($splitnum, 3, "0", STR_PAD_LEFT);
echo "Writing $piece_name...".br();
$fw = fopen($piece_name,"w");
}
}
fclose($fw);
fclose($handle);
echo "Done! ".br();
exit;
function br() {
return (!empty($_SERVER['SERVER_SOFTWARE']))?'<br>':"\n";
}
?>
But this script not creating small files after split in target temp folder. Script runs successfully without any error.
Please help me to found out what is issue here? Or If you have any other working script for similar functinality, Please provide me.
As indicated in the comments above, you can use split to split a file into smaller pieces, and can then use cat to join them back together.
split -b50m filename x
and to put them back
cat xaa xab xac > filename
If you are looking to split the zipfile into a spanning type archive, so that you do not need to rejoin the them together take a look at zipsplit
zipslit -n (size) filename
so you can just call zipsplit from your exec script and then most standard unzip utils should be able to put it back together. man zipslit for more options, including setting output path, etc..

Categories