PHP gzuncompress with file read and write errors - php

I have a function that keeps track of events that happen through out the script. In an effort to use my resources effectively, I decided to compress the data that it generates. However, I keep getting this error:
Unknown error type: [2] gzuncompress() [function.gzuncompress]: data error
Here's the function:
function eventlog($type, $message){
// Types: account,run,queue,system
// Set up file name/location
$eventfile = '/myprivatedirectory/'.date('Ymd').$type.'.log';
if(file_exists($eventfile)){
while(!is_writable($eventfile)){clearstatcache();}
$fh_log = fopen($eventfile,'r+');
flock($fh_log, LOCK_EX);
$logcontents = gzuncompress(fread($fh_log,filesize($eventfile)));
rewind($fh_log);
ftruncate($fh_log, 0);
$logcompressed = gzcompress($logcontents.$message."\n");
fwrite($fh_log,$logcompressed);
flock($fh_log, LOCK_UN);
fclose($fh_log);
} else {
$fh_log = fopen($eventfile,'w');
flock($fh_log, LOCK_EX);
$logcompressed = gzcompress($message."\n");
fwrite($fh_log,$logcompressed);
flock($fh_log, LOCK_UN);
fclose($fh_log);
}
}
So everyday, at midnight, a new error log is created as any of the above events occur (account,run,queue,system), otherwise each new event is appended to the respectful log file.
I would love to keep the compression, but I can not keep having these errors, can anyone please help? Thanks in advance.

I think the implementation is all wrong i would not advice you to gzcompress($message."\n"); every message ...
I think what you should do is that at the end of the day you can compress the whole log file which is more efficient
Save your information using
file_put_contents
At the end of the day
$eventfile = '/myprivatedirectory/'.date('Ymd').$type.'.log';
$eventfileCompressed = '/myprivatedirectory/'.date('Ymd').$type.'.gz';
$gz = gzopen($eventfileCompressed ,"w9");
gzwrite($gz, file_get_contents($eventfile));
gzclose($gz);
To read the file
$zd = gzopen($eventfileCompressed,"r");
$zr = gzread($zd,$fileSize);
gzclose($zd);
This approach would save you more processing power

Related

DOMDocument::Load(); error read read of 8192 bytes failed with errno=13 Permission denied

I am at an internship right now and got this old 2012 code, I have tried my best to debug it but I can only get this error. I have already tried to increase my PHP limit in php.ini and that did nothing. it also isn't the XML file because I can print the entire thing to the website with no problem. it might be a problem with the file locking since this is the first time I have used it.
this is the PHP
<?PHP
// First Clear any status messages from CMS
// Find filepath
$dir = getcwd();
$pieces = explode("/", $dir);
$processname = $pieces[count($pieces) - 1];
$recmsgfilepath = "savedmsg.default.xml";
//saved for later use in $recmsgfilepath
//tmp/webpage/".$processname."/msgfromcms.xml
// Lock file
$newmsgfile = fopen($recmsgfilepath, "r+");
while(!flock($newmsgfile, LOCK_EX))
{
// Continue to try to lock file, webserver will eventually terminate PHP script
}
$newmsgdoc = new domDocument();
if((filesize($recmsgfilepath) > 0) && $newmsgdoc->load($recmsgfilepath))
{
// Clear rec message file
ftruncate($newmsgfile, 0);
}
flock($newmsgfile, LOCK_UN); // release the lock
fclose($newmsgfile);
// Check if any buttons pressed
if(isset($_POST['clearmessages']))
{
// Send message to delete all messages in web app
SendMessage("", "WEBVIEW", "4", "", "", "", "", "");
}
the xml
<SAVEDMESSAGES last_changed="16449122918046">
<TEXTMESSAGE TIMESTAMP="16449122918046" TIMESTAMP_MSG="1644912290" PRIO="0" ALARMID="-">
<TIME>09:04:50</TIME>
<DATE>2022-02-15</DATE>
<ADDRESS></ADDRESS>
<PRIO>0</PRIO>
<TIMEOUT>60</TIMEOUT>
<TEXT>TESTMESSAGE</TEXT>
<ALERTTYPE>0</ALERTTYPE>
<CALLBACK></CALLBACK>
<SENDERID></SENDERID>
<ALARMINITIATOR>0</ALARMINITIATOR>
<ACKDATA></ACKDATA>
<POSINFO1>0</POSINFO1>
<POSINFO2>0</POSINFO2>
<RFP0>0</RFP0>
<STATUS>ORIGINAL</STATUS>
<ORIGIN>CONFIG</ORIGIN>
<MESSAGEID>338599</MESSAGEID>
</TEXTMESSAGE>
</SAVEDMESSAGES>

Safe PHP JSON logging

I am using JSON as a data store and over time I foresee that several parties might be wanting to write to my JSON file like a chat log within a short space of time.
<?php
$foo = json_decode(file_get_contents("foo.json"), true);
if (! is_array($foo["bar"])) { $foo["bar"] = array(); }
array_push($foo["bar"], array("time" => time(), "who" => $_SERVER['REMOTE_ADDR'], msg => $_GET['m']));
file_put_contents("foo.json", json_encode($foo, JSON_PRETTY_PRINT));
?>
So the above code works, but I am worried what happens if the file is read before it's written out, or some case where they are writing out at the same time, leading to some data loss?
What's a better or safer design, preferably using flat file storage (i.e. not databases)?
As for a bonus I really don't want to return to my client who made the request this that were was some "lock". Ideally the request is made to wait until it's safe to return.
You can use the flock() function for this. What it does is it locks the file for all processes except the current one.
http://php.net/manual/en/function.flock.php
Basic usage:
<?php
$fp = fopen('path/to/data.json', 'r+');
if (flock($fp, LOCK_EX)) // locks the file
{
// write to the file
flock($fp, LOCK_UN); // remove the lock
}
fclose($fp);
flock() is blocking by default. That means a process is waiting until it gets permission to access the lock. Have a look at the docs on how to implement a nonblocking version.
How a bout you create individual JSON for each line so you do not need load Json every time. and you just append JSON to the file.
Each Joson will be in One line.
When you load it you will read each line in text file then convert it to Json within PHP.
The reason I recommend this way as we want to find other way to have better solution for file read and write using //http://php.net/manual/en/function.file-put-contents.php
file_put_contents("foo.json", json_encode($foo, JSON_PRETTY_PRINT).
JSON is just a format for text. So I believe that It is a good solution to do.
To write
<?php
$_SERVER['REMOTE_ADDR'], msg => $_GET['m']));
$foo = array("time" => time(), "who" => $_SERVER['REMOTE_ADDR'], msg => $_GET['m']);
//http://php.net/manual/en/function.file-put-contents.php
file_put_contents("foo.json", json_encode($foo, JSON_PRETTY_PRINT), FILE_APPEND | LOCK_EX));
?>
As you mention it is a log, so you may not need to load to see it all the time.
So we will to focus on writing. It may seem long process to read.
TO read
<?php
$log_array = array();
$handle = #fopen("foo.json", "r");
if ($handle) {
while (($buffer = fgets($handle, 4096)) !== false) {
$foo = json_decode($buffer, true);
$log_array[] = $foo;
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
print_r($log_array);
?>

Editing a common temp file in php an preventing conflicts

I want to have a temp file that gets updated from time to time.
What I was thinking of doing is:
<!-- language: lang-php -->
// get the contents
$s = file_get_contents( ... );
// does it need updating?
if( needs_update() )
{
$s = 'some new content';
file_put_contents( ... );
}
The issue that I could see happening is that whatever condition causes 'needs_update()' to return true could cause more than one process to update the same file at, (almost), the same time.
In an ideal situation, I would have one single process updating the file and prevent all other processes from reading the file until I am done with it.
So as soon as 'needs_update()' return true is called I would prevent others processes from reading the file.
<!-- language: lang-php -->
// wait here if anybody is busy writing to the file.
wait_if_another_process_is_busy_with_the_file();
// get the contents
$s = file_get_contents( ... );
// does it need updating?
if( needs_update() )
{
// prevent read/write access to the file for a moment
prevent_read_write_to_file_and_wait();
// rebuild the new content
$s = 'some new content';
file_put_contents( ... );
}
That way, only one process could possibly update the file and the files would all be getting the latest values.
Any suggestions on how I could prevent such a conflict?
Thanks
FFMG
You are looking for the flock function. flock will work as long as everyone that acess the file is using it. Example from php manual:
$fp = fopen("/tmp/lock.txt", "r+");
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
ftruncate($fp, 0); // truncate file
fwrite($fp, "Write something here\n");
fflush($fp); // flush output before releasing the lock
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't get the lock!";
}
fclose($fp);
Manual: http://php.net/manual/en/function.flock.php

PHP: Missing records when writing to file

My telecom vendor is sending me a report each time a message goes out. I have written a very simple PHP script that receive values via HTTP GET. Using fwrite I write the query parameter to a CSV file.The filename is report.csv with the current date as a prefix.
Here is the code :
<?php
error_reporting(E_ALL ^ E_NOTICE);
date_default_timezone_set('America/New_York');
//setting a the CSV File
$fileDate = date("m-d-Y") ;
$filename = $fileDate."_Report.csv";
$directory = "./csv_archive/";
//Creating handle
$handle = fopen($filename, "a");
//These are the main data field
$item1 = $_GET['item1'];
$item2 = $_GET['item2'];
$item3 = $_GET['item3'];
$mydate = date("Y-m-d H:i:s") ;
$pass = $_GET['pass'];
//testing the pass
if (isset($_GET['pass']) AND $_GET['pass'] == "password")
{
echo 'Login successful';
// just making sure the function could write to it
if (!$handle = fopen($directory.$filename, 'a')){
echo "Cannot open file ($filename)";
exit;
}
//writing the data I receive through query string
if (fwrite($handle, "$item1,$item2,$item3,$mydate \n") === FALSE) {
echo "Cannot write to file ($filename)";
exit;
}
fclose($handle);
}
else{
echo 'Login Failure please add the right pass to URL';
}
?>
The script does what I want, but the only problem is inconsistency, meaning that a good portion of the records are missing (about half the report). When I log to my account I can get the complete report.
I have no clue of what I need to do to fix this, please advice.
I have a couple of suggestions for this script.
To address Andrew Rhyne's suggestion, change your code that reads from each $GET variable to:
$item1 = (isset($_GET['item1']) && $_GET['item1']) ? $_GET['item1'] : 'empty';
This will tell you if all your fields are being populated.
I suspect you problem is something else. It sounds like you are getting a seperate request for each record that you want to save. Perhaps some of these requests are happening to close together and are messing up each other's ability to open and write to the file. To check if this is happening, you might try using the following code check if you opened the file correctly. (Note that your first use of 'fopen' in your script does nothing, because you are overwriting $handle with your second use of 'fopen', it is also opening the wrong file...)
if (!$handle = fopen($directory.$filename, 'a')){
$handle = fopen($directory.date("Y-m-d H:i:s:u").'_Record_Error.txt', 'a');
exit;
}
This will make sure that you don't ever lose data because of concurrent write attempts. If you find that this is indeed you issue, you can delay subsequent write attempts until the file is not busy.
$tries = 0;
while ($tries < 50 && !$handle = fopen($directory.$filename, 'a')){
sleep(.5);//wait half a second
$tries++;
}
if($handle){
flock($handle);//lock the file to prevent other requests from opening the file until you are done.
} else {
$handle = fopen($directory.date("Y-m-d H:i:s:u").'_Record_Error.txt', 'a');//the 'u' is for milliseconds
exit;
}
This will spend 25 seconds, trying to open the file once every half second and will still output your record to a unique file every time you are still unable to open the file to write to. You can then safely fwrite() and fclose() $handle as you were.

Trigger action when file download actually completes

Nowdays there are a lot of websites for files hosting (uploading websites) and it count for example point per complete download of certain file.
My question
I want to understand what is the idea they are using !
How does it only count on complete downloading of the file ?!
i mean if i canceled downloading of the file after it started , it won't count point!
how does it knew ! is there any php function that able to know if i canceled downloading certain exact file or not !
that question was all time in my mind and thinking about it but i can't understand how does it works or what is the idea behind it. ~ thanks
This can be done by using my other answer as base How can I give download access to files outside public_html directory? and replacing readfile( $filename )
with readfileWhileConnected( $filename ):
Read file until EOF or disconnect:
/** Read $filename until EOF or disconnect,
* if disconnect then error_log() count of bytes read already
*/
function readfileWhileConnected( $filename ) {
// Save and set ini values:
$user_abort = ignore_user_abort();
ignore_user_abort(false);
// Get file size and set bytes_sent to zero:
$fsize = filesize($filename);
$bytes_sent = 0;
// Open file:
$f = fopen($filename, 'r');
// Read file:
while($chunk = fread($f, 1024)) {
// Check if connection is still open:
if(!connection_aborted()) {
// Send $chunk to buffer (if any), then flush() buffers:
echo $chunk;
flush();
// Add $chunk length to $bytes_sent
$bytes_sent += strlen($chunk);
} else {
// Close file:
fclose($f);
error_log("Connection closed at $bytes_sent/$fsize");
exit();
}
// Close file:
fclose($f);
// Reset ini values:
ignore_user_abort($user_abort);
return $bytes_sent;
}
}
After you have your new shiny class myNewSuperDownloadHandlerClass { ... } ready, then make sure you only serve downloads through filedownload.php described here or if have done good myNewSuperDownloadHandlerClass(), then use that, just make sure that readfileWhileConnected() is used for every download requiring connection status polling.
You can easily add callback to be triggered if user closes connection, only 2 exit points here. (seen many functions that have every often return false; return true; return null; return false; return true; and so on..)

Categories