Delete log files automatically in PHP [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Automatically delete files from webserver
I'm logging some XML content to a text file using file_put_contents() in case I need it for debugging.
How do I delete this file after a certain amount of time?
I've got a restrictive .htaccess in the log folder, but I'd rather not leave the info (will have customer's addresses, etc) up on the web for long.

Well, while agreeing with everyone else that you're using the wrong tool for the job, your question is pretty straightforward. So here you go:
write a PHP script that will run from the command line
use a tool like cron or windows scheduled task
invoke the cron every minute/five minutes/etc
Your script would be pretty straightforward:
<?php
$dh = opendir(PATH_TO_DIRECTORY);
while( ($file = readdir($dh)) !== false ) {
if( !preg_match('/^[.]/', $file) ) { // do some sort of filtering on files
$path = PATH_TO_DIRECTORY . DIRECTORY_SEPARATOR . $file;
if( filemtime($path) < strtotime('-1 hour') ) { // check how long it's been around
unlink($path); // remove it
}
}
}
You could also use find if you're working in Linux, but I see that #Rawkode posted that while I was writing this so I'll leave you with his elegant answer for that solution.

You should be using the PHP error_log() function which will respect the php.ini settings.
error_log('Send to PHP error log');
error_log('Sent to my custom log', 3, "/var/tmp/my-errors.log");

You can use a built-in function – filectime – to track the creation date of your log files and then delete those that are old enough to be deleted.
This code searches through the logs directory and deletes logs that are 2 weeks old.
$logs = opendir('logs');
while (($log = readdir($logs)) !== false)
{
if ($log == '.' || $log == '..')
continue;
if (filectime('logs/'.$log) <= time() - 14 * 24 * 60 * 60)
{
unlink('logs/'.$log);
}
}
closedir($logs);

You should be handling your logging better, but to answer your question I'd use the *nix find command.
find /path/to/files* -mtime +5 -delete
This will delete all files that haven't been modified in five days. Adjust to your own needs.

There are two ways you can work it out:
if you write the log to only one file, you can empty the file using something like this:
<?php file_put_contents($logpath, "");
if you generate many files, you can write cleanup function like this:
<?php
$logpath = "/tmp/path/to/log/";
$dirh = opendir($logpath);
while (($file = readdir($dh)) !== false) {
if(in_array($file, array('.', '..'))) continue;
unlink($logpath . $file);
}
closedir($dirh);

You can set up a server Scheduled task (most known as Cron Job under *nix systems) to run on a regular basis and delete the left log files. The task would be to execute a PHP code that will do the job.
See Cron Job Overvieww

Related

Temporary files that hang around after script execution

Can I create temporary files with php that expire and get deleted after a predefined time set during file creation, and not when the script finishes execution? My host doesn't allow cron jobs, so php only, if possible.
Without access to cron you only have one option -- manage file cleanup on your own.
This is, in fact, what the PHP SESSION handler does -- it creates a file of session data. Then, when PHP starts, there is a small chance that it will go through an remove expired files. (IIRC, there is a 1/100 chance that it will.)
Your best bet is to create a directory to store your temp files in and then use a similar process.
This might give you some ideas: Session Configuration
$deletetime = time() - $days * 86400; # only do this calculation
$dir = '/path/to/dir';
if ($handle = opendir($dir)) {
while (false !== ($file = readdir($handle))) {
if ((filetype("$dir/$file") == 'file') && (filemtime($file) < $deletetime)) {
unlink("$dir/$file");
}
}
closedir($handle);
}
reference - webhostingtalk
Also here is a similar question asked before.Auto delete files after x-time
Is this what you are looking for?
<?php
$fh = fopen('test.html', 'a');
fwrite($fh, '<h1>Hello world!</h1>');
fclose($fh);
unlink('test.html');
?>
http://www.php.net/manual/en/function.unlink.php

Automatically delete files based on date of creation

I am having issue with automatically deleting files from a specific folder on my server.
I need to run an automatic delete every 31 mins on a folder which stores incoming documents. These files will always be *.pdf format.
I have found an issue similar on this site.
How to delete files from directory based on creation date in php?
However my issue is with *.pdf files and I have never used php before, ideally I was looking for a .bat file, but if that's not possible it's no problem.
<?php
if ($handle = opendir('/path/to/files')) {
while (false !== ($file = readdir($handle))) {
$filelastmodified = filemtime($file);
if((time() - $filelastmodified) > 24*3600 && strtolower(substr($file, -4)) == ".pdf")
{
unlink($file);
}
}
closedir($handle);
}
?>
This added condition checks if the filename ends with ".pdf". You could run this as a cronjob.
You might as well use shell commands instead. find with -name, -exec and -mtime does the same and saves the need for a PHP parser.

How to delete files from directory on specific time, daily?

My data.txt is generated each time when someone is visiting my site. I would like to delete this file on on specific time, let say 1:00AM daily.
I found this script but I'm struggling to update the code :/
Any advice?
<?php
$path = dirname(__FILE__).'/files';
if ($handle = opendir($path)) {
while (false !== ($file = readdir($handle))) {
if ((time()-filectime($path.'/'.$file)) < 86400) { // 86400 = 60*60*24
if (preg_match('/\.txt$/i', $file)) {
unlink($path.'/'.$file);
}
}
}
}
?>
The script you posted deletes .txt files in the given folder if they are older than a day but the problem is that this test only happens once - when you run the script.
What you need to do is run this script periodically. If you are running this on Linux you should probably add a cron job that executes this script periodically, say once an hour or once daily.
If you're running this on Windows, there is a task schedule that you could use to accomplish the same thing.
Use task scheduler, such as cron for this purposes. You can remove your file simply via shell command
rm /path/to/data.txt
So there's no need to write a PHP script for that.

Delete all files in directory automatically?

How does one, in PHP, delete all files in a directory every 24 hours without deleting the directory itself? It has to be in PHP, it cannot be a Cron Job.
Can you please provide an explanation behind your code as well? I am still learning PHP.
Thank you!
There is no way to do this in PHP without using PHP. Sorry.
Joking, but if you wanted to do this you would need some sort of task scheduler (like cron).
That is to say that you could program your personal computer to send the request to the server every 24 hours, but you would either have to do it manually or schedule the task locally.
My point being, you need cron, but it doesn't need to be running on the same host as the PHP files.
Without cron you'd have to add code like this to a commonly-requested page:
$scheduledclean = strtotime("midnight"); // 00:00:00 of the current day
$lastcleanfile = '/path/to/my/app/lastclean';
$lastcleantime = (file_exists($lastcleanfile)) ? filemtime($lastcleanfile) : 0;
$curtime = time();
if( ($curtime > $scheduledclean) && ($lastcleantime < $scheduledclean) ) {
touch($lastcleanfile); //touch first to prevent multiple executions
// file cleanup code here
}
On the first request to the page after midnight the cleanup will fire off, but the unlucky person that made the request will likely have a delay for their page to be served that is as long as the cleanup takes. You could mitigate this by running the cleanup as a backgrounded shell command like shell_exec('rm -rf /path/to/dir/* &');
I did something similar to this a long time ago. It's a terrible idea, but you can have a file which stores the last time your directory was cleared. Each time a user visits a relevant page, check this file in the PHP script (you could also check the modified time). If it is far enough into the past, update the file and run your delete script.
Downsides:
Not guaranteed to run every 24 hours (maybe you get no visitors one day)
Gives one user per day a longer wait
Ugly
As for deleting the files,
function delete_contents( $dirpath ) {
$dir = opendir( $dirpath );
if( $dir ) {
while( ($s = readdir( $dir )) !== false ) {
if( is_dir( $s ) ) {
delete_contents( $s );
rmdir( $s );
} else {
unlink( $s );
}
}
closedir( $dir );
}
}
BE VERY CAREFUL with that. On a crude server setup, delete_contents('/') will delete every file.
Make a PHP script that removes all files in the directory, look for the functions readdir() and unlink() to remove the files.
Set-up a Cron Job to run the script automatically each 24 hours. How you have to do this exactly depends on your host. There are also websites that you can use for this: http://www.google.nl/search?q=cronjobs+webservice
Good luck!

PHP script that sends an email listing file changes that have happened in a directory/subdirectories

I have a directory with a number of subdirectories that users add files to via FTP. I'm trying to develop a php script (which I will run as a cron job) that will check the directory and its subdirectories for any changes in the files, file sizes or dates modified. I've searched long and hard and have so far only found one script that works, which I've tried to modify - original located here - however it only seems to send the first email notification showing me what is listed in the directories. It also creates a text file of the directory and subdirectory contents, but when the script runs a second time it seems to fall over, and I get an email with no contents.
Anyone out there know a simple way of doing this in php? The script I found is pretty complex and I've tried for hours to debug it with no success.
Thanks in advance!
Here you go:
$log = '/path/to/your/log.js';
$path = '/path/to/your/dir/with/files/';
$files = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST);
$result = array();
foreach ($files as $file)
{
if (is_file($file = strval($file)) === true)
{
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
}
}
if (is_file($log) !== true)
{
file_put_contents($log, json_encode($result), LOCK_EX);
}
// are there any differences?
if (count($diff = array_diff($result, json_decode(file_get_contents($log), true))) > 0)
{
// send email with mail(), SwiftMailer, PHPMailer, ...
$email = 'The following files have changed:' . "\n" . implode("\n", array_keys($diff));
// update the log file with the new file info
file_put_contents($log, json_encode($result), LOCK_EX);
}
I am assuming you know how to send an e-mail. Also, please keep in mind that the $log file should be kept outside the $path you want to monitor, for obvious reasons of course.
After reading your question a second time, I noticed that you mentioned you want to check if the files change, I'm only doing this check with the size and date of modification, if you really want to check if the file contents are different I suggest you use a hash of the file, so this:
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
Becomes this:
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), md5_file($file));
// or
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), sha1_file($file));
But bare in mind that this will be much more expensive since the hash functions have to open and read all the contents of your 1-5 MB CSV files.
I like sfFinder so much that I wrote my own adaption:
http://www.symfony-project.org/cookbook/1_0/en/finder
https://github.com/homer6/altumo/blob/master/source/php/Utils/Finder.php
Simple to use, works well.
However, for your use, depending on the size of the files, I'd put everything in a git repository. It's easy to track then.
HTH

Categories