Automatically delete files based on date of creation - php

I am having issue with automatically deleting files from a specific folder on my server.
I need to run an automatic delete every 31 mins on a folder which stores incoming documents. These files will always be *.pdf format.
I have found an issue similar on this site.
How to delete files from directory based on creation date in php?
However my issue is with *.pdf files and I have never used php before, ideally I was looking for a .bat file, but if that's not possible it's no problem.

<?php
if ($handle = opendir('/path/to/files')) {
while (false !== ($file = readdir($handle))) {
$filelastmodified = filemtime($file);
if((time() - $filelastmodified) > 24*3600 && strtolower(substr($file, -4)) == ".pdf")
{
unlink($file);
}
}
closedir($handle);
}
?>
This added condition checks if the filename ends with ".pdf". You could run this as a cronjob.
You might as well use shell commands instead. find with -name, -exec and -mtime does the same and saves the need for a PHP parser.

Related

PHP, delete files in a specific directory with a specific extension as long as it's over (10) minutes old

I'm trying to create a PHP script that basically looks through the files of a directory for files with a certain file extension, (they all have uniquely generated files names) get the time it was last modified for the files with that file extension, and then if it's older than 10 minutes, to delete those file.
I have something that kind of works, it checks through a directory but it looks through every file and if that file is older than x amount of time (20 seconds to test if it works) it'll delete the file. Issue is it deletes files like .htaccess which I need.
This is what I have currently:
<?php
$path = './test/';
if ($handle = opendir($path)) {
while (false !== ($file = readdir($handle))) {
$filelastmodified = filemtime($path . $file);
if((time() - $filelastmodified) > 20) //20 seconds
{
unlink($path . $file);
}
}
closedir($handle);
}
?>
That basically just deletes everything in that directory so long it's over than 20 seconds, but is there anyway to filter by file extensions so that files like .htaccess or other file extensions don't get deleted along with the files meant to be deleted?
Technically filemtime is checking when the file was last modified, not created, so you could actually use find to duplicate that exact behavior:
$path = "/path/to/dir";
$extention = "whatever";
$ret = shell_exec("find {$path} -type f -name '*.{$extention}' -mmin +10 -exec rm -f {} \;");
//Equal to
//$ret = shell_exec("find {$path} -type f -name '*.{$extention}' -mmin +10 -delete");
Unfortunately find doesn't have the ability to check for when the file was exactly created, but then again you're not looking at that now anyways.

Temporary files that hang around after script execution

Can I create temporary files with php that expire and get deleted after a predefined time set during file creation, and not when the script finishes execution? My host doesn't allow cron jobs, so php only, if possible.
Without access to cron you only have one option -- manage file cleanup on your own.
This is, in fact, what the PHP SESSION handler does -- it creates a file of session data. Then, when PHP starts, there is a small chance that it will go through an remove expired files. (IIRC, there is a 1/100 chance that it will.)
Your best bet is to create a directory to store your temp files in and then use a similar process.
This might give you some ideas: Session Configuration
$deletetime = time() - $days * 86400; # only do this calculation
$dir = '/path/to/dir';
if ($handle = opendir($dir)) {
while (false !== ($file = readdir($handle))) {
if ((filetype("$dir/$file") == 'file') && (filemtime($file) < $deletetime)) {
unlink("$dir/$file");
}
}
closedir($handle);
}
reference - webhostingtalk
Also here is a similar question asked before.Auto delete files after x-time
Is this what you are looking for?
<?php
$fh = fopen('test.html', 'a');
fwrite($fh, '<h1>Hello world!</h1>');
fclose($fh);
unlink('test.html');
?>
http://www.php.net/manual/en/function.unlink.php

PHP unlink Delay

I'm having some issue with the unlink function.
I have a page that when refreshed, it searches a directory for newly added files. The user may choose to manage the files and can also delete any file. However, when the user deletes the file, there is almost a 5 second delay before the actual file is deleted from the server directory. In the meantime, if the user refreshes the browser, that same file that was supposed to be deleted re-appears as a new file. The issue with this is that if the user deletes this file again, the file no longer exists because of that initial delay...
Any thoughts on this? It's driving me crazy and not sure how to remedy this situation...
One solution could be to create a new file when you call unlink(), and name the new file $original_filename."_deleted". Then when you list the files, exclude any ending with "_deleted". Then you just need to worry about cleaning up all the "_deleted" files every so often with a cron job.
function my_unlink($filename){
touch($filename.'_deleted');
unlink($filename);
}
function list_files(){
if ($handle = opendir('.')) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != ".." && !preg_match('/_deleted$/',$entry)) {
echo "$entry\n";
}
}
closedir($handle);
}
}
clearstatchache(true, $file) still might be worth a try, though I am pessimistic after the documentation on unlink.
Maybe there are too many files in the directory, and using several directories might help (using directories with the first two chars of the file name).
My hope however goes that the listing overview page is cached. Using
header("Cache-Control: no-cache, must-revalidate");
or so might help.

Delete log files automatically in PHP [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Automatically delete files from webserver
I'm logging some XML content to a text file using file_put_contents() in case I need it for debugging.
How do I delete this file after a certain amount of time?
I've got a restrictive .htaccess in the log folder, but I'd rather not leave the info (will have customer's addresses, etc) up on the web for long.
Well, while agreeing with everyone else that you're using the wrong tool for the job, your question is pretty straightforward. So here you go:
write a PHP script that will run from the command line
use a tool like cron or windows scheduled task
invoke the cron every minute/five minutes/etc
Your script would be pretty straightforward:
<?php
$dh = opendir(PATH_TO_DIRECTORY);
while( ($file = readdir($dh)) !== false ) {
if( !preg_match('/^[.]/', $file) ) { // do some sort of filtering on files
$path = PATH_TO_DIRECTORY . DIRECTORY_SEPARATOR . $file;
if( filemtime($path) < strtotime('-1 hour') ) { // check how long it's been around
unlink($path); // remove it
}
}
}
You could also use find if you're working in Linux, but I see that #Rawkode posted that while I was writing this so I'll leave you with his elegant answer for that solution.
You should be using the PHP error_log() function which will respect the php.ini settings.
error_log('Send to PHP error log');
error_log('Sent to my custom log', 3, "/var/tmp/my-errors.log");
You can use a built-in function – filectime – to track the creation date of your log files and then delete those that are old enough to be deleted.
This code searches through the logs directory and deletes logs that are 2 weeks old.
$logs = opendir('logs');
while (($log = readdir($logs)) !== false)
{
if ($log == '.' || $log == '..')
continue;
if (filectime('logs/'.$log) <= time() - 14 * 24 * 60 * 60)
{
unlink('logs/'.$log);
}
}
closedir($logs);
You should be handling your logging better, but to answer your question I'd use the *nix find command.
find /path/to/files* -mtime +5 -delete
This will delete all files that haven't been modified in five days. Adjust to your own needs.
There are two ways you can work it out:
if you write the log to only one file, you can empty the file using something like this:
<?php file_put_contents($logpath, "");
if you generate many files, you can write cleanup function like this:
<?php
$logpath = "/tmp/path/to/log/";
$dirh = opendir($logpath);
while (($file = readdir($dh)) !== false) {
if(in_array($file, array('.', '..'))) continue;
unlink($logpath . $file);
}
closedir($dirh);
You can set up a server Scheduled task (most known as Cron Job under *nix systems) to run on a regular basis and delete the left log files. The task would be to execute a PHP code that will do the job.
See Cron Job Overvieww

How to generate sitemap for HTML or PHP that are unlinked and have no index file?

Say I have a directory with 100 files in it. Some of the files are PHP and the others are HTML. None of them are linked together. It's just a directory with the files and none of the files are linked, and there is no index file. It's a shared hosting cPanel environment. My question: Is there a way via PHP or otherwise to automatically detect these files and generate a sitemap in HTML, XML or other format? Thanks very much for your help on this one.
Untested but here are a couple scripts which I think may solve your issue:
http://apptools.com/phptools/dynamicsitemap.php
http://yoast.com/xml-sitemap-php-script/
If you want a proper sitemap (how the files link to one another) then there are some libraries available for that mentioned by others. If you just want to list them, then just use the opendir and readdir functions:
$directory = 'your directory';
$array_items = array();
if ($handle = opendir($directory)) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
if (is_dir($directory.'/'.$file)){
continue;
}
$array_items[] = $file;
}
}
closedir($handle);
}
You can then loop through the $array_items and output xml or html. You can also make this recursive by making this a function and handling the
if (is_dir($directory.'/'.$file)){
continue;
}
section

Categories