How does one, in PHP, delete all files in a directory every 24 hours without deleting the directory itself? It has to be in PHP, it cannot be a Cron Job.
Can you please provide an explanation behind your code as well? I am still learning PHP.
Thank you!
There is no way to do this in PHP without using PHP. Sorry.
Joking, but if you wanted to do this you would need some sort of task scheduler (like cron).
That is to say that you could program your personal computer to send the request to the server every 24 hours, but you would either have to do it manually or schedule the task locally.
My point being, you need cron, but it doesn't need to be running on the same host as the PHP files.
Without cron you'd have to add code like this to a commonly-requested page:
$scheduledclean = strtotime("midnight"); // 00:00:00 of the current day
$lastcleanfile = '/path/to/my/app/lastclean';
$lastcleantime = (file_exists($lastcleanfile)) ? filemtime($lastcleanfile) : 0;
$curtime = time();
if( ($curtime > $scheduledclean) && ($lastcleantime < $scheduledclean) ) {
touch($lastcleanfile); //touch first to prevent multiple executions
// file cleanup code here
}
On the first request to the page after midnight the cleanup will fire off, but the unlucky person that made the request will likely have a delay for their page to be served that is as long as the cleanup takes. You could mitigate this by running the cleanup as a backgrounded shell command like shell_exec('rm -rf /path/to/dir/* &');
I did something similar to this a long time ago. It's a terrible idea, but you can have a file which stores the last time your directory was cleared. Each time a user visits a relevant page, check this file in the PHP script (you could also check the modified time). If it is far enough into the past, update the file and run your delete script.
Downsides:
Not guaranteed to run every 24 hours (maybe you get no visitors one day)
Gives one user per day a longer wait
Ugly
As for deleting the files,
function delete_contents( $dirpath ) {
$dir = opendir( $dirpath );
if( $dir ) {
while( ($s = readdir( $dir )) !== false ) {
if( is_dir( $s ) ) {
delete_contents( $s );
rmdir( $s );
} else {
unlink( $s );
}
}
closedir( $dir );
}
}
BE VERY CAREFUL with that. On a crude server setup, delete_contents('/') will delete every file.
Make a PHP script that removes all files in the directory, look for the functions readdir() and unlink() to remove the files.
Set-up a Cron Job to run the script automatically each 24 hours. How you have to do this exactly depends on your host. There are also websites that you can use for this: http://www.google.nl/search?q=cronjobs+webservice
Good luck!
Related
I have a php script i run every 5 minutes with Cron from a folder. In the folder there is several images and i add more as time goes.
I was wondering how i can make the php script in the beginning check if NEW files exist after the last time the script was run? If new files exist the script should just go on and if no new files exist then it should not go on. I tried searching around but i cant find anything regarding php.
Anyone that know a quick solution to this problem maybet ?
If the new files are also created with a new timestamp, you can use filemtime() to fetch only files that were created/modified in a specified window of time.
Example:
$files = glob("folder/*.jpg");
$files = array_filter($files, function ($file) { return filemtime($file) >= time() - 5*60; /* modified in the last 5 minutes */ });
if ($files)
{
// there are new files! $files is an array with their names
}
To make sure you won't miss any file, you might want to store the time from last run somewhere, so in case cron delays a second or two and new files were created precisely within that window, you won't lose track of them.
Update for comments:
Now, to store the time from last check, thats up to you to decide how you will do that, you can use database, file, some sort of environment variable etc., but here is an example of how you can do something really simple storing time() in a file:
$last = (int)file_get_contents('folder/timestamp.txt');
file_put_contents('folder/timestamp.txt', time());
$files = glob("folder/*.jpg");
$files = array_filter($files, function ($file) { return filemtime($file) > $last; });
if ($files)
{
// there are new files! $files is an array with their names
}
Just make sure your PHP script can modify folder/timestamp.txt and with this script it will always process new files modified since the last run, no matter how long ago it happened.
Method :
store current time whenever the cron executed in a file or database.
every time when cron starts get the last executed time of the cron from your file or database
count the file which creates after last execution time.
if count greater than 0. process the cron. other wise stop.
You could keep track of the time the script was last run and use filemtime to check if the file was updated or created after your last execution.
http://php.net/manual/en/function.filemtime.php
int filemtime ( string $filename )
Use filemtime() as follows,You will get the added time as date format.
$file_time = date ("F d Y H:i:s.", filemtime($filename);
My data.txt is generated each time when someone is visiting my site. I would like to delete this file on on specific time, let say 1:00AM daily.
I found this script but I'm struggling to update the code :/
Any advice?
<?php
$path = dirname(__FILE__).'/files';
if ($handle = opendir($path)) {
while (false !== ($file = readdir($handle))) {
if ((time()-filectime($path.'/'.$file)) < 86400) { // 86400 = 60*60*24
if (preg_match('/\.txt$/i', $file)) {
unlink($path.'/'.$file);
}
}
}
}
?>
The script you posted deletes .txt files in the given folder if they are older than a day but the problem is that this test only happens once - when you run the script.
What you need to do is run this script periodically. If you are running this on Linux you should probably add a cron job that executes this script periodically, say once an hour or once daily.
If you're running this on Windows, there is a task schedule that you could use to accomplish the same thing.
Use task scheduler, such as cron for this purposes. You can remove your file simply via shell command
rm /path/to/data.txt
So there's no need to write a PHP script for that.
Here's the code I am using:
<?php
$interval = 5 * 60;
$filename = "cache/".basename( rtrim( $_SERVER["REQUEST_URI"], '/' ) ).".cache";
if ( file_exists( $filename ) && (time() - $interval) < filemtime( $filename ) ) {
readfile( $filename );
exit();
}
ob_start();
include 'dynamicpage.php';
?>
<?php
// More page generation code goes here
$buff = ob_get_contents(); // Retrive the content from the buffer
// Write the content of the buffer to the cache file
$file = fopen( $filename, "w" );
fwrite( $file, $buff );
fclose( $file );
ob_end_flush(); // Display the generated page.
?>
Currently, if the cached page is over 5 minutes old, this script would generate a new cache file to replace the old one. Is there any way I can have the old cache display first and have the new cached page be generated in the background? My host is weak sauce so it takes forever to wait for the page to complete loading.
I would set a crontab to process the page every 5 minutes, and always serve your users the cached page.
If you cannot set a crontab, you can output a hidden iframe with the dynamic page loading in there, so the page loads quickly, but another instance is loading in the background (not a very neat solution, but works).
Sounds like you need an asynchronous PHP request. Essentially what this does is trigger another script to run alongside your current one. #John has the right idea, but a crontab is only one way of running the caching process asynchronously. The downside to his solution is that it will run every 5 minutes, whether it's needed or not.
There are a number of libraries and other bits and bobs that will help you to set up async PHP processing, but again as #John says, it gets a little involved.
Here are a few resources to help with this:
php-parallel-processing (PHP Library)
Gearman (Native PHP Library)
Asynchronous PHP calls? (SO Question)
The Smarty Template Engine is a simple and small tool that has a lot of built in cache functionality without rules of a framework. http://www.smarty.net/
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Automatically delete files from webserver
I'm logging some XML content to a text file using file_put_contents() in case I need it for debugging.
How do I delete this file after a certain amount of time?
I've got a restrictive .htaccess in the log folder, but I'd rather not leave the info (will have customer's addresses, etc) up on the web for long.
Well, while agreeing with everyone else that you're using the wrong tool for the job, your question is pretty straightforward. So here you go:
write a PHP script that will run from the command line
use a tool like cron or windows scheduled task
invoke the cron every minute/five minutes/etc
Your script would be pretty straightforward:
<?php
$dh = opendir(PATH_TO_DIRECTORY);
while( ($file = readdir($dh)) !== false ) {
if( !preg_match('/^[.]/', $file) ) { // do some sort of filtering on files
$path = PATH_TO_DIRECTORY . DIRECTORY_SEPARATOR . $file;
if( filemtime($path) < strtotime('-1 hour') ) { // check how long it's been around
unlink($path); // remove it
}
}
}
You could also use find if you're working in Linux, but I see that #Rawkode posted that while I was writing this so I'll leave you with his elegant answer for that solution.
You should be using the PHP error_log() function which will respect the php.ini settings.
error_log('Send to PHP error log');
error_log('Sent to my custom log', 3, "/var/tmp/my-errors.log");
You can use a built-in function – filectime – to track the creation date of your log files and then delete those that are old enough to be deleted.
This code searches through the logs directory and deletes logs that are 2 weeks old.
$logs = opendir('logs');
while (($log = readdir($logs)) !== false)
{
if ($log == '.' || $log == '..')
continue;
if (filectime('logs/'.$log) <= time() - 14 * 24 * 60 * 60)
{
unlink('logs/'.$log);
}
}
closedir($logs);
You should be handling your logging better, but to answer your question I'd use the *nix find command.
find /path/to/files* -mtime +5 -delete
This will delete all files that haven't been modified in five days. Adjust to your own needs.
There are two ways you can work it out:
if you write the log to only one file, you can empty the file using something like this:
<?php file_put_contents($logpath, "");
if you generate many files, you can write cleanup function like this:
<?php
$logpath = "/tmp/path/to/log/";
$dirh = opendir($logpath);
while (($file = readdir($dh)) !== false) {
if(in_array($file, array('.', '..'))) continue;
unlink($logpath . $file);
}
closedir($dirh);
You can set up a server Scheduled task (most known as Cron Job under *nix systems) to run on a regular basis and delete the left log files. The task would be to execute a PHP code that will do the job.
See Cron Job Overvieww
I have a temporary folder generated by my business application and wish for the documents within to be only available for around 30 minutes. I was tempted to build an index to keep track of when each file was created but that would be a little silly for just temporary files, they are not of too much importance but I would like them to removed according to the time they were last modified.
What would I need to do this with my Linux server?
The function filemtime() will allow you to check the last modify date of a file. What you will need to do is run your cron job each minute and check if it is greater than the threshold and unlink() it as needed.
$time = 30; //in minutes, time until file deletion threshold
foreach (glob("app/temp/*.tmp") as $filename) {
if (file_exists($filename)) {
if(time() - filemtime($filename) > $time * 60) {
unlink($filename);
}
}
}
This should be the most efficient method as you requested, change the cron threshold to 10 minutes if you need a less accuracy in case there are many files.
You'd need nothing more than to call stat on the files and decide whether to unlink them or not based on their mtime.
Call this script every ten minutes or so from cron or anacron.
Or you could use tmpwatch, a program designed for this purpose.