I've done some code to try delete the cache, and I got it working on my own computer. However, I've ran into a few problems with doing it on my website
The code is here -
<?PHP
function getFiles($dir,$hours){
//read files
$files=glob($dir.'/*');
for($i=0; $i<count($files); $i++){
//get last modified time
$lastchanged = (time()-filemtime($files[$i]))/3600;
echo $files[$i]." modified: ".round($lastchanged)." hours ago<br/>";
//loop into folder if folder
if(is_dir($files[$i]) == true){
$listoffiles = getFiles($files[$i],$hours);
if(strlen($listoffiles)==0){
rmdir($files[$i]);
}
}
//delete file
else{
if($lastchanged > $hours){
unlink($files[$i]);
}
}
}
}
$dir = 'thumbs/cache';
echo getFiles($dir,7*24);
?>
However, on the website, it doesn't seem to detect if the folder only contains another folder, so tries to delete it, and then causes an error saying the directory is empty.
Can anyone see what I've done wrong? Other than this problem it seems to work all right on the site, I did try storing a list of all the folders, reading it backwards, and trying to delete everything backwards so it'd get the empty folders first, but that didn't work either.
Related
I'm new to using OPcache on php 8 and I have some questions. So my folder structure looks like this:
https://i.stack.imgur.com/vb93u.png
Within each folder is the exact same thing, it's the structure of my website.
Why does OPcache generate multiple folders with the same content?
What is the best way to keep only the most recent folder and delete the others? Is there a check that can be done every so often or a setting that overwrites older files with new ones?
I'm fast approaching the file limit with my hosting and need to clear up some space.
I've read the docs but I don't have a lot of knowledge working with servers so any help is greatly appreciated!
Oh and these are the settings in my php.ini:
zend_extension=opcache.so;
opcache.enable=1;
opcache.memory_consumption=32;
opcache.interned_strings_buffer=8;
opcache.max_accelerated_files=3000;
opcache.revalidate_freq=180;
opcache.fast_shutdown=0;
opcache.enable_cli=0;
opcache.revalidate_path=0;
opcache.validate_timestamps=1;
opcache.max_file_size=0;
opcache.file_cache=/mywebsitepath/.opcache;
opcache.file_cache_only=1;
Just in case anyone else has the same problem, this is what I ended up doing. First I tried to use the Wordpress cron manager but I was having issues getting a simple function to work. Instead, in my hosting you can create a cron job and link it to a php file, so I went that route instead. Here's the contents of the cron-jobs.php file which I put in my OPcache folder. Basically it sorts the folders by date modified then deletes the old ones while keeping the fresh one. If anyone has suggestions for improvements, be my guest!
function opcache_clean($dir) {
$folders = array();
foreach (scandir($dir) as $file) {
//we only want the folders, not files
if ( strpos($file, '.') === false ) {
$folders[$file] = filemtime($dir . '/' . $file);
}
}
//only delete the old folders if there is more than one
if (count($folders) > 1) {
arsort($folders);
$folders = array_keys($folders);
//keep the first folder (most recent directory at index 0)
$deletions = array_slice($folders, 1);
foreach($deletions as $delete) {
echo "deleting $delete <br>";
system("rm -rf ".escapeshellarg($delete));
}
}
else {
echo "No folders to delete!";
}
}
//clean the current directory
opcache_clean ( dirname(__FILE__) );
I came up with this questioin. My background is from the Node.js. I am not usually quite used to be in PHP. That's why I'm aksing this question to solve the current issues.
The issues is that's to says I have certain Files and Folders that contains with a specific letters and number at the beginning. As you can see given by down below with a scrrenshot.
I just learn and writing some php code that I grabbed it from the internet resources. Take a look at what's my code:
I want this to detele this all folders which contains letters "exp_" and all the files names start with this numbers "xx-xx-xx" etc.
I created delete.php. When I'd called this file via the browser, I want to achieve to delete all the files and folder which for the No. 1 case.
All these folders and files are generated in everdays. That's why I do want to clean all those data.
<?php
$path = "test";
if(!unlink($path)){
echo "File has not deleted yet.";
} else {
echo "Successfully deleted!";
}
?>
Is there any how any solution to solve this issues? I will appreciate all in advanced who are giving me idea and suggestions from you guys.
You can do something like this:
$files = new DirectoryIterator(__DIR__);
foreach ($files as $file) {
if ( ($file->isDir() and strpos($file->getFilename(),'exp_')!==false) || ($file->isFile() and $file->getFilename() == date('d-m-Y') ) ) {
unlink($file->getPathname());
}
}
Thus, all folders with names like exp_ and files with today's date will be deleted.
My environment is: Windows, MsSQL and PHP 5.4.
My scenario:
I'm doing a small shell script that creates a full backup from my wanted database to a temp folder and then moves it to a new location.
The backup goes fine and the file is created to my temp folder. Then I rename it to the 2nd folder and sometimes it goes ok, sometimes it cannot find the source file.
Of course at this point I know that I could skip the temporary location alltogether, but the actual problem with not finding the file bothers me. Why is it so random and might it also affect other file functions I've written before this one... Also i need to be able to control how and when the files move to the destination.
The base code is simple as it should be (although this is a simplified version of my actual code, since I doubt anyone would be interested in my error handling/logging conditions):
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
}
else {
die('backup failed');
}
What I have tried is:
I added a file_exists before it and it can't find the source file either, when rename can't.
As the file can't be found, copy() and unlink() will not work either
Tried clearstatcache()
Tried sleep(10) after the sql backup completes
None of these didn't help at all. I and google seem to be out of ideas on what to do or try next. Of course I could some shell_execing, but that wouldn't remove my worries about my earlier products.
I only noticed this problem when I tried to run the command multiple times in a row. Is there some sort of cache for filenames that clearstatcache() won't touch ? It seems to be related to some sort of ghost file phenomena, where php is late to refresh the file system contents or such.
I would appreciate any ideas on what to try next and if you read this far thank you :).
You may try calling system's copy command.
I had once problem like yours (on Linux box) when i had to copy files between two NFS shares. It just failed from time to time with no visible reasons. After i switched to cp (analog of Windows copy) problem has gone.
Surely it is not perfect, but it worked for me.
It might be cache-related, or the mysql process has not yet released the file.
mysql will dump the file into another temp file, first and finally moves it to your temp folder.
While the file is beeing moved, it might be inaccessible by other processes.
First I would try to glob() all the files inside temp dir, when the error appears. Maybe you notice, its still not finished.
Also have you tried to implemente something like 10 retry iterations, with some delay?
$notMoved = 0;
while($notMoved < 10){
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
if ($notMoved++ < 10){
sleep(20);
} else {
$this->handleError("Moving {$source} to {$destination} failed.");
break;
}
}else{
break;
}
}
To bypass the issue:
Don't dump and move
Move then dump :-)
(ofc. your backup store would be one behind then)
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
//done :-)
}
else {
die('backup failed');
}
Try
$source = "\\server01\temp\backups\file.bak";
$destination = "\\server02\storage\backups\file.bak";
$content = file_get_content($source);
file_put_contents($destination, $content);
I'm currently building a very low level CMS for friend's artist web page that will allow her to upload, edit, and delete images along with designating categories for them and posting news posts about shows and so on.
I'm sure there is a very easy solution to this problem of mine but my inexperience in programming has me left at a loss; so here goes.
The Problem
The problem occurs on a page where the user can delete an image that has been uploaded. Here is the snippet of code where the problem occurs:
// Assign selection to variables in memory...
$img_id = $data["img_name"];
// First, collect the file path to the image being deleted...
$rs = mysql_query("SELECT img_path FROM img_uploads WHERE img_id = '$img_id'") or die(mysql_error());
list($img_path) = mysql_fetch_row($rs);
// Then delete that row from the DB...
mysql_query("DELETE FROM img_uploads WHERE img_id = '$img_id'") or die(mysql_error());
// Now, using the file path collected earlier, delete that file from the server.
unlink($img_path);
// Quickly make sure that the file has been deleted by checking if it exists... if it still exists return error.
if(file_exists($filename)) {
$err[] = "ERROR - There was an error deleting the file! Please try again.";
$_SESSION["errors"] = $err;
header("Location: img_del.php?doDel=failed");
exit();
}
// Scan the directory now that a file has been deleted to see if the dir is empty. If so, delete it. (No use in having empty folders!)
$file_types = array("gif","jpg","png"); // file types to scan for...
$path_parts = pathinfo($img_path); // get the directory from the file path...
$dir = $path_parts["dirname"] . "/"; // assign it to a new variable...
$handle = opendir($dir);
$scan = scandir($dir); // now, scan that directory...
$image_found = FALSE;
for($i=0; $i<count($scan); $i++) {
if ($scan[$i] != '.' && $scan[$i] != '..' && in_array(end(explode('.', $scan[$i])), $file_types)) {
$image_found = TRUE;
}
}
closedir($handle);
if(!$image_found) {
rmdir($dir);
}
I first delete the DB row containing image info, then delete the file from the server. this works fine, however, I also want to check if the directory is left empty after deleting that file. I check if the directory is empty using a loop and if no file is found, I run mkdir(). For some reason it keeps returning an error saying that the directory is not empty
I've searched the web and this site for a solution but I've yet to find one. I'm sure it's out there but I'm having trouble finding it which why I came here. What should I do?
Thanks in advance for any help submitted!
NOTE
I have also checked for hidden files and folders but no luck...
Here is a link to an image that pretty much sums up my problem in a nutshell
Are you sure PHP has permission to delete the file? Since you say you've checked for hidden files, this seems to be the only remaining option. CHMOD 0777 when in doubt (I'd never recommend this usually, but if you're deleting it anyway...), and make sure the folder has the proper owner to let php delete it.
I'm having an extremely weird problem with a PHP script of mine.
I'm uploading a couple of files and having PHP put them all in one folder.
I've have trouble with random files being sent and random ones not being sent. So I debugged it and I got a very weird result from the $_FILES[] array.
I tried it with 3 files.
$_FILES["addFile"]["name"] Holds the names of the 3 files.
You'd expect $_FILES["addFile"]["tmp_name"] to hold the 3 temporary names that PHP uses to copy the files, but it doesn't. It holds just one name. The other 2 are empty strings, which generate an error whilst uploading(which I supress from being displayed)
This is very odd. I've tried mulitple situations and it just keeps on happening.
This must be something in my settings or perhaps even my code.
Here's my code:
$i = 0;
if (!empty($_FILES['addFile'])) {
foreach($_FILES['addFile'] as $addFile) {
$fileToCopy = $_FILES["addFile"]["tmp_name"][$i];
$fileName = $_FILES["addFile"]["name"][$i];
$i++;
if(!empty($fileToCopy)){
$copyTo = $baseDir."/".$fileName;
#copy($fileToCopy, $copyTo) or die("cannot copy ".$fileToCopy." to ".$copyTo);
}
}
exit(0);
}
Since the tmp_name is empty, the if-value will be false so it's gonna skip the die() function.
Does anybody know what might be causing this?
further info: I'm using Windows XP, running WAMP server. Never had this problem before and I can acces all maps from which I've tried to upload. Security settings of windows can't be the issue I think.
I'm sorry but it seams to me that you are trying to upload all 3 files with the same variable name? Is this right?
But this will not work because they will overwrite each other.
I think the better an cleaner way it would be to use something like
$i = 0;
foreach($_FILES['addFile'.$i] as $addFile) {
if(!empty($addFiles) {
move_uploaded_file($addFile['temp_name'], 'YOUR DIRECTORY');
}
$i++;
}
Relevent, but probably not going to help: but move_uploaded_file is a (slightly) better way to handle uploaded files than copy.
Are any of the files large? PHP has limits on the filesize and the time it can take to upload them ...
Better to send you here than attempt to write up what it says:
http://uk3.php.net/manual/en/features.file-upload.common-pitfalls.php
Your loop logic is incorrect. You are using a foreach loop on the file input name directly, which stores several properties that are of no interest to you ('type','size', etc).
You should get the file count from the first file and use it as the loop length:
if (!empty($_FILES['addFile']) && is_array($_FILES['addFile']['name'])) {
$length = count($_FILES['addFile']['name']);
for($i = 0; $i < $length; $i++) {
$result = move_uploaded_file($_FILES['addFile']['tmp_name'][$i],$baseDir."/" . $_FILES['addFile']['name'][$i]);
if($result === false) {
echo 'File upload failed. The following error has occurred: ' . $_FILES['addFile']['error'][$i];
}
}
}
Check the error code if you are still having problems, it should provide all the information you need to debug it.