Using rmdir() and RemoveDirectory() can a deleted file be recovered? - php

If you use a code like below to delete a file that was created by your code, where does that file get deleted too? Can the file be recovered?
function RemoveDirectory($path){
foreach(glob("{$path}/*") as $file)
{
if(is_dir($file)) {
RemoveDirectory($file);
} else {
unlink($file);
}
}
rmdir($path);
}
Let's just say I called this directory at the wrong time in the code, and I have regrets.

The file gets deleted from your hard drive. It doesn't get removed to a "Recycle Bin". It gets completely removed. To recover the file after that, you will need some sort of undelete software, which may or may not work, depending on whether or not you have overwritten those sectors of the hard drive with other files since the delete took place. If you accidentally delete a file, remove the drive immediately and boot from a different drive to prevent further writes from occurring.

Related

upload file and register it in a database

If there is any mistake in my writing, please excuse me. I'm not very good at translating from Spanish to English.
I am using pluploadQueue, reading the example script (upload.php). I have tested it and it works fine.
In addition to all that, I need to record information about the files that are attached in a database.
I have tried to add the statements that save the information in the database, but it only works halfway. It generates just one entry in my database table.
I notice that in the "upload" directory it generates a 1024kb .part for the 1st file but it doesn't do anything else there. Although visually the Widget tells me that it has uploaded all the files.
I need to save as many entries as files are selected (many can be uploaded, so it has been thought).
Could someone point me to a reference? I have tried to search for something about my problem but I have not been able to find anything.
I use version 2.3.9 of plupload
Correct me if I'm wrong, but I'm understanding that the upload.php script is executed (or should be) for each file to be uploaded, right?
My logic is such that:
Insert into db basic "header" information for the first time. In the following operations it is recovered.
plupload upload routine
Insert in db file information
thus forming a master-detail relationship bettween 2 tables.
For some reason, it stays at point 2. And this part would not be running:
while (($file = readdir($dir)) !== false) {
$tmpfilePath = $targetDir . DIRECTORY_SEPARATOR . $file;
// If temp file is current file proceed to the next
if ($tmpfilePath == "{$filePath}.part") {
continue;
}
// Remove temp file if it is older than the max age and is not the current file
if (preg_match('/\.part$/', $file) && (filemtime($tmpfilePath) < time() - $maxFileAge)) {
#unlink($tmpfilePath);
}
}
closedir($dir);
}
Help me?
EDIT:
Okay. I have found a typo error. I corrected it and now it saves me for each file. What I can't figure out right now is why it has generated hundreds of .part files in the directory instead of overwriting it. And this led to hundreds of records being stored, one for each generated .part.

Remove "Attachments" folder after PHPMailer email is sent

I've got a web form which has a unique upload folder for each user (using their PHP session_id() as the folder name) which works well. When the form is submitted (after error checking) PHPMailer is used to send the email and the attachments. This is also working well. However, after the email is sent, I would like to remove the uploads from the folder and then the folder itself (sort of a self-cleanup!) The files are removed as expected but the folder remains (albeit empty). I wonder if the folder is somehow "still in use" so doesn't get deleted or something similar? This is the code:
// Empty the contents of the upload folder
if (is_dir($dir)) { // Target directory ($dir) is set above in photos POST section
// Check for any files inside the directory
$files = glob($dir.'/*'); // Get all file names
foreach($files as $file) { // Iterate through the files
if(is_file($file)) { // Check its a file
unlink($file); // Delete the file
}
}
// Remove the upload folder
rmdir($dir); //NOT WORKING? NEEDS SOME TROUBLESHOOTING...
}
Any other ideas on why this folder is remaining?
Ben
I would guess that your folders might contain hidden files (starting with .) which the default glob pattern won't match, so try this:
$files = glob($dir . '/{,.}*'); // Get all file names including hidden ones
foreach($files as $file) { // Iterate through the files
if(is_file($file)) { // Check its a file
unlink($file); // Delete the file
}
}
Also check the return value on both unlink and rmdir so you can see exactly where it's failing.
Turns out after much testing that the problem was not actually with rmdir at all! My web form uses a Dropzone for photo uploads to a unique folder for each user using their php session_id() and this folder is supposed to be created when they add a photo to Dropzone (if it doesn’t already exist). Problem was I’d put the folder creation code outside of the actual upload script so the folder was in fact being deleted but them instantly created again when the form submitted and the page reloads! Sorry about that but thanks for all your help. :)

PHP rename() cannot always find source file (code 2) in Windows environment

My environment is: Windows, MsSQL and PHP 5.4.
My scenario:
I'm doing a small shell script that creates a full backup from my wanted database to a temp folder and then moves it to a new location.
The backup goes fine and the file is created to my temp folder. Then I rename it to the 2nd folder and sometimes it goes ok, sometimes it cannot find the source file.
Of course at this point I know that I could skip the temporary location alltogether, but the actual problem with not finding the file bothers me. Why is it so random and might it also affect other file functions I've written before this one... Also i need to be able to control how and when the files move to the destination.
The base code is simple as it should be (although this is a simplified version of my actual code, since I doubt anyone would be interested in my error handling/logging conditions):
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
}
else {
die('backup failed');
}
What I have tried is:
I added a file_exists before it and it can't find the source file either, when rename can't.
As the file can't be found, copy() and unlink() will not work either
Tried clearstatcache()
Tried sleep(10) after the sql backup completes
None of these didn't help at all. I and google seem to be out of ideas on what to do or try next. Of course I could some shell_execing, but that wouldn't remove my worries about my earlier products.
I only noticed this problem when I tried to run the command multiple times in a row. Is there some sort of cache for filenames that clearstatcache() won't touch ? It seems to be related to some sort of ghost file phenomena, where php is late to refresh the file system contents or such.
I would appreciate any ideas on what to try next and if you read this far thank you :).
You may try calling system's copy command.
I had once problem like yours (on Linux box) when i had to copy files between two NFS shares. It just failed from time to time with no visible reasons. After i switched to cp (analog of Windows copy) problem has gone.
Surely it is not perfect, but it worked for me.
It might be cache-related, or the mysql process has not yet released the file.
mysql will dump the file into another temp file, first and finally moves it to your temp folder.
While the file is beeing moved, it might be inaccessible by other processes.
First I would try to glob() all the files inside temp dir, when the error appears. Maybe you notice, its still not finished.
Also have you tried to implemente something like 10 retry iterations, with some delay?
$notMoved = 0;
while($notMoved < 10){
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
if ($notMoved++ < 10){
sleep(20);
} else {
$this->handleError("Moving {$source} to {$destination} failed.");
break;
}
}else{
break;
}
}
To bypass the issue:
Don't dump and move
Move then dump :-)
(ofc. your backup store would be one behind then)
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
//done :-)
}
else {
die('backup failed');
}
Try
$source = "\\server01\temp\backups\file.bak";
$destination = "\\server02\storage\backups\file.bak";
$content = file_get_content($source);
file_put_contents($destination, $content);

Deleting files after adding them to an archive prevents the archive's creation

if ($zip->open($zipFile, ZipArchive::CREATE) !== TRUE) {
(...)
} else {
$zip->addFile($tempDir.$fileName.".xls", $fileName.".xls");
// The array contains the directory structure of the files to add
foreach ($list_attachments as $dir_name => $attachment_files) {
if (!empty($attachment_files)) {
$zip->addEmptyDir($dir_name);
foreach ($attachment_files as $attachment) {
$zip->addFile($tempDir.$dir_name."/".$attachment, $dir_name."/".$attachment));
unlink($tempDir.$dir_name."/".$attachment);
}
rmdir($tempDir.$dir_name."/");
}
}
$zip->close();
}
Please don't mind potential typos in the variable names, I rewrote them and the comment in English to make them more readable.
If I run the code as is, it will delete the files and the directories but won't create the archive. I ran checks on return values and addEmptyDir, addFile, unlink and rmdir all work fine. However, it seems that removing the files prevents the archive from closing properly, and thus the file isn't created.
I circumvented it by moving the unlink and rmdir calls after the $zip->close(), so the files are only deleted after the archive is created. However, is forces me to have twice the loops, and from what I've gathered looking at the documentation and zip-related questions here there shouldn't be any issue with unlinking like I did.
Does anyone know for which reason this could happen?
The zip will be finally written to file AFTER you've called $zip->close(). Until this point everything happens in memory, no 'zipping' is done. That's why you can delete the unzipped files only after you've called $zip->close() successfully.
The documentation even says the following:
When a file is set to be added to the archive, PHP will attempt to lock the file and it is only released once the ZIP operation is done. In short, it means you can first delete an added file after the archive is closed.
However, the locks will not prevent you from deleting the files anyway, they are just "hints", the big problem is that the files need to be there for processing on close().
So the inner loop should look like this:
foreach ($attachment_files as $attachment) {
$zip->addFile($tempDir.$dir_name."/".$attachment, $dir_name."/".$attachment));
$to_be_unlinked []= $tempDir.$dir_name."/".$attachment;
}
Later on, unlink the files:
...
foreach($to_be_unlinked as $file) {
unlink($file);
}

PHP: Safe way to remove a directory?

Consider this code:
public static function removeDir($src)
{
if (is_dir($src)) {
$dir = #opendir($src);
if ($dir === false)
return;
while(($file = readdir($dir)) !== false) {
if ($file != '.' && $file != '..') {
$path = $src . DIRECTORY_SEPARATOR . $file;
if (is_dir($path)) {
self::removeDir($path);
} else {
#unlink($path);
}
}
}
closedir($dir);
#rmdir($src);
}
}
This will remove a directory. But if unlink fails or opendir fails on any subdirectory, the directory will be left with some content.
I want either everything deleted, or nothing deleted. I'm thinking of copying the directory before removal and if anything fails, restoring the copy. But maybe there's a better way - like locking the files or something similar?
I general I would confirm the comment:
"Copy it, delete it, copy back if deleted else throw deleting message fail..." – We0
However let's take some side considerations:
Trying to implement a transaction save file deletion indicates that you want to allow competing file locks on the same set of files. Transaction handling is usually the most 'expensive' way to ensure consistency. This holds true even if php would have any kind of testdelete available, because you would need to testdelete everything in a first run and then do a second loop which costs time (and where you are in danger that something changed on your file system in the meanwhile). There are other options:
Try to isolate what really needs to be transaction save and handle those data accesses in databases. Eg. MySQL/InnoDB supports all the nitty gritty details of transaction handling
Define and implement dedicated 'write/lock ownership'. So you have folders A and B with sub items and your php is allowed to lock files in A and some other process is allowed to lock files in B. Both your php and the other process are allowed to read A and B. This gets tricky on files, because a file read causes a lock as well which lasts the longer the bigger the files are. So on file basis you probably need to enrich this with file size limits, torance periods and so on.
Define and implement dedicated access time frames. Eg. All files can be used within the week, but you have a maintenance time frame at sunday night which can also run deletions and therefore requires lock free environments.
Right, let's say my reasoning was not frightening enough :) - and you implement a transaction save file deletion anyway - your routine can be implemented this way:
backup all files
if the backup fails you could try a second, third, fourth time (this is an implementation decision)
if there is no successful backup, full stop
run your deletion process, two implementation options (in any way you need to log the files you deleted successfully):
always run through fully, and document all errors (this can be returned to the user later on as homework task list, however potentially runs long)
run through and stop at the first error
if the deletion was the successful, all fine/full stop; if not proceed with rolling back
copy back only previously successful deleted file from the archive (ONLY THEM!)
Wipe out your backup
This then is only transaction save on file level. It does NOT handle the case where somebody changes permissions on folders in between step 5 and 6.
Or you could try to just rename/move the directory to something like /tmp/, it succeeds or doesnt - but the files are not gone. Even if another process would have an open handle, the move should be ok. The files will be gone some time later when the tmp folder is emptied.

Categories