I am making a form that create service with multiple images, I'm not sure if I am doing right saving process that handle failures.
for example if I have 5 images and already uploaded 3 but failed in the 4th image so I need to cancel saving process and delete all saved files.
$validatedImages = [];
foreach($images as $key => $image){
$imageName = ServiceImage::generateRecordName($image);
if(!$image->storeAs(ServiceImage::path(), $imageName)) {
// fall back all the stored files
foreach($validatedImages as $validatedImage)
Storage::delete(ServiceImage::path() . $validatedImage);
return redirect()->back()->with(['errorMsg' => 'There was a problem when uploading images']);
}
$validatedImages[] = $imageName;
}
and also when storing into database
foreach($validatedImages as $validatedImage)
if(!$service->images()->save(new ServiceImage(['name' => $validatedImage])))
// handle failure ..
so my question is : What is the best practice to handle this fall back.
Your controller should only be receiving fully-validated data. So those images should be ready for uploading once you’re in the body of your controller method:
$paths = [];
foreach ($request->file('images') as $image) {
$paths[] = $image->store('images');
}
// Do something with $paths
You probably want to upload images asynchronously. Especially if there are five of them in a request. With this approach, you can upload the files to a folder that is periodically cleaned (say after 24 hours). When an image is uploaded to this folder, return the paths. Submit the paths to your controller action instead of the actual files, and you can then move the files from the temporary folder to a permanent one.
Your method can be improved since what I see from your code is that, once you upload your files all you do is delete the file and again in another loop check for database upload. Since when all file is properly uploaded but could not be written in the database then you are deleting all those files again. This may not matter for small files but when files are large in size and the number of files are also large then this can cause unnecessary performance issues. for example you upload 100 files each of 10 mb size then you uploaded 1000mb but your file execution in database show error on the first database update then the upload of the remaining 99 files just goes in vain. Moreover, You are making four loops 2 for upload and 2 for database I see some unnecessary extra loop here. The loops can be reduced from 4 to 2 with this code and you can stop the script once a file has errors without going to another file and thus saving traffic and improving performance.
The key is you upload each file and write it to database and stop the script as soon as there is error either in the upload or database code execution success,
this is how i suggest you can improve the code.
$validatedImages = [];
foreach($images as $key => $image){
$imageName = ServiceImage::generateRecordName($image);
if(!$image->storeAs(ServiceImage::path(), $imageName)) {
$this->deleteImage($validatedImages);
return redirect()->back()->with(['errorMsg' => 'There was a problem when uploading images']);
}
if(!$service->images()->save(new ServiceImage(['name' => $imageName]))){
/*file has been uploaded but database error was found so this file needs to be deleted, */
$this->deleteImage($validatedImages,$imageName);
return redirect()->back()->with(['errorMsg' => 'There was a problem updating the database']);
}
$validatedImages[] = $imageName;
}
function deleteImage($validatedImages,$extraimage=null){
foreach($validatedImages as $validatedImage){
Storage::delete(ServiceImage::path() . $validatedImage);
//code for reversing database change
}
if($extraimage)
/*the image which was uploaded but could not be written in the
database*/
Storage::delete(ServiceImage::path() . $extraimage);
}
another method you can do is store the image in a temporary folder and move the file permanently only when all files get uploaded and once any error occurs you delete all the files in that temporary directory without deleting them in a loop.
Related
If there is any mistake in my writing, please excuse me. I'm not very good at translating from Spanish to English.
I am using pluploadQueue, reading the example script (upload.php). I have tested it and it works fine.
In addition to all that, I need to record information about the files that are attached in a database.
I have tried to add the statements that save the information in the database, but it only works halfway. It generates just one entry in my database table.
I notice that in the "upload" directory it generates a 1024kb .part for the 1st file but it doesn't do anything else there. Although visually the Widget tells me that it has uploaded all the files.
I need to save as many entries as files are selected (many can be uploaded, so it has been thought).
Could someone point me to a reference? I have tried to search for something about my problem but I have not been able to find anything.
I use version 2.3.9 of plupload
Correct me if I'm wrong, but I'm understanding that the upload.php script is executed (or should be) for each file to be uploaded, right?
My logic is such that:
Insert into db basic "header" information for the first time. In the following operations it is recovered.
plupload upload routine
Insert in db file information
thus forming a master-detail relationship bettween 2 tables.
For some reason, it stays at point 2. And this part would not be running:
while (($file = readdir($dir)) !== false) {
$tmpfilePath = $targetDir . DIRECTORY_SEPARATOR . $file;
// If temp file is current file proceed to the next
if ($tmpfilePath == "{$filePath}.part") {
continue;
}
// Remove temp file if it is older than the max age and is not the current file
if (preg_match('/\.part$/', $file) && (filemtime($tmpfilePath) < time() - $maxFileAge)) {
#unlink($tmpfilePath);
}
}
closedir($dir);
}
Help me?
EDIT:
Okay. I have found a typo error. I corrected it and now it saves me for each file. What I can't figure out right now is why it has generated hundreds of .part files in the directory instead of overwriting it. And this led to hundreds of records being stored, one for each generated .part.
hi this is the function that upload image inside temp location and save location to session for forther use
function uploadPhoto()
{
$rawImage = $_FILES['advPhoto'];
$uploader = new ImageUploader($rawImage);
$uploader->moveToProjectTempFolder();
//1 save the current image in sassion (save the attachment class inside seesion)
$uploader->saveInSession();
// $temperrary = $uploader->CurrentImgTemperraryLocation();
//2 send reponse the current image location
AjaxHelper::sendAjaxResponse("images/temp/" . $uploader->CurrentImgTemperraryLocation());
//create image tag and set the image source the "temp uploaded image path"
// done
//when the mail form is submitted
//loop through session array
//move the user uploaded/approved images to permanent folder
//save image information inside DB
}
here is the function that cause problem I wanna move the picture from temp folder to permanent location but the php_move_uploaded_file() doesn't work in my case I don't really know what is the problem please help me if you know what is the problem thnks .
function saveAdv()
{
$advTitle = $_POST['advTitle'];
$advContent = $_POST['advContent'];
if (!empty($advTitle) && !empty($advContent)) {
if (DataValidation::isOnlyPersianOrEnglish($advTitle) &&
DataValidation::isOnlyPersianOrEnglish($advContent)) {
DBconnection::insertRow('ADVERTISEMENT', ['title', 'Advertisement', 'advDate'],
[$advTitle, $advContent, date('y/m/d h:i:s')]);
// AjaxHelper::sendAjaxResponse("success");
$projectTemp = $_SESSION['ADVERTISEMENT']['Img'];
move_uploaded_file(
$projectTemp,
DOC_ROOT . "/images/advertisementImg/"
);
AjaxHelper::sendAjaxResponse($projectTemp);
}
} else {
AjaxHelper::sendErrorMessage(AjaxHelper::EMPTY_EMAIL_OR_PASSWORD);
}
}
I don't get any error I've already debuged many times but no warning and no errors at all and the location of the folders are completely correct and also there is no permission problems.
The move_uploaded_file() works pretty well at first step that I move image from system temp location to my project temp location, but doesn't work when I wanna move the image from project temp location to permanent location.
move_uploaded_file() is only for moving files which have just been uploaded in a POST request and are stored in the system temp location. As the documentation (https://php.net/manual/en/function.move-uploaded-file.php) states, it first checks whether the file is a valid upload file meaning that it was uploaded via PHP's HTTP POST upload mechanism (that's a direct quote from the docs). If it's not valid by that definition, it fails.
So, if you're trying to use move_uploaded_file() to copy files from other locations (not the system temp location) which have not been directly uploaded to that location in the current request, then it won't work. Use PHP's general file manipulation functionality for moving other files around, using the rename() function (see https://www.php.net/manual/en/function.rename.php for details).
I have a upload form for registration picture of user by PHP.
I use Wamp Server.
I want that when the user starts the uploading file and then abandons the upload form (for any reason), that the uploaded files get deleted after 10 minutes.
how do I remove temporary files on server left from these abandoned upload forms?
OR
How do I create a temporary folder for uploaded files and empty it after a period of time?
How without using PHP code can I do this because maybe after uploading a file the user doesn't continue and the PHP script doesn't process it, so the file doesn't get deleted, but it should be deleted.
How without using PHP code can do this?
OR
How to run a PHP code without user request and by server to delete old upload files?
You can set your temp_folder with http://php.net/manual/en/ini.core.php#ini.upload-tmp-dir in the ini file or using ini function
You can get he temporary folder loaction with http://php.net/manual/en/function.sys-get-temp-dir.php
Temporary file ( ei : $_FILES['userfile']['tmp_name']) are deleted right after the script is done according to this php:: how long to tmp files stay? and the php documentation.
If you are talking about file you moved somewhere else and your server have the permission to write/delete in the folder you could do something like
foreach (glob("your_temp_folder/*") as $Filename) {
// Read file creation time
$FileCreationTime = filectime($Filename);
// Calculate file age in seconds
$FileAge = time() - $FileCreationTime;
// Is the file older than the given time span?
if ($FileAge > ($expire_time * 60)){
// Now do something with the olders files...
print "The file $Filename is older than $expire_time minutes\n";
// For example deleting files:
//unlink($Filename);
}
}
Code snippet credit to => http://www.jonasjohn.de/snippets/php/delete-temporary-files.htm
I am working on a project on CakePHP 2.4.4. And I am facing the following problem: My Vendor Uploading Class calls one function newImage which creates a new image. When I upload more than one image for example five times, this function is being called five time in a row. This function contains code like:
...
...initializing Uploader class
...
//creating image
$this->Orderimage->create();
$data = array(
'order_id' => $order_id,
'filename' => $filename,
'date' => date('Y-m-d'),
'extension' => $result['ext'],
);
$this->Orderimage->save($data);
But here is the place where I meet my problem. When I am trying to upload more than 4 images, which means that I call this function more than 4 times in a row some image are not uploaded and instead of them the previous pictures are uploaded. The reason for this is that these images are getting the same filename. But the filename is given by the last created image+1. Therefore the bug is that the database does not have enough time to save the image, when the next arrives. And this is the reason, that some image overwrite another. How could I fix it ?
Try setting the filename as something unique instead of just using +1.
For example:
$filename = CakeText::uuid() . '.jpg'; // or try String::uuid()
That way you don't need to worry about accidentally having the same filename.
https://book.cakephp.org/2.0/en/core-utility-libraries/string.html#CakeText::uuid
Side note: if you're uploading a lot of files into a single directory, it's a good idea to put them in nested directories (3 deep is common). For example something like:
$filename = rand(0,99) . DS . rand(0,99) . DS . rand(0,99) . $file;
If you did it this way, it'd be very unlikely you'll have the same filename+number combination in the same folder. Just store the path as well as the filename, and you're good to go. This will keep a single folder from having so many images that it takes forever to view as well.
NOTE: I just wrote this code off the top of my head - I did not verify it for syntax...etc, but it should give you the idea.
Daves solution should solve your problem but if you insist to user your filename convention, get the last inserted id and create all images in a loop with $lastInsertedId + $counter bevore saving them. Write the hole image array afterwards to your db.
NOTE: You should use this solution only if you have no simultaneous write requests!
I have searched far and wide on this one, but haven't really found a solution.
Got a client that wants music on their site (yea yea, I know..). The flash player grabs the single file called song.mp3 and plays it.
Well, I am trying to get functionality as to be able to have the client upload their own new song if they ever want to change it.
So basically, the script needs to allow them to upload the file, THEN overwrite the old file with the new one. Basically, making sure the filename of song.mp3 stays intact.
I am thinking I will need to use PHP to
1) upload the file
2) delete the original song.mp3
3) rename the new file upload to song.mp3
Does that seem right? Or is there a simpler way of doing this? Thanks in advance!
EDIT: I impimented UPLOADIFY and am able to use
'onAllComplete' : function(event,data) {
alert(data.filesUploaded + ' files uploaded successfully!');
}
I am just not sure how to point THAT to a PHP file....
'onAllComplete' : function() {
'aphpfile.php'
}
???? lol
a standard form will suffice for the upload just remember to include the mime in the form. then you can use $_FILES[''] to reference the file.
then you can check for the filename provided and see if it exists in the file system using file_exists() check for the file name OR if you don't need to keep the old file, you can use perform the file move and overwrite the old one with the new from the temporary directory
<?PHP
// this assumes that the upload form calls the form file field "myupload"
$name = $_FILES['myupload']['name'];
$type = $_FILES['myupload']['type'];
$size = $_FILES['myupload']['size'];
$tmp = $_FILES['myupload']['tmp_name'];
$error = $_FILES['myupload']['error'];
$savepath = '/yourserverpath/';
$filelocation = $svaepath.$name.".".$type;
// This won't upload if there was an error or if the file exists, hence the check
if (!file_exists($filelocation) && $error == 0) {
// echo "The file $filename exists";
// This will overwrite even if the file exists
move_uploaded_file($tmp, $filelocation);
}
// OR just leave out the "file_exists()" and check for the error,
// an if statement either way
?>
try this piece of code for upload and replace file
if(file_exists($newfilename)){
unlink($newfilename);
}
move_uploaded_file($_FILES["fileToUpload"]["tmp_name"], $newfilename);