Laravel 5.8 multi-upload inputs - php

I want to be able to upload multiple (dynamic) files (pdf).
So I have the following lay-out:
As you can see, the form has 4 input fields for files, but it also has 2 text fields and for every file upload row, it has a checkbox. The "flow" is the following:
Add title and year
Check the classes (Initiatie, Recreatie, Toersime, and Sport) you want to enable (and upload a PDF for)
Upload 1 PDF file per class.
The files are PDF files (1 per class). I tried the following code in PHP to upload the files, but I can only upload one, sometimes 2 files at a time, depending on how large the files are.
public function postGapersritAddResults(Request $request): RedirectResponse
{
// Handle upload
$path = 'documents/gapersrit/'.$request->get('year').'/results';
foreach (['initiatie', 'recreatie', 'toerisme', 'sport'] as $item) {
if ($request->hasFile('file_'.$item)) {
$request->file('file_'.$item)->storeAs($path, $item.'.'.$request->file('file_'.$item)->getClientOriginalExtension(), 'webdav');
}
}
// Handle database
$result = new SiteGapersritResults();
$result->title = $request->get('title');
$result->year = $request->get('year');
$result->initiatie = filter_var($request->get('active_initiatie'), FILTER_VALIDATE_BOOLEAN);
$result->recreatie = filter_var($request->get('active_recreatie'), FILTER_VALIDATE_BOOLEAN);
$result->toerisme = filter_var($request->get('active_toerisme'), FILTER_VALIDATE_BOOLEAN);
$result->sport = filter_var($request->get('active_sport'), FILTER_VALIDATE_BOOLEAN);
$result->save();
toastr()->success('Saved the results for year '.$result->year.'.', 'Success', ['timeOut' => 5000]);
return redirect()->to('admin/gapersrit/results');
}
If someone has a better idea of how I could do this, please help me out.
Ideally, I want to select all the files and be able to upload them one by one (like in my code), but for some reason, this doesn't work and throws most of the time the too large error, however, I guess I'm uploading one file at a time?
Edit
The limit for upload sizes is 100M in php.ini and my Nginx configuration.
Edit 2
I get the following error on my current code:
curl_exec(): CURLOPT_INFILE resource has gone away, resetting to default
full trace: https://pastebin.com/rqUeEhGa

This might be because the upload size is limited by the php.ini config of your server.
If you have access to the file or the php settings, try changing these values:
upload_max_filesize = 20M
post_max_size = 20M
Edit also see see https://stackoverflow.com/a/23686617/7584725

As you said, the error is curl_exec(): CURLOPT_INFILE resource has gone away, resetting to default
The code snippet you posted does not contain anything related to curl, the problem is the "webdav" disk in the saveAs function.
Looking at the trace, this seems like there is a problem in the League\Flysystem\WebDAV\ package, maybe
https://github.com/thephpleague/flysystem-webdav/issues/49 or
https://github.com/thephpleague/flysystem-webdav/issues/50

Related

Laravel multiple files (save all or fail all )

I am making a form that create service with multiple images, I'm not sure if I am doing right saving process that handle failures.
for example if I have 5 images and already uploaded 3 but failed in the 4th image so I need to cancel saving process and delete all saved files.
$validatedImages = [];
foreach($images as $key => $image){
$imageName = ServiceImage::generateRecordName($image);
if(!$image->storeAs(ServiceImage::path(), $imageName)) {
// fall back all the stored files
foreach($validatedImages as $validatedImage)
Storage::delete(ServiceImage::path() . $validatedImage);
return redirect()->back()->with(['errorMsg' => 'There was a problem when uploading images']);
}
$validatedImages[] = $imageName;
}
and also when storing into database
foreach($validatedImages as $validatedImage)
if(!$service->images()->save(new ServiceImage(['name' => $validatedImage])))
// handle failure ..
so my question is : What is the best practice to handle this fall back.
Your controller should only be receiving fully-validated data. So those images should be ready for uploading once you’re in the body of your controller method:
$paths = [];
foreach ($request->file('images') as $image) {
$paths[] = $image->store('images');
}
// Do something with $paths
You probably want to upload images asynchronously. Especially if there are five of them in a request. With this approach, you can upload the files to a folder that is periodically cleaned (say after 24 hours). When an image is uploaded to this folder, return the paths. Submit the paths to your controller action instead of the actual files, and you can then move the files from the temporary folder to a permanent one.
Your method can be improved since what I see from your code is that, once you upload your files all you do is delete the file and again in another loop check for database upload. Since when all file is properly uploaded but could not be written in the database then you are deleting all those files again. This may not matter for small files but when files are large in size and the number of files are also large then this can cause unnecessary performance issues. for example you upload 100 files each of 10 mb size then you uploaded 1000mb but your file execution in database show error on the first database update then the upload of the remaining 99 files just goes in vain. Moreover, You are making four loops 2 for upload and 2 for database I see some unnecessary extra loop here. The loops can be reduced from 4 to 2 with this code and you can stop the script once a file has errors without going to another file and thus saving traffic and improving performance.
The key is you upload each file and write it to database and stop the script as soon as there is error either in the upload or database code execution success,
this is how i suggest you can improve the code.
$validatedImages = [];
foreach($images as $key => $image){
$imageName = ServiceImage::generateRecordName($image);
if(!$image->storeAs(ServiceImage::path(), $imageName)) {
$this->deleteImage($validatedImages);
return redirect()->back()->with(['errorMsg' => 'There was a problem when uploading images']);
}
if(!$service->images()->save(new ServiceImage(['name' => $imageName]))){
/*file has been uploaded but database error was found so this file needs to be deleted, */
$this->deleteImage($validatedImages,$imageName);
return redirect()->back()->with(['errorMsg' => 'There was a problem updating the database']);
}
$validatedImages[] = $imageName;
}
function deleteImage($validatedImages,$extraimage=null){
foreach($validatedImages as $validatedImage){
Storage::delete(ServiceImage::path() . $validatedImage);
//code for reversing database change
}
if($extraimage)
/*the image which was uploaded but could not be written in the
database*/
Storage::delete(ServiceImage::path() . $extraimage);
}
another method you can do is store the image in a temporary folder and move the file permanently only when all files get uploaded and once any error occurs you delete all the files in that temporary directory without deleting them in a loop.

move_uploaded_file not working when uploading more than one files - Intermittently

What I need
I need to upload three files to the some directory at the same time and update MySQL database
The problem
Sometimes files can upload. But sometimes not all files uploading. Lets say I have file A, B and C. Sometimes all A, B and C can upload. But sometimes randomly (Either A, B or C) some files not uploading. But the MySQL query always update correctly. No issue with that
Codes
Following are my codes. Anyone can spot any mistakes? or any better way of doing this?
if($cadguiUpdated == 'cadYes' && $excelSheetUpdated == 'excelYes' && $tpi == 'tpiYes') {
mkdir($pathCad, 0, true);
mkdir($pathExcel, 0, true);
mkdir($pathTpi, 0, true);
move_uploaded_file($_FILES["fileCad"]["tmp_name"], $target_fileCad);
move_uploaded_file($_FILES["fileExcel"]["tmp_name"], $target_fileExcel);
move_uploaded_file($_FILES["fileTpi"]["tmp_name"], $target_fileTpi);
$sql="INSERT INTO dbtuts.tbl_uploads(prjId, date, cadFile, cadType,cadUpdate, cadTempPerm, cadRevision, cadRemarks, cadRemarksHistory, excelFile, excelType, excelUpdate, excelTempPerm, excelRevision, excelRemarks, excelRemarksHistory, tpiFile, tpiType, tpiUpdate, tpiRevision, tpiRemarks, tpiRemarksHistory, optionalRemarks, updatedPerson, activity) VALUES('$prjId','$updatedDate', '$save_fileCad','$fileTypeCad','$cadguiUpdated','$tempPermCad', '$revisionCad', '$cadguiRemarks','$cadInitialRemarks', '$save_fileExcel','$fileTypeExcel','$excelSheetUpdated','$tempPermExcelSheet', '$revisionExcelSheet', '$excelSheetRemarks','$excelInitialRemarks', '$save_fileTpi','$fileTypeTpi','$tpi', '$formNumber', '$tpiRemarks','$tpiInitialRemarks', '$optionalRemarks', '$updatedPerson', '$activity')";
mysqli_query($conn, $sql);
echo '<br><div class=" container alert alert-success alert-dismissable fade in">×<strong>Success </strong>: Successfully created</div>';
}
After tried many things, finally realised issue was due to file size. Sometimes the files I used to upload was too big compared to the declared file size. To change allowed upload file size limit I did few changes in php.ini file.
Step 1
Open php.ini file and search for this string upload_max_filesize. Change this to your preferred value.
Step 2
Search for post_max_size in the php.ini file and change it according to your desired value. Take note that, this post_max_size is the file size before you actually upload it.
Initially I did only Step 1. But still couldn't upload due to post_max_size was not set correctly. After did Step 2, I am able to upload files without any issue for the specified file size
NB: Please make sure to restart your XAMPP after do the changes.

RuntimeException SplFileInfo::getSize(): stat failed for... Laravel 4 upload image

I work with laravel 4 (last update), I create a form where we could upload an image (logo/avatar). I am on MAC OS, I use sublime Text 3, laravel and MAMP Applications. All my configuration is setting right, no running problems.
My probleme is that I have this error when I submit the reaching fields from my form: RuntimeException
SplFileInfo::getSize(): stat failed for /Applications/MAMP/tmp/php/phpRUJfMX
Here's the code from my form nameOfTheForm.blade.php:
#extends('layout.default')
#section('title')
Name Of My Project - EditProfile
#stop
#section('content')
{{Form::open(array('url'=>'uploadAvatar','files' => true))}}
<p>
{{Form::label('pseudo','pseudo (*): ')}}
{{Form::text('pseudo',Input::old('nom'))}}
</p>
#if ($errors->has('pseudo'))
<p class='error'> {{ $errors->first('pseudo')}}</p>
#endif
<br>
<br>
<p>
{{Form::label('url_Avatar','Avatar: ')}}
{{Form::file('url_Avatar',Input::old('Url_Avatar'))}}
</p>
#if ($errors->has('url_Avatar'))
<p class='error'> {{ $errors->first('url_Avatar')}}</p>
#endif
<br>
<br>
<p>
{{Form::submit('Validate your avatar')}}
</p>
{{Form::close()}}
#stop
Here's the code from my controller:
public function uploadAvatar() {
//*****UPLOAD FILE (on server it's an image, on the DB it's an url*****
$file = Input::File('url_Avatar');
//set a register path to the uploaded file
$destinationPath = public_path().'/upload/';
//have client extension loaded file and set a random name to the uploaded file, produce a random string of length 32 made up of alphanumeric characters [a-zA-z0-9]
$filename = $destinationPath . '' . str_random(32) . '.' . $file->getClientOriginalExtension();
$uploaded = Input::File('url_Avatar')->move($destinationPath,$filename);
//*****VALIDATORS INPUTS and RULES*****
$inputs = Input::all();
$rules = array(
'pseudo' => 'required|between:1,64|unique:profile,pseudo',
//urlAvatar is an url in database but we register as an image on the server
'url_Avatar' => 'required|image|min:1',
);
The uploaded file code works perfect, I register the file in the selected folder I want. I Have no problem with my routes (no need to show this part of the code).
But When I submit the form, I have this error:
RuntimeException
SplFileInfo::getSize(): stat failed for /Applications/MAMP/tmp/php/phpRUJfMX
error info details:
open:/Applications/MAMP/htdocs/nameOfMyProject/vendor/laravel/framework/src/Illuminate/Validation/Validator.php
}
elseif (is_array($value))
{
return count($value);
}
elseif ($value instanceof File)
{
return $value->getSize() / 1024;
}
else
It seems, that Laravel needs (stat - Gives information about a file), that is to say, needs to have the informations from the uploaded file, here the size, but I try this in my controller just before the line where here's $uploaded where I move the file in my selected folder:
//I add this line code before
$size= $file->getSize();
$uploaded=Input::File('url_Avatar')->move($destinationPath,$filename);
But, when I did that, I have another error: the validator doesn't r**ecognize the files as an image** et ask me to upload a valid format. I think I need to correct the first errors I had (SplFileInfo::getSize())
If you have any ideas... Thank you.
Short version: Laravel's Symphony source returns file size of 0 if the file size is larger than php.ini's upload_max_filesize.
Recommend checking your php.ini file. I'm not sure how MAMP handles php.ini but it's there if you're running Laravel. In php.ini, the upload_max_filesize may be smaller than the file you're attempting to upload. That value happened to be defaulted on my Ubuntu 12.10 VM to 2megs.
Check
/vendor/symphony/http-foundation/Symphony/Component/HttpFoundation/File/UploadedFile.php
and check the getMaxFilesize(). By default, this method grabs the upload_max_filesize value from your PHP.ini file on your host system.
The interesting thing about this is that if you do
$foo = Input::file('uploaded_file')
var_dump($foo)
or
use Laravel's die&dump dd($foo), then a file that's larger than php.ini's upload_max_filesize returns a size of 0.
In addition to upload_max_filesize, PHP docs also mention checking php.ini so that:
- post_max_size must be larger than upload_max_filesize
- memory_limit should be larger than post_max_size
I got same error In Laravel 6x. This error is due to null value for column which was not nullable.
To resolve I just changed column to accept null after that is working fine.
I know this question has been answered quite some time ago, but I found something on the PHP website which might be useful because it actually explains why the error occurs:
If you're using Symfony's UploadedFile,
please be aware that if you call this method
after you call #move, you will most likely get
some obscenely untraceable error, that says:
stat failed
Which if you really think about it, it does makes sense,
the file has been moved by Symfony, but getSize is in SplFileInfo,
and SplFileInfo doesn't know that the file has been moved.
Source: https://www.php.net/manual/en/splfileinfo.getsize.php#122780
i solve my problem with edit columns attribute default "null" and the problem gone.

Laravel file does not exist - file upload

I am using a form to upload video files. For some reason all I get is the following error:
Symfony \ Component \ HttpFoundation \ File \ Exception \ FileNotFoundException
The file "" does not exist
In my controller I had a validation rule to require files, like so
$validator = Validator::make(Input::all(),
array(
'file' => 'required'
)
);
But because of the above rule, I couldn't properly debug what is going on, therefore I ended up removing it, as a result to generate the above error.
In the php.ini file, change the following:
upload_max_filesize = 20M
post_max_size = 20M
This should make it work.
Probably the problem is the "upload_max_filesize" that has been exceeded (check your php.ini), however it is a good practice to first check if the file is valid.
if (!$file->isValid()) {
throw new \Exception('Error on upload file: '.$file->getErrorMessage());
}
//...
In 2020 with Laravel 5.8, I discovered this problem and your answers gave me clues, but I was still getting an error when I tried the isValid method. I found this was the best way to check if a file was too large:
$file_result = new \stdClass();
if ($request->file('file_to_upload')->getSize() === false) {
$max_upload = min(ini_get('post_max_size'), ini_get('upload_max_filesize'));
//From: https://gist.github.com/svizion/2343619
$max_upload = str_replace('M', '', $max_upload);
$max_upload = $max_upload * 1024;
$file_result->error = "The file you are trying to upload is too large. The maximum size is " . $max_upload;
//return whatever json_encoded data your client-side app expects.
}
It looks as if the getSize method successfully returns false if the file size exceeds the maximum size. In my experience isValid throws an obscure error. The upload_max_filesize parameter is there to protect the server so I wanted a reliable way of catching when the user attempts to upload a large file and my client-side validation is not set correctly.
To clear the answer
The problem that you uploaded file which size is more than php configuration located in php.ini, so when you try to access the any property of the uploadFile object you will see the exception because file not exist

PHP server file download cutoff unexpectedly

I have a web interface that I built into the admin section of a WordPress site. It scrapes a few tables in my database and just displays a big list of data row by row. There are about 30,000 rows of this data, displayed with a basic echo in a for loop. Displaying all 30,000 rows on a page works fine.
Additionally, I include an option to download a CSV file of the complete rows of data. I use fopen and then fputcsv to build the CSV file for download from the result of the data query. This feature used to work, but now that the dataset is at 30,000, the CSV will no longer generate correctly. What happens is the first 200~1000 rows will be written to the CSV file leaving out the majority of the data. I estimate that the CSV that is not properly generated in my case would be about 10 Megs. Then the file will download the first 200~1000 rows as though everything was working correctly.
Here is the code:
// This gets a huge list of data from a SP I built. This data is well formed
$data = $this->run_stats_stored_procedure($job_to_report);
// This is where the data is converted into a csv file. This part is broken
// the file may already exist at that location burn it down if it does
if(file_exists(ABSPATH . "some/path/to/my/file/csv_export.csv")) {
unlink(ABSPATH . "some/path/to/my/file/csv_export.csv");
}
$csv_file_handler = fopen(ABSPATH . "some/path/to/my/file/candidate_export.csv", 'w');
if(!empty($csv_file_handler)) {
$title_array = array(
"ID",
"other_feild"
);
fputcsv($csv_file_handler, $title_array, ",");
if(!empty($data)) {
foreach($data as $data_piece) {
$array_as_csv_line = array();
foreach($data_piece as $object_property) {
$array_as_csv_line[] = (string)$object_property;
}
fputcsv($csv_file_handler, $array_as_csv_line, ",");
unset($array_as_csv_line);
}
} else {
fputcsv($csv_file_handler, array("empty"), ",");
}
// pros clean everything up when they are done
fclose($csv_file_handler);
}
I'm not sure what I need to change to get the entire CSV file to download. I believe this could be a configuration issue, but I'm not should. I am led to believe this because this function used to work with even 20,000 csv rows, it is now at 30,000 and breaking. Please let me know if additional info would help. Has anyone bumped into issues with huge CSV files before? Thank you to anyone who can help.
Is the "download" taking more than say a minute, two minutes, or three minutes? If so, the webserver could be closing the connection. For example, if you're using the Apache FCGI module, it has this directive:
FcgidBusyTimeout
which defaults to 300 seconds.
This is the maximum time limit for request handling. If a FastCGI request does not complete within FcgidBusyTimeout seconds, it will be subject to termination.
Hope this helps you solve your problem.
The answer that I am currently implementing is to allow the script to use more time. To do this, I am simply running the following code before the script runs:
set_time_limit ( 3600 );
I am doing further research because this is not a sustainable solution. Any further advice would be greatly appreciated.

Categories