CrossBrowser: File Upload Issue (HTML+PHP) - NOT TO UPLOAD IMMEDIATELY - php

Is there a possibility to NOT upload a file IMMEDIATELY, but to store maybe in an array to upload it in a later step?
I want to ensure is that the user can navigate freely between the form pages and then upload the files in the last step. The only way I know is to encode the files and store them in the session, but this is anything other than elegant.
It should also be a cross-browser solution.

If you have a form splitted over different urls and you have a file input in a previous step no, it isn't possible.
unless...
if your "cross browser" requirements can ignore IE<10 and every non recent other browser, you may use the javascript File Api to read the file client side, eventually store it on the client with sessionStorage/localStorage and send it to you later
See
http://caniuse.com/#feat=fileapi
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
https://developer.mozilla.org/en-US/docs/Web/Guide/API/DOM/Storage

When ever the user uploaded the file then that file will be moved to the temporary location.
That temporary file will be deleted when ever the script is finishes.
In your scenario if the user moves from one page to another page then the file will be deleted automatically

Related

Codeigniter - How can I initiate the download of a file and reset the form to be used again

Using Codeigniter, I'm trying to do something basic.
The site has a form where users upload files for processing. After the files are processed, new files are created and saved in a zip file. Then I use $this->zip->download() to start the download.
But after that I can't do anything to refresh the upload form. And if I reload the view first, the download doesn't work.
All I want to do is upload the files, create the zip, download the zip and wind up with a clean upload form.
Any suggestions how to approach this?
I don't think you will be able to do it on the same page because of missmatched headers. You could try the following: when the zip is ready save it and name it some random/unique id/md5 anything and print the download link in the view with "click here to download your file" with target="_blank" param so it opens in a new tab. (usually browsers are smart enough that it will auto close it when you accept the file). And so you can reset the form in the view as well.
Another way could be with ajax and js i guess but that would be much longer to write.
Ps.: just dont forget to delete the files after x amount of time (if you dont need them)

Is there any mechanism in Laravel 5 to delete unused uploaded files?

There are situations where a user uploads a file (say image field inside a form) but doesn't save the form and simply close the browser. It causes unused files to reside inside the server.
In some CMSs like Drupal there is a mechanism to detect such files and delete them after a while. They create a table called file_managed, and for every uploaded file, they assign the id of the content which it belongs to. So it is easy to find unused files.
I would like to know is there any mechanism like this in Laravel that detects the unused uploaded files?
thanks.
The selected file won't be uploaded if form is not submitted. In case of ajax upload, place the file in any temporary folder first, when the user completes the form and submits it, move the uploaded picture to the correct path and remove it from the temporary folder.
You can write some cron jobs or queues to empty the temporary folder
I know its an old post, but to update it to more recent solution to this problem (Laravel 8 at this time) --> this video helped me: https://www.youtube.com/watch?v=Vs4EQqFcD-c

Implementation of fully functional media uploading in web application

Suppose we have the web application which handle create, read, update and delete articles and each article should have gallery of images. I have to make one to one relation between Article and Gallery and one to many relation between Gallery and Media.
HTML5 gives a lot of features like multiupload, so I want to use this excellent http://blueimp.github.io/jQuery-File-Upload/ plugin for that. The problem is how to handle the file upload "in memory" like other form's data?
For example when we show the page for create new article we should be able to fill in article's data fields and select images to upload, next when we click the save button the images should start upload and after that the form should submit. When validation fails the images should be still displayed on the frontend, but on the server-side nothink should be saved.
One of the solutions is create somethink like "create entity session temporary id" before displaying the entire form and that id can be used to create temporary directory for save uploads, so after success saved form these images can be moved to appropriate directory, but how to make the "create entity session temporary id"?
The other solution I think is the "with the edit id" approach, because we can handle the uploads with previously saved gallery id, but sometimes I can't save new blank article with gallery, cause some of the fields should't be empty in db.
For the Rails I saw https://github.com/thoughtbot/paperclip gem which in the Readme says:
Paperclip is intended as an easy file attachment library for Active Record. The intent behind it was to keep setup as easy as possible and to treat files as much like other attributes as possible. This means they aren't saved to their final locations on disk, nor are they deleted if set to nil, until ActiveRecord::Base#save is called.
My question is how it works?
The problem with enabling file uploads on the create mask is that you eventually end up with orphaned files. This is because a user is able to trigger the upload without saving the actual entity. While creating a very own UploadBundle I thought about this problem for a while and came to the conclusion that there is no truly proper solution.
I ended up implementing it like this:
Given the fact that our problem arise from orphaned files, I created an Orphanage which is in charge of managing these files. Uploaded files will first be stored in a separate directory, along with the session_id. This helps distinguishing files across different users. After submitting the form to create the actual entity, you can retrieve the files from the orphanage by using only your session id. If the form was valid you can move the files from the temporary orphanage directory to the final destination of your files.
This method has some pitfalls:
The orphanage directory itself should be cleaned on a regular basis using a cron job or the like.
If a user will upload files and choose not to submit the form, but instead start over with a new form, the newly uploaded files are going to be moved in the same directory. Therefore you will get both the files uploaded the first time and the second time after getting the uploaded files.
This is not the ultimate solution to this problem but more of a workaround. It is in my opinion however cleaner than using temporary entities or session based storage systems.
The mentioned bundle is available on Github and supports both Orphanage and the jQuery File Uploader plugin.
1up-lab/OneupUploaderBundle
I haven't work with the case personaly, but my co-worker had similar conundrum. She used
punkave/symfony2-file-uploader-bundle
It's a bundle that wrapps jQuery File Upload plugin. It is in the early stages and a lot of things are missing, such as event, but we gave it a shot.
That's what we do: in newAction() we create entity, generate unique dir ID, and store the ID in entity (via regular setDirId()). Than we create the form, which contains hidden field dirId.
We are uploading the files to temp dir on server via ajax, not during the submit. Ajax request requires the ID. It stores files in temp_dir/prefix_ID
Than it's quite simple. Form is sent. If form is valid - move files from temp to dest dir. If not - we have the ID, and are able to show the images.
However, we do not save information about individual files in a separate table in the database. Every time we read the contents of the folder that corresponds to our dirId.
I know it's not the solution You are asking for. It's rather a workaround.

how to hande multiple clients in server when reading and writing to a file

I have a php page, that when visited, it will open a txt file, read from it, and then write to it. But what happens if multiple people visit the php page at the same time? Is there a way to wait until the file is not being used before the php code read/writes to it? Sort of add synchronization to it.
Thanks.
You can create temp files for each user. Once the user completes his action and logs out. You can create a function to read the temp file's content and put it to the original file and later delete the temp file.

How can I perform checks on each file that is to be uploaded?

I am trying to make a multiple upload system for images. I am using input type of file with multiple files upload. I want to have checks for each and every file and upload only if they pass all the checks, otherwise stop uploading the current (failing) file but continue for remaining files. I want to do it with Javascript and PHP. Can anyone suggest what steps to follow? No JQuery, please.
<input id="uploadfiles" style="overflow:hidden" type="file" name="files[]" value="upload" multiple />
Let me summarize generic upload process:
User will choose file(s)
They will click upload (or anything that submits the form)
File will get uploaded using multipart/form-data form to php temporary folder
Php script will be called (target of the form) and it will be passed file paths in the tmp folder
Php script will decide to copy files from tmp folder to some folder on the web page
This clearly shows one thing: If you want to validate files by PHP script, all of the files must have been fully uploaded already, this means you can not call the script for each file separately while others are uploading (not with html and regular forms).
Also (depending on requirements you need to check on files) you may not be able to validate files with javascript for some security reasons.
So there is simple (but not much satisfying) solution. Let the user upload all files, then validate them one by one with your javascript. If they pass your requirements, copy them to intended folder, if not you can just leave them intact and they will be erased from the tmp folder. It has only one downside, the user will need to wait long time when uploading large file before they know it did not pass requirements.

Categories