We're developing a website where users may change their slider images. (fullscreen slider) Each image is around 2000px to 2000px and we allow users to upload as many images as they want in our HTML form (capped to 10).
Their upload speed will be pretty slow and I believe we'll easily pass the max_execution_time of PHP which is default to 30 seconds. We will also let users to upload some .rar/.zip files in the future capping at 100MB.
We had few ideas but I wanted to ask SO for a better solution/reviews.
We can change 30 seconds for alot higher value since we have access to PHP.ini and let users upload all images at once, but that may create performance related issues in long term. (This is not an option!)
We can make use of javascript in client size. Foreach image paths specified in HTML form, javascript may post it with XMLHttpRequest one by one and expect a response. If the response is true, javascript moves to the next images and attempts to upload it. (Profit: each image will start php itself and get their own 30 seconds lifetime.)
Javascript solution won't work in file uploads when the file is above 50MB. Customers are usually capping at 25kbps upload speed in target region so there is no way they can upload 50MB in 30 seconds. Similar to #2, we may use a script where uploaded file saves in bytes every 30 seconds and client continues to push remaining bytes, or anything alike.
Basically, how would you complete this task?
We don't want to rely on PHP.ini, so increasing max_execution_time shouldn't be an option.
Should we go with #2 for image uploads and what can you suggest for #3?
Take a look into chunked uploads.
You'll need to use some sort of uploader script like JUpload or plUpload. You can specify how large the chunk of a file should be sent to the server. For example, if you have a 10MB file, you can chunk it to 1MB, so 10 1MB chunks would be uploaded to the server. In the case of a slow connection, just make the chunks smaller like 500KB.
Related
I have hosted a domain name on a cheap hosting website. This website allows uploading only the files less than 50 MB. I want to upload larger files. Is there any working trick without changing in php.ini. because it's not possible for me.
There isn't really an easy way round this. A couple of suggestions:
Resize images to make them smaller using Javascript before uploading. You can use a filereader and then a canvas to resize the image.
If you really need them that big then split the images up into chunks, again using a filereader to get the file into Javascript, then extract the data in chunks and upload each chunk separately. You'd need some clever PHP code to stick the chunks back together again. I would add some kind of chunk index into each upload as you can't guarantee the chunks will arrive in order.
Since the max upload size is limited to 50MB you can't do it this way , unless , you can split the file into files less than 50MB and upload each one .
To implement this solution of spliting file into many parts , use phpfsplit
You can also use the plupload
I want to upload more than 1000-3000 images to a post using the 'Add media' functionality.
If I add them to the media upload window (drag and drop or select), the browser warns me that the script is lagging. Eg on firefox:
A script on this page may be busy, or it may have stopped responding. You can stop the script now, open the script in the debugger, or let the script continue.Script: ../wp-admin/load-scripts.php?c=0&load%5B%5D=jquery-core,jquery-migrate,utils,plupload,json2&ver=4.1:2
I'm guessing this is expected, as the ajax call to upload the images hasn't returned, hence it seems like its busy.
How can I tweak this to wait, while this particular functionality is called?
Note: This is part of a plugin I am making where the user would be required to attach hundreds of images in each post (like a gallery). Of course I want to use the existing functionality and not reinvent the wheel.
This is expected behaviour, as most ajax upload scripts send the files as you drag and drop. Depending on the size of the photos you could be consuming the maximum amount of RAM for the browsers (since most are 32 bit).... 3000 images at 1 MB each is 3GB and near the limit. It would likely take a few hours to churn through that much data.
A suggestion would be to setup an sftp account and then having a script import those files. The transfer would take less time. The bulk import wouldn't be all that long a minute or two.
The reason why I suggest this, is that web browsers were not design to do bulk upload of files. Is it possible? Yes. Do I recommend it? No. Much like how I wouldn't recommend taking my ferrari through a 3 foot deep puddle. Your method of stuffing the files through php for bulk uploads taxes your server as well. I wouldn't recommend trying to parallelize it either. You will add a significant load to your server and might cause the site to stop responding.
Doing the upload outside of your web server (apache or ngix) is a much safer, secure, less resource draining solution.
You want add 1000 or more images in post means directly upload the
YOURSITENAME/wp-content/uploads/currentmonthfolder
and if u completed means you should add the img tag manually in particular post
I want to have php script that uploads partial files. Like the first 1 MB of a very large file. Can this be done in PHP with a regular form upload? Like, it close off the connection after 1mb upload...
I've looked a lot of uploaders (html5/javascript/flash), and it seems some support 'chunking', and file size limits which sounds 1/2 of what I want.. but I'd of course somehow need to know that it's only a partial of a full file.
If you are open to using javascript you can read the file into a blob on the client (which is far quicker than uploading), cut out everything after the first 1mb, and send it to php via ajax. It wouldn't technically be a traditional php upload form, but it would seem like one to the user.
I have created a website in which there is a Checkout button, whenever the Customer Clicks on the Checkout Button I need to create 4 Image Files in jpg Format and Save them on the Server. These Image Files will be later on printed by the Admin of the Site
Now one of the Image Files could be of 3MB to 4MB in size
While the rest of the 3 Images Files will be mostly between 250KB to 500KB in size
Now the problem is the First File which could be from 3MB to 4MB takes some time to be created like 5-10 Secs (this is only when one Customer Checkout at a time)
But when I performed simultaneous Checkout with 2 different Devices, the time frame increased to 20 Secs, one of the Checkout was processed First while the other waited or worked a little slowly, so the total time was 20 Secs, otherwise the time for one of the Checkout was 8 Secs while for the other it was 12 Secs
So I fear that if 10 or 20 Customers performed Checkout simultaneously then some of them might have to wait for perhaps 2 OR 3 Mins or perhaps even more
So can anyone please tell me, how I can increase the rate at which PHP writes file OR to increase the PHP Execution Speed OR Will increasing the RAM might help (currently it is 1GB)
Note :-
->I am using IMagick to create Image Files,
->Also cannot reduce the jpg quality as it will be printer later on at 150 dpi,
->The First Image File is a High Definition Photograph, while the rest 3 are only filled with Solid Colors
->Also their resolutions are High like between 2800 to 4400 pixels in both width and height
->Also there is a Serial Number that is to be added to all 4 Image Files so I cannot reduce it's quality otherwise the Serial Number might not be visible while printing all the 4 Image Files
UPDATE :
The Customer First Uploads a Photograph, then he can scale the image, crop it or even move it(for cropping purpose)
So in short he is mainly cropping an uploaded image and when he Clicks on the Checkout Button the new image is created and a Serial Number is added to it
I checked the code the Image gets processed OR created in memory(RAM) in about 1 Sec, but when I add the code for writing the image on the Hard Drive in the Server itself it is then that it takes some amount of time 5-8 Secs if no simultaneous request or 10-15 Mins if 20-30 Simultaneous Request
Also I have to show them a Success or a Failure Message, but I can only show the Success Message after all the 4 images files are created, so even if i use ajax even then I cannot show the Success Message not before all the Images are created
Again if I remove the code for writing the Image then the time reduces to 1 Sec if no simultaneous request or 5-10 Secs if 20-30 Simultaneous Request
So I think the problem is with writing the Image File and not processing or generating it
Also I cannot create the Image File while the user is still cropping the image otherwise it will slow down the cropping process. Also the situation will not change but rather get worse, the problem from "What will happen when Simultaneous Checkout" will change to "What will happen when Simultaneous Crop". So this is not an option
Also I don't think it will look professional if I say that an EMail will be sent to you, informing you whether your Checkout was a Success or Not.
Just a few thoughts, maybe one of them help you with your task at hand:
Try to find out which of the steps you are performing on those images is the one eating away your performance. Maybe show some code, so people can help.
Some image operations might take longer because of older hardware (CPU, RAM, HD). Is your hardware good enough?
Are you using the latest version of IMagick?
Try separating the image processing from the checkout process. Maybe write the raw data into a database, file or whatever first and create a cron (or trigger it) which does the processing, so your users don't have to wait for this.
You might have to delve into your php and apache configuration, too. Unoptimized setups might result into performance issues, too.
I can't figure out a good solution for limiting the storage amount a user may access with his files.
In the application users are allowed to upload a limit amount of files. The limitation is based on total file size. I.e. a user might be allowed to store 50 Mb on the server.
This number is stored in a database so that it can be easily increased/decreased.
The language used is PHP, but I guess the solution isn't depended on the scripting language.
Very sorry if the question is unclear. I don't really know what to ask for more than a strategy to implement.
Thanks!
Keeping track of how much space has been used should be straightforward - with each upload you could store the space used in another table. The PHP filesize() function will tell you the size of a file on disk. Use a SUM() SQL query to get the total size of all the files uploaded by each user, and compare it against their quota limit.
The tricky bit is when you're approaching the limit - you can't tell how big the file is going to be before it's uploaded. So you'll have to get the user to upload a file and then check its size and see if it takes them over quota. If the file's too big, delete and let the user know they're out of space.
A simple approach would be to store the filename, dates and sizes of a users uploads in the database too. Then you can easily reject an upload when it exceeds their total storage.
This also makes it easy to show a list of files sorted in a variety of ways, allowing a user close to their limit to select some files for removal.
You could even use the average size of the files the user uploads to warn them when they are getting close to using up all their space.
You can use a script (something like that) that iterates through a directory contents, calculates filesizes and then deletes files that don't fit or rejects new uploads. But I think that this better be done with some sort of directory restrictions on a server. Unfortunately, I'm not a linux guy, so I don't know exactly how to do that, but this post might be helpful.
Solution of drewm is good, I just want to add few words about tricky part he mentioned.
Yes, it is impossible to predict file size before file is uploaded, as you cannot check filesize using javascript on user`s file upload page. However you can do it using flash based file uploader (swfupload.org for example). By using it you can check files size before upload is started and check it against upload limit you have. This way you will save time for user (no need to upload file to get "limit exceed error" message).
As a bonus you can show user upload progress bar as well.
Don' forget about OS solutions. If the files are stored in a "user" specific directory, then you can use the OS to find the disk spaced used in that directory. A Linux solution would something like this:
$dirSize = explode("\t", `du -ks $userDir`); // Will return an array of size, dirName
if ($dirSize[0] > MAX_DIR_LIMIT) print "USER IS OVER QUOTA";