I have created a website in which there is a Checkout button, whenever the Customer Clicks on the Checkout Button I need to create 4 Image Files in jpg Format and Save them on the Server. These Image Files will be later on printed by the Admin of the Site
Now one of the Image Files could be of 3MB to 4MB in size
While the rest of the 3 Images Files will be mostly between 250KB to 500KB in size
Now the problem is the First File which could be from 3MB to 4MB takes some time to be created like 5-10 Secs (this is only when one Customer Checkout at a time)
But when I performed simultaneous Checkout with 2 different Devices, the time frame increased to 20 Secs, one of the Checkout was processed First while the other waited or worked a little slowly, so the total time was 20 Secs, otherwise the time for one of the Checkout was 8 Secs while for the other it was 12 Secs
So I fear that if 10 or 20 Customers performed Checkout simultaneously then some of them might have to wait for perhaps 2 OR 3 Mins or perhaps even more
So can anyone please tell me, how I can increase the rate at which PHP writes file OR to increase the PHP Execution Speed OR Will increasing the RAM might help (currently it is 1GB)
Note :-
->I am using IMagick to create Image Files,
->Also cannot reduce the jpg quality as it will be printer later on at 150 dpi,
->The First Image File is a High Definition Photograph, while the rest 3 are only filled with Solid Colors
->Also their resolutions are High like between 2800 to 4400 pixels in both width and height
->Also there is a Serial Number that is to be added to all 4 Image Files so I cannot reduce it's quality otherwise the Serial Number might not be visible while printing all the 4 Image Files
UPDATE :
The Customer First Uploads a Photograph, then he can scale the image, crop it or even move it(for cropping purpose)
So in short he is mainly cropping an uploaded image and when he Clicks on the Checkout Button the new image is created and a Serial Number is added to it
I checked the code the Image gets processed OR created in memory(RAM) in about 1 Sec, but when I add the code for writing the image on the Hard Drive in the Server itself it is then that it takes some amount of time 5-8 Secs if no simultaneous request or 10-15 Mins if 20-30 Simultaneous Request
Also I have to show them a Success or a Failure Message, but I can only show the Success Message after all the 4 images files are created, so even if i use ajax even then I cannot show the Success Message not before all the Images are created
Again if I remove the code for writing the Image then the time reduces to 1 Sec if no simultaneous request or 5-10 Secs if 20-30 Simultaneous Request
So I think the problem is with writing the Image File and not processing or generating it
Also I cannot create the Image File while the user is still cropping the image otherwise it will slow down the cropping process. Also the situation will not change but rather get worse, the problem from "What will happen when Simultaneous Checkout" will change to "What will happen when Simultaneous Crop". So this is not an option
Also I don't think it will look professional if I say that an EMail will be sent to you, informing you whether your Checkout was a Success or Not.
Just a few thoughts, maybe one of them help you with your task at hand:
Try to find out which of the steps you are performing on those images is the one eating away your performance. Maybe show some code, so people can help.
Some image operations might take longer because of older hardware (CPU, RAM, HD). Is your hardware good enough?
Are you using the latest version of IMagick?
Try separating the image processing from the checkout process. Maybe write the raw data into a database, file or whatever first and create a cron (or trigger it) which does the processing, so your users don't have to wait for this.
You might have to delve into your php and apache configuration, too. Unoptimized setups might result into performance issues, too.
Related
I'm developing IPS 4. I have profile popup:
The profile cover loads very long, because cover dimensions and size are big. So I've decided to make a PHP API which resizes images to needed size and then displays the resized image.
Is this good idea to make cover upload faster?
You need to populate a 436x85 box with user-provided pictures.
My own digital camera has a 18 MPx sensor that produces 4896x3672 pictures that use around 7 MB when compressed as JPEG. Imagine you display e.g. a dozen profiles per page. That's 84 MB worth of network transfer (more than a typical MP3-encoded music album) for a single page. JPEG compression roughly accomplishes a 1/10 ratio so you can assume 840 MB of RAM just to store the pictures. And they you have the overhead of having the browser resample pictures real time.
On the other size, a 436x85 JPEG can use 8 to 22 KB on average (depending on quality settings).
So if you use the raw pictures uploaded by users, of course it's not going to be fast.
Conclusion: always resize pictures yourself. And please do it only once, it's a heavy process even for your server.
Yes, it is a good idea to store not only the original image, but the resized ones too, because each time a new user is requesting certain page he is getting this big image, what is basically a waste of transfer and makes user wait what leads to poor user experience.
You should make a script which resizes and saves newly uploaded images to your server and use them instead of these big original ones. But also don't forget that resizing is really CPU-heavy, so it would be a good idea to queue this action and not do it instantly during the user's request.
I want to upload more than 1000-3000 images to a post using the 'Add media' functionality.
If I add them to the media upload window (drag and drop or select), the browser warns me that the script is lagging. Eg on firefox:
A script on this page may be busy, or it may have stopped responding. You can stop the script now, open the script in the debugger, or let the script continue.Script: ../wp-admin/load-scripts.php?c=0&load%5B%5D=jquery-core,jquery-migrate,utils,plupload,json2&ver=4.1:2
I'm guessing this is expected, as the ajax call to upload the images hasn't returned, hence it seems like its busy.
How can I tweak this to wait, while this particular functionality is called?
Note: This is part of a plugin I am making where the user would be required to attach hundreds of images in each post (like a gallery). Of course I want to use the existing functionality and not reinvent the wheel.
This is expected behaviour, as most ajax upload scripts send the files as you drag and drop. Depending on the size of the photos you could be consuming the maximum amount of RAM for the browsers (since most are 32 bit).... 3000 images at 1 MB each is 3GB and near the limit. It would likely take a few hours to churn through that much data.
A suggestion would be to setup an sftp account and then having a script import those files. The transfer would take less time. The bulk import wouldn't be all that long a minute or two.
The reason why I suggest this, is that web browsers were not design to do bulk upload of files. Is it possible? Yes. Do I recommend it? No. Much like how I wouldn't recommend taking my ferrari through a 3 foot deep puddle. Your method of stuffing the files through php for bulk uploads taxes your server as well. I wouldn't recommend trying to parallelize it either. You will add a significant load to your server and might cause the site to stop responding.
Doing the upload outside of your web server (apache or ngix) is a much safer, secure, less resource draining solution.
You want add 1000 or more images in post means directly upload the
YOURSITENAME/wp-content/uploads/currentmonthfolder
and if u completed means you should add the img tag manually in particular post
We're developing a website where users may change their slider images. (fullscreen slider) Each image is around 2000px to 2000px and we allow users to upload as many images as they want in our HTML form (capped to 10).
Their upload speed will be pretty slow and I believe we'll easily pass the max_execution_time of PHP which is default to 30 seconds. We will also let users to upload some .rar/.zip files in the future capping at 100MB.
We had few ideas but I wanted to ask SO for a better solution/reviews.
We can change 30 seconds for alot higher value since we have access to PHP.ini and let users upload all images at once, but that may create performance related issues in long term. (This is not an option!)
We can make use of javascript in client size. Foreach image paths specified in HTML form, javascript may post it with XMLHttpRequest one by one and expect a response. If the response is true, javascript moves to the next images and attempts to upload it. (Profit: each image will start php itself and get their own 30 seconds lifetime.)
Javascript solution won't work in file uploads when the file is above 50MB. Customers are usually capping at 25kbps upload speed in target region so there is no way they can upload 50MB in 30 seconds. Similar to #2, we may use a script where uploaded file saves in bytes every 30 seconds and client continues to push remaining bytes, or anything alike.
Basically, how would you complete this task?
We don't want to rely on PHP.ini, so increasing max_execution_time shouldn't be an option.
Should we go with #2 for image uploads and what can you suggest for #3?
Take a look into chunked uploads.
You'll need to use some sort of uploader script like JUpload or plUpload. You can specify how large the chunk of a file should be sent to the server. For example, if you have a 10MB file, you can chunk it to 1MB, so 10 1MB chunks would be uploaded to the server. In the case of a slow connection, just make the chunks smaller like 500KB.
I'm in the process of making a website for a school group and the website will have a lot of pictures of its activities. I'm using lightbox so the pictures will display as a slideshow and I changed the dimension of the pictures so the "current" picture on slideshow isn't as big as the original. I still find about 5 seconds delay from opening the picture or going to the next picture. I'm wondering if there's a way to achieve a faster load time or even another method I didn't consider.
I'm using xhtml, css, and php for my site.
Sorry for answering, but I can't write comment now...
I'm doing it other way, after upload, I resampled picture to big picture and thumbnail.
Ofcourse you can resampled picture everytime, but this is propably reason, why you have to wait so long, and it is lots of work for server, if you have many visitors in one moment.
So for me is best way to said, that big image is 640x480 max, and then save picture after upload to this size, ofcourse resampled by the same ratio.
Edit: From your post, I can't know if you resizing/resampling image, and where, in HTML by setting height and width or in PHP and how often.
Let's say you have a picture on the server, and its size is 8000x6000. its size could be something like 10 MB.
Now let's say you want to display this image in a web page and do it like this:
<img src="largeImage.jpg" width="800" height="600"/>
The browser will download the large image (10 MB), which takes a whole lot of time, and will then resize it to 800x600 in memory to display it on the web page (this consumes memory and CPU time). Total time: 25 seconds.
Now suppose you resize this image to 800x600 and put this resized image on the server. You then display the image with
<img src="smallImage.jpg" width="800" height="600"/>
The small image will look identical to the user visiting your web page, but the browser will only have to download a small image, which will be something like 100 KB large (100 times less than the large image). The time taken to download the image will be divided by 100 as well (0.25 seconds), and the browser won't have to load and resize a huge image in memory (less memory, less CPU time). Your image will be visible almost instantly.
There are many tools (I use Irfanview myself) which are able to take a large collection of images and resize them all at once. You should do that.
I am building a web application that allows the user to upload high res images (in the ballpark of 10mb). After its uploaded it creates a medium sized and thumbnail sized image of the upload. It seems it takes upwards of 100mb to of memory allocated to PHP just to resize to the medium image. I don't have a lot of experience in this type of scalability, will the site easily crash? Will I need webservers with 16gb of memory just to handle the load of the resizing? Are there alternative? Any information would be greatly appreciated!
Thank you!
You could create a queue of images to be resized and ensure that only x number of images are being resized at any given time. x would depend on the amount of available memory.
If you resize the images in real time as soon as they are uploaded, you are bound to run into a situation where more images are being resized than your memory can hold, which would cause a crash.
Instead, as the image are uploaded, add them to a DB. Then have a PHP script which fetches x images from the DB, forks new processes for each of these images to rezize them. As and when a process report completion to the parent, the parent deletes the image's entry from the queue and fetches another. Wash, rinse, repeat.
run benchmarks
do processing in the background
do processing during off-peak hours
use lighter libraries