I want to upload more than 1000-3000 images to a post using the 'Add media' functionality.
If I add them to the media upload window (drag and drop or select), the browser warns me that the script is lagging. Eg on firefox:
A script on this page may be busy, or it may have stopped responding. You can stop the script now, open the script in the debugger, or let the script continue.Script: ../wp-admin/load-scripts.php?c=0&load%5B%5D=jquery-core,jquery-migrate,utils,plupload,json2&ver=4.1:2
I'm guessing this is expected, as the ajax call to upload the images hasn't returned, hence it seems like its busy.
How can I tweak this to wait, while this particular functionality is called?
Note: This is part of a plugin I am making where the user would be required to attach hundreds of images in each post (like a gallery). Of course I want to use the existing functionality and not reinvent the wheel.
This is expected behaviour, as most ajax upload scripts send the files as you drag and drop. Depending on the size of the photos you could be consuming the maximum amount of RAM for the browsers (since most are 32 bit).... 3000 images at 1 MB each is 3GB and near the limit. It would likely take a few hours to churn through that much data.
A suggestion would be to setup an sftp account and then having a script import those files. The transfer would take less time. The bulk import wouldn't be all that long a minute or two.
The reason why I suggest this, is that web browsers were not design to do bulk upload of files. Is it possible? Yes. Do I recommend it? No. Much like how I wouldn't recommend taking my ferrari through a 3 foot deep puddle. Your method of stuffing the files through php for bulk uploads taxes your server as well. I wouldn't recommend trying to parallelize it either. You will add a significant load to your server and might cause the site to stop responding.
Doing the upload outside of your web server (apache or ngix) is a much safer, secure, less resource draining solution.
You want add 1000 or more images in post means directly upload the
YOURSITENAME/wp-content/uploads/currentmonthfolder
and if u completed means you should add the img tag manually in particular post
Related
I am creating an upload interface to upload files in php.
Files are uploading fine.
But I want to give the user some feedback as how much time will it take to upload, how much of the uploading has been done etc..
I have found online code which gives ajax plugin to do what I want.
BUT my question is more fundamental, WHERE do I get the data in php that tells me how much of the file is received? what is the connection speed(connection speed and file size can be used to get time left) and other information needed?
can i get the data form php or am i looking at the wrong place?
Everything is in the documentation:
http://www.php.net/manual/en/session.upload-progress.php
WHERE do i get the data in php that tells me how much of the file is received
You don't. Uploading is handled by the webserver. When the upload is complete your PHP script will run and it gets a reference to the temp file(s) created by your webserver.
I think there are plugins or mods that do allow you to monitor theses processes. There is session.uploader-progress.php
but there is a much easier solution!
Use the JS FileReader API to slice the file into small chunks (like 5mb) then you can already make a pretty good loading bar. Additionally you can monitor the progress of XMLHTTPRequests to see how many bytes have been sent. This should get you a pretty spot on progress indicator.
This also alleviates common problems with exceeding max_upload_size.
An actual code solution is quite involved so I will refrain from posting one. You should be able to find samples or tutorials.
I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.
I've been on a project for the past few days and hit a problem displaying large quantities of images (+20gb total ~1-2gb/directory)in a gallery on one area of the site. The site is built on the bootstrap framework. I've been trying to make massive carousels that ultimately do not function fluidly due to combined /images size. Question A: In this situation do I need i/o from a database and store images there-- is this faster than in /images folder on front end?
And b) in my php script i need to -set directories to variables/ iterate through and display images into < li >, but how do I go about putting controls on the memory usage so as to not overload browser? Any additions, suggestions, or alternatives would be greatly appreciated. Im looking for most direct means to end here.
Though the question is a little generic, here are some thoughts in regards to your two questions:
A) No, performance pulling images from a database would most likely be worse than pulling straight from the file system. In general, it is not a good idea to store images or other binary data in databases unless you absolutely have to, because databases can't do much with this information and you are just adding an extra layer on top of the file system that doesn't need to be there. You would, however, want to store paths to images in your database, potentially along with other characteristics such as image dimensions, thumbnail paths, keywords, etc. Then your application would read the entries for the images to return the correct paths to the images.
B) You will almost certainly want to implement some sort of paging if you are displaying many hundreds or thousands of photos. If the final display must be a carousel, you will want to investigate the Javascript that drives it to determine how you could hook in a function that retrieves more results from your PHP application via an AJAX call when it reaches the end or near end of the current listing of images. If you are having problems with the browser crashing due to too many images, you will also want to remove images from the first part of the list of <li>s when you load new ones so that it keeps the DOM under control.
A) It's a bad idea to store that much binary data into a database, even if the DB allows it, you shouldn't use it, it'll also give you much more memory consumption, all your data will be stored in the database's memory space, then copied into PHP's memory space for you to handle, which eats up twice the memory, plus the overhead of running a database server, and querying, etc.. so no, it's slower to use a database, accessing the filesystem directly is faster, if you also use varnish or other front-end caching system, you'll even be able to serve content much faster too.
What I would do is store files on the filesystem, and the best server to handle static serving like that is either G-WAN or NGINX Source, but do your read up and decide for yourself what suits you best. point is, stay away from apache, and probably host all those static files onto a separate server running a lightweight http server
ProTip: Save multiple copies of the same image with scaled down sizes for example 50% and another version with 25% of the original image size, this way you'll be able to send the thumbnails first for quick browsing, then when a user decides to view an image you serve up the 50% or 100% size, depending on their screen size, this way you save yourself bandwidth and memory. you also save a big 3G bill for mobile users.
B) This is where it makes some sense to use a database, you can index all the directories into a database, and use that to store the location of the image in the FS, and perhaps some tags, and maybe even number of views, etc...
and in the forntend you'll implement a scipt that'll fetch for example 50 thumbnails per page then the user can scroll around using some fancy JQuery, and when you need to fetch more, simply get a new result set with 50 more thumbs, etc..
this way you'll save yourself memory, bandwidth and even the users will thank you for such a lightweight browsing experience !
Another tip:
If you want to be able to handle bigger traffic, you might want to consider using a CDN, there are many CDN services that aren't as expensive as Amazon S3, a simple search will give you tons of resources !
Happy hacking !
It is possible to track the upload status in PHP (with APC) and I'm wondering if I can cancel the transfer somehow from PHP. Is it possible?
Unfortunately, no. The upload is something which is triggered and controlled by the client (think about it — it is a part of the POST request). This means that even if the upload is going to a 404, it will still continue to upload until the server returns the response.
Maybe you could use JQuery "chunk" file uploaders, so the upload can be made by chunks (through JQuery) and the server can decide when to stop receiving. You'd have to modify the chunking plugin, I guess, but that shouldn't be too much of an effort.
It still doesn't offer you the possibility to cancel in the middle of a packet, but you could probably chunk in 1% bits and make sure you have plenty of opportunities to tell the JQuery script that you don't want more.
See https://github.com/blueimp/jQuery-File-Upload/wiki/Chunked-file-uploads
i've thinking about various ways of handling file uploads in a sort of CMS. I write here because i am not satisfied with what i've got right now...
The problem
Uhmm, lets call it the tumblr way ;-) The user shall be able to upload a file or several files directly without a file management view or s.th like that. The bottom side is that if he deletes the file in the WYSIWYG editor the file stays on the server. In my case there is not only a WYSIWYG editor also a media module...
The question
Is there a best practice for handling this? I've never programmed s.th like that. Would you store the filenames in a MySQL table, would you use a cron job to check if the files are really used in the document?
ANY ADVICE WOULD BE REALLY APRECIATED!!!
Muchissimas Gracias y Saludos!!!!
Personally I use a cron job that runs once a day and cleans up any orphan uploaded files (orphans older that x days).
I admit, I'm curious about other ppl approaches.
Why so much hassle for some additional space used? Hard drive space shouldn't be a concern, since it's so cheap. And even if it were not, the images are very lightweight resources.
The only problems I can imagine is that your CMS's users are uploading very large files. In that case, you should process the image before saving them, lowering the quality and the size.
I think a cron job would be more CPU intensive than letting some 'ghosts' files.
However, you could try to catch when an image is deleted, but then again, it could be more trouble.