I've build a CMS with photo album. Pretty simple, most stuff static, static HTML pages, no database, just (as little as possible) text files containing some JSON stuff.
The webinterface for the admin panel is all in jQuery with a PHP (Zend Framework) based backend. A much as possible is done within the browser so the backend is pretty bare.
Now the photo album works currently like this:
Clicking link 'Media'
Fetching a JSON string from the backend containing an object with all albums and for every album all the photo's
Rendering an unordered list with all the albums
Rendering an unordered list within each album list item with all the pictures
Uploading:
Drop one or more jpeg/png files into the browser to create a new album
Drop one or more jpeg/png files into an album to append those files to the album under the cursor
Send all dropped files (using this jQuery drag drop upload plugin) to the backend
Backend receives all files (while displaying a nice progress bar)
Backend loops through all uploaded files, while the webinterface displays a nice spinner
Each file is resized to a maximum size specified and renders a thumbnail at max 133x133 px
Each file is appended to an array with the serverside filename and thumbnail name
[Not yet implemented: rendering the (updated) static html page for the album overview and for each image]
Array with all newly uploaded files is converted to JSON and sent to client
Webinterface appends all new files as list items (displaying the thumbnail)
Uploading done
This is all going pretty well, upto +- 600 images or +- 900MB. That's fine by me, if the user wants to upload more files at once, well, do it in two stages. The problem is, the backend processing is a bitch. Converting 100+ images at a good size (1.5MB each) to the maximum size and generating the thumbnail is taking way to long. I'm doing this with PHP GD. Didn't take me too much time (or no time at all), to find out that that's the problem. My guess is that there is no way I'm going to speed this up within PHP.
So here are a few questions:
Will ImageMagick be faster? I'm not a fan, so please so no, also, I don't want to install this on my server that badly..
Is there a really, really lightweight command-line program that does the same with just a few commands (and I know that I'm not alluding to ImageMagick)?
If the answer to the previous question is no: what would be the best way to do this? Don't say Java, I'm no that big of a fan of Java also. Some C(-dialect)? Preferably one with a powerful, yet lightweight image library for the nearest neighbor, bilinear and bicubic interpolation algorithms.
Could my architecture be changed? At this moment, the images start appearing in the browser once the thumbnail is inserted, thus after the whole JSON array is received, causing the entire action having to complete and generating all the image data before any kind of feedback is received in the browser. This means that the spinner (without any indication of how long the process is going to take or how many images have been completed) will be displayed for a long, long time. Is it an idea to use Javascripts FileReader to preload the images from the users system, generate the thumbnails in the browser and display them after uploading is done immediately? And on the backend: just receiving the file, writing them to disk, executing a command-line command, immediately send response to browser and converting in the background?
How do I prevent client side abort event of an AJAX request? When uploading and converting, a warning should be displayed when the user want to close the page or when the #hash is being tried to change.
Thanks. Hope you guys can help me. Just so you know: the client side is pretty complex with way to much code. I'd rather change the backend.
Related
I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.
I am working on a project that consists of a online store, to sell Printed Circuit Boards.
But the hole system will be automated, and will be able to view online the Gerber files (Gerber are the files that has the machine code for the pcb).
I need to choose the best way to "output" this file uploaded by the user, to the webpage. Only for viewing the PCB before buying.
I have done the entire PHP code to process the Gerber, but, I can't decide if the file will be proccesed, and then:
I will save as a PNG file (rendering will be done with the PHP image library {that is a shit}), and if the user zoom, or do anything, it will not be perfect... (I would need to render in a high resolutiong, and would take a lot of space, and also time to load)
Render as an SVG file (Vector file), and show on the HTML as an mbeded element (Does it work on all new browsers? Is it slow to proccess?), The SVG file are awsome in terms of drawing lines...
And the last but not least way to doit, is to create a list of JavaScript commands, that draws on a Canvas Element (I have already implemented this, and works really good, but I don't like to think, that I'm actualy 'rendering' to a code...)
Anyway, what do you think I should use, and if I didn't tought in another way of doing it, please tell me!
Here is an example of the output as Canvas (With the source being a JavaScript function object, that has many drawLine commands):
I'm opting for SVG file, even if it's slower than canvas, but considering that won't be real time drawing process in terms of user interaction with the drawing object it's a good choice. SVG's are actually vectors, hence the images won't be crispy on large images. There are some quite good libraries out there which are working with svg-s like:
http://processingjs.org
http://d3js.org/
http://fabricjs.com/
http://raphaeljs.com/
I think d3 is one of the best. I definitely wont' recommend PHP image library.
So some background: what I'm doing, is creating a gallery that shows thumbnails of all the pictures in a server directory dynamically (it caches the thumbnails, don't worry). When a user clicks on a thumbnail, a loading gif is displayed until the image is ready, and then the image displayed. The actual pictures are very large in size and might take a considerable amount of time to download to a users computer.
What I would like to do, is show a percentage of the picture that is downloaded while the loading gif is playing.
I realize there are other questions like this, and from what research I've done so far, I also realize this might not be able to be accomplished without some server-side tricks.
From what I have come across in the last little bit, I've gathered (and I could be wrong, so please correct me if I am) is that the client-side code, knows how many bytes are received, but not how large the file is.
So is there a possible configuration using some php/javascript tricks, so that the client side javascript can load an image from a web-server directory and be able to calculate downloaded percentage?
Possibly the php code sending an extra header to the client with file size or something? Or even opening a second request to the web server for file size? How could you get the currently downloaded bytes?
You can use XMLHttpRequest2 to load the data and hook onto the progress events. The loaded data is turned into base64 and added to a Data URI. Once loading has finished you can assign a new image source to the constructed URI.
More info can be found here: http://blogs.adobe.com/webplatform/2012/01/13/html5-image-progress-events/
I'm building a web app where users can store links, with 200x200 pictures associated. By default, I'd like to crawl the link for images and then return thumbnails of the biggest ones (from which the user can select the "official" thumbnail). I want this all to happen via AJAX. My question is: what is the best way to do this?
Currently, I'm using the PHP Simple HTTP Parser to scan a URL. I then find the src attribute of all the <img> tags, use getimagesize to store the image size located at that URL, sort the array from biggest to smallest and return the top 5 biggest image URL's via AJAX to the client. Then the client sends a different AJAX request for each one which makes a server-side ImageMagick script download and cut the image to a thumbnail, save it in a temporary folder and then return the URL of this thumbnail, which the client finally loads on his browser.
Needless to say, this is a little complicated and probably really inefficient. Running this process on http://en.wikipedia.org takes about 10-15 seconds from start to finish. I'm not certain there are any more efficient ways, however.
I'd do it in one AJAX request, with the script automatically resizing the biggest 5 images on the first pass, saving them, and returning a JSON array with the resized image URLs for the client.
You should probably use PHP's DOMDocument class to grab/parse the html page.
getimagesize() means you have to download each image and process them. Perhaps you should consider simply showing the user ALL images, by simply placing img tags that link back to the original HTML page. You could size these however you like using the tags. This way you do not have to download/process a single image until the user has actually selected one for the thumbnail.
Interested if / how you solved this?
In the end I looped through the images doing getimagesize() until both height and width were over a certain size, and then broke the loop.
This way its slightly more efficient as it only downloads as many images as it needs
I'm trying to figure out if this is possible:
web server running PHP collects a number of images from user input
web server takes those images, runs AfterEffects which uses the images instead of placeholders in a template video to create a personalised video for the user
web server makes the video available for download to the user.
Cheers,
Mark.
This would be mighty complex, but I think it might be possible. Here's how I think the process might go down:
Make your After Effects project, importing some placeholder images. Save the project.
Client uploads images. Those images need to be converted to the same image filetype (PNG, JPEG, TIFF, whatever) as your placeholder images, renamed to the same name as your placeholder images, and placed in the same directory as the placeholder images that were referenced in your After Effects project.
Run After Effects from the command line using aerender. More info on that here.
Render to a public directory and give the link to the client.
Delete the client's uploaded images to make room for the next client.
Heres where things would get tricky:
I don't think it's feasible to edit the After Effects project file, so I think the client would be limited to the exact number of images you made in your template. Any more would not appear in the rendered movie, and any less would give a media offline error. I do not think it is possible to have After Effects import media via a script.
Yes. It is possible, our stack is fairly involved. We are doing it at my startup, lumin8.me. Doable but complex, yet fun :)