So some background: what I'm doing, is creating a gallery that shows thumbnails of all the pictures in a server directory dynamically (it caches the thumbnails, don't worry). When a user clicks on a thumbnail, a loading gif is displayed until the image is ready, and then the image displayed. The actual pictures are very large in size and might take a considerable amount of time to download to a users computer.
What I would like to do, is show a percentage of the picture that is downloaded while the loading gif is playing.
I realize there are other questions like this, and from what research I've done so far, I also realize this might not be able to be accomplished without some server-side tricks.
From what I have come across in the last little bit, I've gathered (and I could be wrong, so please correct me if I am) is that the client-side code, knows how many bytes are received, but not how large the file is.
So is there a possible configuration using some php/javascript tricks, so that the client side javascript can load an image from a web-server directory and be able to calculate downloaded percentage?
Possibly the php code sending an extra header to the client with file size or something? Or even opening a second request to the web server for file size? How could you get the currently downloaded bytes?
You can use XMLHttpRequest2 to load the data and hook onto the progress events. The loaded data is turned into base64 and added to a Data URI. Once loading has finished you can assign a new image source to the constructed URI.
More info can be found here: http://blogs.adobe.com/webplatform/2012/01/13/html5-image-progress-events/
Related
I am working with a large amount of pages (letters) that are the same except for the address and a few other minor details. I believe what slows the PDF creation down the most is the logo image that I'm including on every page (even though it is fairly small).
I'm hoping to speed up the process some more by caching the logo, i.e. by loading the file once and storing it in a variable and have TCPDF use that instead of loading the image every time. TCPDF can load a "PHP image data stream", and the example given is this:
$imgdata = base64_decode('iVBORw0KGgoAAAANSUhEUgAAABwAAAASCAMAAAB/2U7WAAAABlBMVEUAAAD///+l2Z/dAAAASUlEQVR4XqWQUQoAIAxC2/0vXZDrEX4IJTRkb7lobNUStXsB0jIXIAMSsQnWlsV+wULF4Avk9fLq2r8a5HSE35Q3eO2XP1A1wQkZSgETvDtKdQAAAABJRU5ErkJggg==');
$pdf->Image('#'.$imgdata);
However, I have no idea how to create an image stream like this from a file.
My logo is a small (4kB) PNG file. If I use readfile($file) and send that to $pdf->Image with the '#' in front, it errors out - something about the cache folder which is already set to chmod 777 (it's a test server - I'll work on proper permissions on the live server). I believe I also tried base64_encode which also didn't work.
Any thoughts on how to do this?
PS: I already noticed that the more pages I include into the PDF, the slower it gets, so I'll find a good middle (probably 200-250 pages per file instead of the current 500).
Thanks!
Posted the same question in the TCPDF forum on sourceforge (sourceforge forum post), and the author of TCPDF answered.
He said that images are cached internally, however if the images need processing, he suggests using the XObject() template system (see example 62 on TCPDF site).
It took me a while to get it working (still not sure why it didn't work for me at first), but once I had it looking exactly like my original version using Image(), I ran a few tests with about 3,000 entries divided into PDF files of 500 pages each.
There was no speed gain at all between XObject() and Image(), and XObject() actually appeared to make the resulting files just a tiny bit larger (2.5kB in a 1.2MB file).
While this doesn't directly answer my original question (how to create a PHP data stream that can be directly used in TCPDF using Image('#'.$image)), it tells me what I really needed to know - the image is already cached, and caching using XObject() does not provide any advantage to my situation.
I've got an application I'm building with PHP which calls photos from a database according to however many images there are for a given plant species. I have the script resize the otherwise large photos to 100x100. This process REALLY takes a bite out of page load time and my computer's CPU gets up to 100% and is working quite hard.
I think it's because all images are loading at once... Is there a way to have them load only when the previous one is finished? Or is there a more efficient way of rendering images like this? Here is the snippet that loads 'em:
$imagesArray = explode(", ",$images);
unset($imagesArray[count($imagesArray)-1]); // get rid of the last array key which is blank.
echo '<tr><td>Images:</td><td>';
foreach ($imagesArray as $imgloc)
{
echo '<a target="_blank" href="plant_images/'.$imgloc.'"><img src="plant_images/'.$imgloc.'" width="100" height="100" alt="'.$row[2].'" title="'.$row[2].'" /></a> ';
}
Here is a screenshot of a partially loaded image in the page (this is a lot better than what happens other times! Seriously, some species have 10-12 images and my computer takes about 15 seconds to load the page, painfully)
http://www.captainscall.site11.com/temp_stuff/slow-img.png
I found this already, mildly helpful.
Thank you kindly,
Khanahk
The users browser will usually cache the images, so that the user will only experience slow loading the first time he/she visits the page.
However, you should consider having thumbnails of all the images that are being displayed. You don't need to make these thumbnails your self, in 2013 we have computers to do that.
A quick google search will probably give you some software that can make resized copies of all your pictures in no time, or if you know some coding you can make your own script to to this. You can for instance use Imagick in PHP (see http://php.net/manual/en/imagick.scaleimage.php). Then just store two sizes of the images on your server, one for thumbnails and one set with higher resolution.
One advantage of doing this is that you will decrease the outgoing data from your server (if your server has a lot of unique visitors, there will be a lot of traffic due to the size of the images). Also your users will experience less loading time for your site (as you said, waiting for images to load is boring).
You could probably tell the browser with javascript in what order to load the images, but that wouldn't solve your problem, which mainly is that your images are too big.
I think you should:
Serverside: you need to cache an image if it already generated, the next time you can use the cache version. You can create a physical image file for next used.
Client side: you can use library to lazy load your image
http://www.appelsiini.net/projects/lazyload
You can implement a JavaScript function that load image after some specific second.
I've build a CMS with photo album. Pretty simple, most stuff static, static HTML pages, no database, just (as little as possible) text files containing some JSON stuff.
The webinterface for the admin panel is all in jQuery with a PHP (Zend Framework) based backend. A much as possible is done within the browser so the backend is pretty bare.
Now the photo album works currently like this:
Clicking link 'Media'
Fetching a JSON string from the backend containing an object with all albums and for every album all the photo's
Rendering an unordered list with all the albums
Rendering an unordered list within each album list item with all the pictures
Uploading:
Drop one or more jpeg/png files into the browser to create a new album
Drop one or more jpeg/png files into an album to append those files to the album under the cursor
Send all dropped files (using this jQuery drag drop upload plugin) to the backend
Backend receives all files (while displaying a nice progress bar)
Backend loops through all uploaded files, while the webinterface displays a nice spinner
Each file is resized to a maximum size specified and renders a thumbnail at max 133x133 px
Each file is appended to an array with the serverside filename and thumbnail name
[Not yet implemented: rendering the (updated) static html page for the album overview and for each image]
Array with all newly uploaded files is converted to JSON and sent to client
Webinterface appends all new files as list items (displaying the thumbnail)
Uploading done
This is all going pretty well, upto +- 600 images or +- 900MB. That's fine by me, if the user wants to upload more files at once, well, do it in two stages. The problem is, the backend processing is a bitch. Converting 100+ images at a good size (1.5MB each) to the maximum size and generating the thumbnail is taking way to long. I'm doing this with PHP GD. Didn't take me too much time (or no time at all), to find out that that's the problem. My guess is that there is no way I'm going to speed this up within PHP.
So here are a few questions:
Will ImageMagick be faster? I'm not a fan, so please so no, also, I don't want to install this on my server that badly..
Is there a really, really lightweight command-line program that does the same with just a few commands (and I know that I'm not alluding to ImageMagick)?
If the answer to the previous question is no: what would be the best way to do this? Don't say Java, I'm no that big of a fan of Java also. Some C(-dialect)? Preferably one with a powerful, yet lightweight image library for the nearest neighbor, bilinear and bicubic interpolation algorithms.
Could my architecture be changed? At this moment, the images start appearing in the browser once the thumbnail is inserted, thus after the whole JSON array is received, causing the entire action having to complete and generating all the image data before any kind of feedback is received in the browser. This means that the spinner (without any indication of how long the process is going to take or how many images have been completed) will be displayed for a long, long time. Is it an idea to use Javascripts FileReader to preload the images from the users system, generate the thumbnails in the browser and display them after uploading is done immediately? And on the backend: just receiving the file, writing them to disk, executing a command-line command, immediately send response to browser and converting in the background?
How do I prevent client side abort event of an AJAX request? When uploading and converting, a warning should be displayed when the user want to close the page or when the #hash is being tried to change.
Thanks. Hope you guys can help me. Just so you know: the client side is pretty complex with way to much code. I'd rather change the backend.
I have a very large image generated on the fly with PHP and outputted to the browser. (it's 5000px wide and 1000-2000px tall. It's a plot of the daily user activity on my site).
The problem is that nowadays the plot is too big and the PHP script gives memory exhausted errors (tough the generated PNG itself is quite small) and I can't get the image due to this.
Is there way to output this large image in multiple parts somehow using GD in PNG format?
(ps: the host where I run the site uses safe mode, so I can't modify the configuration and I think they're using the default PHP installation.)
EDIT1: It's an admin script. No users see it except me.
EDIT2: and example image can be seen here: http://users.atw.hu/calmarius/trash/wtfb2/x.png
(I also have the option to group the tracks by IP address.)
Every user+IP pair has its own 24 hour track on the plot. And every green mark denotes an user activity. As you can see this image can be output track by track. And there is no need to output and generate the whole thing all once.
This website will be an online strategy game and I want to use this graph in the future to make detecting multiaccounts easier. (Users who are trying to get advantage by registering multiple accounts over those ones who only have 1.) But this is a different problem.
I'm using PHP script because I'm too lazy to export the requestlog from the database, download it and feed the data to a program that would make the plot for me. ;)
Set the memory limit to unlimited before processing the image.
ini_set('memory_limit', '-1');
It'd help to say how you're generating the image (GD library, ImageMagick) and how you're outputting it. Are you saving the file to a directory and then using readfile() to output it? If yes, fopen / fread / echo combination is about 50%-60% faster than using readfile() to output files to the browser. Are you using gzip compression? What's the time limit on php execution? What's the exact error message you're getting?
I'm trying to figure out if this is possible:
web server running PHP collects a number of images from user input
web server takes those images, runs AfterEffects which uses the images instead of placeholders in a template video to create a personalised video for the user
web server makes the video available for download to the user.
Cheers,
Mark.
This would be mighty complex, but I think it might be possible. Here's how I think the process might go down:
Make your After Effects project, importing some placeholder images. Save the project.
Client uploads images. Those images need to be converted to the same image filetype (PNG, JPEG, TIFF, whatever) as your placeholder images, renamed to the same name as your placeholder images, and placed in the same directory as the placeholder images that were referenced in your After Effects project.
Run After Effects from the command line using aerender. More info on that here.
Render to a public directory and give the link to the client.
Delete the client's uploaded images to make room for the next client.
Heres where things would get tricky:
I don't think it's feasible to edit the After Effects project file, so I think the client would be limited to the exact number of images you made in your template. Any more would not appear in the rendered movie, and any less would give a media offline error. I do not think it is possible to have After Effects import media via a script.
Yes. It is possible, our stack is fairly involved. We are doing it at my startup, lumin8.me. Doable but complex, yet fun :)