Problem:
I could not find a proper/working info googling about caching dyanamic images, as per my requirement.
I am using Zend Framework and url of the images looks like http://localhost/media/image/id/1120. The images are stored in the database.
Is there any way I can cache my images.
One of the Resouce Looked at:
http://dtbaker.com.au/random-bits/how-to-cache-images-generated-by-php.html
This is the closet I could get and the images are cached in this website but can't make it work on situation like mine.
A help would be great.
If you are using Zend Framework, then your first port of call for caching anything should be Zend_Cache.
Having said that, caching images is probably the easiest caching problem to solve as it is just a matter of saving a local copy of any image to your server after it is generated and checking for a local copy before running the code that generates it.
Something like:-
if(file_exists(image12345.png){
serve cached file;
} else {
generate new file;
save new file to image12345.png;
serve cached file;
}
Looking at your user name you should be able to work out real, working code for yoursel(f)(ves).
Related
I have a php web application which has a gallery.
This gallery uses justified javascript.
Then it uses timthumb.php to resize the images without saving the images in the server.
i would like to know which one would be better..
Loading all the images using timthumb.php
Or saving resized images in the server cache folder and loading all
the images from the cache folder.
I have tried these two methods. Strangely 2nd method is slower than first for the first load.
Thank you for all the help.
Lynn
Timthumb tends to have security issues and either way image processing requires a great deal of RAM so having cache folders is the best option. Notice that I said folders and not a cache folder. On IIS servers or any windows based server you will run into slowness accessing folders which have more than a few thousand files. Linux is known to have the same problem but not until you have a few hundred thousand files in a folder. Either way, if you're dealing with millions of images it is best to categorize them in some way into separate folders so you don't end up with slowdowns from the OS trying to locate the file.
Honestly I do not have much idea about timthumb.php.
Although saving the photos in a server cached folder seems to be a better idea, You can save the save the path of the image in your datasource (normally a relational database) , and then while retrieving the photos , extract it from cached folder.
It might be a possibility that your cache is getting reloaded time and again and thats why taking sometime in the first load.
Hi there I developing an app and it requires the app to get the titles off the mp3 files. (Mp3 files only) the options are getting then from PHP or from android itself. I have tried the the mediametadata class but it just force closes I can assume that its for only local files. As for PHP, I need to install something I can't remember what but I can't do that since I switch servers very often. I would like something simple :)
Oh and I tried ffmpeg meta retriever library as well works, but turns my app into massive size. So would prefer something tjats lightweight as well.
I am currently working on a website project that requires the creation of an image based on user input on a form. Basically, I'm trying to create an image from existing HTML markup with the form data replacing some of text. The text is almost always unique.
I have explored several options for creating such an image, mainly:
imagecreate with PHP that wasn't flexible enough, and
PhantomJS which I can't really install on the server
wkhtmltopdf and php-wkhtmltox
I am working in a shared-hosting environment that limit my available options. The environment supports PHP (compiled with GD), Perl, and Python.
Is there a reliable way to implement such a behavior?
I use CutyCapt software to render images from html, it works perfectly. The only issue is that you need to build it on your shared hosting. Probably the workaround could be in building it on your local machine, uploading it to your hosting and running it from there, but I don't believe it will work, because it requires X server.
Another possible solution could be use of GD library to build the image manually. If you know how your html is looks like, you can replicate it by drawing all elements on a blank image. It is not easy to do, but looks like this is only one solution which you can use.
I am developing an application in the Kohana PHP framework that assesses performance. The end result of the process is a webpage listing the overall scoring and a color coded list of divs and results.
The original idea was to have the option to save this as a non-editable PDF file and email the user. After further research I have found this to be non as straight forward as I hoped.
The best solution seemed to be installing the unix application wkhtmltopdf but as the destination is shared hosting I am unable to install this on the server.
My question is, what's the best option to save a non editable review of the assessment to the user?
Thank you for help with this.
I guess the only way to generate a snapshot, or review how you call it, is by storing it on the server side and only grant access via a read only protocol. So basically by offering it as a 'web page'.
Still everyone can save and modify the markup. But that is the case for every file you generate, regardless of the type of file. Ok, maybe except DRM infected files. But you don't want to do that, trust me.
Oh, and you could also print the files. Printouts are pretty hard to be edited. Though even that is not impossible...
I found a PHP version that is pre-built as a Kohana Module - github.com/ryross/pdfview
I'm building a webapp that as a small subset of one of the features allows images to be uploaded. I'm running a lamp stack, with Mongo instead of MySql.
I use a javascript uploader with a php backend to upload the files. The whole framework is under version control though, so I don't want to dump these files anywhere inside my framework, as it would get messy with the version control, and sub-optimal when I eventually migrate the media over to a CDN.
So, my question is - On a VPS, where should I drop these images for now? In some folder external to my framework? In my DB as bson? I've heard Mongo does a decent job handling binary data...
And, as a follow up, if I'm eventually planning on moving the content over to a CDN, how would you recommend structuring my schema for now?
My current plan would be something like the following:
All uploads are named with a unique
ID and dropped in an external
folder, defined by a globally
accessible variable of sorts.
A reference to each images' name is
stored in the db.
Is there anything obviously stupid about going about it that way, possibly causing me a scaling headache later?
Here's a summarized specific question, just so this is a little more of an SO friendly question:
Given a Linux, Apache, Mongo, PHP Framework on a VPS, what is the best way to store uploaded images while keeping scalability and speed as the 2 most important factors in deciding on the solution?
if your plan is moving to CDN, the answer couldn't be more easy: create a subdomain on your VPS, and drop your images there, and you will have decent CDN simulation as well as reliable file storage.
I do agree that you should never put user uploaded content inside your webapp for many number of reasons but then one of the problem is how to show img tag in HTML which takes src attribute relative to webapp.
An easy workaround can be: Suppose create a folder /usr/ImagesUPloadByUser on your unix box and drop all images there. and then create a link ( linux link) in your webapp which points to correct directory. Now the images are not residing in ur webapp and you also have easy access .