I have a php web application which has a gallery.
This gallery uses justified javascript.
Then it uses timthumb.php to resize the images without saving the images in the server.
i would like to know which one would be better..
Loading all the images using timthumb.php
Or saving resized images in the server cache folder and loading all
the images from the cache folder.
I have tried these two methods. Strangely 2nd method is slower than first for the first load.
Thank you for all the help.
Lynn
Timthumb tends to have security issues and either way image processing requires a great deal of RAM so having cache folders is the best option. Notice that I said folders and not a cache folder. On IIS servers or any windows based server you will run into slowness accessing folders which have more than a few thousand files. Linux is known to have the same problem but not until you have a few hundred thousand files in a folder. Either way, if you're dealing with millions of images it is best to categorize them in some way into separate folders so you don't end up with slowdowns from the OS trying to locate the file.
Honestly I do not have much idea about timthumb.php.
Although saving the photos in a server cached folder seems to be a better idea, You can save the save the path of the image in your datasource (normally a relational database) , and then while retrieving the photos , extract it from cached folder.
It might be a possibility that your cache is getting reloaded time and again and thats why taking sometime in the first load.
Related
I've got a weird issue lately and I am not sure what causes it. I've got a dedicated box with a few websites on it and it seems like memcache only works on half of them.
2 wordpress websites (both got W3 Total Cache with same settings) and one of them is stuck while the other one works fine. If I manually clear cache from W3 it will work until the next post.
At first I thought it was plugin's fault but then I've noticed similar issues on some of my other sites. For example, when I try to update a php file it will show me the file I uploaded unless the file size is different.
Another occasion was DIR which would point out to the old folder name (the one used when it was uploaded) unless I edit all the files in that folder that had DIR in them.
Any ideas/suggestions?
PS: All these things started happening after I installed php curl on my server.
I am running a Codeigniter application that uses page cache with ~350k files and take up ~26Gb of disk space in ONE folder. Is that too much?
If memory serves, CI puts each page cache in a single folder. Assuming you are a FAT32 Linux machine, and the files aren't grouped into sub-folders I imagine you may be taking a performance hit.
Referencing this; How many files can I put in a directory?
A reasonable solution could be to override the default caching library to group things into folders (perhaps based on the first N of the hash name) to improve performance.
I've got nearly a million images for my site and they are stored in one folder on my windows server.
Since opening this folder directly on desktop drive me and my CPU crazy I am wondering that whether fetching one of them using my PHP script for a HTTP request is also laborious. So, will separating them into different folders improve the performance?
No, the performance does not depend on the number of files that are in a directory. The reason why opening the folder in Windows explorer is slow is because it has to render icons and various other GUI related things for each file.
When the web server fetches a file, it doesn't need to do that. It just (more or less) directly goes to the location of the file on the disk.
EDIT: Millions is kind of pushing the limits of your file system (I assume NTFS in your case). It appears that anything over 10,000 files in a directory starts to degrade your performance. So not only from a performance standpoint, but from an organizational standpoint as well, you may want to consider separating them into subdirectories.
Often the best answer in a case like this is to benchmark it. It shouldn't be too hard to create a program that opens 1000 hard-coded file names and closes them. Run the test on your million-plus directory and another directory containing only those 1000 files being tested and see if there's a difference.
Not only does the underlying file system make a difference, but accessing over a network can affect the results too.
Separating your files into seperate directories will most likely help performance. But as mark suggests it's probably worth benchmarking
Problem:
I could not find a proper/working info googling about caching dyanamic images, as per my requirement.
I am using Zend Framework and url of the images looks like http://localhost/media/image/id/1120. The images are stored in the database.
Is there any way I can cache my images.
One of the Resouce Looked at:
http://dtbaker.com.au/random-bits/how-to-cache-images-generated-by-php.html
This is the closet I could get and the images are cached in this website but can't make it work on situation like mine.
A help would be great.
If you are using Zend Framework, then your first port of call for caching anything should be Zend_Cache.
Having said that, caching images is probably the easiest caching problem to solve as it is just a matter of saving a local copy of any image to your server after it is generated and checking for a local copy before running the code that generates it.
Something like:-
if(file_exists(image12345.png){
serve cached file;
} else {
generate new file;
save new file to image12345.png;
serve cached file;
}
Looking at your user name you should be able to work out real, working code for yoursel(f)(ves).
I'm building a webapp that as a small subset of one of the features allows images to be uploaded. I'm running a lamp stack, with Mongo instead of MySql.
I use a javascript uploader with a php backend to upload the files. The whole framework is under version control though, so I don't want to dump these files anywhere inside my framework, as it would get messy with the version control, and sub-optimal when I eventually migrate the media over to a CDN.
So, my question is - On a VPS, where should I drop these images for now? In some folder external to my framework? In my DB as bson? I've heard Mongo does a decent job handling binary data...
And, as a follow up, if I'm eventually planning on moving the content over to a CDN, how would you recommend structuring my schema for now?
My current plan would be something like the following:
All uploads are named with a unique
ID and dropped in an external
folder, defined by a globally
accessible variable of sorts.
A reference to each images' name is
stored in the db.
Is there anything obviously stupid about going about it that way, possibly causing me a scaling headache later?
Here's a summarized specific question, just so this is a little more of an SO friendly question:
Given a Linux, Apache, Mongo, PHP Framework on a VPS, what is the best way to store uploaded images while keeping scalability and speed as the 2 most important factors in deciding on the solution?
if your plan is moving to CDN, the answer couldn't be more easy: create a subdomain on your VPS, and drop your images there, and you will have decent CDN simulation as well as reliable file storage.
I do agree that you should never put user uploaded content inside your webapp for many number of reasons but then one of the problem is how to show img tag in HTML which takes src attribute relative to webapp.
An easy workaround can be: Suppose create a folder /usr/ImagesUPloadByUser on your unix box and drop all images there. and then create a link ( linux link) in your webapp which points to correct directory. Now the images are not residing in ur webapp and you also have easy access .