Dynamic picture loading with PHP and proxy cache? - php

I am a developer of Wave Framework, a lightweight framework that includes a number of functionality that makes it easier to deploy API's and serve resources dynamically.
One of those features is on-demand image editing. For example, on my server I have this file:
http://www.waher.net/w/resources/images/logo.png
But in my HTML, I load my image from a URL like this:
http://www.waher.net/w/resources/images/160x160&logo.png
This '160x160&logo.png' file does not actually exist and the only file that exists is 'logo.png'. Every HTTP request is routed to PHP and parameters in the file URL are parsed in order to apply additional functionality, like picture resolution.
Why is this useful? If my system has a large number of user avatars and my design changes, I can easily change the avatar picture URL's and everything works as expected. I never have to regenerate all the avatars of all my users, especially of those users that do not exist in my system anymore and just waste resources.
But here's my problem, if I want to implement Nginx to serve static files on my server, my system does not work. This is because Nginx will attempt to load static files itself and throws a 404 Not Found message if picture was not found. I assume that the same is true with Apache and Squid.
One of my clients specifically requested that they wish to serve images and resources through Nginx instead, but they would still like the dynamic images for the ease of development and design.
Is it possible to tell Nginx or Squid to send the request to PHP, if the image file itself is not found, get the 'dynamic' image from PHP and then send it to a user through Nginx? And at the same time always serving it from Nginx cache in any subsequent request to the same file?
I want to have the flexibility of dynamically loaded image files, but also have the speed of Nginx when serving image files. Is something like this possible? Do I need to set specific file headers in PHP that allow for this? (I already set cache and expire headers).
Thank you!

Related

Store the number of views of an image

I have a folder on my web server containing lots of PNG images. I would like to store in a database the number of views each image gets. I do not really know how to achieve this. I can only think of using a .htaccess file to rewrite urls in that folder to a php script that would serve those images and also store the visit on MySQL. But maybe there is a better way than serving all the images through a PHP script. I am looking for the simplest possible way to do this. And also, current urls should not be changed.
There isn't a single "good" way to do this. But I cat give you few ideas.
(recommended) Make a proxy PHP script which will take ID (could be file name) of the picture. It will increment the counter in the DB and then redirect to original image.
This is very clear way to achieve expected behaviour. Disadvantage is it changes current filenames. So you have to use following schema
CURRENT_URL -> (mod_rewrite) -> PHP PROXY SCRIPT -> NEW_URL
Redirect all image urls to PHP proxy script which will directly server image (set appropriate headers and output content of file).
This is 100% transparent way to do this but obvious disadvantage is that every image is processed by PHP. So when you have heavy-loaded server this could be problem.
If you have access to logs of web server (access.log for Apache) you can process this file say once a day and update counters in DB. This is very good when approximation is good enough and your server is heavy-loaded because you can parse logs on different machine.
Apache already keeps a log file of all requests (/var/log/apache2/access.log). This file contains the URL requested, time of request etc.
A possible approach could be:
Create a script which parses the access log and updates the database
Configure a cronjob which invokes the script periodically

Different between get image from .jpg and .php?type=3&item_id=013c23

I wonder the pro / cons for below method to get a image:
http://image.anobii.com/anobi/image_book.php?type=3&item_id=013c23a6dd4c6115e4&time=1282904048
http://static.anobii.com/anobii/static/image/welcome/icon_welcome.png
One use php to get image and the second one just enter the url.
e.g. which one is fast?
They potentially serve very different purposes. If you are able to link directly to the .png resource, it is likely (but not guaranteed to be) a real file which is world accessible on the web. When using a PHP script to serve the image content, a lot of different things may be happening behind the scenes.
For example, PHP is able to check the user's session or authentication credentials to provide authorization for the image. The image binary data could be stored in a database instead of in the filesystem, or if in the filesystem, the image file could be stored outside the web server's document root, preventing direct access to it. One common usage is to to deny access to the file when a user is not authorized, and instead serve other image data in its place, like an "access denied" default image.
Another potential use of the PHP script could be per-session hit counting on the resource, or rate limiting clients from hitting a resource too many times.
When serving a static file, authorization, logging, etc. are limited to the capabilities of the web server as it is configured.
The question to ask isn't really which is faster, but which suits the application's business need.

PHP File Upload in Sharded Server Configuration

We use multiple servers to handle incoming web requests which are load-balanced in a round-robin fashion. I've run into an issue that I'm not sure how to solve.
Using AJAX (qqFileUploader), I am uploading a file. By default it goes into the /tmp folder which is fine. The problem is when I try to retrieve that file, that retrieval request gets handled by the next server in line which does not have the file I uploaded. If I keep repeating the request over and over again, it will eventually reach the original server (via round robin load balancing) where the file was stored and then I can open it. Obviously that is not a good solution.
Here is essentially the code: http://jsfiddle.net/Ap27Z/. I removed some of it for brevity. You will see that the uploader object makes a call to a PHP file to do the file upload and then after the file upload is complete, another AJAX call is made to a script to process the .csv file. This is where the process is getting lost in the round-robin.
I read a few questions here on SO relating to uploading files to memory and it seems that it is essentially not currently feasible. Is there another option I can use to upload a file and handle it all within the same request?
The classic solution to this type of problem is to use sticky sessions on your load balancer. That may not be a good solution for you as it would be modifying your whole setup to fix a small problem.
I would suggest adding a sub-domain prefix for each machine e.g. upload goes to www.example.com and then each server is allocated an additional subdomain www1.example.com, www2.example.com which are always passed directly to that server, rather than round robin DNS.
As part of the success result, you could pass back the server name that points to the exact server, rather than the load-balanced name, and then all subsequent Ajax calls that reference the uploaded data use that server specific domain name, rather than the generic load balanced domain name.
Is there another option I can use to upload a file and handle it all
within the same request?
Sure, why not? The code that handles the POSTing of the data can do whatever you want it to do.
There are (at least) 2 solutions to your problem:
You change the load-balancing.
There are several load balancing proxies out there which support session affinity a.k.a. "sticky sessions". That means that a user always gets the same server within a session.
Two programs that can act in this way are HAProxy (related question here on SO) and nginx with a custon module (tutorial here).
You chance the files' location.
The other choice would be to change the location of your stored files to some place that all of your servers can access via the same location. This could be, for example, an NFS mount or a database (with the files stored as BLOBS). This way, it doesn't matter which server processes the request, as all of them have access to the file.

Cache (static) content from other website on my own webserver

My website is loading slowly and I ran this test: http://www.webpagetest.org/result/120227_MD_3CQZM/1/performance_optimization/
Which indicates that files stored on gametrackers.com is not being cached.
Apache and joomla already cache content that is on my server.
I'm using a script from gametrackers.com to show my teamspeak 3 statistics on my website1
However this script sometimes loads slowly duo to issues with gametrackers.com server and that's why I'd like to store a copy of it on my own webserver as cache and refresh it every 30 minutes from the gametrackers website.
If the gametrackers website is down(which is quite common) it should keep the last successful cache check.
How would I do this with apache 2.4.1 and possibly php?
If its possible I'd also like to use css sprites because webpagetest.org indicates:
The following images served from gametracker.com should be combined into as few images as possible using CSS sprites.
http://cache.www.gametracker.com/images/components/html0/gt_icon.gif
http://cache.www.gametracker.com/images/components/html0/online.gif
http://cache.www.gametracker.com/images/flags/nl.gif
http://cache.www.gametracker.com/images/game_icons/ts3.png
http://cache.www.gametracker.com/images/server_info/16x16_channel_green.png
http://cache.www.gametracker.com/images/server_info/16x16_player_off.png
http://cache.www.gametracker.com/images/server_info/vs_tree_item.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_last.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_outer.gif
http://www.gametracker.com/images/game_icons/ts3.png
CSS Sprites are a concept image resource where you use one image with several icons and other items positioned so you can with only one request load several images.
If the images aren´t on your site, it will be very difficult to implement that, and to do so you need strict patterns.
Check: http://coding.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
If you have a vps / dedicated server you can use mod_pagespeed it does several combination of things that web site optimizers like, automatically.
But don´t just believe that web site optimizers and testing tools like that are accurate.
They just suggest measures that could help, some practical, some don´t.
Good luck.

forcing browser to cache images in php website

I've a php based website and would like browser to cache the images for 30 days .. i am using a shared hosting solution where I do not have access to apache config to enable mod-headers or other modules and so can not use htaccess mechanisms for this.
my site is a regular php app, and has both html contents and images. I would like browser to cache images only. I've seen php's "header" function, but couldn't find a way to force only image cache .. How do i go about it ?
Thanks
As far as I know, if you can't get access to Apache to set the headers, your only other option is to serve images from a PHP script so you can use the PHP Header methods to set the headers.
In this case, you'd need to write a PHP image handler, and replace all your image tags with calls to this handler (e.g. http://mysite.com/imagehandler.php?image=logo.png). You would then have you imagehandler.php script retrieve the image from the file system, set the mime type and cache control headers, and stream the image back to the client.
You could write your own, or if you google, you will find image handler PHP scripts. Either way, make sure you focus on security - don't allow the client to retrieve arbitrary files from your web server, because that would be a fairly major security hole....

Categories