I'm looking for a way to make a web server cache and provide resized images from another remote server.
Let's say there's Site A located somewhere in Africa. On Site A are JPEG images that are refreshed every five minutes (they're webcams). If you visit Site A from the US, the images take quite a while to load since the server is in Africa.
I have Site B. I would like Site B to display the four images from Site A, but I would like them to be served from a server here in the US (which is also where Site B is hosted). That way, they'll, of course, load much quicker.
I'm not familiar with script creation. I tried a few PHP scripts from CodeCanyon and the concept works, but there always seems to be a few minor bugs that cause the entire system to fail.
They work by providing the remote image URL after the local site URL (i.e. http://www.SiteB.com/image_cacher.php?=http://www.SiteA.com/image1.jpg). That's exactly what I want to do.
I'd like the cached images to be stored on the Site B server in a folder called "cache." That way, I can use a cron job to automatically delete the cached images every five minutes; the same frequency the webcam images are updated.
I've solved the cron job issue though, so my only dilemma now is creating some type of script (preferably PHP) that can achieve this.
There are many similar questions like this here, but they're all minutely different and I unfortunately haven't been able to find anything that can perform this task.
Thanks!
Related
My website is loading slowly and I ran this test: http://www.webpagetest.org/result/120227_MD_3CQZM/1/performance_optimization/
Which indicates that files stored on gametrackers.com is not being cached.
Apache and joomla already cache content that is on my server.
I'm using a script from gametrackers.com to show my teamspeak 3 statistics on my website1
However this script sometimes loads slowly duo to issues with gametrackers.com server and that's why I'd like to store a copy of it on my own webserver as cache and refresh it every 30 minutes from the gametrackers website.
If the gametrackers website is down(which is quite common) it should keep the last successful cache check.
How would I do this with apache 2.4.1 and possibly php?
If its possible I'd also like to use css sprites because webpagetest.org indicates:
The following images served from gametracker.com should be combined into as few images as possible using CSS sprites.
http://cache.www.gametracker.com/images/components/html0/gt_icon.gif
http://cache.www.gametracker.com/images/components/html0/online.gif
http://cache.www.gametracker.com/images/flags/nl.gif
http://cache.www.gametracker.com/images/game_icons/ts3.png
http://cache.www.gametracker.com/images/server_info/16x16_channel_green.png
http://cache.www.gametracker.com/images/server_info/16x16_player_off.png
http://cache.www.gametracker.com/images/server_info/vs_tree_item.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_last.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_outer.gif
http://www.gametracker.com/images/game_icons/ts3.png
CSS Sprites are a concept image resource where you use one image with several icons and other items positioned so you can with only one request load several images.
If the images aren´t on your site, it will be very difficult to implement that, and to do so you need strict patterns.
Check: http://coding.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
If you have a vps / dedicated server you can use mod_pagespeed it does several combination of things that web site optimizers like, automatically.
But don´t just believe that web site optimizers and testing tools like that are accurate.
They just suggest measures that could help, some practical, some don´t.
Good luck.
I have a file uploading site which is currently resting on a single server i.e using the same server for users to upload the files to and the same server for content delivery.
What I want to implement is a CDN (content delivery network). I would like to buy a server farm and somehow if i were to have a mechanism to have files spread out across the different servers, that would balance my load a whole lot better.
However, I have a few questions regarding this:
Assuming my server farm consists of 10 servers for content delivery,
Since at the user end, the script to upload files will be one location only, i.e <form action=upload.php>, It has to reside on a single server, correct? How can I duplicate the script across multiple servers and direct the user's file upload data to the server with the least load?
How should I determine which files to be sent to which server? During the upload process, should I randomize all files to go to random servers? If the user sends 10 files should i send them to a random server? Is there a mechanism to send them to the server with the least load? Is there any other algorithm which can help determine which server the files need to be sent to?
How will the files be sent from the upload server to the CDN? Using FTP? Wouldn't that introduce additional overhead and need for error checking capability to check for FTP connection break, and to check if file was transferred successfully etc.?
Assuming you're using an Apache server, there is a module called mod_proxy_balancer. It handles all of the load-balancing work behind the scenes. The user will never know the difference -- except when their downloads and uploads are 10 times faster.
If you use this, you can have a complete copy on each server.
mod_proxy_balancer will handle this for you.
Each server can have its own sub-domain. You will have a database on your 'main' server, which matches up all of your download pages to the physical servers they are located on. Then a on-the-fly URL is passed based on some hash encryption algorithm, which prevents using a hard link to the download and increases your page hits. It could be a mix of personal and miscellaneous information, e.g., the users IP and the time of day. The download server then checks the hashes, and either accepts or denies the request.
If everything checks out, the download starts; your load is balanced; and the users don't have to worry about any of this behind the scenes stuff.
note: I have done Apache administration and web development. I have never managed a large CDN, so this is based on what I have seen in other sites and other knowledge. Anyone who has something to add here, or corrections to make, please do.
Update
There are also companies that manage it for you. A simple Google search will get you a list.
A client asked me to create an image for him, and he wants me to set up a script that will track how many times the image is clicked AND how many times the image is loaded. He wants this displayed for him in the WP dashboard so I don't think I can use the servers access logs. I have never done anything like this, so I turn to the stackoverflow community. :)
Can somebody just point me in the right direction? Thanks a million.
The easiest way is probably just to use to the access logs of your web server (for Apache, the most common one, check: http://httpd.apache.org/docs/2.0/logs.html). Here every hit (both the link page and the image) is logged, so you can get the information from that.
Most web servers have some tools available that can make this a bit easier than counting by hand (or with grep); just search the web for your web server + log analyzer (e.g. http://awstats.sourceforge.net/ for Apache). If you're not hosting the site yourself, chances are that your web host has pre-installed such a tool for you already.
I'm in the process of building a simple website, or to be more precise a simple component for a website that adds a watermark to an image, creates a few different size images, and overlays it onto a few products. These edits will be made every time someone queries an image in a certain directory on the server.
I know this can all be done with imagemagick, my only concern is that the whole website will grind to a halt every time someone views their image for the first time (after the edit's been made once, the database is updated to get the edited version every time a user accesses it).
The website isn't hosted yet, for the time being I'm testing on XAMPP, but I figured for this I'm going to need a virtual or dedicated server, I just need some advice on what sort of hardware specs I ought to be looking at. I doubt more than 2 or 3 people will be viewing photos at any one time, but at a guess I need to be sure that the server can handle up to 10 or so and still be functional.
Hope someone can advise on this, cheers!
For Imageprocessing you need some CPU Power. If the Images are large they will also consume memory. But I think you should in any case work with caching. I don't know your application but certainly there are possibilities to cache images which are rendered once into the filesystem.
I have a website. It's my first website with Zend Framework but I think it's written good. Generatiom time is about 0.9s now. I'll do it something like 0.2 but leave it now. When you press any link on the website it tooks about 1,5-2s before web browser is loading page. Then it tooks 0.15s to show it. So if execution time is 0.9 where are the other 1.1s? Ping is about 13ms. Website address is http://zgarnijlicke.pl
Edit:
Strange. Second domain, http://lottek.eu, is working good. Look at http://lottek.eu/picostreamer. It isn't lagging like the zgarnijlicke.pl domain.
Edit 2:
There is a problem with Zend-Framework. I setted up action without rendering view (layout disabled too) and it's working as fast as server can do it. I'll make new question for it.
Here's a webpagetest.org report for your site: http://www.webpagetest.org/result/100721_1P0Y/
If you view the waterfall graph for the first view, you'll see that the browser gets your HTML source at around the 1.2 second mark, and is first able to render your page at just after 4 seconds. What happens in between those two is downloading of your three javascript files and two CSS files. So, this is where you want to start. Some suggestions:
Consider using a free CDN for jquery.js insteading of serving it from your server, e.g. Google's: http://code.google.com/apis/ajaxlibs/ . This way, users are more likely to already have it cached, Google will serve it from a location geographically closer to the user, and (I think) in compressed format.
For jquery.corner.js and jquery.media.js, consider merging them into one file and serving them compressed (the Apache module mod_deflate makes this very easy to do)
Same for your CSS files - consider merging them into one file and serving them compressed.
Those will give you some quick wins. However there are other things you can improve:
Add width and height attributes to your image tags. Without these, some browsers will halt rendering while they download the images so that they know how much space they'll occupy. None of your image tags have these attributes.
Make sure you're using the right image format for the job. Your banner.png image is over 300k which is far too large. I converted this to a JPEG image (80% quality) and it was 30k.
As for the execution time, 0.9 seconds seems quite high. Are you using APC or similar? Is the page doing any heavy database work?
Try putting some timer code in your php that measures the length of time it takes to generate the page content. This way you can confirm or rule out server problems.
You might also use network tools like ping and traceroute to see if your problem is caused by network latency.
A quick test with wget here gets an overall execution time of 1.5s to transfer one of the pages, with an actual download time of 0.2 seconds, so 1.3s of overhead. The pause occurs before the transfer starts, so that's a server-side problem.
Is that site on a virtual server? It's possible that if the underlying physical server is heavily loaded, your image could be getting swapped out or otherwise CPU-starved and takes ~1 second to become responsive again.
Perhaps it's an internal resource thing - are you connecting to a DB, especially a remote one? Even if some or most of the pages aren't DB-driven, the overhead of connecting to a DB could be causing this slowdown. And then gets swapped/delayed again as there's little further activity to keep the image active.
It could even be something as silly as Apache being configured with 'IdentityCheck' on, though unlikely, as this would slow down all requests. I'm not seeing any slowdown on the requests for .css/.js files from your server when viewed from HTTPFox. Interestingly, requesting the .css/.js via wget returns a '500 Internal Server Error'.
I found it. It's problem with ZF because when I did hello.php page with code like that:
hello world
Without any < ?php ?> script took 0.4s to complete.