I am writing a mobile version of a website, and I want to
use WURFL-PHP to perform the mobility check on the client.
re-use the existing .htaccess files. I don't want to parse anything manually. This is better left to Apache.
For simplicity, let's say I have two flat files: file.d.html and file.m.html. In index.php, I consider making an appropriate HTTP request on one of these files depending on the client we are serving (mobile vs non-mobile) and forwarding the response to the client. Everything will be the way .htaccess files prescribe.
To make the request, a PHP library called pecl_http exists. Problem is: I only have a shared hosting plan, so my webhost is reluctant to make it available.
So, can someone tell me if this is reasonable, or I'm just complicating things?
Also, is there any way to accomplish this without an extra HTTP request? I wonder if Apache has some way to test for mobility. Perhaps I can add something to .htaccess instead of coding it in PHP.
Many thanks.
Personally I wouldn't do this server-side but with javascript and CSS media queries. Specifically there is process called 'Responsive Web Design' where your browser picks your styles based on the screen width in pixels. This avoids the evil of browser sniffing and therefore will generally support future mobile platforms without any changes. You can use the display style to hide whole blocks of content if necessary or just use different values for width and float to change from horizontal to vertical layouts. The Wikipedia page on RWD warns that some mobile platforms don't support media queries but in practice as of 2012 that's an extremely small share of the mobile market.
Related
I'm working on a website that can be found here:
http://odesktestanswers2013.com/Metareviewer
The index appears to be unusually slow (slowing down the browser as it loads) even though Yslow doesn't seem to see anything particularly wrong with it and that my php microtime returns a fine value.
What's the other things I should be looking into ?
Using Chrome Developer Tools, the network tab shows this:
... a timeline of what's loading in your page.
There are also plenty of good practices that aren't being made here. Some of these can also be flagged up by using Google Chrome's Audit tool (F12 menu), but in my opinion the most important are:
Use a CDN for serving common library code. Do you really need to host Jquery yourself? (side-rant, do you really need jquery at all?)
Your JavaScript files are taking a long time to load, because they are all served as separate HTTP calls. You can combine them into a single JavaScript file, and also minify them to save lots of bandwidth.
Foundation.css is very large - not that there's a problem with large CSS files, but it looks like there are over 2000 rules in the CSS file that aren't being used on your site. Do you need this file?
CACHE ALL THE THINGS - there are 26 HTTP requests that are made, that are uncached, meaning that everyone who clicks on your site will have to download everything, every request.
The whole bandwidth can be reduced by about two thirds if you enabled gzip compression on your server (or even better, implement SPDY, but that's a newer technology with less of a community).
Take a look on http://caniuse.com - there are a lot of CSS technologies that are supported in modern browsers without the need for -webkit or -moz, which could save a fortune of kebabbobytes.
If I could change one thing on your site...
Saying all of that, each point above will make a very small (but accumulative) difference to the speed of your site, but it's probably a good idea to attack the worst offender first.
Look at the network graph. While all that JavaScript is downloaded, it is blocking the rest of the site to download.
If you're lazy, just move it all to the end of the document body. That way, the rest of the page will download before the JavaScript has to, but this could harm the execution of your scripts if they are programmed in particular styles.
Hope this helps.
You should also consider using http://www.webpagetest.org/
It's one of the best tools when it comes to benchmarking your site's performance.
You can use this site (http://gtmetrix.com/) to analyze the causes and to fix them them.The site provides the reasons as well as solutions like js and css in optimized formats.
As per this site's report, you need to optimize images and minify js and css files. The optimized images and js and css files can be downloaded from this site.
Use Google Chrome -> F12 -> Network and check the connect, send, receive and etc. time for each resource, used in your page.
It looks like your CSS and JS scripts have a very long conntect and wait times.
you can use best add-one available for both chrome and firefox
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.
above is link for firefox add-one you can also search chrome and is freely available.
Yslow gives you details about your website's front end. Most likely you have a script that is looping one to many times in the background.
If you suspect that a sequence of code is hanging server side then you need to do a stack trace to pinpoint exactly where the overhead is taking place.
I recommend using New Relic.
Try to use Opera. Right click -> Inspect element -> Profiler.
Look to Inspect element -> Errors.
The question is simple:
When is a good idea save static files ( images, js, css, etc ) in a subdomain ? and when is not a good idea ? and why ?
Right now, I am using a shared hosting, but in the future I will use a dedicated server.
Thanks
It's not the subdomain that has any relevance, it's the content delivery mechanism that is relevant....
If your site has high traffic such that performance is poor or your current host is unable to keep up, it can make sense to offload the serving of commonly requested or large-size assets (images, video, some CSS or Javascript, etc) to a more specialised content delivery server.
For example, I use Amazon's CloudFront Content Delivery Network (CDN) to serve all my static images. It is much faster at serving static content (it caches the content in several edge locations around the world), and it frees up my server to perform better in processing and serving dynamic content (dynamic PHP and CSS files). Until recently, amazon's CDN could only serve static files, so it made sense that way.
I map my own subdomains to the Amazon CDN server, but you don't have to -- there's no requirement saying thay you have to map a subdomain to do that. You can use your content delivery provider's default URL if you prefer.
So how do you know if your site would benefit from serving of static assets from a CDN? I made the assessment using YSlow. TSlow profiles your site responsiveness and speed and makes recommendations to speed up the site. I measured before and afterwards and saw an improvement. Be careful though -- it's easy to get addicted to profiling! The best gauge is by analysing your user behaviour. Do they give up and leave your site too early? Do people complain about poor performance? If not, then find other areas that may better
improve your bottom line.
My website is loading slowly and I ran this test: http://www.webpagetest.org/result/120227_MD_3CQZM/1/performance_optimization/
Which indicates that files stored on gametrackers.com is not being cached.
Apache and joomla already cache content that is on my server.
I'm using a script from gametrackers.com to show my teamspeak 3 statistics on my website1
However this script sometimes loads slowly duo to issues with gametrackers.com server and that's why I'd like to store a copy of it on my own webserver as cache and refresh it every 30 minutes from the gametrackers website.
If the gametrackers website is down(which is quite common) it should keep the last successful cache check.
How would I do this with apache 2.4.1 and possibly php?
If its possible I'd also like to use css sprites because webpagetest.org indicates:
The following images served from gametracker.com should be combined into as few images as possible using CSS sprites.
http://cache.www.gametracker.com/images/components/html0/gt_icon.gif
http://cache.www.gametracker.com/images/components/html0/online.gif
http://cache.www.gametracker.com/images/flags/nl.gif
http://cache.www.gametracker.com/images/game_icons/ts3.png
http://cache.www.gametracker.com/images/server_info/16x16_channel_green.png
http://cache.www.gametracker.com/images/server_info/16x16_player_off.png
http://cache.www.gametracker.com/images/server_info/vs_tree_item.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_last.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_outer.gif
http://www.gametracker.com/images/game_icons/ts3.png
CSS Sprites are a concept image resource where you use one image with several icons and other items positioned so you can with only one request load several images.
If the images aren´t on your site, it will be very difficult to implement that, and to do so you need strict patterns.
Check: http://coding.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
If you have a vps / dedicated server you can use mod_pagespeed it does several combination of things that web site optimizers like, automatically.
But don´t just believe that web site optimizers and testing tools like that are accurate.
They just suggest measures that could help, some practical, some don´t.
Good luck.
I am curious as to know if the basic PHP File Upload procedure will work on all devices.
Such as:
All Major Browsers
Mobile Phones (iPhone, Blackberry, Android, Palm)
I need to be able to upload Videos/Photos from any device onto our server, will there be any issues that you can forsee?
First Off, PHP is not the one that handles file uploads, that dependent on the browser and how they handle it. The browser is the one sending the information about the file (contents, size, type)
Second, mobile platforms will not do this well. iPhone doesn't allow file uploads from their browser. I'm not sure on android, but iPhone alone should be enough to re-think this.
The browser doesn't "SEE" PHP....php is server side code. The interaction on the front end might happen with flash (often used in "ajax-styled" uploads") javascript, and/or html. HTML is about as basic as it gets, so 99% of the devices out there can deal with it.
Just imagine the chaos if we had to coordinate our browsers with the back-end technology. It's tough enough as it is with the lack of current standards and half-decade-in-the-dark browser team that does IE....thanks Microsoft.....
I'd personally caution you against using bleeding-edge technology such as CSS3 or HTML5 if you're trying to make something "universally" compatible. For example, AOL's browser, which is still in use, believe it or not, is essentially an old version of IE6. Some of the phone browsers are even more basic, though I'd question whether people would ever REALLY use those in an upload situation. It's not fun to use "old" technology, but especially when clients are very specific about backwards compatibility, you often don't have a choice.
What are some important optimizations that can be made to a website to reduce the loading time?
Remove/Minimize any bottlenecks on the server side. For this purpose, use a profiler like Xdebug or Zend Debugger to find out where your application is doing expensive and slow operations. Implement caching where possible. Use an OpCode Cache. If this still isn't fast enough consider investing in more CPU or RAM or SSDs (depending on whether you are CPU, IO or Memory bound)
For general server/client side optimizations, see the Yahoo YSlow! User Guide.
It basically sums it up to:
Minimize HTTP Requests
Use a Content Delivery Network
Add an Expires or a Cache-Control Header
Gzip Components
Put StyleSheets at the Top
Put Scripts at the Bottom
Avoid CSS Expressions
Make JavaScript and CSS External
Reduce DNS Lookups
Minify JavaScript and CSS
Avoid Redirects
Remove Duplicate Scripts
Configure ETags
Make AJAX Cacheable
Use GET for AJAX Requests
Reduce the Number of DOM Elements
No 404s
Reduce Cookie Size
Use Cookie-Free Domains for Components
Avoid Filters
Do Not Scale Images in HTML
Make favicon.ico Small and Cacheable
Also see the comments contributed below, as they contain some additional useful information for other users.
Before attempting any optimizations first you need to be able to profile, get FireBug for Firefox. Then you can run some analysis that will tell you exactly what to do using YSlow. Fundamental things that you should do are listed here.
definitely want to look at caching, as round trips to DB are expensive.
also, minify JS
Here are a few "best practice" things:
Caching CSS, JavaScript, images, etc.
Minifying Javascript files.
gzip content.
Place links to JavaScript files, JavaScript code, and links to CSS files at the bottom of your page when possible.
Load only what is necessary.
For an existing website, before you do any of this determine where your bottlenecks are with tools like Firebug and as someone else mentioned YSlow (I highly recommend this tool).
install firebug and pagespeed plugin
follows all the pagespeed directives (until possible) and be happy
http://code.google.com/intl/it/speed/page-speed/
anyway the most importante optimization in my experience is to reduce the number of HTTP requests to a minimum...
The simple options I can think of are:
Gzip (x)html, so a compressed file should arrive more quickly to the user
minify the CSS
minify the JS
use caching where possible
use a content-delivery network
use a tool, such as yslow to identify bottlenecks and further suggestions
There are two sides you can care about, when optimizing :
The server side : what matters is generating the ouput faster
The client side : what matters is getting all that has to be displayed faster.
Note : we, as developpers, often think about optimizing the server-side first... Which in most cases only represents less than 10% percent of the loading-time of the page !
On the server side, you'll generally want to :
profile, to determine what's long
optimize your SQL queries, and reduce their number
use caching
For more informations, you can take a look to the answer I gave some time ago to this question : Optimizing Kohana-based Websites for Speed and Scalability
On the client side, the biggest gains are generally achieved by :
Reducing the number of HTTP requests -- the easiest way being to reduce the number of JS/CSS/images files, by combining several files into one
Compressing CSS/JS/HTML, using for instance Apache's mod_deflate.
About that, there is a lot of great stuff on Yahoo's Exceptional Performance : they've released lots of good pratices and tools, such as yslow.
We recently did this on our web site. Here we outlined nine techniques that seemed to have the highest impact with the least difficulty: http://mentormate.com/blog/easy-ways-speed-website-load-time/
The first optimisation is: Decide if it is slow, and if not, don't bother.
This is trickier than it sounds, because it's not like testing a desktop app or game. A game is slow if when you play it on the target hardware, the frame rate is too low. This is very easy to measure.
A web site is trickier, because you, as the developer, are probably using a local test system with a very fast network. Even when you use your staging / system test servers, you're probably still on the local network. Even your production servers are in all likelihood, on the same continent.
The same is possibly not true for quite a lot of your users.
Therefore the options which exist are:
Find out by asking your users, whether they find it to be slow
Simulate a high latency environment and test it yourself (or your QA team)
Guesswork
The latter is not recommended.
An option which the holier-than-thou Yahoo Web Sites performance book (which yes, is a book you can buy) doesn't mention a lot is HTTPS. Most web applications which handle important data run mostly or entirely over HTTPS, which changes the rules of the game rather a lot. Remember to do all testing with it enabled.
i wrote some things about, see:
Google page speed test optimization
As already mentioned, you can use Yslow or PageSpeed firefox extension. But you can also use GTmetrix, an online service scanning your page with both tools.
Features I like / use:
soft, clean and usable presention
comparison with another page. It's really interesting to see where are your friends / competitors.
(by the way, i'm not related to gtmetrix !)
To reduce network traffic, you can minify static files, such as CSS and Javascript, and use gzip compression on generated content. You can also try using tools such as optipng to reduce the size of images.
However, the first step to take is to actually analyse what's taking all of the time -- whether it's sending the bits over the network, or actually generate the content to send. There's no point making your CSS files 10% smaller if it takes a minute to generate each HTML page.
Don't use whitespace in code.
Load balancing would help to reduce the loading time immense.