I have an established Magento site that is only just starting to get complaints about its performance. It's on a shared server. The internal server-side cache is also enabled.
Some of the problems I have noticed are
Many HTTP Requests
No minification of CSS or JavaScript.
No CSS sprites
Shared server isn't recommended for Magento, or so I've read
Unfortunately, I can not get a dedicated host and I don't want to hack the codebase to minify all JS/CSS.
Is there any plugins that will minify CSS / JS?
Do you have any experience with speeding up Magento's performance?
To minify css/js files try this extension
Fooman Speedster
I created a similar topic on magento performance improvements here , the comments there may help you.
CSS sprites and JS minification are good general techniques, but hacking them out of the default Magento installation makes them a less-than-stellar option. You're likely to get better performance for your effort by enabling expiration dates on your files (CSS/JS/images can be cached for a while, especially once you've finished development) and making sure that apache gzip's those files. This will reduce the size of your page loads on empty cache and reduce the number of HTTP requests on cached pages.
After that, (and also recommended on the other topic mentioned) I'd recommend looking at the use of a PHP bytecode cache such as xcache. Magento loads many PHP files on every request, so such a cache will lead to significant improvements in performance. Ask your shared host whether they offer any such cache.
Also, here's the Magento article on performance, which has some good suggestions:
http://www.magentocommerce.com/blog/comments/performance-is-key-notes-on-magentos-performance/
Edit: Forgot to mention, block caching can significantly reduce the amount of time that Magento churns on a page, speeding up your page loads. Google magento block caching for some good resources.
You can get a VPS for barely more than a standard shared host. Or better yet, get two VPS accounts and use one only for the database.
You're really limited with what you can do to speed up Magento on a shared host, because so many tweaks rely on server configuration (which you can do with a VPS).
Also, turn on every cache that you possibly can. See:
http://www.magentocommerce.com/blog/comments/performance-is-key-notes-on-magentos-performance/
http://www.yoast.com/magento-performance-hosting/
I assume you already know this, but the bottom line is that Magento should NOT be run on shared hosting. It wasn't designed at all for it. It's like trying to run the latest Photoshop on a 7 year old computer.
Related
My website has a long response time (3 sec).
I tried deactivating all plugins one-by-one and checking, but it's waiting time is 2 sec.
I also use w3-cache and wp-cache on my live domain, but waiting time is 3 sec.
Today I downloaded the new WordPress, and configured all settings.
I did not add any themes or plugins and tested its speed; I see the WordPress waiting time is 2 sec.
It means my WordPress response time is long. How can I reduce it?
Some pretty high traffic sites run WordPress.
There are several layers in the technology stack that could be the problem. The most common bottlenecks are network speed, server resources (RAM/CPU), and database resources.
Since your initial page is very slow and your resources seem faster, I would probably eliminate network speed as an issue. If you try loading a static HTML test page that'll confirm your network speed is okay.
Next, install WP Super Cache on your plain install of WordPress using the default theme. Is it fast? If so, I would suspect your host's MySQL server is overloaded and overworked.
By default most web hosts and WP aren't setup for 100% speed efficiency but with little tweaks you can make massive improvements in speed times.
I use GTmetrix to measure a websites speed, and implement their ideas to increase the speed including:
Htaccess GZip
Defer Javascript loading
Minify JS and CSS with Autoptimize
reduce image size with Smush.it
They are only a couple of ways, GTmetrix will help you understand what is slowing your site down and ways to increase it.
A default WordPress install should be lightning fast if it's on an even remotely decent server. Try removing the database, recreating it, and reinstalling. Or, use phpMyAdmin to repair and optimize your tables. If this is not an option, try some janitoral DB queries like:
DELETE FROM `user_meta` um LEFT JOIN `users` u ON (um.user_id = u.ID) WHERE u.ID IS NULL
Same concept with posts & post meta, just different primary keys.
Also, if your server is not setup properly for caching, i.e. XCache or memcache not installed, properly configured, or loaded, and you're not running an uber high-traffic site, you probably don't even really need to use caching at all. If it ain't broke, don't fix it.
Or, if you have existing data that you cannot afford to lose, use WP Migrate DB to back up your existing install exporting to SQL file, follow steps above in regards to removing the original database, then import the database anew from phpMyAdmin after recreating.
Getting desired performance from your Wordpress sites can be trick, as the Wordpress platform don't seem to have had performance optimizations in mind.
You can get the desired results by looking into performance optimization techniques- which can be done manually on your sites, but can be burdensome for the not so technical Wordpress users, especially if they just what a fast performing wordpress blog.
There are really good plugins out there that can make it easy but some still require technical knowledge. I recommend these 2 plugins and they are quite easier to configure for not so technical users.
Speed up your website with jch optimize - this plugin makes it really easy for combining, minifying css, javascript, and html. there ton more features you can get with the paid version.
cache your site with W3TC - one the most popular caching plugins. A must get if you want some extra speed.
Just wondering... Does it? And how much
Like including 20 .php files whith classes in them, but without actually using the classes (they might be used though).
I will give a slight variant Answer to this:
If you are running on a tuned VPS or dedicated server: a trivial amount.
If you are running on a shared hosting service: it can considerably degrade performance of your script execution time.
Why? because in the first case you should have configured a PHP opcode cache such as APC or Xcache, which can, in practical terms, eliminate script load and compilation overheads. Even where files need to be read or stat-checked the meta and file data will be "hot" and therefore largely cached in the file-system cache if the (virtual) server is dedicated to the application.
On a shared service everything is running in the opposite direction: PHP is run as a per-request image in the users UID; no opcode caching solutions support this mode, so everything needs to be compiled. The killer here is that files need to be read and many (perhaps most) shared LAMP hosting providers use a scalable server farm for the LAMP tier, with the user data on shared NFS mounted NAS infrastructure. Since these NFS mounts with have an acremin of less than 1 min, the I/O requests will require RPCs off-server. My blog article gives some benchmarks here. The details for a shared IIS hosting template are different but the net effects are similar.
I run the phpBB forum package on my shared service and I roughly halved response times by aggregating the common set of source includes as I describe here.
Yes, though by how much depends on a number of things. The performance cost isn't too high if you are using a PHP accelerator, but will drastically slow things if you're aren't. Your best bet is generally to use autoloading, so you only load things at the point of actual use, rather than loading everything just in case. That may reduce your memory consumption too.
Of course it affects the performance. Everything you do in PHP does.
How much performance is a matter of how much data is in them, and how long it takes to execute them or in the case of classes, read them.
If your not using them, why include them? I assume your using some main engine file, or header file and should rethink your methods of including files.
EDIT: Or as #Pekka pointed out, you can autoload classes.
Short answer - yes it will.
For longer answers a quick google search revealed these - Will including unnecessary php files slow down website? ; PHP Performance on including multiple files
Searching helps!
--Matīss
I installed Drupal common from acquia and using it for my college Intranet Website. I configured it on Ubuntu lucid lynx Desktop edition running latest XAMPP. I want to increase the performance of the website. My databse server and webserver is on same machine.
Can any one suggest methos to increase the performance on following point
What should be the ideal hardware configuration
What parameters should i change in PHP to run it for best performance?
How can I optimize apache and My SQL to get best performance out of both??
are there tweaks in drupal which can make it more faster?
Are there any additional packages for caching etc which can improve the speed??
Also, try Varnish if you're using PressFlow, as suggested by berkes. It helps a lot if you have to serve content for anonymous users.
Varnish can cache in memory all the content that Drupal produces, reducing hits to your web server and database.
Here a good start point for configuring Varnish with Pressflow:
https://wiki.fourkitchens.com/display/PF/Configure+Varnish+for+Pressflow
Google some for more details.
And don't forget about non Drupal related optimization, like reducing the number of http requests, serving web page elements from different domains to reduce browser pipelining, etc. Use YSlow and follow Yahoo's excellent rules. Google for "yahoo Best Practices for Speeding Up Your Web Site" (can't include link due to SO limitation for new users).
Is not specific for Drupal, but for every PHP setup. More general: for each web-app. I advise you to start with O'Reilly's Building Scalable Websites.
See above. For Drupal, note the memory limit; many people just crank it up to rediculous values; after logic: Drupal needs more then 38MB, I'll just give it 250MB, to be safe.
Again, see above. For Drupal, pay extra attention to the amount of queries. If you focus on Slow Queries only, you may miss that single tiny query hammering your DB 100+ times per request.
Lots. My advice is to start looking at pressflow, an optimised Drupal. It has all the tweaks you are looking for built in. And more.
Yes. Many-, but start with memcached. And if you rely on search a lot, consider moving search to SOLR search.
Many more tips for starters can be found at Drupal performance Blog
The question you ask is very broad, so it is hard to give any specifics in answers. A good place to start is drupal's own handbook on performance tuning.
I would also highly recommend the boost module if your site serves largely anonymous users, as this allows requests to not even go to drupal and be served entirely from a static cache.
Drupal's Devel module has a Performance module that will log memory usage and access times to the Reports section of your site.
Use this to determine which pages on your site are slow.
Load xdebug (a PHP extension) and turn on the profiling feature. Make requests to your performance-intensive pages and it will create (very large) dumps of the entire request. Open up the cache file in a program like KCacheGrind or WinCacheGrind and you will be able to see every function call that Drupal made when building the page. From here you can see which parts are slowest and optimize them.
This should get you a good 30-80% improvement in performance if you have a slow site. In my experience, there's usually a few blocks or views that account for a huge part of any performance issues.
Pro Drupal 7 Development has a whole section regarding fine-tuning called "optimizing drupal".
I think you will find it quite interesting. It also discusses hardware architectures which is of your interest.
Regarding the 4th question, you can for a start checkout the boost module and disable modules you are not using.
Additionally, for improving page-performance you can enable page caching from Configuration -> Performance. In the same page you can use the aggregate and compress CSS(JS) files into one", in this way you reduce the number of HTTP requests per page and the overall size of the downloaded page.
You should also consider if CRON is setup. Not running cron can fill up the db with log , stale cache and other "garbage".
A last suggestion is to convert your db from MyIsam to InnoDB, but I think this requires some investigation because it not always the case that InnoDB is faster. With InnoDb there is less time lost from table locking while MyISAM is faster in table readings.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Is Magento usually so terribly slow?
This is my first experience with it and the admin panel simply takes ages to load and save changes. It is a default installation with the test data.
The server where it is hosted serves other non-Magento sites super fast. What is it about the PHP code that Magento uses that makes it so slow, and what can be done to fix it?
I've only been tangentially involved in optimizing Magento for performance, but here's a few reasons why the system is so slow
Parts of Magento use an EAV database system implemented on top of MySQL. This means querying for a single "thing" often means querying multiple rows
There's a lot of things behind the scenes (application configuration, system config, layout config, etc.) that involve building up giant XML trees in memory and then "querying" those same trees for information. This takes both memory (storing the trees) and CPU (parsing the trees). Some of these (especially the layout tree) are huge. Also, unless caching is on, these tree are built up from files on disk and on each request.
Magento uses its configuration system to allow you to override classes. This is a powerful feature, but it means anytime a model, helper, or controller is instantiated, extra PHP instructions need to run to determine if an original class file or an override class files is needed. This adds up.
Besides the layout system, Magento's template system involves a lot of recursive rendering. This adds up.
In general, the Magento Engineers were tasked, first and foremost, with building the most flexible, customizable system possible, and worry about performance later.
The first thing you can do to ensure better performance is turn caching on (System -> Cache Management). This will relieve some of the CPU/disk blocking that goes on while Magento is building up its various XML trees.
The second thing you'll want to do is ensure your host and operations team has experience performance tuning Magento. If you're relying on the $7/month plan to see you through, well, good luck with that.
Further to Alan Storm's recommendations on caching, there's two things I'd specifically recommend you look into related to caching:
- Make sure caching is in memcached, rather than on disk.
I look after a couple of magento installs, and once you get any sort of load on the system, memcached starts to perform much faster. And its dead easy to change it over (relative to doing other magento stuff at least!)
Good starting point is here: http://www.magentocommerce.com/boards/viewthread/12998/P30/ - but if you've not used memcached at all before, its worth looking at some general info about it as well.
- Enable template/view caching.
This is a good article: http://inchoo.net/ecommerce/magento/magento-block-caching/
There are good ones on the magento site too (google magento block caching), but its down at the moment.
To add my two cents to the block caching, I'd advise you create your own blocks in /app/code/local, extending the core ones and defining the cache parameters, name them xxx_Cache and then update your layout to use these blocks instead of the core ones. This way, you avoid losing your changes or breaking the system when you upgrade magento.
If you haven't seen it yet, Magento and Rackspace teamed up to create a white paper on performance tuning Magento. It's excellent.
https://support.rackspace.com/whitepapers/building-secure-scalable-and-highly-available-magento-stores-powered-by-rackspace-solutions/
--- edit ---
Another great resource, newly available (Oct 2011) is:
http://www.sessiondigital.com/assets/Uploads/Mag-Perf-WP-final.pdf
(Thanks due to Alan Storm on this one.)
There is possibly also a very non-obvious reason why your admin interface is very slow. Magento has a module named Mage_AdminNotification. Try to disable that ext. Because what it does is query magentocommerce.com for new update messages. If their servers are slow your admin page waits and is in effect slow because of the network lag and loading of the external news. If you have secured your outgoing server connection through a firewall this can be even more frustrating, since the admin interface will wait for the timeout when it cannot reach magentocommerce.com
To disable it: go to System -> Configuration, scroll to the bottom and hit Advanced(in the Advanced section). Now disable Mage_AdminNotification and save!
I only have a superficial experience with Magento. I installed it on a shared grid-server and the page loading was dismal ~5+ seconds. On a lark, I installed it on my optimized for CMS sites dedicated server, and it felt very, very snappy.
My Dedicated hosting had ~10 Joomla! sites and a VBullitin site running.
My guess is it's just not going to be performant on shared hosting. The over-subscription just won't allow enough resources for Magento to run as it ought.
I'm more involved in the managed server optimization in my company but I may have a few tips for you. First, you can look at the code more closely using the code tracing feature of Zend server. It will allow you to see where and when the things get dirty.
I totally share benlumley's consideration regarding the cache. Most of the sites we host doesn't even have the block caching enable. This cache has to be explicitly called and not "assumed". So if you code hasn't yet took part of this mechanism, it's something you definitely want to try. If you have a EE version, you can get the Full page up in order to get the best of the beast.
A reverse proxy will also help a lot. It'll cache the static ressources, significantly lowering the pressure on the php interpretation stack of your front servers.
Don't forget to write the sessions & Magento cache to a RAM disk. This will also definitely get you to another level of performances.
There's still a lot to be said here but I'm running out of time. You have to know that a good site, well coded in a 1.4.1 CE version, running on a 2x5650 Xeon + 16 GB RAM server and having a Rproxy on top can take up to 50 000 unique visitors a day with smooth pages to everybody.
Switching from Apache to LiteSpeed helped us a lot. In addition to: Editing MySQL's settings, installing Fooman Speedster (module to compress/combine js and css files), and installing APC. Magento has also posted a white paper on how to get the best performance out of the enterprise edition, but it is equally applicable to the other versions: http://www.magentocommerce.com/whitepaper/
There are many reasons why your Magento shopping cart could be running slow but no excuses for there is a variety of ways to eleviate the problem and make it pretty darn fast. Enabling Gzip by modifying your htaccess file is a start. You can also install the fooman speedster extension. The type of server used also will determine the speed of your store. More tips and a better explanation here http://www.interactone.com/how-to-speed-up-magento/
Magento is very slow because the database design is not very good. The code is a mess and very hard to update and optimize. So all optimizations are done via cache instead of code.
On the other hand. It is a webshop with a lot of tools. So if you need a flexible webshop just buy a very powerfull server and you will be ok.
When I first installed I had pages that were taking 30 seconds to load. My server was not maxed out in ram or processor, so I didn't know what to do. Looking at firebug's net panel it was loading about 100 files per page, and each one took a long time to connect. After installing fooman speedster and the gzip in the htaccess loads times were down to 3 seconds, like they had been on other shopping carts on my server.
it will also come down to functionality versus performance.
Raw performance is gained using nginx, php-fpm, memcached, apc and a proper designed server.
Functionality like plesk and magento performance could be managed by taking the entire infrastructure in perspective when designing a magento performance cloud.
What is the best way of implementing a cache for a PHP site? Obviously, there are some things that shouldn't be cached (for example search queries), but I want to find a good solution that will make sure that I avoid the 'digg effect'.
I know there is WP-Cache for WordPress, but I'm writing a custom solution that isn't built on WP. I'm interested in either writing my own cache (if it's simple enough), or you could point me to a nice, light framework. I don't know much Apache though, so if it was a PHP framework then it would be a better fit.
Thanks.
You can use output buffering to selectively save parts of your output (those you want to cache) and display them to the next user if it hasn't been long enough. This way you're still rendering other parts of the page on-the-fly (e.g., customizable boxes, personal information).
If a proxy cache is out of the question, and you're serving complete HTML files, you'll get the best performance by bypassing PHP altogether. Study how WP Super Cache works.
Uncached pages are copied to a cache folder with similar URL structure as your site. On later requests, mod_rewrite notes the existence of the cached file and serves it instead. other RewriteCond directives are used to make sure commenters/logged in users see live PHP requests, but the majority of visitors will be served by Apache directly.
The best way to go is to use a proxy cache (Squid, Varnish) and serve appropriate Cache-Control/Expires headers, along with ETags : see Mark Nottingham's Caching Tutorial for a full description of how caches work and how you can get the most performance out of a caching proxy.
Also check out memcached, and try to cache your database queries (or better yet, pre-rendered page fragments) in there.
I would recommend Memcached or APC. Both are in-memory caching solutions with dead-simple APIs and lots of libraries.
The trouble with those 2 is you need to install them on your web server or another server if it's Memcached.
APC
Pros:
Simple
Fast
Speeds up PHP execution also
Cons
Doesn't work for distributed systems, each machine stores its cache locally
Memcached
Pros:
Fast(ish)
Can be installed on a separate server for all web servers to use
Highly tested, developed at LiveJournal
Used by all the big guys (Facebook, Yahoo, Mozilla)
Cons:
Slower than APC
Possible network latency
Slightly more configuration
I wouldn't recommend writing your own, there are plenty out there. You could go with a disk-based cache if you can't install software on your webserver, but there are possible race issues to deal with. One request could be writing to the file while another is reading.
You actually could cache search queries, even for a few seconds to a minute. Unless your db is being updated more than a few times a second, some delay would be ok.
The PHP Smarty template engine (http://www.smarty.net) includes a fairly advanced caching system.
You can find details in the caching section of the Smarty manual: http://www.smarty.net/manual/en/caching.php
You seems to be looking for a PHP cache framework.
I recommend you the template system TinyButStrong that comes with a very good CacheSystem plugin.
It's simple, light, customizable (you can cache whatever part of the html file you want), very powerful ^^
Simple caching of pages, or parts of pages - the Pear::CacheLite class. I also use APC and memcache for different things, but the other answers I've seen so far are more for more complete, and complex systems. If you just need to save some effort rebuilding a part of a page - Cache_lite with a file-backed store is entirely sufficient, and very simple to implement.
Project Gazelle (an open source torrent site) provides a step by step guide on setting up Memcached on the site which you can easily use on any other website you might want to set up which will handle a lot of traffic.
Grab down the source and read the documentation.