Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am working on a Drupal based site and notice there are a lot of seperate CSS and js files. Wading though some of the code I can also see quite a few cases where many queries are used too.
What techniques have you tried to improve the performance of Drupal and what modules (if any) do you use to improve the performance of Drupal 'out of the box'?
Going to the admin/settings/performance page, turning on CSS and JS aggregation, and page caching with a minimum lifetime of 1 minute, will give you an immediate boost on a high traffic site. If you're writing your own code and doing any queries, consider writing your own discrete caching for expensive functions. The linked article covers Drupal 5, not 6, but the only change in d6 is the elimiation of the serialization requirement and the function signature for the cache_set() and cache_get() functions. (Both noted in comments on the article)
On large scale sites also consider dropping a memcached server onto the network: Using the memcached module, you can entirely bypass the drupal database for cached data. If you have huge amounts of content and search is a hot spot, you can also consider using lucene/solr as your search indexer instead of drupal's built-in search indexer. It's nice for a built-in indexer but it's not designed for heavy loads (hundreds or thousands of new pieces of content an hour, say, with heavy faceted searching). The apache solr module can tie in with that.
If you're making heavy use of Views, be sure that you've checked the queries it generates for unindexed fields; sorting and filtering by CCK fields in particular can be slow, because CCK doesn't automatically add indexes beyond the primary keys. In D6, preview the View in the admin screen, copy the text of the query, and run it through EXPLAIN in mysql or whatever query analysis tools you have.
Tools like YSlow and Firebug can also help you spot slow stuff like massive image files, JS hosted on remote servers, and so on.
Drupal 6, out-of-the-box, provides css and javascript aggregation --- most css and js files will be combined into a single file (and thus a single HTTP request), and are also whitespace-shortened (to reduce bandwidth consumption). You can enable this under /admin/settings/performance .
Also on that screen are controls for Drupal's (very effective) cache, which helps reduce the number of database queries.
Additionally, because Drupal (and all the modules you'll probably have installed) has a ton of PHP source, using a PHP opcode cache such as APC helps significantly decrease the request time.
I strongly second Benedict's recommendation of the Boost module - it alone will make your website fly on shared hosting, if configured correctly, and is not really buggy at all.
Turn on CSS/JS aggregation, turn on Boost, and your site can perform very well for anonymous users.
If your site deals with mostly logged-in users, you're going to have to do a lot more work making sure sessions are cached well, and probably consider using memcached and more SQL query caching.
The biggest performance gains always come from caching—but monitoring and adjusting your slow queries, monitoring and adjusting apache and PHP configurations, and being smart about the modules you use are very important as well.
Other than the drupal default cache , there are some another method to increase the performance
Boost module is one of the best.
memcache, Varnish(Drupal 7/Pressflow), CDN are the another methods that can improve the performance
It's also worth mentioning the performance boost from using quality SSD storage. It has consistently cut my initial response load times by 30% or more when migrating to SSD (I went from about ~450ms to ~250ms on my last project, using identical Apache/Memcache/Cloudfront EC2 configs), not to mention how much nicer it is to manage a snappy server where every command or script you throw at it is nearly instantaneous. I will never go back.
Related
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I was working on a Magento 1.7 site last night, which wasn't the fastest but was at least acceptable in terms of speed. I tried running a simple re-compile (System > Tools > Compilation), which took forever to complete (5+ minutes) so I cancelled it, flushed cache (2+ minutes), then re-compiled again, which took a while but eventually finished.
Now the site is running extremely slow: page load speeds > 8 seconds, and admin page load speeds are > 20 seconds.
I made a couple other minor changes prior to recompiling. in Admin > Configuration > Checkout, changed Redirect to Checkout from Yes to No for when a user clicks Add-To-Cart. As well as some basic code changes to alter the layout of category list page.
I basically don't know where to start at this point. The site is http://www.vapetropolis.ca
Edit: Just thought of this - Wordpress was installed in a subdirectory of the site before compiling. Could this be the problem? Will try removing it and recompiling and see what happens...
Edit 2: Problem persists
Edit 3: Confirmed, compilation is part of the problem. After disabling compilation, the site runs much faster. Slower than when it was previously compiled and working, but much faster than the broken compiled state
My guess is that you've cleared the cache during the recompile, and that it just needs time to start caching again. Think of it this way, cache is a stored memory in which something is held for frequent/accessible use. You clear cache, until it's re-cached, things will move slower.
In addition, there are lots of things you can do to make this problem "less" apparent when it occurs.
1st The recompiling is not actually going to help with speed unless you have some sort of PHP caching such as APC installed server side. Something like APC is a STAPLE and you will see increased performance and decreased load times.
1.7 is more of a pig than previous versions, but it seems to respond well to Varnish. Our implementation of Varnish full page cache saved about 70% on load times alone. If you can implement Varnish, this is a must.
For search and category pages, SOLR is a GREAT tool. It uses it's own index (created by Magento) and does not use MySQL fulltext searching. This not only decreases load times on your search result pages, but your category flats will fly as well.
Hardware -- Magento needs a decent amount of processing, but RAM is much needed when using tools like APC and Varnish, as they store their data in the much faster RAM of the machine, rather than hard disk space. Even though top may not indicate high RAM "usage", install munin tools and look at your reserved RAM space for said tools, I'll bet you are using close to all of it efficiently.
I understand that you are concerned about Magento moving slowly after a recompile. And my answer is "that's an expected result." -- By doing the above, you can dramatically reduce the effects of clearing your Magento Cache.
Solution!
I deleted the Wordpress subdirectory that had been installed before compiling. The compiler must have choked on non-Magento files. After deleting the directory, flushing all cache, reindexing all data, disabling compiling then recompiling, the site is now up to speed
try enabling cache if it is not already on
goto system>cache management >(at right side) select enable from drop down and click "submit"
and if it is alreay on
try to flush all the type of caches and then try to opn your website
I have made site on drupal
My site has 7500 users and approx (20 to 50 without logged in)(2 to 10 logged in) users are online (and this is not heavy traffic I think)
The site is on dedicated server. I have enabled setting in performance from drupal admin and also installed memcache and eaccelerator
I looked in query logs from using devel module. it is firing total 600 to 900 queries on each page
When I have installed patch of path.inc to reduce the queries of drupal_look_path(). It has reduced queries to around 400
I have also made some positive changes in mysql (my.cnf) file, but still there are many same queries run form user_load() function again and again
I have 60 to 70 modules enabled and all are use full. I can't remove the modules
Still the site is running slow it is taking approx 10 to 15 sec
Now I don't know why the site is running so slow
Is it because the drupal has the large php code ?
Is it because it is firing so many queries on each page?
Does the InnoDB engine improve the performance?
Please, any kind of suggestions are welcome
400 queries for each requests is a sucidie (but even 50+).
You should implement some html cacher. My website generally doens't even make the db connection. It just fires the html cached in a file.
Some additional things to look into:
Install a tool like Yslow/PageSpeed to see how much of those 10-15s are client and server time.
Instal XhProf (on a development site, not live) together with Devel to see which are the functions that use the most time. Look into these first. Edit, now with link: http://groups.drupal.org/node/82889
Using pressflow might help a bit, but since you are alrady using the path.inc patch, probably not so much.
You mentioned that you installed memcache. Did you also install the memcache module and configure the cache plugin to use memcache?
EDIT: Yes, switching to InnoDB can help. One of the main performance advantages of InnoDB is row-level locking (as opposed to table-level locking of MyISAM), which means that multiple INSERT/UPDATE queries against the same table won't block each other unless really necessary. However, InnoDB does not perform well at all out of the box, you really need to fine-tune your mysql configuration for your specific site. So this is a step that you should only take carefully and after testing and optimizing on a development site. There are various questions already on this site and elsewhere about InnoDB tuning...
Anything else than that is then site specific and depends on the modules you are using. But especially things like complex node_access setups and multiple languages (i18n!) tend to either cause slow queries and/or a lot of them.
Not all modules make use of the caching mechanisms you can switch on in the performance settings area. It would be worth trying to identify which ones are doing the most/slowest queries and attempting to get the developer(s) to improve them.
Alternatively, examine whether you could achieve things with fewer modules. Some modules do overlap somewhat in functionality, so you may be able to reorganise the way the site functions a bit.
Additionally, you need to look at whether your settings MySQL are allowing enough memory for these queries to be carried out. Most MySQL distributions come with different versions of my.ini labelled 'small', 'medium', 'huge' etc. Copy the 'huge' one to my.ini (back up the old one first) and restart the DB to see if maxing out all the cache sizes makes a difference. You may well have a bottle neck there, but it can be hard to work out what setting is causing it.
Same goes for PHP. Set memory_limit in php.ini to 500MB or something and see if it helps. Of course, you may not be able to do this, depending on your hosting arrangements, but it will eliminate one possible cause (or not) if you can.
Performance of your Drupal website also depends on how well your hosting platform is tuned for Drupal. Drupal requires special optimization of LAMP stack components. You can try Drupal-specific hosting companies http://www.drupalspecific.com to make your website run faster.
facing the drupal slowness issue myself. But have a very different issue than the others mentioned.
I disabled all the content also the drupal header for a drupal page of a specific content type.
Still the time taken by this page to load is above 20 secs!
I took help of YSlow and NET firebug panels.
Upon looking at them, noticed:
JS and CSS files inclusion individually takes 3 to 2 secs, and there are fair bit of inclusions happening, as a result it takes like 20 secs.
But i am not able to figure out, why the js and css inclusions are taking so much time. (this includes normal drupal core js and css files as well)
I installed Drupal common from acquia and using it for my college Intranet Website. I configured it on Ubuntu lucid lynx Desktop edition running latest XAMPP. I want to increase the performance of the website. My databse server and webserver is on same machine.
Can any one suggest methos to increase the performance on following point
What should be the ideal hardware configuration
What parameters should i change in PHP to run it for best performance?
How can I optimize apache and My SQL to get best performance out of both??
are there tweaks in drupal which can make it more faster?
Are there any additional packages for caching etc which can improve the speed??
Also, try Varnish if you're using PressFlow, as suggested by berkes. It helps a lot if you have to serve content for anonymous users.
Varnish can cache in memory all the content that Drupal produces, reducing hits to your web server and database.
Here a good start point for configuring Varnish with Pressflow:
https://wiki.fourkitchens.com/display/PF/Configure+Varnish+for+Pressflow
Google some for more details.
And don't forget about non Drupal related optimization, like reducing the number of http requests, serving web page elements from different domains to reduce browser pipelining, etc. Use YSlow and follow Yahoo's excellent rules. Google for "yahoo Best Practices for Speeding Up Your Web Site" (can't include link due to SO limitation for new users).
Is not specific for Drupal, but for every PHP setup. More general: for each web-app. I advise you to start with O'Reilly's Building Scalable Websites.
See above. For Drupal, note the memory limit; many people just crank it up to rediculous values; after logic: Drupal needs more then 38MB, I'll just give it 250MB, to be safe.
Again, see above. For Drupal, pay extra attention to the amount of queries. If you focus on Slow Queries only, you may miss that single tiny query hammering your DB 100+ times per request.
Lots. My advice is to start looking at pressflow, an optimised Drupal. It has all the tweaks you are looking for built in. And more.
Yes. Many-, but start with memcached. And if you rely on search a lot, consider moving search to SOLR search.
Many more tips for starters can be found at Drupal performance Blog
The question you ask is very broad, so it is hard to give any specifics in answers. A good place to start is drupal's own handbook on performance tuning.
I would also highly recommend the boost module if your site serves largely anonymous users, as this allows requests to not even go to drupal and be served entirely from a static cache.
Drupal's Devel module has a Performance module that will log memory usage and access times to the Reports section of your site.
Use this to determine which pages on your site are slow.
Load xdebug (a PHP extension) and turn on the profiling feature. Make requests to your performance-intensive pages and it will create (very large) dumps of the entire request. Open up the cache file in a program like KCacheGrind or WinCacheGrind and you will be able to see every function call that Drupal made when building the page. From here you can see which parts are slowest and optimize them.
This should get you a good 30-80% improvement in performance if you have a slow site. In my experience, there's usually a few blocks or views that account for a huge part of any performance issues.
Pro Drupal 7 Development has a whole section regarding fine-tuning called "optimizing drupal".
I think you will find it quite interesting. It also discusses hardware architectures which is of your interest.
Regarding the 4th question, you can for a start checkout the boost module and disable modules you are not using.
Additionally, for improving page-performance you can enable page caching from Configuration -> Performance. In the same page you can use the aggregate and compress CSS(JS) files into one", in this way you reduce the number of HTTP requests per page and the overall size of the downloaded page.
You should also consider if CRON is setup. Not running cron can fill up the db with log , stale cache and other "garbage".
A last suggestion is to convert your db from MyIsam to InnoDB, but I think this requires some investigation because it not always the case that InnoDB is faster. With InnoDb there is less time lost from table locking while MyISAM is faster in table readings.
The question, simply put, is the one in the title. Is it possible?
So far, my experience with scripting languages is that, to increase performance, you need to cache everything and later just serve the generated HTML files.
That's ok for some use cases, but when you really need to generate a new page in realtime, it's just impossible.
Drupal can take up to 3 seconds (or more!) to render some web pages (PHP execution time, not DB). That's crazy. Completely crazy.
If many projects (like Facebook) are using PHP, obviously the problem is mine. But googling for this problem shows that it's common. Too common.
(Of course I installed APC for PHP. It certainly helps, but PHP is still ultra-slow).
Must I assume this is the reality for Drupal / PHP?
Thanks.
Short answer is no. But why would you not want to cache?
What do you mean by 'generate a new page in realtime'? Authenticated users (anyone logged in) can see new content right away. Anonymous users may have to wait a little bit (if you are using Boost, for example), BUT, you can always control it, or flush it when new content is added. You should cache as much as you can.
You can install Boost (static HTML files), Memcache, and enable Drupal cache. It's encouraged, especially the last one. You can also run nginx on the server.
You can also try using Pressflow, a drop-in replacement for Drupal that will give you better performance.
http://pressflow.org/
Its been discussed many times.. you can make Drupal extremely fast if you want to. Check out some of the 2bits articles:
http://2bits.com/contents/articles
Utilizing the available methods of caching will help you keep your hosting cost low, instead of throw more hardware on an unoptimized site.
As you say, Facebook uses PHP, and they clearly have reason to need good performance. Their solution was to write their own compiler for PHP called HipHop, which they released as open source. If you're worried about PHP's performance, you should give it a try as it will definitely improve things.
The downside is that it doesn't (yet) cover 100% of the PHP function set, so some PHP programs may not compile. I don't know where Drupal fits into this, but it would be worth trying it out - there's nothing to be lost by doing a test compilation; if its not going to work, you won't have lost anything.
On a similar vein, there is a project in the Drupal community to convert parts of the Drupal Core into a PHP Extension, meaning that some key Drupal functions are then built-in to the PHP runtime as compiled code. See the project page here. But note that this is still in a fairly early stage of development: it's still listed as experimental, and only covers a small number of functions. It might be worth keeping an eye on the project, though.
According to http://groups.drupal.org/node/34076, yes you can get a < 200ms response time with Drupal without caching.
The tips that I've received from some friends regarding Drupal load performance is to install less than 40 modules.
More than 40, especially if those contrib modules use too much hooks and memory, and the performance will be decreased.
Other tips:
remove imagecache ui and views ui on production site
if possible put htaccess on vhost.conf so that htaccess will only be called once on apahe start
use throttle module
use gzip for all html, css and js files
use cdn module and amazon server solution
use ajax for some parts or blocks of your site
last and if there is enough budget, migrate to oracle
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want to know when building a typical site on the LAMP stack how do you optimize it for the best possible load times. I am picturing a typical DB-driven site.
This is a high-level look and could probably pull in question and let me break it down into each layer of the stack.
L - At the system level, (setup and filesystem) can you do to improve speed? One thing I can think of is image sizes, can compression here help optimize anything?
A - There have to be a ton of settings related to site speed here in the web server. Not my Forte. Probably depends a lot on how many sites are running concurrently.
M - MySQL in a database driven site, DB performance is key. Is there a better normalization approach i.e, using link tables? Web developers often just make simple monolithic tables resembling 1NF and this can kill performance.
P - aside from performance-boosting settings like caching, what can the programmer do to affect performance at a high level? I would really like to know if MVC design approaches hit performance more than quick-and-dirty. Other simple tips like are sessions faster than cookies would be interesting to know.
Obviously you have to get down and dirty into the details and find what code is slowing you down. Also I realize that many sites have many different performance characteristics, but let's assume a typical site that has more reads then writes.
I am just wondering if we can compile a bunch of best practices and fully expect people to link other questions so we can effectively workup a checklist.
My goal is to see if even in addition to the usual issues in performance we can see some oddball things you might not think of crop up to go along with a best-practices summary.
So my question is, if you were starting from scratch, how would you make sure your LAMP site was fast?
Here's a few personal must-dos that I always set up in my LAMP applications.
Install mod_deflate for apache, and
do not use PHP's gzip handlers.
mod_deflate will allow you to
compress static content, like
javascript/css/static html, as well
as the usual dynamic PHP output, and
it's one less thing you have to worry
about in your code.
Be careful with .htaccess files!
Enabling .htaccess files for
directories in your app means that
Apache has to scan the filesystem
constantly, looking for .htaccess
directives. It is far better to put
directives inside the main
configuration or a vhost
configuration, where they are loaded
once. Any time you can get rid of a
directory-level access file by moving
it into a main configuration file,
you save disk access time.
Prepare your application's database
layer to utilize a connection manager
of some sort (I use a Singleton for
most applications). It's not very
hard to do, and reducing the number
of database connections your
application opens saves resources.
If you think your application will
see significant load, memcached can
perform miracles. Keep this in mind
while you write your code... perhaps
one day instead of creating objects
on the fly, you will be getting them
from memcached. A little foresight
will make implementation painless.
Once your app is up and running, set
MySQL's slow query time to a small
number and monitor the slow query log
diligently. This will show you where
your problem queries are coming from,
and allow you to optimize your
queries and indexes before they
become a problem.
For serious performance tweakers, you
will want to compile PHP from source.
Installing from a package installs a
lot of libraries that you may never
use. Since PHP environments are
loaded into every instance of an
Apache thread, even a 5MB memory
overhead from extra libraries quickly
becomes 250MB of lost memory when
there's 50 Apache threads in
existence. I keep a list of my
standard ./configure line I use when
building PHP here, and I find it
suits most of my applications. The
downside is that if you end up
needing a library, you have to
recompile PHP to get it. Analyze
your code and test it in a devel
environment to make sure you have
everything you need.
Minify your Javascript.
Be prepared to move static content,
such as images and video, to a
non-dynamic web server. Write your
code so that any URLs for images and
video are easily configured to point
to another server in the future. A
web server optimized for static
content can easily serve tens or even
hundreds of times faster than a
dynamic content server.
That's what I can think of off the top of my head. Googling around for PHP best practices will find a lot of tips on how to write faster/better code as well (Such as: echo is faster than print).
First, realize that performance is an iterative process. You don't build a web application in a single pass, launch it, and never work on it again. On the contrary, you start small, and address performance issues as your site grows.
Now, onto specifics:
Profile. Identify your bottlenecks. This is the most important step. You need to focus your effort where you'll get the best results. You should have some sort of monitoring solution in place (like cacti or munin), giving you visibility into what's going on on your server(s)
Cache, cache, cache. You'll probably find that database access is your biggest bottleneck on the back end -- but you should verify this on your own. Fortunately, you'll probably find that a lot of your traffic is for a small set of resources. You can cache those resources in something like memcached, saving yourself the database hit, and resulting in better backend performance.
As others have mentioned above, take a look at the YDN performance rules. Consider picking up the accompanying book. This'll help you with front end performance
Install PHP APC, and make sure it's configured with enough memory to hold all your compiled PHP bytecode. We recently discovered that our APC installation didn't have nearly enough ram; giving it enough to work in cut our CPU time in half, and disk activity by 10%
Make sure your database tables are properly indexed. This goes hand in hand with monitoring the slow query log.
The above will get you very far. That is to say, even a fairly db-heavy site should be able to survive a frontpage digg on a single modestly-spec'd server if you've done the above.
You'll eventually hit a point where the default apache config won't always be able to keep up with incoming requests. When you hit this wall, there are two things to do:
As above, profile. Monitor your apache activity -- you should have an idea of how many connections are active at any given time, in addition to the max number of active connections when you get sudden bursts of traffic
Configure apache with this in mind. This is the best guide to apache config I've seen: Practical mod_perl chapter 11
Take as much load off of apache as you can. Apache's too heavy-duty to serve static content efficiently. You should be using a lighter-weight reverse proxy (like squid) or webserver (lighttpd or nginx) to serve static content, and to take over the job of spoon-feeding bytes to slow clients. This leaves Apache to do what it does best: execute your code. Again, the mod_perl book does a good job of explaining this.
Once you've gotten this far, it's largely an issue of caching more, and keeping an eye on your database. Eventually, you'll outgrow a single server. First, you'll probably add more front end boxes, all backed by a single database server. Then you're going to have to start spreading your database load around, probably by sharding. For an excellent overview of this growth process, see this livejournal presentation
For a more in-depth look at much of the above, check out Building Scalable Web Sites, by Cal Henderson, of Flickr fame. Google has portions of the book available for preview
I've used MysqlTuner for performance analysis on my mysql servers and its given a good insight into further issues for googling, as well as making its own recommendations
A resource you might find helpful is the YDN set of performance rules.
Don't forget the fact that your users will be thousands of miles away from your server, and downloading dozens of files to render a single page. That latency, and the overhead of rendering the page in their browsers can be larger than the amount of time that you spend collecting the information, and generating the page.
See the pages at Yahoo Developer Network about Best Practices for Speeding Up Your Web Site, and the YSlow tool for seeing what part of the downloading of the site is taking time.
Don't forget to turn off atime for your filesystem!
I'd recommend using Jet Profiler for MySQL to find any bad queries. I've successfully used it on a couple of my sites. Really helpful, and much easier to digest than the slow query log.
I'd recommend starting with http://highscalability.com/
As for your suggestions:
Compression for images, definitely no. Type of files system tunning, yes, that could have some effect, but minimal. But actually the best is to use in-memory reverse proxy, or even better CDN.
For Apache basically only load the modules you need. Do not load anything else. As with PHP you can only use forking MPM, it's important to keep it slim. As for optimal settings, well you have to fine tune them to specific application, hardware etc. If you have enough CPU, it's recommendable that you use mod_deflate. Faster the server can send data to the client, faster it can start processing next request.