Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I was working on a Magento 1.7 site last night, which wasn't the fastest but was at least acceptable in terms of speed. I tried running a simple re-compile (System > Tools > Compilation), which took forever to complete (5+ minutes) so I cancelled it, flushed cache (2+ minutes), then re-compiled again, which took a while but eventually finished.
Now the site is running extremely slow: page load speeds > 8 seconds, and admin page load speeds are > 20 seconds.
I made a couple other minor changes prior to recompiling. in Admin > Configuration > Checkout, changed Redirect to Checkout from Yes to No for when a user clicks Add-To-Cart. As well as some basic code changes to alter the layout of category list page.
I basically don't know where to start at this point. The site is http://www.vapetropolis.ca
Edit: Just thought of this - Wordpress was installed in a subdirectory of the site before compiling. Could this be the problem? Will try removing it and recompiling and see what happens...
Edit 2: Problem persists
Edit 3: Confirmed, compilation is part of the problem. After disabling compilation, the site runs much faster. Slower than when it was previously compiled and working, but much faster than the broken compiled state
My guess is that you've cleared the cache during the recompile, and that it just needs time to start caching again. Think of it this way, cache is a stored memory in which something is held for frequent/accessible use. You clear cache, until it's re-cached, things will move slower.
In addition, there are lots of things you can do to make this problem "less" apparent when it occurs.
1st The recompiling is not actually going to help with speed unless you have some sort of PHP caching such as APC installed server side. Something like APC is a STAPLE and you will see increased performance and decreased load times.
1.7 is more of a pig than previous versions, but it seems to respond well to Varnish. Our implementation of Varnish full page cache saved about 70% on load times alone. If you can implement Varnish, this is a must.
For search and category pages, SOLR is a GREAT tool. It uses it's own index (created by Magento) and does not use MySQL fulltext searching. This not only decreases load times on your search result pages, but your category flats will fly as well.
Hardware -- Magento needs a decent amount of processing, but RAM is much needed when using tools like APC and Varnish, as they store their data in the much faster RAM of the machine, rather than hard disk space. Even though top may not indicate high RAM "usage", install munin tools and look at your reserved RAM space for said tools, I'll bet you are using close to all of it efficiently.
I understand that you are concerned about Magento moving slowly after a recompile. And my answer is "that's an expected result." -- By doing the above, you can dramatically reduce the effects of clearing your Magento Cache.
Solution!
I deleted the Wordpress subdirectory that had been installed before compiling. The compiler must have choked on non-Magento files. After deleting the directory, flushing all cache, reindexing all data, disabling compiling then recompiling, the site is now up to speed
try enabling cache if it is not already on
goto system>cache management >(at right side) select enable from drop down and click "submit"
and if it is alreay on
try to flush all the type of caches and then try to opn your website
Related
My website has a long response time (3 sec).
I tried deactivating all plugins one-by-one and checking, but it's waiting time is 2 sec.
I also use w3-cache and wp-cache on my live domain, but waiting time is 3 sec.
Today I downloaded the new WordPress, and configured all settings.
I did not add any themes or plugins and tested its speed; I see the WordPress waiting time is 2 sec.
It means my WordPress response time is long. How can I reduce it?
Some pretty high traffic sites run WordPress.
There are several layers in the technology stack that could be the problem. The most common bottlenecks are network speed, server resources (RAM/CPU), and database resources.
Since your initial page is very slow and your resources seem faster, I would probably eliminate network speed as an issue. If you try loading a static HTML test page that'll confirm your network speed is okay.
Next, install WP Super Cache on your plain install of WordPress using the default theme. Is it fast? If so, I would suspect your host's MySQL server is overloaded and overworked.
By default most web hosts and WP aren't setup for 100% speed efficiency but with little tweaks you can make massive improvements in speed times.
I use GTmetrix to measure a websites speed, and implement their ideas to increase the speed including:
Htaccess GZip
Defer Javascript loading
Minify JS and CSS with Autoptimize
reduce image size with Smush.it
They are only a couple of ways, GTmetrix will help you understand what is slowing your site down and ways to increase it.
A default WordPress install should be lightning fast if it's on an even remotely decent server. Try removing the database, recreating it, and reinstalling. Or, use phpMyAdmin to repair and optimize your tables. If this is not an option, try some janitoral DB queries like:
DELETE FROM `user_meta` um LEFT JOIN `users` u ON (um.user_id = u.ID) WHERE u.ID IS NULL
Same concept with posts & post meta, just different primary keys.
Also, if your server is not setup properly for caching, i.e. XCache or memcache not installed, properly configured, or loaded, and you're not running an uber high-traffic site, you probably don't even really need to use caching at all. If it ain't broke, don't fix it.
Or, if you have existing data that you cannot afford to lose, use WP Migrate DB to back up your existing install exporting to SQL file, follow steps above in regards to removing the original database, then import the database anew from phpMyAdmin after recreating.
Getting desired performance from your Wordpress sites can be trick, as the Wordpress platform don't seem to have had performance optimizations in mind.
You can get the desired results by looking into performance optimization techniques- which can be done manually on your sites, but can be burdensome for the not so technical Wordpress users, especially if they just what a fast performing wordpress blog.
There are really good plugins out there that can make it easy but some still require technical knowledge. I recommend these 2 plugins and they are quite easier to configure for not so technical users.
Speed up your website with jch optimize - this plugin makes it really easy for combining, minifying css, javascript, and html. there ton more features you can get with the paid version.
cache your site with W3TC - one the most popular caching plugins. A must get if you want some extra speed.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Is there a PHP command that write asynchronously (as in I don't care when the data is written and this has very low priority.)
I am hosting 1,300 domains on a server. I know it's a lot. But each takes very little memory, very little CPU, very little bandwidth. The bottleneck is random writes. Too many random writes (and random read). The server is Linux.
I read here:
Hence, SYNC-reads and ASYNC-writes are generally a good way to go as
they allow the kernel to optimise the order and timing of the
underlying I/O requests.
There you go. I bet you now have quite a lot of things to say in your
presentation about improving disk-IO. ;-)
Most of the writes are just cache files. I lost nothing if it's wrong once in a while. Basically I want to set up my system so that most read and write goes to the memory and then when I want my server to write data to my disks in sequential block.
I have a huge amount of memory that can be used as cache. In Windows I use supercache to do this and everything loads faster. People there tell that Linux already has block caching. Well, the block caching doesn't work and I am still researching how to make most writes goes to memory first so that data is flushed only ocassionally.
For now, I just want to modify my PHP program so when it writes data it tells the kernel "just write it some times in the future, I can wait and I don't mind."
Is there an option for that? I could use SSD but they are small.
If most of the write activity is due to caching then I would use either apc or memcached as a a backend to the cache instead of just the filesystem. Any writes that must be persisted to the file system can then be handled separately without the competing cache io hitting the disk. The only consideration to make is that your cache is now in memory it will be emptied after a reboot, if that would cause a performance issue you may want to implement a cache warming process.
Most of the major PHP frameworks have memcached and apc (i.e. Zend Framework )as an available storage adaptor for their caching modules so that would be my first choice. Failing that you can use the memcached and apc extensions directly.
Another option is to look at implementing a queue that you can send the pending writes to. That queue can then be processed periodically by a scheduled task that performs the writes at an acceptable rate. I have used Zend_Queue in the past for a similar purpose.
I have made site on drupal
My site has 7500 users and approx (20 to 50 without logged in)(2 to 10 logged in) users are online (and this is not heavy traffic I think)
The site is on dedicated server. I have enabled setting in performance from drupal admin and also installed memcache and eaccelerator
I looked in query logs from using devel module. it is firing total 600 to 900 queries on each page
When I have installed patch of path.inc to reduce the queries of drupal_look_path(). It has reduced queries to around 400
I have also made some positive changes in mysql (my.cnf) file, but still there are many same queries run form user_load() function again and again
I have 60 to 70 modules enabled and all are use full. I can't remove the modules
Still the site is running slow it is taking approx 10 to 15 sec
Now I don't know why the site is running so slow
Is it because the drupal has the large php code ?
Is it because it is firing so many queries on each page?
Does the InnoDB engine improve the performance?
Please, any kind of suggestions are welcome
400 queries for each requests is a sucidie (but even 50+).
You should implement some html cacher. My website generally doens't even make the db connection. It just fires the html cached in a file.
Some additional things to look into:
Install a tool like Yslow/PageSpeed to see how much of those 10-15s are client and server time.
Instal XhProf (on a development site, not live) together with Devel to see which are the functions that use the most time. Look into these first. Edit, now with link: http://groups.drupal.org/node/82889
Using pressflow might help a bit, but since you are alrady using the path.inc patch, probably not so much.
You mentioned that you installed memcache. Did you also install the memcache module and configure the cache plugin to use memcache?
EDIT: Yes, switching to InnoDB can help. One of the main performance advantages of InnoDB is row-level locking (as opposed to table-level locking of MyISAM), which means that multiple INSERT/UPDATE queries against the same table won't block each other unless really necessary. However, InnoDB does not perform well at all out of the box, you really need to fine-tune your mysql configuration for your specific site. So this is a step that you should only take carefully and after testing and optimizing on a development site. There are various questions already on this site and elsewhere about InnoDB tuning...
Anything else than that is then site specific and depends on the modules you are using. But especially things like complex node_access setups and multiple languages (i18n!) tend to either cause slow queries and/or a lot of them.
Not all modules make use of the caching mechanisms you can switch on in the performance settings area. It would be worth trying to identify which ones are doing the most/slowest queries and attempting to get the developer(s) to improve them.
Alternatively, examine whether you could achieve things with fewer modules. Some modules do overlap somewhat in functionality, so you may be able to reorganise the way the site functions a bit.
Additionally, you need to look at whether your settings MySQL are allowing enough memory for these queries to be carried out. Most MySQL distributions come with different versions of my.ini labelled 'small', 'medium', 'huge' etc. Copy the 'huge' one to my.ini (back up the old one first) and restart the DB to see if maxing out all the cache sizes makes a difference. You may well have a bottle neck there, but it can be hard to work out what setting is causing it.
Same goes for PHP. Set memory_limit in php.ini to 500MB or something and see if it helps. Of course, you may not be able to do this, depending on your hosting arrangements, but it will eliminate one possible cause (or not) if you can.
Performance of your Drupal website also depends on how well your hosting platform is tuned for Drupal. Drupal requires special optimization of LAMP stack components. You can try Drupal-specific hosting companies http://www.drupalspecific.com to make your website run faster.
facing the drupal slowness issue myself. But have a very different issue than the others mentioned.
I disabled all the content also the drupal header for a drupal page of a specific content type.
Still the time taken by this page to load is above 20 secs!
I took help of YSlow and NET firebug panels.
Upon looking at them, noticed:
JS and CSS files inclusion individually takes 3 to 2 secs, and there are fair bit of inclusions happening, as a result it takes like 20 secs.
But i am not able to figure out, why the js and css inclusions are taking so much time. (this includes normal drupal core js and css files as well)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Is Magento usually so terribly slow?
This is my first experience with it and the admin panel simply takes ages to load and save changes. It is a default installation with the test data.
The server where it is hosted serves other non-Magento sites super fast. What is it about the PHP code that Magento uses that makes it so slow, and what can be done to fix it?
I've only been tangentially involved in optimizing Magento for performance, but here's a few reasons why the system is so slow
Parts of Magento use an EAV database system implemented on top of MySQL. This means querying for a single "thing" often means querying multiple rows
There's a lot of things behind the scenes (application configuration, system config, layout config, etc.) that involve building up giant XML trees in memory and then "querying" those same trees for information. This takes both memory (storing the trees) and CPU (parsing the trees). Some of these (especially the layout tree) are huge. Also, unless caching is on, these tree are built up from files on disk and on each request.
Magento uses its configuration system to allow you to override classes. This is a powerful feature, but it means anytime a model, helper, or controller is instantiated, extra PHP instructions need to run to determine if an original class file or an override class files is needed. This adds up.
Besides the layout system, Magento's template system involves a lot of recursive rendering. This adds up.
In general, the Magento Engineers were tasked, first and foremost, with building the most flexible, customizable system possible, and worry about performance later.
The first thing you can do to ensure better performance is turn caching on (System -> Cache Management). This will relieve some of the CPU/disk blocking that goes on while Magento is building up its various XML trees.
The second thing you'll want to do is ensure your host and operations team has experience performance tuning Magento. If you're relying on the $7/month plan to see you through, well, good luck with that.
Further to Alan Storm's recommendations on caching, there's two things I'd specifically recommend you look into related to caching:
- Make sure caching is in memcached, rather than on disk.
I look after a couple of magento installs, and once you get any sort of load on the system, memcached starts to perform much faster. And its dead easy to change it over (relative to doing other magento stuff at least!)
Good starting point is here: http://www.magentocommerce.com/boards/viewthread/12998/P30/ - but if you've not used memcached at all before, its worth looking at some general info about it as well.
- Enable template/view caching.
This is a good article: http://inchoo.net/ecommerce/magento/magento-block-caching/
There are good ones on the magento site too (google magento block caching), but its down at the moment.
To add my two cents to the block caching, I'd advise you create your own blocks in /app/code/local, extending the core ones and defining the cache parameters, name them xxx_Cache and then update your layout to use these blocks instead of the core ones. This way, you avoid losing your changes or breaking the system when you upgrade magento.
If you haven't seen it yet, Magento and Rackspace teamed up to create a white paper on performance tuning Magento. It's excellent.
https://support.rackspace.com/whitepapers/building-secure-scalable-and-highly-available-magento-stores-powered-by-rackspace-solutions/
--- edit ---
Another great resource, newly available (Oct 2011) is:
http://www.sessiondigital.com/assets/Uploads/Mag-Perf-WP-final.pdf
(Thanks due to Alan Storm on this one.)
There is possibly also a very non-obvious reason why your admin interface is very slow. Magento has a module named Mage_AdminNotification. Try to disable that ext. Because what it does is query magentocommerce.com for new update messages. If their servers are slow your admin page waits and is in effect slow because of the network lag and loading of the external news. If you have secured your outgoing server connection through a firewall this can be even more frustrating, since the admin interface will wait for the timeout when it cannot reach magentocommerce.com
To disable it: go to System -> Configuration, scroll to the bottom and hit Advanced(in the Advanced section). Now disable Mage_AdminNotification and save!
I only have a superficial experience with Magento. I installed it on a shared grid-server and the page loading was dismal ~5+ seconds. On a lark, I installed it on my optimized for CMS sites dedicated server, and it felt very, very snappy.
My Dedicated hosting had ~10 Joomla! sites and a VBullitin site running.
My guess is it's just not going to be performant on shared hosting. The over-subscription just won't allow enough resources for Magento to run as it ought.
I'm more involved in the managed server optimization in my company but I may have a few tips for you. First, you can look at the code more closely using the code tracing feature of Zend server. It will allow you to see where and when the things get dirty.
I totally share benlumley's consideration regarding the cache. Most of the sites we host doesn't even have the block caching enable. This cache has to be explicitly called and not "assumed". So if you code hasn't yet took part of this mechanism, it's something you definitely want to try. If you have a EE version, you can get the Full page up in order to get the best of the beast.
A reverse proxy will also help a lot. It'll cache the static ressources, significantly lowering the pressure on the php interpretation stack of your front servers.
Don't forget to write the sessions & Magento cache to a RAM disk. This will also definitely get you to another level of performances.
There's still a lot to be said here but I'm running out of time. You have to know that a good site, well coded in a 1.4.1 CE version, running on a 2x5650 Xeon + 16 GB RAM server and having a Rproxy on top can take up to 50 000 unique visitors a day with smooth pages to everybody.
Switching from Apache to LiteSpeed helped us a lot. In addition to: Editing MySQL's settings, installing Fooman Speedster (module to compress/combine js and css files), and installing APC. Magento has also posted a white paper on how to get the best performance out of the enterprise edition, but it is equally applicable to the other versions: http://www.magentocommerce.com/whitepaper/
There are many reasons why your Magento shopping cart could be running slow but no excuses for there is a variety of ways to eleviate the problem and make it pretty darn fast. Enabling Gzip by modifying your htaccess file is a start. You can also install the fooman speedster extension. The type of server used also will determine the speed of your store. More tips and a better explanation here http://www.interactone.com/how-to-speed-up-magento/
Magento is very slow because the database design is not very good. The code is a mess and very hard to update and optimize. So all optimizations are done via cache instead of code.
On the other hand. It is a webshop with a lot of tools. So if you need a flexible webshop just buy a very powerfull server and you will be ok.
When I first installed I had pages that were taking 30 seconds to load. My server was not maxed out in ram or processor, so I didn't know what to do. Looking at firebug's net panel it was loading about 100 files per page, and each one took a long time to connect. After installing fooman speedster and the gzip in the htaccess loads times were down to 3 seconds, like they had been on other shopping carts on my server.
it will also come down to functionality versus performance.
Raw performance is gained using nginx, php-fpm, memcached, apc and a proper designed server.
Functionality like plesk and magento performance could be managed by taking the entire infrastructure in perspective when designing a magento performance cloud.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am working on a Drupal based site and notice there are a lot of seperate CSS and js files. Wading though some of the code I can also see quite a few cases where many queries are used too.
What techniques have you tried to improve the performance of Drupal and what modules (if any) do you use to improve the performance of Drupal 'out of the box'?
Going to the admin/settings/performance page, turning on CSS and JS aggregation, and page caching with a minimum lifetime of 1 minute, will give you an immediate boost on a high traffic site. If you're writing your own code and doing any queries, consider writing your own discrete caching for expensive functions. The linked article covers Drupal 5, not 6, but the only change in d6 is the elimiation of the serialization requirement and the function signature for the cache_set() and cache_get() functions. (Both noted in comments on the article)
On large scale sites also consider dropping a memcached server onto the network: Using the memcached module, you can entirely bypass the drupal database for cached data. If you have huge amounts of content and search is a hot spot, you can also consider using lucene/solr as your search indexer instead of drupal's built-in search indexer. It's nice for a built-in indexer but it's not designed for heavy loads (hundreds or thousands of new pieces of content an hour, say, with heavy faceted searching). The apache solr module can tie in with that.
If you're making heavy use of Views, be sure that you've checked the queries it generates for unindexed fields; sorting and filtering by CCK fields in particular can be slow, because CCK doesn't automatically add indexes beyond the primary keys. In D6, preview the View in the admin screen, copy the text of the query, and run it through EXPLAIN in mysql or whatever query analysis tools you have.
Tools like YSlow and Firebug can also help you spot slow stuff like massive image files, JS hosted on remote servers, and so on.
Drupal 6, out-of-the-box, provides css and javascript aggregation --- most css and js files will be combined into a single file (and thus a single HTTP request), and are also whitespace-shortened (to reduce bandwidth consumption). You can enable this under /admin/settings/performance .
Also on that screen are controls for Drupal's (very effective) cache, which helps reduce the number of database queries.
Additionally, because Drupal (and all the modules you'll probably have installed) has a ton of PHP source, using a PHP opcode cache such as APC helps significantly decrease the request time.
I strongly second Benedict's recommendation of the Boost module - it alone will make your website fly on shared hosting, if configured correctly, and is not really buggy at all.
Turn on CSS/JS aggregation, turn on Boost, and your site can perform very well for anonymous users.
If your site deals with mostly logged-in users, you're going to have to do a lot more work making sure sessions are cached well, and probably consider using memcached and more SQL query caching.
The biggest performance gains always come from caching—but monitoring and adjusting your slow queries, monitoring and adjusting apache and PHP configurations, and being smart about the modules you use are very important as well.
Other than the drupal default cache , there are some another method to increase the performance
Boost module is one of the best.
memcache, Varnish(Drupal 7/Pressflow), CDN are the another methods that can improve the performance
It's also worth mentioning the performance boost from using quality SSD storage. It has consistently cut my initial response load times by 30% or more when migrating to SSD (I went from about ~450ms to ~250ms on my last project, using identical Apache/Memcache/Cloudfront EC2 configs), not to mention how much nicer it is to manage a snappy server where every command or script you throw at it is nearly instantaneous. I will never go back.