I have got a simple PHP page requesting a list of addresses from a MySQL database. The database table has got 1257 entries. I also include a dynamically loaded side menu to browse to other sites.
Together I got 5 MySQL requests
The addresses
Pagination
Check whee the user has got permission to browse
get all the groups for side menu
get all sub entries for side menu
The site takes about 5 seconds to load.
I Googled for site load time improvement and found the Google Developer tools with page speed did all the improvements it told me to like enable deflate, change banner size, and so on but it is still at nearly the same loading time. I would like to know if this is common or if there anything I can do to improve the loading time.
EDIT: I have also indexed the columns and enabled the MySQL cache. I also use foreign keys in the sub entries table which are from the menu group table
EDIT2: I found the solution the problem was that i used localhost to connect to my db but since im using windows 7 it tried to connect via ipv6 now i change all localhost to 127.0.0.1 and it only takes about 126ms to load my page
In the first place, find out what's taking a page so long to load with browser's console. If the cause of the delay is at server side, e.g. the html file itself is being generated for a long time, then check the following:
Try to log slow mysql queries and make sure that you have none.
http://dev.mysql.com/doc/refman/5.0/en/slow-query-log.html
If you really have some expensive calculations going on (which is not likely in your case), try to cache it.
Don't forget about benefits of PHP code accelerators like APC and mysql optimizations (query cache etc).
... many other ways to speed things up, but got to profile the app itself and see what's going on.
Have you done indexing for the columns using in where condition. If not pl index the columns and check it
Related
Currently i m using shared hosting domain for my site .But we have currently near about 11,00,000 rows in one of the tables.So its taking a lot of time to load the webpage.So we want to implement the database caching techniques like APC or memcache for our site.But in shared domain we dont have those facilities available,we have only eaccelerator.But eaccelerator does not cache db calls,If i m not wrong.So considering all these points we want to move to VPS and in this case.which database caching technique we need to use APC or memcache to decrease the page load time...Please guide on VPS and better caching technique of two
we have similar website and we use APC
APC will cache the opcode as well the html that is generated. This helps to avoid unrequired hits to the page
you should also enable caching on mysql to cache results of your query
I had a task where i needed to fetch rows from a database table that had more than 100.000 record. it was a scrollable page. So what i did was to fetch the first 50 records and cache the next 50 in the first call. and on scroll down events i wrote an ajax request to check if the data is available in cache; if not i fetched it from the database and also cached the next 50. It worked pretty well and solved the inconvenient load time.
if you have a similar scenario you might benefit from this approach.
ps: I used memcache.
From your comment I take it you're doing a LIKE %..% query and want to paginate the result. First of all, investigate whether FULLTEXT indices are an option for you, as they should perform better. If that's not an option, you can add a simple cache like so:
Treat each unique search term as an id, i.e. if in your URL you have ..?search=foobar, then "foobar" is the id of the result set. Keep that in all your links, e.g. ..?search=foobar&page=2.
If the result set does not yet exist (see below), create it:
Query the database with your slow query.
Get all the results into an array. Don't overdo it, you don't want to be storing hundreds of megabytes.
Create a unique filename per query, e.g. sha1($query), or maybe sha1(strtolower($query)).
serialize the data and store it in the file.
Get the data from the file, unserialize it, display the portion of the array corresponding to the requested page.
Occasionally, delete old cached results. You can do that with something like if (rand(0, 100) == 1) .., which will run the cleanup job every 100 queries on average. Strike a balance between server load and data freshness. Cache invalidation is a topic whole books can be written about, BTW.
That's a simple poor man's cache implementation. It's not great, but if you have absolutely nothing else to work with, it's better than running slow queries over and over.
APC is Alternative PHP Cache and works only with PHP. Whereas Memcahced will work independently with any language.
I am running application (build on PHP & MySql) on VPS. I have article table which have millions of records in it. Whenever user login i am displaying last 50 records for each section.
So every-time use login or refresh page it is executing sql query to get those records. now there are lots of users on website due to that my page speed has dropped significantly.
I done some research on caching and found that i can read mysql data based on section, no. articles e.g (section - 1 and no. of articles - 50). store it in disk file cache/md5(section no.).
then in future when i get request for that section just get the data from cache/md5(section no).
Above solution looks great. But before i go ahead i really would like to clarify few below doubts from experts .
Will it really speed up my application (i know disk io faster than mysql query but dont know how much..)
i am currently using pagination on my page like display first 5 articles and when user click on "display more" then display next 5 articles etc... this can be easily don in mysql query. I have no idea how i should do it in if i store all records(50) in cache file. If someone could share some info that would be great.
any alternative solution if you believe above will not work.
Any opensource application if you know. (PHP)
Thank you in advance
Regards,
Raj
I ran into the same issue where every page load results in 2+ queries being run. Thankfully they're very similar queries being run over and over so caching (like your situation) is very helpful.
You have a couple options:
offload the database to a separate VPS on the same network to scale it up and down as needed
cache the data from each query and try to retrieve from the cache before hitting the database
In the end we chose both, installing Memecached and its php extension for query caching purposes. Memecached is a key-value store (much like PHP's associative array) with a set expiration time measured in seconds for each value stored. Since it stores everything in RAM, the tradeoff for volatile cache data is extremely fast read/write times, much better than the filesystem.
Our implementation was basically to run every query through a filter; if it's a select statement, cache it by setting the memecached key to "namespace_[md5 of query]" and the value to a serialized version of an array with all resulting rows. Caching for 120 seconds (3 minutes) should be more than enough to help with the server load.
If Memecached isn't a viable solution, store all 50 articles for each section as an RSS feed. You can pull all articles at once, grabbing the content of each article with SimpleXML and wrapping it in your site's article template HTML, as per the site design. Once the data is there, use CSS styling to only display X articles, using JavaScript for pagination.
Since two processes modifying the same file at the same time would be a bad idea, have adding a new story to a section trigger an event, which would add the story to a message queue. That message queue would be processed by a worker which does two consecutive things, also using SimpleXML:
Remove the oldest story at the end of the XML file
Add a newer story given from the message queue to the top of the XML file
If you'd like, RSS feeds according to section can be a publicly facing feature.
i'm going crazy trying to solve a VERY weird error in a PHP script (Joomla). I have a page that displays multiple select dropdown lists, all of them showing the same list of items (the selected item changes from one list to another, but listed items are the same). This list has around 35-40 items. This works fine until a certain amount of selects, but when i put more than 20 or 25 selects on the same page, it doesn't work and shows only a white page. No errors, there is no text displayed, no errors in php logs, nothing; just a white page. If using THE SAME CODE, i display 11 dropdown select lists... it works.
I'm guessing that this problem is related to memory or something like that, but i can't be sure cause as i've said, there is no errors displayed. Does anyone knows about a simmilar issues? can anyone give me a tip about how to address this problem? i don't know what to do and i've tried many things but it still doesn't work. Any help will be very much appreciated and wellcomed...
NOTE: The select list are filled with values from a DB table and each select list has a different selected item based on contents from another table. It's not a very complex code and as i've said, it works fine when i use less select lists on the same page. The problem is when i reach a certain number of select lists on the same page (i think that it's around 20 or 25 input selects). I think that the amount of data is not very exagerated, so i can't understand why it doesn't work¿!?
A quick google for your issue turns this up:
for jos_session, which is the only table I suggested you empty, any logged on users will be logged off...any work in progress (forum posts/articles would be lost)...
I also empty recaptcha...
Please remember to always back up your db first.
I empty these two for a higher volume joomla 1.5 system once a week...we also set the session lifetime (no activity) varies 60-90 minutes...6-7k day volume site...this also helps our akeeba back up, as the two aforementioned tables can get very large without proper maintenance.
Just some general ramblings...
You should also review your mysql site report via phymyadmin "Show MySQL runtime information". Look for things that are in 'red'.
As for your overall question about performance. Please remember that there are many ways to improve websites performance. It's yes another job required by site administrators and at least an interesting process.
The joomla performance forum is a great place to have your site reviewed and get good help tuning your site including the minimum base server you need (shared/vps/dedicicated).
IMHO...First objective is to turn off joomla cache and joomla gzip by enabling standard server modules like mod_deflate and mod_expires (mod_expires is one of the best fixes for returning visitors). Make sure you mysql configuration enabling query_cache are or can be set. You will need a minimum of a vps. and there's more!...jajajaja
A little note about running shared server, and not having certain server modules available,
check this out: http://www.webogroup.com/ It's really one heck of a product. I've used it on the aforementioned site until I could implement the changes on the server. As I implemented each new server module I turned off the webo...site is now boring fast
have fun
I have a website that let's each user create a webpage (to advertise his product). Once the page is created it will never be modified again.
Now, my question: Is it better to keep the page content (only a few parts are editable) into a MySql database and generate it using queries everytime the page is accesed or to create a static webpage containing all the info and store it onto the server?
If I store every page on the disk, I may reach like 200.000 files.
If I store each page in MySQL database I would have to make a query each time the page is requested, and for like 200.000 entries and 5-6 queries/second I think the website will be slow...
So what's better?
MySQL will be able to handle the load if you create the tables properly (normalized and indexed). But if the content of the page doesn't change after creation, it's better if you cache the page statically. You can organize the files into buckets (folders) so that one folder doesn't have too many files in it.
Remember to cache only the content areas and not the templates. Unless each user has complete control over how his/her page shows up.
200.000 files writable by the Apache process is not a good idea.
I recommend using a database.
Database imports/exports are easier, not telling about the difference between the maintenance costs.
Databases are using caching, and if nothing is changed, they will pull up the last result, without running the query again. This doesn't stand, thanks JohnP.
If you want to redesign your webpage sometimes later you must be using MySQL to store the pages as you can't really change them (unless you dig into regexp) after making them static.
About the time issue - its not an issue if you set indexes right.
if the data is small to moderate then prefer static hardcoding ie. putting the data in the HTML, but if it is huge, computational or dynamic and changing you have no option but to use a connectivity to the Database
I believe that proper caching technique with certain attributes (long exp. time) would be better than static pages or retrieving everything from mysql everytime.
Static content is usually a good thing if you have a lot of traffic, but 5-6 queries a second is not hard for the database at all, so with your current load it doesn't matter.
You can spread the static files to different directories by file name and set up rewrite rules in your web server (mod_rewrite on Apache, basic location matching with regexp on Nginx and similar on other web servers). That way you won't even have to invoke the PHP interpreter.
A database and proper caching. 200.000 pages times, what? 5KB? That's 1 GB. Easy to keep in RAM. Besides 5/6 queries per second is easy on a database. Program first, then benchmark.
// insert quip about premature optimisation
I am looking to display 60,000 records on a webpage with php pulling the records from a mysql database on localhost. These 60,000 records may change depending on the data input.
The records have 5 text fields and due to the sheer number of records, a significant time is taken to send the data from the mysql server to the web browser. Even on a localhost, the time taking is around 15 seconds. During this time, the page is empty.
I would like to seek professional opinion on how to either to
1. display the data in an alternative method, (which I'm not sure what method) or
2. hasten the sending of data from mysql server to the web browser using caching technology like memcache.
In the end i will be deploying the application on the internet where the lag would be immensely unacceptable (i.e. > 15 seconds).
Thank you and Best Regards!
I would suggest trying AJAX pagination. No user will be able to see and analyze 60k records at one time. You can have the php display the first x (however many fit on the average screen or two) records to fill 2-3 pages, and have JavaScript listen for a scroll change. If a user starts scrolling down, have it automatically query the next y records, and add them to the display list. Possibly also removing the records from the top of the list.
Also, adding some quick-jump links or a search feature could help, as you wouldn't want to scroll down 60k records to make changes.
This will significantly lighten the server and client load, as it would only have to serve up a couple hundred records at a time.
DataTable
You should have a look at YUI's DataTable. You should hook the datatable up to autocomplete. There is also an example how they did it in YUI2(help) but YUI3 is a lot faster.
Caching
Caching is also important. You say you could use memcached so that is very good. I am a big fan of redis(But both will work, but the nice thing is that redis is I think better suited for autocomplete). There is even a free plan of Redis To go.
Another important tip is to make sure you are getting your data as you want it displayed from the database. In other words if there is any calculation or processing that you have to do, avoid doing it in PHP code during loops. Use SQL functions to process data, name fields, etc. Databases are good at that sort of thing. Of course this may or may not apply to exactly what you're doing.