I have a Wordpress Plug-in with users requesting a feature that is the view counter.
I know only a few approaches on making a view counter, and the problem is that I want to optimize for performance and memory issues.
I have done a small amount of research and it seems, "mod_log_mysql", may be a great approach, but I do not have any prior knowledge on how this mod works, nor could have any ideas on how to connect it with the Wordpress Plug-in.
Or I could use a database. When the page is viewed, an update or a insert (which is said to be faster than a update) event occurs.
Thus my following option are:
Update/Insert on server side when
the page is called for.
Research more on mod_log_mysql and
find a way to connect it with the
plug-in.
Find a Premade view counter.
If there is a better approach, I would like to hear them in hoping this will solve my problem.
It really depends on what you want to achieve and how much time you want to spend on it.
If you need more than a simple view counter per page/event, go for a premade one.
If you need something simple, I would go for option #1.
If you're worried about performance, use a memory table for staging the 'counts', and then have a php script move that into a regular table periodically (i.e. using a cronjob). I wouldn't expect updating a view counter in a memory table to have any significant performance impact.
Option #2 could easily fall into premature optimization category.
Option 1 by far seems the easiest and probably most efficient. There is little overhead associated with making a single call to a database whose connection is already open from other operations done on the page previously.
Related
May I know what's the best practice for storing page views in database?
Is it costly to update the pageview value every time the page is loaded?
And is there any possible errors/threats while updating the pageview and in the same time, someone else is modifying the same row's data?
For e.g.
Table - Item
ID
Name
Description
PageView
I understand it might not be a very big issue, it can apply to some other data that updates very often, would like to know what's the best approach of doing it.
The possible scenario that came into my mind was, if there's a lot of pages is running in the simultaneously. Will there be any performance issue?
Pardon my English, and many thanks in advance.
The best approach is to write a new line into the table for every view, rather than to keep a rolling tally which could lead to locking problems on a large system. For additional performance, you can add rows using INSERT DELAYED syntax. This will allow the DB handle to return immediately and your script won't wait for the insert to complete. Documentation here:-
http://dev.mysql.com/doc/refman/5.5/en/insert-delayed.html
It might also be worth looking at the Archive storage engine which is aimed specifically at this type of logging. You can only INSERT and SELECT data but performance is aimed at rapid writing.
http://dev.mysql.com/doc/refman/5.1/en/archive-storage-engine.html
To view pageview information you simply query the number of rows for any given page. A major advantage of this approach is that by logging a timestamp with every page view, you can analyse data by time of day, days of week etc. and see what shape your traffic is.
I'm wondering if it's worth caching a result that is only going to change every couple of hours but is shown on every page of the website. Makes sense to me to do it but just wanting a second or third opinion.
The situation is that on each page there is 2 top 5 lists. These do change but only really need to be updated every hour for example. The result of this top 5 is found through a MySQL query. Would it be a smart idea to cache these results? What would be the best way to cache them? (Using PHP).
I've had a look through some of the questions asked here but they don't relate well enough so thought it'd be best to ask again.
Thanks.
Your use case is precisely the reason that caching was developed! The most common pattern is to use a caching service, such as memcached. (see Caching with PHP to take stress away from MySQL)
However, I would encourage you to think about whether or not this is a premature optimization. Is your database unable to cope with the requests? Are all pages loading slower than you need? Further, are these database queries slow for some reason? Are you properly using indexes?
I would encourage you to think only add a cache when you need to, and to ensure you have optimized as best as possible before then.
Checkout https://stackoverflow.com/a/1600252/1058853 for a good commentary.
i am making a comment system, the user will log in with their details on the main page which has been built, but on the second page where the comments will be i want to show each comment in order of time created, would this be better done in mysql database to store the comments or by putting the comments into a file and reading them from the file's on server?
XML [aka a 'flat file' database] may seem preferable for simplicity's sake, but you will find that your performance degrades ridiculously fast once you get a reasonable amount of traffic. Let's say that you have a separate XML file storing comments for each page, and you want to display the last 5 comments with the newest first. The server has to read the entire file from start to finish just figure out which 5 comments are the last.
What happens when you have 10,000 comments on something? What happens when you have 100 posts with 10,000 comments and a 1000 pageviews per second? Basically you're putting so much I/O load on your disk that everything else will grind to a halt for queued I/O.
The point of an RDBMS like mysql is that the information is indexed when it is put into the database, and that index is held in memory. In this way the application doesn't have to re-examine 100% of the data each time a request is made. A mysql query is written to consult the index and have the system retrieve only the desired information, ie: the last 5 comments.
Trust me, I worked for a hosting provider and code like this is constantly causing problems of the variety "it worked fine for months and now my site is slow as hell". You will not regret taking a little extra time to do it right the first time rather than scrambling to re-do the application when it grinds to a halt.
Definitely in MySQL. The actions are easy, and traceability is easy.
Unless you want to go the NoSQL route.
Storing the comments in database will be a better option. You get more power with database. By word more power, I mean you can easily do these:
When use gets deleted, you can decide if you want to delete his comments OR not
You can show top 10(or so) comments
You can show all comments from a user on some other page
Etc
I would recommend a database, a lot easier to access the data that way. XML is always a pain when it comes to laying out the code when pulling out the data. Might be worth getting a more detailed explanation from someone though as i've not had that much experience with XML.
Plus you have more control over moderating the comments.
Hope that helps!
MySQL every time!
It's perfect for doing this and very suitable/flexible/powerful.
I have an application that is fetching several e-commerce websites using Curl, looking for the best price.
This process returns a table comparing the prices of all searched websites.
But now we have a problem, the number of stores are starting to increase, and the loading time actually is unacceptable at the user experience side. (actually 10s pageload)
So, we decided to create a database, and start to inject all Curl filtered result inside this database, in order to reduce the DNS calls, and increase Pageload.
I want to know, despite of all our efforts, is still an advantage implement a Memcache module?
I mean, will it help even more or it is just a waste of time?
The Memcache idea was inspired by this topic, of a guy that had a similar problem: Memcache to deal with high latency web services APIs - good idea?
Memcache could be helpful, but (in my opinion) it's kind of a weird way to approach the issue. If it was me, I'd go about it this way:
Firstly, I would indeed cache everything I could in my database. When the user searches, or whatever interaction triggers this, I'd show them a "searching" page with whatever results the server currently has, and a progress bar that fills up as the asynchronous searches complete.
I'd use AJAX to add additional results as they become available. I'm imagining that the search takes about ten seconds - it might take longer, and that's fine. As long as you've got a progress bar, your users will appreciate and understand that Stuff Is Going On.
Obviously, the more searches go through your system, the more up-to-date data you'll have in your database. I'd use cached results that are under a half-hour old, and I'd also record search terms and make sure I kept the top 100 (or so) searches cached at all times.
Know your customers and have what they want available. This doesn't have much to do with any specific technology, but it is all about your ability to predict what they want (or write software that predicts for you!)
Oh, and there's absolutely no reason why PHP can't handle the job. Tying together a bunch of unrelated interfaces is one of the things PHP is best at.
Your result is found outside the bounds of only PHP. Do not bother hacking together a result in PHP when a cronjob could easily be used to populate your database and your PHP script can simply query your database.
If you plan to only stick with PHP then I suggest you change your script to index your database from the results you have populated it with. To populate the results, have a cronjob ping a PHP script that is not accessible to the users which will perform all of your curl functionality.
I'm currently developing a Zend Framework project, using Doctrine as ORM.
I ran into the typical situation where you have to show a list of items (around 400) in a table, and of course, I don't want to show them all at once.
I've already used Zend_Paginator before (only some basic usage), but i always used to get all the items from the DB, and then paginate them, but now it doesn't feel quite right.
My question is this: is it better to get all items from DB first and then "paginate" them, or to get "pages" of items as they are requested? which would have a larger impact on performance?
For me, it is better to get a part of the data and then paginate through them.
If you get All the data from a DB you paginate with the help of JavaScript.
The first opening of the page will take a long time (for 400 rec. is OK).
Browser has a limited memory. If a user opens up a lot of tabs in the browser
and you take a lot of memory (with your data)
this will slow down the speed of the browser and the speed of your application.
You have only 400 records but the increase of the data happens very often.
At worst, the whole browser may break when the page is opened.
What if browser doesn't support JS ...
If you get part of the data from DB, the only defect is if
a user has a very slow Internet speed(but this is the defect in the first option - in the first refresh of the page).
If someone wants to get to another page, it will take a little bit longer than JavaScript.
The second option is better(for me) in the long run, because if it works it will work for years.
The database engines are usually best suited to do the retrieval for you. So, in general, if you can delegate a data-retrieval task to the DB engine instead of doing it in-memory and using your programming language, the best bet for performance is to let the DB engine do it for you.
But also remember that if you don't configure the indices correctly or don't run a good query, you won't get the best result out of your DB engine.
However, most DB engines nowadays are capable of optimizing your queries for you and running them in their most normal form.