I was searching the web for this but found no satisfying answer.
I am not talking about the time it takes the browser to render and display.
Only the part where the HTML is generated in the server itself.
<?php
$script_start = microtime_float();
#CODE
echo (microtime_float()-$script_start)
?>
What is the accepted/normal time in web pages.
Lets say the page has a calendar, poll, content, menus(with submenus), some other modules.
Is it okay if it is less than 0.05seconds?
What do you think, what is the highest normal/accepted time it should take?
I've got this bit of string, how long should it be?
Your page will take as long as it needs to, based on what you're trying to do, how you're trying to do it, what platform you're running on, whether you're marshalling data from third-parties and a thousand and one other unknowable variables.
There will be an upper limit on what your users find acceptable, and if you find yourself frequently breaching that bound, then you could try some workarounds, e.g. caching data, lowsrc, asynchronous elements, etc.
But as it stands, there's no specific answer to this general question.
You should read this story about Google's measurements on this very topic.
I think there is no such thing like as a highest accepted time. As #Johannes points out it depends on how many users you have. Execution speed matters for Facebook - they even wrote a Compiler for it :) There are some nice benchmarks on http://www.phpbench.com/ and some optimizing tips on http://phplens.com/lens/php-book/optimizing-debugging-php.php
There's no correct answer to this, satisfying or otherwise. You should obviously be aiming to render the html in as little time as possible, but you can't put an arbitrary figure on how long that should be.
Having said that, if your pages are rendering in less than 0.05 seconds I don't think you've got anything to worry about!
That's one of the so-called "non-functional requirements". Too often they're forgotten. Others are "how often should my page crash", and "what's the desired uptime", and "should the page look different when printed?"...
You should take a look at how your php should be used: is it going to be called from other web-pages, or is it a stand-alone app? Is the user going to be bothered if the html-generation becomes the bigger part of the latency?...
Its usually more productive to observe the following:
How long database queries take
How long it takes to get data from off site requests
... which individually add up to the single page load time. There's no sense in measuring how long a page takes to load if you can't narrow down the bottlenecks.
More than a second or two, someone is likely going to start fiddling with their back or refresh buttons, or just close the browser tab. Again, that's subjective and based on my idea of how a 'typical someone' expects things to work.
Related
I don't know where to begin and need some guidance...
Looking for a simple page hits counter for a directory website. Each page of the directory (100+ pages) will have its own publicly-viewed page counter. Thanks in advance!
The question is a bit diffuse as it is now. It is not clear from your question whether you are searching for an existing hit counter solution (and possibly help on how to implement that on your site) or whether you want to code your own solution.
In either case you should try to get some things sorted out before you start:
What are the requirements for the hit counter? Or to put it differently: What features should it offer?
Should it just count the overall hits of every page or should it provide a finer resolution, e.g. showing the hits per day?
Should it provide an option to not show or not count hits on some special pages?
etc.
Hit counters usually use some kind of database to store the hit counts for the various pages. What kinds of databases are available on your server that could be used for that purpose?
If these questions are answered, you could either look for an existing solution that meets your requirements or start working on your own implementation. It is usually easier to work towards achieving some end, if you have a clear goal in mind. (Maybe you already have that, but then the question does not show it.)
I'm wondering if it's worth caching a result that is only going to change every couple of hours but is shown on every page of the website. Makes sense to me to do it but just wanting a second or third opinion.
The situation is that on each page there is 2 top 5 lists. These do change but only really need to be updated every hour for example. The result of this top 5 is found through a MySQL query. Would it be a smart idea to cache these results? What would be the best way to cache them? (Using PHP).
I've had a look through some of the questions asked here but they don't relate well enough so thought it'd be best to ask again.
Thanks.
Your use case is precisely the reason that caching was developed! The most common pattern is to use a caching service, such as memcached. (see Caching with PHP to take stress away from MySQL)
However, I would encourage you to think about whether or not this is a premature optimization. Is your database unable to cope with the requests? Are all pages loading slower than you need? Further, are these database queries slow for some reason? Are you properly using indexes?
I would encourage you to think only add a cache when you need to, and to ensure you have optimized as best as possible before then.
Checkout https://stackoverflow.com/a/1600252/1058853 for a good commentary.
i am making a comment system, the user will log in with their details on the main page which has been built, but on the second page where the comments will be i want to show each comment in order of time created, would this be better done in mysql database to store the comments or by putting the comments into a file and reading them from the file's on server?
XML [aka a 'flat file' database] may seem preferable for simplicity's sake, but you will find that your performance degrades ridiculously fast once you get a reasonable amount of traffic. Let's say that you have a separate XML file storing comments for each page, and you want to display the last 5 comments with the newest first. The server has to read the entire file from start to finish just figure out which 5 comments are the last.
What happens when you have 10,000 comments on something? What happens when you have 100 posts with 10,000 comments and a 1000 pageviews per second? Basically you're putting so much I/O load on your disk that everything else will grind to a halt for queued I/O.
The point of an RDBMS like mysql is that the information is indexed when it is put into the database, and that index is held in memory. In this way the application doesn't have to re-examine 100% of the data each time a request is made. A mysql query is written to consult the index and have the system retrieve only the desired information, ie: the last 5 comments.
Trust me, I worked for a hosting provider and code like this is constantly causing problems of the variety "it worked fine for months and now my site is slow as hell". You will not regret taking a little extra time to do it right the first time rather than scrambling to re-do the application when it grinds to a halt.
Definitely in MySQL. The actions are easy, and traceability is easy.
Unless you want to go the NoSQL route.
Storing the comments in database will be a better option. You get more power with database. By word more power, I mean you can easily do these:
When use gets deleted, you can decide if you want to delete his comments OR not
You can show top 10(or so) comments
You can show all comments from a user on some other page
Etc
I would recommend a database, a lot easier to access the data that way. XML is always a pain when it comes to laying out the code when pulling out the data. Might be worth getting a more detailed explanation from someone though as i've not had that much experience with XML.
Plus you have more control over moderating the comments.
Hope that helps!
MySQL every time!
It's perfect for doing this and very suitable/flexible/powerful.
I have a page that will pull many headlines from multiple categories based off a category id.
I'm wondering if it makes more sense to pull all the headlines and then sort them out via PHP if/ifelse statements or it is better to run multiple queries that each contain the headlines from each category.
Why not do it in one query? Something like:
SELECT headline FROM headlines WHERE category_id IN (1, 2, 3, ...);
If you filter your headlines in PHP, think how many you'll be throwing away. If you end up with removing just 10% of the headlines, it won't matter as much as when you'd be throwing away 90% of the results.
These kinds of questions are always hard to answer because the situation determines the best course. There is never a truly correct answer, only better ways. In my experience doesn't really matter whether you attempt to do the work in PHP or in the database because you should always try to cache the results of any expensive operation using a caching engine such as memcached. That way you are not going to spend a lot of time in the db or in php itself since the results will be cached and ready instantaneously for use. When it comes down to it, unlss you profile your application using a tool like xDebug, what you think are your performance bottlenecks are just guesses.
It's usually better not to overload the DB, because you might cause a bottleneck if you have many simultaneous queries.
However, handling your processing in PHP is usually better, as Apache will fork threads as it needs to handle multiple requests.
As usual, it all comes down to: "How much traffic is there?"
MySQL can already do the selecting and ordering of the data for you. I suggest to be lazy and use this.
Also I'd look for a (1) query that fetches all the categories and their headlines at once. Would an ORDER BY category, publishdate or something do?
Every trip to the database costs you something. Returning extra data that you then decide to ignore costs you something. So you're almost certainly better to let the database do your pruning.
I'm sure one could come up with some case where deciding what data you need makes the query hugely complex and thus difficult for the database to optimize, while you could do it in your code easily. But if we're talking about "select headline from story where category='Sports'" followed by "select headline from story where category='Politics'" then "select headline from story where category='Health'" etc, versus "select category, headline from story where category in ('Health','Sports','Politics')", the latter is clearly better.
On the topic of "Faster to query in MYSQL or to use PHP logic", which is how I ended up on this question 10 years later. I have determined that the correct answer is "it depends". There are just too many examples where using the DB saves processing time over writing PHP Code.... but there are just as many examples where writing PHP Code saves time on excessively complex MySQL queries.
There is no right answer here. If you end up here, like I did, then the best I can suggest is try to solve your problem with the skills that you have. Start first with the Query and try to solve it, if you run into issues then start thinking about just gathering the data and running the logic through PHP code to come up with a solution.
At the end of the day, you need to solve a problem.... if you solve it, but its not fast enough, then thats another problem... work on optimizing which may end up meaning that you go back to writing more MySQL logic.
Use the 80/20 rule and try to get things 80% of the way there as quickly as possible. You can go back and optimize once its workable. Spending all your effort on making it perfect the first time will surely mean you miss your deadline.
Thats my $0.02
Okay, so I'm sure plenty of you have built crazy database intensive pages...
I am building a page that I'd like to pull all sorts of unrelated database information from. Here are some sample different queries for this one page:
article content and info
IF the author is a registered user, their info
UPDATE the article's view counter
retrieve comments on the article
retrieve information for the authors of the comments
if the reader of the article is signed in, query for info on them
etc...
I know these are basically going to be pretty lightning quick, and that I could combine some; but I wanted to make sure that this isn't abnormal?
How many fairly normal and un-heavy queries would you limit yourself to on a page?
As many as needed, but not more.
Really: don't worry about optimization (right now). Build it first, measure performance second, and IFF there is a performance problem somewhere, then start with optimization.
Otherwise, you risk spending a lot of time on optimizing something that doesn't need optimization.
I've had pages with 50 queries on them without a problem. A fast query to a non-large (ie, fits in main memory) table can happen in 1 millisecond or less, so you can do quite a few of those.
If a page loads in less than 200 ms, you will have a snappy site. A big chunk of that is being used by latency between your server and the browser, so I like to aim for < 100ms of time spent on the server. Do as many queries as you want in that time period.
The big bottleneck is probably going to be the amount of time you have to spend on the project, so optimize for that first :) Optimize the code later, if you have to. That being said, if you are going to write any code related to this problem, write something that makes it obvious how long your queries are taking. That way you can at least find out you have a problem.
I don't think there is any one correct answer to this. I'd say as long as the queries are fast, and the page follows a logical flow, there shouldn't be any arbitrary cap imposed on them. I've seen pages fly with a dozen queries, and I've seen them crawl with one.
Every query requires a round-trip to your database server, so the cost of many queries grows larger with the latency to it.
If it runs on the same host there will still be a slight speed penalty, not only because a socket is between your application but also because the server has to parse your query, build the response, check access and whatever else overhead you got with SQL servers.
So in general it's better to have less queries.
You should try to do as much as possible in SQL, though: don't get stuff as input for some algorithm in your client language when the same algorithm could be implemented without hassle in SQL itself. This will not only reduce the number of your queries but also help a great deal in selecting only the rows you need.
Piskvor's answer still applies in any case.
Wordpress, for instance, can pull up to 30 queries a page. There are several things you can use to stop MySQL pull down - one of them being memchache - but right now and, as you say, if it will be straightforward just make sure all data you pull is properly indexed in MySQL and don't worry much about the number of queries.
If you're using a Framework (CodeIgniter for example) you can generally pull data for the page creation times and check whats pulling your site down.
As other have said, there is no single number. Whenever possible please use SQL for what it was built for and retrieve sets of data together.
Generally an indication that you may be doing something wrong is when you have a SQL inside a loop.
When possible Use joins to retrieve data that belongs together versus sending several statements.
Always try to make sure your statements retrieve exactly what you need with no extra fields/rows.
If you need the queries, you should just use them.
What I always try to do, is to have them executed all at once at the same place, so that there is no need for different parts (if they're separated...) of the page to make database connections. I figure it´s more efficient to store everything in variables than have every part of a page connect to the database.
In my experience, it is better to make two queries and post-process the results than to make one that takes ten times longer to run that you don't have to post-process. That said, it is also better to not repeat queries if you already have the result, and there are many different ways this can be achieved.
But all of that is oriented around performance optimization. So unless you really know what you're doing (hint: most people in this situation don't), just make the queries you need for the data you need and refactor it later.
I think that you should be limiting yourself to as few queries as possible. Try and combine queries to mutlitask and save time.
Premature optimisation is a problem like people have mentioned before, but that's where you're crapping up your code to make it run 'fast'. But people take this 'maxim' too far.
If you want to design with scalability in mind, just make sure whatever you do to load data is sufficiently abstracted and calls are centralized, this will make it easier when you need to implement a shared memory cache, as you'll only have to change a few things in a few places.