Display php browser content successively - php

I'm currently migrating a webpage to a new server from php 5.0.x to php 5.6
Now I got one page that loops a Database query 7 times and the results are loaded into a table. I know that this is not the best way of doing things but it works quite well and I currently don't have other options to work on that because the results come from different databases. Caching is not an option because I need the content always up to date.
From accessing the site until it is displayed it takes around three to four seconds.. The queries got optimized recently but still take a long time to get executed.
To my problem: The old webserver shows the result table by table, so the user see's that the page is already working on it's request. When the content of Table1 is loaded, it gets displayed and the server cares for the second resultset.
The new webserver generates the website at once and the content is not shown until the 7 resultsets are loaded and the data is mapped to the grids.
Is there an option, maybe in php.ini with that i can reach the same result as on the old webserver? I really dunno how to google it so I'm asking you guys.
On the other hand maybe there's an option to run all the queries at once (multithreading?) and not in sequence?
Sorry for my "improvable" english.

It's probably something related to output buffering. You can force to write buffer contents out to the client with ob_flush function, just add a call to ob_flush after query execution.
php.ini setting is output_buffering. Also check output_compression setting, as it can cause problems with buffer flush.

Related

Caching in PHP for speeding up

I am running application (build on PHP & MySql) on VPS. I have article table which have millions of records in it. Whenever user login i am displaying last 50 records for each section.
So every-time use login or refresh page it is executing sql query to get those records. now there are lots of users on website due to that my page speed has dropped significantly.
I done some research on caching and found that i can read mysql data based on section, no. articles e.g (section - 1 and no. of articles - 50). store it in disk file cache/md5(section no.).
then in future when i get request for that section just get the data from cache/md5(section no).
Above solution looks great. But before i go ahead i really would like to clarify few below doubts from experts .
Will it really speed up my application (i know disk io faster than mysql query but dont know how much..)
i am currently using pagination on my page like display first 5 articles and when user click on "display more" then display next 5 articles etc... this can be easily don in mysql query. I have no idea how i should do it in if i store all records(50) in cache file. If someone could share some info that would be great.
any alternative solution if you believe above will not work.
Any opensource application if you know. (PHP)
Thank you in advance
Regards,
Raj
I ran into the same issue where every page load results in 2+ queries being run. Thankfully they're very similar queries being run over and over so caching (like your situation) is very helpful.
You have a couple options:
offload the database to a separate VPS on the same network to scale it up and down as needed
cache the data from each query and try to retrieve from the cache before hitting the database
In the end we chose both, installing Memecached and its php extension for query caching purposes. Memecached is a key-value store (much like PHP's associative array) with a set expiration time measured in seconds for each value stored. Since it stores everything in RAM, the tradeoff for volatile cache data is extremely fast read/write times, much better than the filesystem.
Our implementation was basically to run every query through a filter; if it's a select statement, cache it by setting the memecached key to "namespace_[md5 of query]" and the value to a serialized version of an array with all resulting rows. Caching for 120 seconds (3 minutes) should be more than enough to help with the server load.
If Memecached isn't a viable solution, store all 50 articles for each section as an RSS feed. You can pull all articles at once, grabbing the content of each article with SimpleXML and wrapping it in your site's article template HTML, as per the site design. Once the data is there, use CSS styling to only display X articles, using JavaScript for pagination.
Since two processes modifying the same file at the same time would be a bad idea, have adding a new story to a section trigger an event, which would add the story to a message queue. That message queue would be processed by a worker which does two consecutive things, also using SimpleXML:
Remove the oldest story at the end of the XML file
Add a newer story given from the message queue to the top of the XML file
If you'd like, RSS feeds according to section can be a publicly facing feature.

Optimize loading time of php page

I have got a simple PHP page requesting a list of addresses from a MySQL database. The database table has got 1257 entries. I also include a dynamically loaded side menu to browse to other sites.
Together I got 5 MySQL requests
The addresses
Pagination
Check whee the user has got permission to browse
get all the groups for side menu
get all sub entries for side menu
The site takes about 5 seconds to load.
I Googled for site load time improvement and found the Google Developer tools with page speed did all the improvements it told me to like enable deflate, change banner size, and so on but it is still at nearly the same loading time. I would like to know if this is common or if there anything I can do to improve the loading time.
EDIT: I have also indexed the columns and enabled the MySQL cache. I also use foreign keys in the sub entries table which are from the menu group table
EDIT2: I found the solution the problem was that i used localhost to connect to my db but since im using windows 7 it tried to connect via ipv6 now i change all localhost to 127.0.0.1 and it only takes about 126ms to load my page
In the first place, find out what's taking a page so long to load with browser's console. If the cause of the delay is at server side, e.g. the html file itself is being generated for a long time, then check the following:
Try to log slow mysql queries and make sure that you have none.
http://dev.mysql.com/doc/refman/5.0/en/slow-query-log.html
If you really have some expensive calculations going on (which is not likely in your case), try to cache it.
Don't forget about benefits of PHP code accelerators like APC and mysql optimizations (query cache etc).
... many other ways to speed things up, but got to profile the app itself and see what's going on.
Have you done indexing for the columns using in where condition. If not pl index the columns and check it

Getting all data once for future use

Well this is kind of a question of how to design a website which uses less resources than normal websites. Mobile optimized as well.
Here it goes: I was about to display a specific overview of e.g. 5 posts (from e.g. a blog). Then if I'd click for example on the first post, I'd load this post in a new window. But instead of connecting to the Database again and getting this specific post with the specific id, I'd just look up that post (in PHP) in my array of 5 posts, that I've created earlier, when I fetched the website for the first time.
Would it save data to download? Because PHP works server-side as well, so that's why I'm not sure.
Ok, I'll explain again:
Method 1:
User connects to my website
5 Posts become displayed & saved to an array (with all its data)
User clicks on the first Post and expects more Information about this post.
My program looks up the post in my array and displays it.
Method 2:
User connects to my website
5 Posts become displayed
User clicks on the first Post and expects more Information about this post.
My program connects to MySQL again and fetches the post from the server.
First off, this sounds like a case of premature optimization. I would not start caching anything outside of the database until measurements prove that it's a wise thing to do. Caching takes your focus away from the core task at hand, and introduces complexity.
If you do want to keep DB results in memory, just using an array allocated in a PHP-processed HTTP request will not be sufficient. Once the page is processed, memory allocated at that scope is no longer available.
You could certainly put the results in SESSION scope. The advantage of saving some DB results in the SESSION is that you avoid DB round trips. Disadvantages include the increased complexity to program the solution, use of memory in the web server for data that may never be accessed, and increased initial load in the DB to retrieve the extra pages that may or may not every be requested by the user.
If DB performance, after measurement, really is causing you to miss your performance objectives you can use a well-proven caching system such as memcached to keep frequently accessed data in the web server's (or dedicated cache server's) memory.
Final note: You say
PHP works server-side as well
That's not accurate. PHP works server-side only.
Have you think in saving the posts in divs, and only make it visible when the user click somewhere? Here how to do that.
Put some sort of cache between your code and the database.
So your code will look like
if(isPostInCache()) {
loadPostFromCache();
} else {
loadPostFromDatabase();
}
Go for some caching system, the web is full of them. You can use memcached or a static caching you can made by yourself (i.e. save post in txt files on the server)
To me, this is a little more inefficient than making a 2nd call to the database and here is why.
The first query should only be pulling the fields you want like: title, author, date. The content of the post maybe a heavy query, so I'd exclude that (you can pull a teaser if you'd like).
Then if the user wants the details of the post, i would then query for the content with an indexed key column.
That way you're not pulling content for 5 posts that may never been seen.
If your PHP code is constantly re-connecting to the database you've configured it wrong and aren't using connection pooling properly. The execution time of a query should be a few milliseconds at most if you've got your stack properly tuned. Do not cache unless you absolutely have to.
What you're advocating here is side-stepping a serious problem. Database queries should be effortless provided your database is properly configured. Fix that issue and you won't need to go down the caching road.
Saving data from one request to the other is a broken design and if not done perfectly could lead to embarrassing data bleed situations where one user is seeing content intended for another. This is why caching is an option usually pursued after all other avenues have been exhausted.

What php function to use to pull data from MySQL periodically and display it on website?

I'm running a server and website for an online game.
I need to pull information from the mySQL database containing player names and messages and display it on our website.
I managed to do that with a php script which directly pulls the info from the MySQL database whenever someone opens the page.
However, this puts unnecessary load on the server.
I need to achieve so that the function pulls the data only at certain intervals, for example, every hour.
I have no idea how to do this and what to even look for in terms of functions.
Could anyone show me the right direction?
If you're worried about the load being placed on the call being made every time the user opens up the site, perhaps first you should look into caching. Something like APC or memcached, that way there isn't an actual DB lookup, it just returns the same data that it grabbed last time You just set the period that you want the cache to hold the data so that in time it will grab a fresh copy, in your case this would be an hour.
There's a bunch of questions on APC, memcached etc on stack overflow that should be able to help you out, e.g., The best way of PHP Caching

Static web page vs MySql generated

I have a website that let's each user create a webpage (to advertise his product). Once the page is created it will never be modified again.
Now, my question: Is it better to keep the page content (only a few parts are editable) into a MySql database and generate it using queries everytime the page is accesed or to create a static webpage containing all the info and store it onto the server?
If I store every page on the disk, I may reach like 200.000 files.
If I store each page in MySQL database I would have to make a query each time the page is requested, and for like 200.000 entries and 5-6 queries/second I think the website will be slow...
So what's better?
MySQL will be able to handle the load if you create the tables properly (normalized and indexed). But if the content of the page doesn't change after creation, it's better if you cache the page statically. You can organize the files into buckets (folders) so that one folder doesn't have too many files in it.
Remember to cache only the content areas and not the templates. Unless each user has complete control over how his/her page shows up.
200.000 files writable by the Apache process is not a good idea.
I recommend using a database.
Database imports/exports are easier, not telling about the difference between the maintenance costs.
Databases are using caching, and if nothing is changed, they will pull up the last result, without running the query again. This doesn't stand, thanks JohnP.
If you want to redesign your webpage sometimes later you must be using MySQL to store the pages as you can't really change them (unless you dig into regexp) after making them static.
About the time issue - its not an issue if you set indexes right.
if the data is small to moderate then prefer static hardcoding ie. putting the data in the HTML, but if it is huge, computational or dynamic and changing you have no option but to use a connectivity to the Database
I believe that proper caching technique with certain attributes (long exp. time) would be better than static pages or retrieving everything from mysql everytime.
Static content is usually a good thing if you have a lot of traffic, but 5-6 queries a second is not hard for the database at all, so with your current load it doesn't matter.
You can spread the static files to different directories by file name and set up rewrite rules in your web server (mod_rewrite on Apache, basic location matching with regexp on Nginx and similar on other web servers). That way you won't even have to invoke the PHP interpreter.
A database and proper caching. 200.000 pages times, what? 5KB? That's 1 GB. Easy to keep in RAM. Besides 5/6 queries per second is easy on a database. Program first, then benchmark.
// insert quip about premature optimisation

Categories