Cookie slows down my website on first start, always like this? - php

As soon as I implemented only one cookie into my website, the entire website became very slow on first start.
I guess it's because it is fetching the cookie information. But is this always the case?
There is no heavy code behind fetching the cookie, just plain simple php like this:
$arr = $_COOKIE['name']; // array maximum of 10 values
one for loop and nothing else!
Should I be worried?
With slows down, I mean like loading for 3-5 seconds.
Thanks

This is always the case for the first time. For the second time, browser gets stuff from it's internal cache and temporary files.

Retrieving a cookie shouldn't slow your site. Something else is going on. You'll need to do some profiling.

Related

Very bad TTFB time [duplicate]

I have a query which involves getting a list of user from a table in sorted order based on at what time it was created. I got the following timing diagram from the chrome developer tools.
You can see that TTFB (time to first byte) is too high.
I am not sure whether it is because of the SQL sort. If that is the reason then how can I reduce this time?
Or is it because of the TTFB. I saw blogs which says that TTFB should be less (< 1sec). But for me it shows >1 sec. Is it because of my query or something else?
I am not sure how can I reduce this time.
I am using angular. Should I use angular to sort the table instead of SQL sort? (many posts say that shouldn't be the issue)
What I want to know is how can I reduce TTFB. Guys! I am actually new to this. It is the task given to me by my team members. I am not sure how can I reduce TTFB time. I saw many posts, but not able to understand properly. What is TTFB. Is it the time taken by the server?
The TTFB is not the time to first byte of the body of the response (i.e., the useful data, such as: json, xml, etc.), but rather the time to first byte of the response received from the server. This byte is the start of the response headers.
For example, if the server sends the headers before doing the hard work (like heavy SQL), you will get a very low TTFB, but it isn't "true".
In your case, TTFB represents the time you spend processing data on the server.
To reduce the TTFB, you need to do the server-side work faster.
I have met the same problem. My project is running on the local server. I checked my php code.
$db = mysqli_connect('localhost', 'root', 'root', 'smart');
I use localhost to connect to my local database. That maybe the cause of the problem which you're describing. You can modify your HOSTS file. Add the line
127.0.0.1 localhost.
TTFB is something that happens behind the scenes. Your browser knows nothing about what happens behind the scenes.
You need to look into what queries are being run and how the website connects to the server.
This article might help understand TTFB, but otherwise you need to dig deeper into your application.
If you are using PHP, try using <?php flush(); ?> after </head> and before </body> or whatever section you want to output quickly (like the header or content). It will output the actually code without waiting for php to end. Don't use this function all the time, or the speed increase won't be noticable.
More info
I would suggest you read this article and focus more on how to optimize the overall response to the user request (either a page, a search result etc.)
A good argument for this is the example they give about using gzip to compress the page. Even though ttfb is faster when you do not compress, the overall experience of the user is worst because it takes longer to download content that is not zipped.

PHP - Using Sessions to temporarily save variables for ajax application

I wonder if the following would be a good Idea or rather contra-productive performance-wise:
An Ajax-Application, like for example a pagebrowser needs some language- and config-values, which are stored in the database. So each time, the user is using this app, in the ajax-script the mysql-query to get the variables is done again and again. concidering this for a pagebrowser, there might be like 10 or more requests (back and forward, back, forward, and so on), aka 10 x database-select, while it is needed only one time actually.
My idea was, to safe the config-vars in a session-array the first time, the ajax-app is requested. If the sessions-array exists, the mysql-query isnt done again.
if the user calls another regular page, these session-array is deleted again.
Now im not really sure, what would consume more server-resources, using sessions in the above described way for saving the vars teporarily, or just using a mysql-query to get the vars each time, the user klicks the ajax-app.
Thanx in advance, Jayden
If you working with massive amount of data, you could consider using Cookies as well instead of session for server resources, which will be stored in the user's local browser.
I'd bet sessions would be more effective, but the best way is to test and measure the different execution times.

Why does my basic php hit counter count too many hits?

I am using a simple bit of PHP code in my header file which increments a counter in my SQL database by 1 once per PHP session. I've tested this and it works fine.
However when I leave it for a day the counter has gone up by way more than I believe it should, and comparing this to the pageview counter in my Google Analytics it is far too high.
What could be happening and how could I stop this?
Google-analytics has a very different way of counting visits than a simple session based counter. I can't tell you exactly how it counts it because it is very closed source on that aspect but there is definitely cookies, sessions and javascript involved.
If you want my opinion. I built my own stat system once and it was hell with all those robots detection, trends, false visits. I switched to GA and it was worse because the client then started complaining that the numbers werent the same in both sites.
IMO? Don't use both, make up your own or use GA only, but not the two, you'll probably NEVER hit the same numbers.
Good luck
What do you mean by "once per session"?
You need to do a start_session() and then set a variable to ensure you are only counting unique sessions:
if(!isset($_SESSION['started'])) {
doHitCounter();
$_SESSION['started']=true;
}

PHP - Display status of loop

I have a PHP script something like:
$i=0;
for(;$i<500;++i) {
//Do some operation with files numbered 0 to 500;
}
The thing is, the script works and displays the end results, but the operation takes a while and watching a blank screen can be frustrating. I was thinking if there is some way I can continuously update the page at the client's end, detailing which file is currently being worked upon. That is, can I display and continuously update what is the current value of $i?
The Solution
Thanks everyone! The output buffering is working as suggested. However, David has offered valuable insight and am considering that approach as well.
You can buffer and control the output from the PHP script.
However, you may want to consider the scalability of this design. In general, heavy processes shouldn't be done online. Your particular case may be an edge in that the wait is acceptable, but consider something like this as an alternative for an improved user experience:
The user kicks off a process. This can be as simple as setting a flag on a record in the database or inserting some "to be processed" records into the data.
The user is immediately directed to a page indicating that the process has been queued.
An offline process (either kicked off by the PHP script on the server or scheduled to run regularly) checks the data and does the heavy processing.
In the meantime, the user can refresh the page (manually, by navigating elsewhere and coming back to check, or even use an AJAX polling mechanism to update the page) to check the status of the processing. In this case, it sounds like you'd have several hundred records in a database table queued for processing. As each one finishes, it can be flagged as done. The page can just check how many are left, which one is current, etc. from the data.
When the processing is completed, the page shows the result.
In general this is a better user experience because it doesn't force the user to wait. The user can navigate around the site and check back on progress as desired. Additionally, this approach scales better. If your heavy processing is done directly on the page, what happens when you have many users or the data processing load increases? Will the page start to time out? Will users have to wait longer? By making the process happen outside of the scope of the website you can offload it to better hardware if needed, ensure that records are processed in serial/parallel as business rules demand (avoid race conditions), save processing for off-peak hours, etc.
Check out PHP's Output Buffering.
Try to use:
flush();
http://php.net/manual/ru/function.flush.php
Try the flush() function. Calling this function forces PHP to send whatever output it has so far to the client, instead of waiting for the script to end.
However, some web servers will only send the output once the entire page is done being built, so calling flush() would have no effect in this case.
Also, browsers themselves buffer input, so you may run into problems there. For example, certain versions of IE won't start displaying the page until 256 bytes has been received.

PHP session handling when the same client requests the same script multiple times at once

So here's my test setup:
session_start();
if(!isset($_SESSION['bahhhh']))
$_SESSION['bahhhh'] = 0;
$_SESSION['bahhhh']++;
sleep(5);
die('a'.$_SESSION['bahhhh']);
What I expect to happen is that each time I hit the page, it returns a different number.
But if I use multiple tabs, and refresh them each within 5 seconds of the first, they all return the same number. (This isn't client side caching, as the 5 second delay is still evident.)
Why is it doing this, and how do I fix this?
It seems to have the same strange caching issue with file and database data as well, and is preventing me from building a working mutex to prevent running the same code more than once at a time.
Here's another, simpler example:
echo microtime();
sleep(10);
Run this 3 times, each 2 seconds apart, and all three return the same microsecond. WTF?
Session data, be default, is not saved until the request terminates. So your increment is not saved while sleeping. If you want to save the session prematurely checkout session_write_close()
I would have the script itself append something to a log file to verify the script is actually getting executed as many times as you think. Maybe you have some software somewhere that is intercepting the request and returning a cached response.
If it weren't for your comment that this also happens with microtime(), I would have given an explanation of how php manages concurency with sessions, and when it might not.
Apparently this is some bug in my browser itself. Opera behaves this way while Internet Explorer does not.
I did initial testing in IE with the same results but with more complex code. Apparently that complex code had an error that triggered the misbehavior in IE, and this simplified code does not.
Sorry to bother everyone.

Categories