Domain Lookup Script execution time - php

I need to lookup domains names from an XML file and then loop through each domain to see whether it exists or not..
Im using below approaches..
1.fsockopen()
2.checkdnsrr()
Number of records in XML file is around 120.Im using AJAX to get the results..
Results :
**1.with approach-1 -- it took 13-14 s on an average on localhost
2.with approach-1 -- it took 25-30 s on an average on live server
1.with approach-2 -- it took 6-8 s on an average on localhost
2.with approach-1 -- it took 19-22 s on an average on live server**
Why the difference with localhost and live server??
Because in both the cases i have a 2MBPS Machine to test from..
Also i would like to show the availability of each domain entry as soon as it is scanned rather than dumping whole results when ajax call returns..How am i supposed to achieve this??
Any help is appreciated

First of all, queries on localhost might be faster because the DNS results are cached already.
You should do these tests on a cache-clean machine, but it's always tricky to clean DNS cache entries. Or maybe your browser cache some results too. (See DNS Flusher)
About the AJAX requests, what you are looking for is asynchronous requests.
AJAX works in both modes :
with synchronous calls, the script wait/hangs until the responses before going on your script, so it's longer, but it's sequential.
with asynchronous calls, the script do the call, and goes on. The response might arrive or not, the script continues anyway. The responses will be handle when they arrive, maybe not in the same order you made the calls.
Checkout http://javascript.about.com/od/ajax/a/ajaxasyn.htm
In jQuery, you have a parameter async: true to achieve this.
Good luck with your project.

Related

Very bad TTFB time [duplicate]

I have a query which involves getting a list of user from a table in sorted order based on at what time it was created. I got the following timing diagram from the chrome developer tools.
You can see that TTFB (time to first byte) is too high.
I am not sure whether it is because of the SQL sort. If that is the reason then how can I reduce this time?
Or is it because of the TTFB. I saw blogs which says that TTFB should be less (< 1sec). But for me it shows >1 sec. Is it because of my query or something else?
I am not sure how can I reduce this time.
I am using angular. Should I use angular to sort the table instead of SQL sort? (many posts say that shouldn't be the issue)
What I want to know is how can I reduce TTFB. Guys! I am actually new to this. It is the task given to me by my team members. I am not sure how can I reduce TTFB time. I saw many posts, but not able to understand properly. What is TTFB. Is it the time taken by the server?
The TTFB is not the time to first byte of the body of the response (i.e., the useful data, such as: json, xml, etc.), but rather the time to first byte of the response received from the server. This byte is the start of the response headers.
For example, if the server sends the headers before doing the hard work (like heavy SQL), you will get a very low TTFB, but it isn't "true".
In your case, TTFB represents the time you spend processing data on the server.
To reduce the TTFB, you need to do the server-side work faster.
I have met the same problem. My project is running on the local server. I checked my php code.
$db = mysqli_connect('localhost', 'root', 'root', 'smart');
I use localhost to connect to my local database. That maybe the cause of the problem which you're describing. You can modify your HOSTS file. Add the line
127.0.0.1 localhost.
TTFB is something that happens behind the scenes. Your browser knows nothing about what happens behind the scenes.
You need to look into what queries are being run and how the website connects to the server.
This article might help understand TTFB, but otherwise you need to dig deeper into your application.
If you are using PHP, try using <?php flush(); ?> after </head> and before </body> or whatever section you want to output quickly (like the header or content). It will output the actually code without waiting for php to end. Don't use this function all the time, or the speed increase won't be noticable.
More info
I would suggest you read this article and focus more on how to optimize the overall response to the user request (either a page, a search result etc.)
A good argument for this is the example they give about using gzip to compress the page. Even though ttfb is faster when you do not compress, the overall experience of the user is worst because it takes longer to download content that is not zipped.

DB Connection on Ajax

I have a chat on my website, and it runs on AJAX calls. Knowing that the PHP script is being ran 2-3 times per second, is it a bad idea to connect to a database and pull / insert data? I am wondering if it will slow down my PHP significantly, or not change it much at all.
Sorry I can't comment yet, so i don't know if that's your answer..
So basically of course this will cause all lot of traffic on your database. Depending on Webserver this might not be that big of a deal. But if a Clients physic computer is just from 2000, his side will just lag out the tab because his browser is sending all the time requests to your database and is trying to get the answer.
But i think this is the easiest method to get live data, which you need for your chat. But in my opinion I would suggest you to run the AJAX-Request like every 2 Seconds. I don't know what's exactly your purpose. But for a normal chat (not real time data exchange) this will last.
By the way: I am also not sure how you are initializing your AJAX-Request. But i would suggest doing this with jQuery:
$(document).ready(function(){
call_php_ajax()
});
function call_php_ajax(){
$("#div_toinsert").load("ajax.php");
setTimeout(call_php_ajax, 3000) // 3 Seconds
And in your MySQL-Query-File (ajax.php) you perform your queries

How to minimalise mysql connections and ajax calls in jQuery game (ping function)?

I'm writting a browser-based game. The client is in jQuery and the server is in PHP/MySQL.
It's a turn-based game, so most of the client-server communication is realised by call-respond (jQuery-PHP). That calls happens any time the user clicked button or any other active element. After call in JQuery, PHP controller creates some classes, connects to the database and returns respond. That communication is quite good for me, do not cause the problems with the number of connections etc. (it's similar to the standard trafic during using the website).
Unfortunatelly, I also need some 'calls' from server-side. The example is the list of active games to join. Client must be notify any time the game list has changed. And the maximum delay for that is no more than 1 second.
For now I make it by sending 'ping' call from client (jQuery) and server anserws with "nothing" most the time, or "game2 created" etc. But that 'pings' will be send every second from each of the players. And for each of them, the server will create classes and connect to the mysql which results with "Database connection error".
Is there any way to minimalise mysql connections and/or ajax calls?
I use standard www server, don't have root account.
Start with this:
But that 'pings' will be send every second from each of the players
Instead of calling every second the server by both players (which is actually 2 calls, with
the number going up for every player connected), you can optimize it by checking the idle time or how much time passed of doing nothing; if nothing has been returned for 2 continuous calls, you should increase the call delay to 2 seconds and then to 4 seconds etc. (just play with setInterval and make it run continuously);
This will allows some breathing to your app (i had my own game using this)
Next thing to do is the calling policy; Instead of calling the server in player's command, you can just store the player's command in a js array and every X seconds send
that array off; if no commands, no ajax call. Yes, you'll get a delay but think of many users
connected to a possible poor server...
You can also use comet technology if you want to push things further..
Having said that this may be a possible duplicate as mentioned by eggyal ..
about ping and response (client pings server scenario)
1) Have you considered writing temporary files (ie. Smarty compiled file with caching time X) ?
Once player Y has done its turn remove the file (or write something in that file).
Each player does an AJAX request with a game_id (or anything uniques) which will check existence of that compiled file. This will save you many mysql calls. You will call mysql if the caching time of the file has expired.
2) Ugly -> try if mysql persistent connections will help ( I am not sure )

How can I create a progress bar attached to a server-side process?

We just ran into a problem with our cloud host - they've changed their apache settings to force a much shorter page timeout, and now during certain processes (report creation, etc.) that take more than 15 seconds (which the client is fine with; we're processing huge amounts of data) we get an error:
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request POST /administrator/index.php.
Reason: Error reading from remote server
I have confirmed that our code is still running correctly in the background, and double-checked with the host that this is really just a timeout. Their suggestion was to create a progress bar that is associated with the backend code; that way apache knows something is still going on and won't time out.
I've done progress bars associated the page load events (i.e. when all images are loaded, etc.) but have no idea how to go about creating a progress bar associated with backend code. This is a Joomla site, coded in mvc php, and the code that's causing the issue is part of the model - the various pieces that could be involved are all doing humongous queries. The tables are indexed correctly and the queries are optimized; the issue is not how to make the processes take less time - because we're on a cloud server the timeout limit could be changed to 5 seconds tomorrow without any kind of warning. What I need is someone to point me in the right direction of how to create the progress bar so it's actually associated with the function being run in the model.
Any ideas? I'm a complete beginner as far as this goes.
The easiest way I can think of is to use a two-step process:
Have the model write out events to a textfile or something when it gets to a critical point.
Use some ajax method to have the page regularly check that file for updates, and update the progress bar as such.
Whatever the background process does should update something like a file or database entry with a percentage completed every X seconds or at set places in its flow. Then you can call another script from Javascript every X seconds and it returns the percentage completed via the database record.
updateRecord(0);
readLargeFile();
updateRecord(25);
encodeLargeFile();
updateRecord(50);
writeLargeFile();
updateRecord(75);
celebrate();
updateRecord(100);

cURL multi hanging/ignoring timeout

I'm using a 'rolling' cURL multi implementation (like this SO post, based on this cURL code). It works fine to process thousands of URLs using up to 100 requests at the same time, with 5 instances of the script running as daemons (yeah, I know, this should be written in C or something).
Here's the problem: after processing ~200,000 urls (across the 5 instances) curl_multi_exec() seems to break for all instances of the script. I've tried shutting down the scripts, then restarting, and the same thing happens (not after 200,000 urls, but right on restart), the script hangs calling curl_multi_exec().
I put the script into 'single' mode, processing one regular cURL handle at time, and that works fine (but it's not quite the speed I need). My logging leads me to suspect that it may have hit a patch of slow/problematic connections (since every so often it seems to process on URL then hang again), but that would mean my CURLOPT_TIMEOUT is being ignored for the individual handles. Or maybe it's just something with running that many requests through cURL.
Anyone heard of anything like this?
Sample code (again based on this):
//some logging shows it hangs right here, only looping a time or two
//so the hang seems to be in the curl call
while(($execrun =
curl_multi_exec($master, $running)) == CURLM_CALL_MULTI_PERFORM);
//code to check for error or process whatever returned
I have CURLOPT_TIMEOUT set to 120, but in the cases where curl_multi_exec() finally returns some data, it's after 10 minutes of waiting.
I have a bunch of testing/checking yet to do, but thought maybe this might ring a bell with someone.
After much testing, I believe I've found what is causing this particular problem. I'm not saying the other answer is incorrect, just in this case not the issue I am having.
From what I can tell, curl_multi_exec() does not return until all DNS (failure or success) is resolved. If there are a bunch of urls with bad domains curl_multi_exec() doesn't return for at least:
(time it takes to get resolve error) * (number of urls with bad domain)
Here's someone else who has discovered this:
Just a note on the asynchronous nature of cURL’s multi functions: the DNS lookups are not (as far as I know today) asynchronous. So if one DNS lookup of your group fails, everything in the list of URLs after that fails also. We actually update our hosts.conf (I think?) file on our server daily in order to get around this. It gets the IP addresses there instead of looking them up. I believe it’s being worked on, but not sure if it’s changed in cURL yet.
Also, testing shows that cURL (at least my version) does follow the CURLOPT_CONNECTTIMEOUT setting. Of course the first step of a multi cycle may still take a long time, since cURL waits for every url to resolve or timeout.
I think your problem is releated to:
(62) CURLOPT_TIMEOUT does not work properly with the regular multi and multi_socket interfaces. The work-around for apps is to simply remove the easy handle once the time is up.
See also: http://curl.haxx.se/bug/view.cgi?id=2501457
If that is the case you should watch your curl handles for timeouts and remove them from the multi pool.

Categories