I am using gethostbyname function to get IP address for our domain names.
When its running in apache and also using php in command line its taking 5.0695459842682 seconds for complete result.
<?php
$domain_name = $argv[1];
$stime = microtime(true);
$ip =gethostbyname($domain_name);
$etime = microtime(true);
$ttime = $etime - $stime;
echo "Total time for gethostbyname : $ttime\n";
echo $ip."\n";
?>
When I am running above script in php command line by passing google.com as domain, Its returning below result.
Total time for gethostbyname : 5.0695459842682
216.58.203.142
Can anyone please help me to come out and reduce the time to less than 1 second.
Regards,
Vignesh Kumar K
Maybe the problem is that your system is trying connect to dns server using ipv6 but your network is configured not properly to finish this with success? If so, you can try to disable ipv6.
In Debian you could do that just like here: https://wiki.debian.org/DebianIPv6#How_to_turn_off_IPv6
It depends on the internet speed you are using. When I run the code i am getting less than 2 seconds as my internet speed is nearly 90Mbps.
Hence you cant do anything to optimize the code rather than you need to increase your internet speed.
Related
I have an odd situation, I am using the code below:
$currentTime = date("YmdHis");
echo 'Current Time = '.$currentTime;
echo '<br>';
When running this, sometimes it is running around 7 minutes too slow.. example:
Image - Actual Time 11:35, PHP Current Time 11:27
But then sometimes, it runs perfectly.. example:
Image - Actual Time 11:37, PHP Current time 11:37
Is there any reason why this is happening? I am basing calculations on the current time which is obviously causing problems when it is slow!
I'm using zend_mail from zend-framework2 in my project to send some amount of emails in a loop, but sometimes it takes more than usual to send an email.
After doing some research, i found that the delay occurs on the stream_socket_client function.
I tried to set this function's timeout to acceptable value, but it seems to ignore that setting.
Also tried to use STREAM_CLIENT_PERSISTENT to limit the number of opened sockets but with no luck.
Average stream_socket_client times are about 0.03 seconds and occasionally it takes from 5 seconds up to even 40 seconds. Every value above 0.5 seconds is unacceptable for me. I'm out of ideas what can cause that issue.
Current setup:
$start = microtime(true);
$this->socket = #stream_socket_client($remote, $errorNum, $errorStr, self::TIMEOUT_CONNECTION, STREAM_CLIENT_CONNECT|STREAM_CLIENT_PERSISTENT);
echo 'Stream socket: '.(microtime(true) - $start);
The question is simple, I would like to check a database to serve customised content to a site visitor, but failover and serve a generic page if this function takes more then 800ms to execute. (Target time for the server response is 1000ms).
I've seen the set_time_limit function, however this takes an integer in seconds as the argument.
My question: is there something similar that can be used with values of less than 1 second?
I'm looking for something like:
void set_time_limit_ms ( int $milliseconds )
set_time_limit_ms (800)
doesn't exist. you just could emulate it with a tick function:
declare(ticks=1); // or more if 1 takes too much time
$start = microtime(1);
register_tick_function(function () use ($start) {
(microtime(1) - $start < 0.8) or die();
});
You will not be able to use this function to prevent a query that is running longer than you expected. This only measures the actual script execution time. Here is an bit from the manual.
The set_time_limit() function and the configuration directive
max_execution_time only affect the execution time of the script
itself. Any time spent on activity that happens outside the execution
of the script such as system calls using system(), stream operations,
database queries, etc. is not included when determining the maximum
time that the script has been running. This is not true on Windows
where the measured time is real.
Yes there is, I often use microtime. see
http://php.net/manual/en/function.microtime.php
<?php
$time_start = microtime(true);
// Sleep for a while
usleep(100);
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Did nothing in $time seconds\n";
?>
I'm executing a PHP script that takes about a minute to finish and although the default time limit is set to 30 seconds, the script continues its execution after that limit.
I found out, that the limit only affects the time that is spent inside the script itself and not the time spent in library functions like database queries etc.
Is there a way to find out the time that is actually spent inside the script? I tried to use getrusage, but it doesn't seem to return the appropriate values for this problem.
Example:
<?php
$time = microtime(TRUE);
sleep(100);
echo 'Time: ', microtime(TRUE) - $time;
?>
The script waits for 100 seconds and does not terminate after the time limit of 30 seconds.
According to the documentation of set_time_limit, the time that is spent inside the sleep function (100 seconds) is not involved in the calculation of the execution time, because it's an external (library) function.
I just want to know how much of the execution time is spent inside my script and how much is spent in library functions.
Use XDebug Profiler for it.
I'm guessing you want something like this:
<?php
// the whole operation
$time1 = time();
// perform some functions
$s = file_get_contents("somefilename");
$time2 = time();
// perform some library functions
$rs = some_third_party_library_function();
$time3 = time();
// Show results
echo "Overall Time spent: " . ($time3 - $time1) . "s <br>";
echo "Time spent reading file: ". ($time2 - $time1) . "s <br>";
echo "Time spent by Third Party Library: " . ($time3 - $time2) . "s <br>";
?>
You can mark the start time before entering the script, then mark the end time after the script ends. Echo the difference then.
<?php
$a=time();
sleep(5);
$b=time();
echo $b-$a;
?>
If you need to measure the execution time of individual functions vs standard or built-in PHP functions, this is best done with a proper debugger and code profiler. Xdebug does exactly that.
Don't know how complicated your script is, however you can just add a comment to all your library functions calls. If your code requires their returned values in order to be properly executed, replace them with constant values (that could have been returned by these functions). For example:
$result = mysql_query( ... )
replace with:
// $result = mysql_query( ... )
$result = // resource, boolean or whatever you want
Then run the script and calculate the execution time using Coding-Freak's suggestion.
If you prefer Rio Bautista's approach you may find some functions that calculates section's time.
Is it time()?
Is it time().substr(microtime(), 2, 2)?
Is it time().substr(microtime(), 2, 3)?
Kind of lost with the following snippet.
function updateClock ( ) {
var timeStamp = <?php echo time().substr(microtime(), 2, 2);?>;
var currentTime = new Date ( );
currentTime.setTime( timeStamp );
...
...
}
My goal is to use server time and start ticking from there on client browser window. The code above either returns the current client computer time or sometime in 1973. I guess I'm not getting the right time stamp format for setTime()?
Thanks!
1000
I tried that but the web page still shows my local time after I upload the js.php (rendering the javascript code) to my server. My server is approx 12 hours different in time from me. My guess is that does php takes client side time into account running time() ? I mean browsers do send request time to apache right?
I copied the time() * 1000 returned value from the web page run on my server and pasted it into a local page:
<script type="text/javascript">
var d = new Date();
d.setTime(1233760568000);
document.write(d);
</script>
And it's indeed my local time. Thus the guess.
Is there anyway to specify time zone for time()?
Date.setTime expects the number of milliseconds since 1970-1-1. php's time function yields the number of seconds since 1970-1-1. Therefore, you can just use
var TimeStamp = <?php echo time()*1000;?>
Due to latency issues (the browsers needs to load the whole page before starting JavaScript), the time will usually drift one or a couple of seconds though.
Multiply by 1000. JavaScript expects milliseconds while PHP returns seconds.
Date.setTime() wants milliseconds since the Unix Epoch, and time() returns seconds since then. If absolute precision isn't required (and given your methodology, I don't think it is), just multiply the value you get from time() by 1000.
Edit: beaten twice--D'oh