A few days ago wrote a php script that goes through all my music and reads id3 tags for each song and inserts those in a mysql database. Here's a snippet for context:
exec ( "find /songs -type f -iname '*mp3'", $song_path );
$number_of_songs = count($song_path);
for($i=0; $i<$number_of_songs; $i++){
//read id3 tags
//store id3 tags into database
}
I changed the php directive max_execution_time in apache2/php.ini to gain a better understanding of what this directive does.
When I set max_execution_time = 10, my php script runs for about 45 seconds and successfully reads the id3 tags for about 150 songs (Out of thousands of songs) and inserts those tags into a mysql database before terminating the script and outputting the following to the screen:
Fatal error: Maximum execution time of 10 seconds exceeded in /websites/.../public_html/GetID3()/getid3/module.audio.mp3.php on line 1894
From the documentation, 'The maximum execution time is not affected by system calls, stream operations etc' http://www.php.net/manual/en/info.configuration.php#ini.max-execution-time
What can I infer from the difference between the
maximum_execution_time being set at 10 seconds and the script
running for a total of 45 seconds before terminating? Does this
mean out of the 45 seconds, 35 were spent doing non-php related
activities like reading the id3 tags, inserting data into mysql etc..., while 10 were spent
doing php related activities like iterating the loop?
Is there a way I can calculate the execution time and print it to
the screen?
EDIT
Using the timer Dagon suggested, I called the getTime() function at the end of the loop, there were about 100+ iterations of the loop. Here is the output to my browser:
0.1163 seconds
0.8142 seconds
1.1379 seconds
1.5555 seconds
...
76.7847 seconds
77.2008 seconds
77.6071 seconds
Fatal error: Maximum execution time of 10 seconds exceeded in /websites/.../public_html/GetID3()/getid3/module.audio.mp3.php on line 505
<!-- Paste this at the top of the page -->
<?php
$mic_time = microtime();
$mic_time = explode(" ",$mic_time);
$mic_time = $mic_time[1] + $mic_time[0];
$start_time = $mic_time;
?>
<!-- Write Your script(executable code) here -->
enter code here
<!-- Paste this code at the bottom of the page -->
<?php
$mic_time = microtime();
$mic_time = explode(" ",$mic_time);
$mic_time = $mic_time[1] + $mic_time[0];
$endtime = $mic_time;
$total_execution_time = ($endtime - $start_time);
echo "Total Executaion Time ".$total_execution_time." seconds";
?>
i don't believe the script is actully running for more than 10 seconds, you need to put a proper timer in it
<!-- put this at the top of the page -->
<?php
function getTime() {
$timer = explode( ' ', microtime() );
$timer = $timer[1] + $timer[0];
return $timer;
}
$start = getTime();
?>
<!-- put other code and html in here -->
<!-- put this code at the bottom of the page -->
<?php
$end = getTime();
$time_took= round($end - $start,4).' seconds';
echo $time_took;
?>
This type of script should really be executed in the CLI environment, not in a php process executed by your web server. As per the manual docs on how the PHP command line environment differs from other PHP SAPIs:
PHP in a shell environment tends to be used for a much more diverse
range of purposes than typical Web-based scripts, and as these can be
very long-running, the maximum execution time is set to unlimited.
While it doesn't answer your question, it does solve your problem :)
Seems like you not only try to measure the script duration, but also try to limit it. And in your case max_execution_time is not what you want.
Basically, "The maximum execution time is not affected by system calls, stream operations etc" is correct. If you need to limit real-time script length, you need to implement your own time measurement. People usually write some benchmark class for it, which after all will be helpful in optimizing the script, but simple
$timer['start'] = time();
$timer['max_exec_time'] = 10; // seconds
at start and
if (time() > $timer['start'] + $timer['max_exec_time'])
break; // or exit; etc
at the end of the loop or anywhere else you want should be enough.
Related
I have the database of 5000 users.Already there is cron job running of Once in A Week.
Initially when users were among 100's things was working fine. Now when users reached to 5000 then what happening is Cron job starts to run for some 500-600 users and breaks down. I researched it and came to the conclusion that since HTTP follow stateless protocol, so whenever any new request comes then cron job break down in between. Now my question is How can I be able to run the Cron Job for all 5000 users without break down. Please help me.
I would firstly check your PHP error logs as you may be hitting time and memory limits. If you are performing database queries I would also check the logs to see if any limits are being hit.
PHP Memory Limit Increase
Increasing the memory limit will allow your script to run for longer if it currently running out of memory.
Option One
Update your php.ini file. Change 256 to suit your requirements.
php_value memory_limit 256MB
Option Two
Add ini_set('memory_limit', nM) to increase the memory limit, again change 256 to suit your requirements:
ini_set('memory_limit','256M');
PHP Execution Limit Increase
Add set_time_limit(n) to your PHP file to increase the current execution timeout (changing 300 to suit your requirements):
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Split Up Database Results (Batches)
If you are performing a query that returns a large number of rows, it could be timing out. You can try implementing the following example logic, which splits a big query into smaller chunks using limit and offset.
// Get total rows count
$total_rows = SELECT count(id) FROM users;
// Set a block size
$block_size = 300;
// Init starting offset
$block_offset = 0;
for($block = $block_offset; $block < $total_rows; $block = $block + $block_size) {
// Query
$data = SELECT * FROM table LIMIT $block_size OFFSET $block_offset;
// Loop through each row and process here
foreach($data as $row) {
.. code here
// You can also echo out something here so script is returning some data. Sometimes if nothing is sent back for a while it can cause issues (not generally for a cron though) e.g.
echo 'Done block ' . $block;
}
// Update block offset, so offset increments by block size (300)
$block_offset = $block_offset + $block_size ;
}
In Short: How to break the code after the code takes more than 53 seconds to execute, something like this:
while (5 < 200)
{
// do some work here, which will take around 1-6 seconds
if (code this loop is running for more than 53 seconds)
break;
}
If you want to know, why i want to do this:
Ok, this is What I am doing: I am copying data from 7 JSON pages on the web and inserting them into my MySQL database.
The code is something like this:
$start_time = microtime(TRUE); // Helps to count PHP script execution time
connect to database
$number = get inserted number into database
while ($number > ($number+20) )
{
7 Open JSON File link like this - "example.com/$number?api=xxxyyzz"
Use Prepared PDO statements to insert data into MySQL
}
// Code to count PHP script execution time
$end_time = microtime(TRUE);
$time_taken = $end_time - $start_time;
$time_taken = round($time_taken,5);
echo '<p>Page generated in '.$time_taken.' seconds.</p>';
So in my case, It takes around 5.2 seconds to complete one whole loop of adding all data. But some JSON files are empty, so it takes only 1.4 second to complete 1 loop, if they are empty.
So like that, I want to complete millions of loops (add Data from millions of JSON files). So if my code runs for 24/7, it will take me 1 month to complete my task.
But after the code runs for 90 seconds, i get this error:
I am using a CRON job to do the task. And looks like server gives the same error to CRON job.
So, I want to do the CRON job to run every 1 minute, so I do not get timed out error.
But I am afraid of this: What If the script added data in half rows and 1 mintue gets over, and it do not add data into other half rows. Then after the starting on the new minute, the code start from the next $number.
So, If i can break; out of the loop after 53 seconds (If the code starts another loop at 52 seconds, then break at the end of it, that will be around 58-59 seconds).
I mean, i will put the break; code just before the loop end (before }). So i do not exit the loop, while the data got inserted into half of the rows.
I guess that your PHP's max_execution_time is equal to 90 seconds, you can specify max_execution_time by set_time_limit, but I don't think it is a good approach for this.
Have a try pcntl or pthreads, it would save you a lot of time.
I have a problem in Php file Execution Time, first I tell what I did in PHP file.. I need to Generate CSV file so I wrote Query to fetch data from DB. and My query return 1 million record and I have use PHP fwrite() to write data into CSV file. but my script Taken 76.481406211853 sec to generate one csv file ehich contains 1 million of data. How to reduce this execution time? Sometime I got fatal error:
Fatal error: Allowed memory size of 1048576 bytes exhausted (tried to allocate 793601 bytes)
so, how to manage memory?
PHP code here:
<?php
// start script
$time_start = microtime(true);
echo_memory_usage();
//My logic here
// End of Script
echo 'Total execution time in seconds: ' . (microtime(true) - $time_start);
echo_memory_usage();
function echo_memory_usage() {
$mem_usage = memory_get_usage(true);
if ($mem_usage < 1024)
echo $mem_usage." bytes";
elseif ($mem_usage < 1048576)
echo round($mem_usage/1024,2)." kilobytes";
else
echo round($mem_usage/1048576,2)." megabytes";
echo "<br/>";
}
?>
My result is
512 kilobytes //Script start
1006.25 megabytes // script End
Total execution time in seconds: 76.481406211853
And successfully write CSV file with 1 million record.. but execution time too long and sometime throws Fatal Error. And also how to set ini_set()? Because I have to set ini_set('memory_limit', '1023M');
The reason of I set 1023, after script run memory usage is 1006.25, so I have to set 1023M? Is it correct way?
Execution time depends on the external connections, db connections and loops of same script. Try to do one step per time:
1. let the script just connect to db doing nothing
2. let the script read the data
3. let the script prepare the data for saving
Write down the execution time for every case to check where the fault is.
Option 1:
You can change memory_limit in your php.ini file.
Option 2:
Maybe you are using heavy objects or slow connections.
If you want to improve your code you can test it here:
// Code 1
$start = microtime(true);
/* Here the code 1 */
$timeTotal1 = microtime(true) - $start;
echo 'Code 1: '. $timeTotal1 .'<br>';
// Code 2
$start = microtime(true);
/* Here the code 2 */
$timeTotal2 = microtime(true) - $start;
echo 'Code 2: '. $timeTotal2 .'<br>';
// Winner
echo '<b>Winner:</b> '. ( $timeTotal1 > $timeTotal2 ? 'Code 2' : 'Code 1');
Normally you would do this via the php.ini of your webserver, but you can also do it like this:
ini_set('memory_limit','16M');
Values are specified like 16,32,64 always ending with the "M" for megabytes.
Or just remove the limit:
ini_set("memory_limit",-1);
I'm getting some weird data in my database (well it does not add up to what I'm trying to do)
Here's my for loop which counts from 1 to 52 in weeks and then makes a URL out if it to give to a function for processing
for ($week = 1; $week < 52 ; $week++) {
kalenderFetch("heren2kalender","http://kovv.mavari.be/xlsKalenderScheidsrechter.aspx?&reeks=H3A&week=".$week);
}
The function to which it post then processes the URL and extracts the data from the table and inputs it to a database:
For as far al I can tell it does not input all data from all weeks, it just randomly stops sometimes I get more data sometimes less.
I was getting an awful lot of error (like time-exceptions and memory-exceptions)
First of all, use:
for ($week = 1; $week <= 52 ; $week++)
Further, if you get time and memory exceptions, try:
set_time_limit(0);
memory_limit("128M");
Whether this works or not depends on your PHP settings. Maybe you also have to increase memory in your php.ini file.
Image : http://i40.tinypic.com/2hodx55.png
I have built a Network Interface Monitor using Php and SNMP , but now when i execute it on localhost i see my graph goes to origin(0) again and again (Please see the image) and also the speed on Y axis is wrong. At times it goes in Millons and Millions.
please can anyone tell me what is the problem in the code below
<?php
$int="wlan0";
session_start();
$rx0 =snmpget('localhost','public','.1.3.6.1.2.1.2.2.1.10.3');
$tx0 =snmpget('localhost','public','.1.3.6.1.2.1.2.2.1.16.3');
sleep(5);
$rx1 =snmpget('localhost','public','.1.3.6.1.2.1.2.2.1.10.3');
$tx1 =snmpget('localhost','public','.1.3.6.1.2.1.2.2.1.16.3');
$rx0 = substr($rx0, 11);
$tx0 = substr($tx0, 11);
$rx1 = substr($rx1, 11);
$tx1 = substr($tx1, 11);
$tBps = $tx1 - $tx0;
$rBps = $rx1 - $rx0;
$round_rx=$rBps;
$round_tx=$tBps;
$time=date("U")."000";
$_SESSION['rx'][] = "[$time, $round_rx]";
$_SESSION['tx'][] = "[$time, $round_tx]";
$data['label'] = $int;
$data['data'] = $_SESSION['rx'];
if (count($_SESSION['rx'])>60)
{
$x = min(array_keys($_SESSION['rx']));
unset($_SESSION['rx'][$x]);
}
echo '{"label":"'.$int.'","data":['.implode($_SESSION['rx'], ",").']}';
?>
What you are seeing here is a classic case of polling a counter faster than its refresh interval. It is often the case that counters (in this case, interface counters) are updated every few seconds (10-15 seconds is a common value).
If the counter updates every 15 seconds, and you ask for data every 5 seconds, then you will receive the same value once or twice in a row (depending on latency, processing time, etc.). If you receive the same value twice, then you will see a zero value for the delta (which is what your image shows).
There are two ways to get around this:
Ask for data less frequently than the counters are updated (30-second polling usually works fine). Obviously, if you can find out the exact refresh interval, then you can use that.
Modify the configuration of your equipment to refresh its counters faster. Sometimes this is possible, sometimes it is not; it just depends on the manufacturer, the software, and what has been implemented.
For Net-SNMP "snmpd" daemons, you can walk NET-SNMP-AGENT-MIB::nsCacheTable (1.3.6.1.4.1.8072.1.5.3) for more information about its internal caching of counters.
For example:
snmpwalk -v2c -cpublic localhost 1.3.6.1.4.1.8072.1.5.3 | grep .1.3.6.1.2.1.2.2
NET-SNMP-AGENT-MIB::nsCacheTimeout.1.3.6.1.2.1.2.2 = INTEGER: 3
NET-SNMP-AGENT-MIB::nsCacheStatus.1.3.6.1.2.1.2.2 = INTEGER: cached(4)
Here, you can see that my particular box is caching IF-MIB::ifTable (.1.3.6.1.2.1.2.2), which is the table that you're using, every three seconds. In my case, I would not ask for data any more often than every three seconds. NET-SNMP-AGENT-MIB::nsCacheTimeout (.1.3.6.1.4.1.8072.1.5.3.1.2) is marked as read-write, so you might be able to issue an a "set" command to change the caching duration.