While Loop, Maximum execution time of 30 seconds exceeded - php

I need to call the script function open() when the counter exceeds 5, but when I load my webpage, it shows me a fatal error : Maximum execution time of 30 seconds exceeded. Is that my while loop problem?
$result = mysqli_query($dbconnect, "SELECT id, temp, hum, lum FROM weather ORDER BY id DESC LIMIT 5");
while($row = mysqli_fetch_assoc($result)){
$weather["temp"] = $row["temp"];
$weather["lum"] = $row["lum"];
$tempcounter = 0;
while ($weather["temp"] > 27) {
$tempcounter++;
}
if($tempcounter > 5){
echo"<script>open()</script>";
}}

Your while loop is infinite. The loop will start if $weather["temp"] is bigger than 27 and inside you increment the tempcounter but never change $weather["temp"] so it will always be bigger than 27 and never get out of the loop.
You could put $weather["temp"] in a variable. Then inside the loop increment it also. But most importantly you need to reverse your condition. If you increment you need to check < and if you decrement you can check >.
If you increment:
$weatherTemp = $weather["temp"];
while ($weatherTemp < 27) {
$weatherTemp++;
}
If you decrement:
$weatherTemp = $weather["temp"];
while ($weatherTemp > 27) {
$weatherTemp--;
}
And then you can declare $tempcouter like this:
$tempcounter = $weatherTemp - $weather["temp"]; //Temperature needed to get to 27
if($tempcounter > 5){
echo"<script>open()</script>";
}
Code is not tested it might not work has it is.
If what you want to do is get the difference between 27 and your temperature you are not using the right logic here. You could simply take 27 minus your temperature form the query. If the result is negative just change it to positive and you get the difference.

I think it is time over problem.
You can fix this problem by 2 methods.
Set max_execution_time to large value ex. 300 or larger
Begin of your php script call set_time_limit function
set_time_limit(0);

In this loop:
$tempcounter = 0;
while ($weather["temp"] > 27) {
$tempcounter++;
}
You're not updating $weather["temp"], so it will loop forever if that condition is satisfied.

Related

Checking previous elements of array when looping through

I have a (relatively) simple platform which handles running competitions. When calculating the results, the first placed finisher gets 20 points, second placed 19 points, and so on, down to a minimum of 3 points just for taking part.
The existing loop looks like this - results is an array of objects, ordered by finish time (ascending) :
$pos = 1;
$posscore = 20;
foreach($results as $result) {
$result->position = $pos;
$result->race_points = $posscore;
$result->save();
$pos += 1;
if($posscore > 3) {
$posscore -= 1;
}
// Other, unrelated, code removed
}
The problem arises when it handles two (or more) finishers with the same finish time. The first one in the array will get a higher finish position (ie. lower number) and higher points than the second one, when the ideal outcome would be for both to get the same points and finishing position, and it then jump one
So at the moment if the 4th and 5th finishers both finish at the same time it will give :
1st / 20
2nd / 19
3rd / 18
4th / 17
5th / 16
6th / 15
when the actual outcome should be
1st / 20
2nd / 19
3rd / 18
4th / 17
4th / 17
6th / 15
How would I best go about maintaining a note of, or otherwise accessing, the previous finish times whilst iterating through the loop?
Try this code. But of course you need to change arrived_time to yours
<?php
$maxScore = 20;
foreach($results as $key => $result) {
$result->position = $key + 1;
$result->race_points = $maxScore - $result->position + 1;
if (isset($result[$key-1])){
$previous = $result[$key-1];
if ($previous->arrived_time == $result->arrived_time){
$result->race_points = $previous->race_points;
}
}
if ($result->race_points <=3){
$result->race_points = 3;
}
$result->save();
}
P.S.: It is not optimized code. But I guess, if you have only 20 items, or close to it, no need to overengineer here.

Simple For-Loop results in memory exhausted error

After working more than 4 hours on localizing the problem, I finally found the part of my code that leads to the following error:
PHP Fatal error: Allowed memory size of 268435456 bytes exhausted
(tried to allocate 8388608 bytes) in
I'm fetching data from my database. The query returns an array of data with 33 entries (not that much...)
I iterate over that array with a for-loop to get a time difference and flag the array if the difference is too big (simple if-else).... Its quite difficult to explain, but I guess you get what I mean by looking at my code... its pretty simple:
for($i = 0; $i <= count($customerList); $i++) {
$date1 = date_create($customerList[$i]["lastUpdate"]);
$interval = date_diff($date1, $date2);
$min=$interval->format('%i');
$sec=$interval->format('%s');
$hour=$interval->format('%h');
$mon=$interval->format('%m');
$day=$interval->format('%d');
$year=$interval->format('%y');
$date3 = date("H");
if(($year > 0) || ($mon > 0) || ($day > 0)) {
$customerList[$i]["flag"] = 1;
}
else {
if($date3 >= 6 && $date3 <= 21) {
if(($hour > 0) || ($min > 29)) {
$customerList[$i]["flag"] = 1;
}
}
}
}
The variable $date2 is defined before the for-loop with the following line of code:
$date2 = date_create($lastUpdateTime["lastUpdate"]);
Sadly I really don't see how this code can result in that error, since in my opinion this shouldn't really use that much memory...
Im 100 % sure that the error comes from this for-loop, since when I remove it the error is gone. Also, when I place this for-loop inside other views, it results in the same error.
I solved the problem by increasing the maximum memory limit of PHP, still I wonder how this for-loop can cause that problem.
I would really appreciate any explanations, since I don't find anything on the internet...
Oh, I'll just add the query here so you can see that I don't fetch a lot of data (the site handels much bigger queries without any problems...):
SELECT c.id, c.type, c.name, c.environment, cd.customer, MAX(cd.createdAt) AS lastUpdate FROM customerdata AS cd JOIN customers c ON cd.customer = c.id GROUP BY customer ORDER BY c.name ASC
Thank you!
Your for loop condition is $i<=count($customerList) instead of $i<count($customerList). This means your access of $customerList[$i] goes one past the end of the array, creating a new element along the way. Now count($customerList) is one larger, so the loop takes another iteration, ad infinum, thereby using more and more RAM for the growing array.
You should also get a warning message that the element does not (yet) exist when first accessing it, when you read "lastUpdate". Later, when you set the "flag" element, you create the array element.
To make the loop conditions easier to read and prevent typos like this, you could use a foreach loop:
foreach ($customerList as $customer) {
$date1 = date_create(customer["lastUpdate"]);
$interval = date_diff($date1, $date2);
...

execute a script for minimum 1 second

Here is my pseudo code
For loop repeat 30 times
Call and execute API script which takes less than 1 second
I want the system to sleep for some moment(less than 1 second)
Loop end
I want the above script should finish execution in minimum of 30 seconds (2-3 seconds longer is not a problem but must not less).
Can you write a sample code for me(only time related)
Here's a little program that implements the essence of what I believe you are looking for:
$minTime = 30;
$perSec = 10000;
$start = microtime(TRUE)*$perSec;
print("This must take $minTime seconds.\n");
// Replace this loop with what you actually need done
for ($i=0; $i<10; $i++) { print("Doing stuff #{$i}... "); sleep(1); }
// Check difference in timestamps to calculate remaining time
$remain=($minTime*$perSec-(microtime(TRUE)*$perSec-$start))/$perSec;
var_dump(['remain' => $remain]);
if ($remain > 0) {
print("\n$remain seconds remaining... ");
sleep($remain);
}
print("DONE\n");
This code has been updated to use microtime() instead of time(), which only permitted intervals of integer seconds. On systems that support gettimeofday() we can work with microseconds, instead.

Limit the number of results in a for loop in reverse order

I created a for loop to occupy a table on my site. Originally I limited the number of results by using if (i++ <= 10) {.
However, I adjusted the for loop so it is reverse order (most recent item first). Here is my code:
for ($i=count($shipmentInfoArray['ShipmentList'])-1; $i >= 0; $i--) {
$shipmentUrl = BASE_URL . "shipment.php?ShipmentID=" . $shipmentInfoArray['ShipmentList'][$i]['ShipmentID'];
Obviously, since it's counting down instead of up, my original limit code won't work.
I tried this, but it didn't work:
if ($i-- >= (count($shipmentInfoArray['ShipmentList']) - (count($shipmentInfoArray['ShipmentList'])-10)) {
My hope was it would take the total number of entries, then subtract the same number by the number I want to display (100 - (100-10) = 10 entries.
I also tried adding a 'break' at the end, assuming that this didn't work because the "count" wasn't finished yet. However that didn't work either :/
Any suggestions? Thanks guys!
I feel you're making this over-complicated, and confusing yourself as you try to reason through the resulting complexity. There are many approaches, I don't know that one is any better than another...
You could initialize a second counter (say j) to 0 and increment it on each iteration, as one simple solution.
Something like the following should work:
// Set the limit to the max number of results
$LIMIT = 10;
foreach (array_reverse($shipmentInfoArray['ShipmentList']) as
$shipmentIndex => $shipment) {
if ($shipmentIndex >= $LIMIT) {
break;
}
// Use $shipment
}

PHP File Writing (fwrite / file_put_contents) speed/optimization

So, i have a database with big data. The data to use is currently about 2,6 GB.
All the data need to be written to a text file for later use in another scripts.
The data is being limited per file and splitted in multiple parts. 100 results per file (around 37MB each file). Thats about 71 files.
The data is json data that is being serialized and then encrypted with openssl.
The data is correctly being written to the files, untill the max execution time is reached after 240 seconds. That's after about 20 files...
Well, i can just extend that time, but thats not the problem.
The problem is the following:
Writing file 1-6: +/- 5 seconds
Writing file 7-8: +/- 7 seconds
Writing file 9-11: +/- 12 seconds
Writing file 12-14: +/- 17 seconds
Writing file 14-16: +/- 20 seconds
Writing file 16-18: +/- 23 seconds
Writing file 19-20: +/- 27 seconds
Note: time is needed time per file
In other words, with every file im writing, the writing time per file goes significantly up, what causes the script to be slow offcourse.
The structure of the script is a bit like this:
$needed_files = count needed files/parts
for ($part=1; $part<=$needed_files; $part++) { // Loop throught parts
$query > mysqli select data
$data > json_encode > serialize > openssl_encrypyt
file_put_contents($filename.$part, $data, LOCK_EX);
}
WORKING CODE AFTER HELP
$notchDetails = mysqli_query($conn, "SELECT * FROM notches WHERE projectid = ".$projectid."");
$rec_count = 0;
$limit = 100;
$part = 1;
while ($notch = mysqli_fetch_assoc($notchDetails)) {
$data1[] = $notch;
$rec_count++;
if ($rec_count >= $limit) {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
$part++;
$rec_count = 0;
$data = $data1 = "";
}
}
if ($data1 != "") {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
}
mysqli_free_result($notchDetails);
Personally I would have coded this as a single SELECT with no LIMIT and then based on a $rec_per_file = ?; write the outputs from within the single while get results loop
Excuse the cryptic code, you didnt give us much of a clue
<?php
//ini_set('max_execution_time', 600); // only use if you have to
$filename = 'something';
$filename_suffix = 1;
$rec_per_file = 100;
$sql = "SELECT ....";
Run query
$rec_count = 0;
while ( $row = fetch a row ) {
$data[] = serialize > openssl_encrypyt
$rec_count++;
if ( $rec_count >= $rec_per_file ) {
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix,
$json_string,
LOCK_EX);
$filename_suffix++; // inc the suffix
$rec_count = 0; // reset counter
$data = array(); // clear data
// add 30 seconds to the remaining max_execution_time
// or at least a number >= to the time you expect this
// while loop to get back to this if statement
set_time_limit(30);
}
}
// catch the last few rows
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix, $data, LOCK_EX);
Also I am not sure why you would want to serialize() and json_encode()
I had a thought, based on your comment about execution time. If you place a set_time_limit(seconds) inside the if inside the while loop it might be cleaner, and you would not have to set ini_set('max_execution_time', 600); to a very large number, which if you have a real error in here may cause PHP continue processing for a long time before kicking the script out.
From the manual:
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When called, set_time_limit() restarts the timeout counter from zero. In other words, if the timeout is the default 30 seconds, and 25 seconds into script execution a call such as set_time_limit(20) is made, the script will run for a total of 45 seconds before timing out.

Categories