Codeigniter Mysql Don't complete cycle - php

I'm having some kind of problem and i don't know how i solve it.
The problem is, that i have 500 registers in another database, and the query to get them its has them all, but when i do the cycle to insert/update them in m y database the foreach cycle don't reach the end and it doesn't show any error.
Here is the cycle:
foreach ($this->getMetaEmpregado()->getAll()->result_array() as $modelData) {
$oEmpregado = $this->getEmpregadoObject($modelData);
$arrayEmpWhere = array(
'idempregado' => $oEmpregado->getIdEmpregado(),
'idsociedade' => $oEmpregado->getIdSociedade(),
'nif' => $oEmpregado->getNif()
);
if ($this->getWayUtilizador()->get($arrayEmpWhere)->num_rows() == 0) {
$countInsert++;
$this->insertNewEmp($oEmpregado);
} else {
$countUpdate++;
$this->UpdateEmp($oEmpregado);
}
}
echo "Total Updates: $countUpdate Total Inserts: $countInsert<br>";
It down't shoe the echo in the end because its stops around the 260 register, but sometimes it reaches the 300 others not even 100.
Regards,Elkas

It seems that the script is taking too long, You need to increase the Max_Execution_Time in your php.ini.
You can use ini_set method to change it, like this:
ini_set('max_execution_time', 600); //600 seconds = 10 minutes
Hope this helps.

Related

Keep data from reaching below 0

Im working on a browser game where i need to handle energy. The user spends and gets energy but i have come across an issue that i have tried to fix, When a user has for example 20 Energy and he does something that spends 30 energy it's going through and ofcourse i understand because i am checking the value to not go below 0 like this
if ( 0 < $resultenergy[0]['energy'])
This means that unless the users energy is at 0 he can spend whatever he want. What would be a better way to check the data so it alerts when "trying" to go under 0 ? I appreciate all help and if you have an idea of a different approach im all ears.
Do the calculation once.
$remaining_energy = $current_energy - $energy_cost;
Then check the result, and act accordingly.
if ($remaining_energy >= 0) {
$current_energy = $remaining_energy;
// do whatever required the energy
// send the client a succes indicator
} else {
// don't change current energy
// send the client an error indicator
}
You could subtract and then evaluate. e.g.
if ( 0 < $resultenergy[0]['energy'] - $energyCost) {
$resultenergy[0]['energy'] = $resultenergy[0]['energy'] - $energyCost;
return true
} else {
return false;
}
$energyCost is the cost of whatever action they're trying to do.

Http Request alternatives

I have a requirement, calling nearly 50+ URLs to fetch the data. I tried in many ways. Finally Stick with HTTPRequestPool. Because it was faster than any other methods. I tried below methods.
1.file_get_contents($url) - took about 2 mins to complete all 50 requests
2.curl - took about 45 to 50 seconds to complete all 50 requests
3.HTTPRequest - took about 20 to 30 seconds to complete all 50 requests
4.HttpRequestPool - took about 10 to 15 seconds to complete all 50 requests
But still 10 to 15 seconds is also slow, when compare to my competitors. I want to complete all the 50 requests in a max of 3 to 6 seconds. How can I achieve my target time frame. Is there any alternatives apart from the above in PHP. If yes can anybody please tell me those.
Please help me to resolve this issue.
Thanks,
Sudhakar
Below is my code.
for($i=0;$i<$url_cnt;$i++)
{
$req[$i] = new HttpRequest($URLs[$i], HttpRequest::METH_GET,array('redirect'=>4,'timeout'=>5,'connecttimeout'=>7));
}
$pool = new HttpRequestPool();
$pool->enablePipelining(true);
//Attaching the Request
try {
foreach($req as $r){
$pool->attach($r);
}
} catch (HttpRequestPoolException $ex) { }
try {
$pool->send();
}
catch (HttpRequestPoolException $x) {}
foreach($pool as $request)
{
$url_contents[] = $request->getResponseBody();
}
foreach($req as $r)
{
$pool->detach($r);
}
$pool->__destruct();
unset($pool);

PHP long poll fails

I have this loop to long poll:
$time = time(); //send a blank response after 15sec
while((time() - $time) < 15) {
$last_modif = filemtime("./logs.txt");
if($_SESSION['lasttime'] != $last_modif) {
$_SESSION['lasttime'] = $last_modif;
$logs = file_get_contents("./logs.txt");
print nl2br($logs);
ob_flush();
flush();
die();
}
usleep(10000);
}
problem is: the "if" condition is never entered in the middle of the while loop, even if logs.txt is modified. I have to wait 15sec till the next call to this file to obtain the updated content (so it becomes a regular, "setTimeout style" AJAX polling, not a long-poll). Any idea why ?
This is because of the filemtime() function : its results are cached. Thus each time your loop is executed the timestamp's the same for 15 seconds.
I have not tried it myself, but according to w3cschools :
The result of this function are cached. Use clearstatcache() to clear the cache.
Hope that helps !

PHP script supposed to take 6 hours but stops after 30 minutes

I've made a basic web crawler to scrape info from a website and I estimated that it should take around 6 hours (multiplying the number of pages by how long it takes to grab the info) but after around 30-40 minutes of looping through my function, it stops working and I only have a fraction of the info I wanted. When it is working, the page looks like it's loading and it outputs where it's up to on the screen, but when it stops, the page stops loading and the input stops showing.
Is there anyway that I can keep the page loading so I don't have to start it again every 30 minutes?
EDIT: Here's my code
function scrape_ingredients($recipe_url, $recipe_title, $recipe_number, $this_count) {
$page = file_get_contents($recipe_url);
$edited = str_replace("<h2 class=\"ingredients\">", "<h2 class=\"ingredients\"><h2>", $page);
$split = explode("<h2 class=\"ingredients\">", $edited);
preg_match("/<div[^>]*class=\"module-content\">(.*?)<\\/div>/si", $split[1], $ingredients);
$ingred = str_replace("<ul>", "", $ingredients[1]);
$ingred = str_replace("</ul>", "", $ingred);
$ingred = str_replace("<li>", "", $ingred);
$ingred = str_replace("</li>", ", ", $ingred);
echo $ingred;
mysql_query("INSERT INTO food_tags (title, link, ingredients) VALUES ('$recipe_title', '$recipe_url', '$ingred')");
echo "<br><br>Recipes indexed: $recipe_number<hr><br><br>";
}
$get_urls = mysql_query("SELECT * FROM food_recipes WHERE id>3091");
while($row = mysql_fetch_array($get_urls)) {
$count++;
$thiscount++;
scrape_ingredients($row['link'], $row['title'], $count, $thiscount);
sleep(1);
}
What's your php.ini's set_time_limit option value? it must be set to 0 in order for script to be able to work infinitely
Try adding
set_time_limit(0);
at the top of your script.

Prevent timeout during large request in PHP

I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.
<?php
include 'echove.php';
$e = new Echove(
'xxxxx',
'xxxxx'
);
// Read Video IDs
# Define our parameters
$params = array(
'fields' => 'id,referenceId'
);
# Make our API call
$videos = $e->findAll('video', $params);
//print_r($videos);
foreach ($videos as $video) {
//print_r($video);
$ref_id = $video->referenceId;
$vid_id = $video->id;
switch ($ref_id) {
case "":
$metaData = array(
'id' => $vid_id,
'referenceId' => $vid_id
);
# Update a video with the new meta data
$e->update('video', $metaData);
echo "$vid_id updated sucessfully!<br />";
break;
default:
echo "$ref_id was not updated. <br />";
break;
}
}
?>
Thanks!
Try the set_time_limit() function. Calling set_time_limit(0) will remove any time limits for execution of the script.
Also use ignore_user_abort() to bypass browser abort. The script will keep running even if you close the browser (use with caution).
Try sending a 'Status: 102 Processing' every now and then to prevent the browser from timing out (your best bet is about 15 to 30 seconds in between). After the request has been processed you may send the final response.
The browser shouldn't time out any more this way.

Categories