PHP Script Timeout - php

I have a custom script for a bulletin board system that counts the number of threads a user has made, and updates a column accordingly. This works fine, however with 100,000+ users, it times out when running it for the first time.
I've tried adding the following before the query, but it still times out (500 error).
set_time_limit(0);
ignore_user_abort(true);
Additional: I'm using this script on my vps.
Query:
set_time_limit(0);
ignore_user_abort(true);
$db->write_query("ALTER TABLE `".TABLE_PREFIX."users` ADD `numthreads` int(10) unsigned NOT NULL default '0'");
// load users into an array to count number of threads
$query = $db->simple_select("users", "uid");
while($user = $db->fetch_array($query))
{
$users[$user['uid']] = $user;
}
foreach($users as $user)
{
$query = $db->simple_select("threads", "COUNT(tid) AS threads", "uid = '{$user['uid']}'");
// get total number of threads
$numthreads = intval($db->fetch_field($query, "threads"));
$db->update_query("users", array("numthreads" => $numthreads), "uid = '{$user['uid']}'");
}

Use this:
ini_set('max_execution_time', 0);
Inplace of:
set_time_limit(0);
ignore_user_abort(true);
You can also edit php.ini:
max_execution_time = 60; //Maximum execution time of each script, in seconds
max_input_time = 60; //Maximum amount of time each script may spend parsing request data
Hope this helps.

First you should separate your ALTER statement. Execute the ALTER first and then do the rest. Alter table can be expensive in time if you have a big table. You can run it manually, using phpmyadmin or via shell (even better since there's no php timeout). which will give you the ability to not timeout.
Then remove the ALTER from the script and run it.
and then use:
$query = $db->simple_select("users", "uid");
while($user = $db->fetch_array($query))
{
$query = $db->simple_select("threads", "COUNT(tid) AS threads", "uid = '{$user['uid']}'");
$numthreads = intval($db->fetch_field($query, "threads"));
$db->update_query("users", array("numthreads" => $numthreads), "uid = '{$user['uid']}'");
}

try this
ini_alter ("max_execution_time", 600000000);
$tmp = ini_get ( "max_execution_time" );
set_time_limit(600000000);

A common reason that people are not able to change the max execution time is that their host does not allow them to. Make sure to find out that they do not block this.

Related

PHP While Loop with MySQL Has Inconsistent Execution Time, How Can I Fix This?

My problem is simple. On my website I'm loading several results from MySQL tables inside a while loop in PHP and for some reason the execution time varies from reasonably short (0.13s) or to confusingly long (11s) and I have no idea why. Here is a short version of the code:
<?php
$sql =
"SELECT * FROM test_users, image_uploads
WHERE test_users.APPROVAL = 'granted'
AND test_users.NAME = image_uploads.OWNER
".$checkmember."
".$checkselected."
ORDER BY " . $sortingstring . " LIMIT 0, 27
";
$result = mysqli_query($mysqli, $sql);
$data = "";
$c = 0;
$start = microtime(true);
while($value = mysqli_fetch_array($result)) {
$files_key = $value["KEY"];
$file_hidden = "no";
$inner_query = "SELECT * FROM my_table WHERE KEY = '".$files_key."' AND HIDDEN = '".$file_hidden."'";
$inner_result = mysqli_query($mysqli, $inner_query);
while ($row = mysqli_fetch_array($inner_result)) {
// getting all variables with row[n]
}
$sql = "SELECT * FROM some_other_table WHERE THM=? AND MEMBER=?";
$fstmt = $mysqli->prepare($sql);
$fstmt->bind_param("ss", $value['THM'], 'username');
$fstmt->execute();
$fstmt->store_result();
if($fstmt->num_rows > 0) {
$part0 = 'some elaborate string';
} else {
$part0 = 'some different string';
}
$fstmt->close();
// generate a document using the gathered data
include "../data.php"; // produces $partsMerged
// save to data string
$data .= $partsMerged;
$c++;
}
$time_elapsed_secs = substr(microtime(true) - $start, 0, 5);
// takes sometimes only 0.13 seconds
// and other times up to 11 seconds and more
?>
I was wondering where the problem could be.
Does it have to do with my db connection or is my code flawed? I haven't had this problem at the beginning when I first implemented it but since a few months it's behaving strangely. Sometimes it loads very fast other times as I said it takes 11 seconds or even more.
How can I fix this?
There's a few ways to debug this.
Firstly, any dynamic variables that form part of your query (e.g. $checkmember) - we have no way of knowing here whether these are the same or different each time you're executing the query. If they're different then each time you are executing a different query! So it goes without saying it may take longer depending on what query is being run.
Regardless of the answer, try running the SQL through the MySQL command line and see how long that query takes.
If it's similar (i.e. not an 11 second range) then the answer is it's nothing to do with the actual query itself.
You need to say whether the environment you're running this in is a web server, e.g. accessing the PHP script via a browser, or executing the script via a command line.
There isn't enough information to answer your question. But you need to at least establish some of these things first.
The rule of thumb is that if your raw SQL executes on a MySQL command line in a similar amount of time on subsequent attempts, the problem area is elsewhere (e.g. connection to a web server via a browser). This can be monitored in the Network tab of your browser.

Cassandra datastax | TimeoutException

I have a problem with cassandra I have the following error.
I link a picture
Code syntaxe :
public function find($db_table = null, $db_id = null) {
$filter = "";
$return = array();
$cluster = $this->cluster();
$session = $cluster->connect($this->keyspace);
if(isset($db_table)) {
$filter .= " WHERE db_table like '%".$db_table."%' ";
if($db_id != null) {
$filter .= " AND db_id = '".$db_id."' ALLOW FILTERING";
}
}
$query = new Cassandra\SimpleStatement("SELECT * FROM ".$this->keyspace.".log $filter;");
$result = $session->executeAsync($query);
$rows = $result->get();
Cassandra Error picture
You should not use "allow filtering" unless you know what you are doing.
SELECT * FROM prod.log WHERE db_id = 13913 AND db_table LIKE '%%' product LIMIT 5000 is timing out, as you seem to have a lot of entries in the DB and allow filtering is doing a full table scan.
You should adapt the table design to match your queries.
A more detail can be found here.
https://docs.datastax.com/en/developer/php-driver/1.2/api/Cassandra/Cluster/class.Builder/
Using withConnectTimeout may help avoid a TimeoutException
$cluster = $this->cluster()->withConnectTimeout(60);
you may increase the timeout value Although you update little more by changing values in /etc/cassandra/cassandra.yaml
similar to below --
sudo nano /etc/cassandra/cassandra.yaml (for editing of cassandra.yaml file)
# How long the coordinator should wait for read operations to complete
read_request_timeout_in_ms: 50000
# How long the coordinator should wait for seq or index scans to complete
range_request_timeout_in_ms: 100000
# How long the coordinator should wait for writes to complete
write_request_timeout_in_ms: 20000
# How long the coordinator should wait for counter writes to complete
counter_write_request_timeout_in_ms: 50000
# How long a coordinator should continue to retry a CAS operation
# that contends with other proposals for the same row
cas_contention_timeout_in_ms: 10000
# How long the coordinator should wait for truncates to complete
# (This can be much longer, because unless auto_snapshot is disabled
# we need to flush first so we can snapshot before removing the data.)
truncate_request_timeout_in_ms: 600000
# The default timeout for other, miscellaneous operations
request_timeout_in_ms: 100000
# How long before a node logs slow queries. Select queries that take longer than
# this timeout to execute, will generate an aggregated log message, so that slow queries
# can be identified. Set this value to zero to disable slow query logging.
slow_query_log_timeout_in_ms: 5000
using "LIKE" on cassandra i dont think so :(
your query :( try do something more clean instead use ".$db_table." do the properly bind use . ? and then inside your exec(query,['value'])
what do your mean ? with SELECT * FROM ".$this->keyspace.".log
this is not a query ! if you use this in php, the syntax definitely this is completely wrong.
you wrote
SELECT * FROM keyspace_name.log WHERE table like 'wherever' and id = 'something' :(
and worse select ALL
this never gonna happen
u use $this for invoke your cluster from where ? did this come from ?
ok i can enumerate 10 different reasons to your code doesn't work, but this is not my goal i want help u so just try something simple.
____ good_____
<?
$cluster = Cassandra::cluster()
->withContactPoints('127.0.0.1')
->build();
$session = $cluster->connect("your_k_space");
$table_list = $session->execute("SELECT table_name FROM system_schema.tables WHERE keyspace_name = 'your_k_space'");
if (in_array($db_table, $table_list)) {
$options = array('arguments' => [$db_table,$db_id]);
$result = $session->execute("SELECT * FROM ? WHERE db_id = ? ALLOW FILTERING",$options);
foreach ($result as $key => $value) print_r($value);
}else{
die('table not found');
}

Yii large SQL queries consumes a large amount of memory

I am using Yii 1.1.14 with php 5.3 on centos 6 and I am using CDbCommand to fetch data from a very large table, the result set is ~90,000 records over 10 columns I am exporting it to a csv file and the file size is about 15MB,
the script always crashed without any error messages and only after some research I figured out that I need to raise the memory_limit in php.ini in order to be able to execute the script successfully.
The only problem is that for a successful execution I had to raise the memory limit to 512MB(!) which is a lot! and if 10 users will be executing the same script my server will not respond very well...
I was wondering if anyone might know of a way to reduce memory consumption on sql queries with Yii?
I know I can split the query to multiple queries using limits and offsets, but it just doesn't seem logical that a 15MB query will consume 512MB.
Here is the code:
set_time_limit(0);
$connection = new CDbConnection($dsn,$username,$password);
$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$result = $command->queryAll(); //this is where the script crashes
print_r($result);
Any ideas would be greatly appreciated!
Thanks,
Instead of using readAll that will returns all the rows in a single array (the real memory problem is here), you should simply use a foreach loop (take a look at CDbDataReader), e.g. :
$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$rows = $command->query();
foreach ($rows as $row)
{
}
EDIT : Using LIMIT
$count = Yii::app()->db->createCommand('SELECT COUNT(*) FROM TEST_DATA')->queryScalar();
$maxRows = 1000:
$maxPages = ceil($count / $maxRows);
for ($i=0;$i<$maxPages;$i++)
{
$offset = $i * $maxRows;
$rows = $connection->createCommand("SELECT * FROM TEST_DATA LIMIT $offset,$maxRows")->query();
foreach ($rows as $row)
{
// Here your code
}
}

Why str_shuffle always generate similar patterns?

I am trying to generate 1500 authentication code using the following code:
<?php
include "../include/top.php";
set_time_limit(0);
ini_set("memory_limit", "-1");
$end=0;
while ($end<1500)
{
//generate an authentication code
$string="ABCDEFGHJKLMNPQRSTUVWXYZ123456789";
$string= substr(str_shuffle($string),5,8) ;
//check whether generated code already exist
$query = "select count(*) from auth where code = '$string' ";
$stmt = prepare ($query);
execute($stmt);
$bind = mysqli_stmt_bind_result($stmt, $count);
check_bind_result($bind);
mysqli_stmt_fetch($stmt);
mysqli_stmt_free_result($stmt);
//If generated code does not already exist, insert it to Database table
if ($count == 0)
{
echo $string."<br>";
$query = "insert into auth (Code) values ('$string')";
$stmt = prepare ($query);
execute($stmt);
$end++;
}
}
?>
It generated and inserted 1024 codes in database and printed 667 codes in browser within 15 seconds and the browser continue loading without inserting/printing further codes, until I close the browser window after half an hour.
After that when opening any web page in browser from the WAMP, It shows like the browser is loading and does not show the content. That is, I need to restart the WAMP after running this script before opening any web pages.
I have tried this many times.
Why the script does not generate 1500 codes and always stop when it reach the count 667/1024?
UPDATE
As an experiment, I have added an ELSE clause to the IF condition and wrote the code to print "Code Already Exist" in ELSE clause. And ran the script with an empty(truncated) copy of the same table , then it print and inserted 1024 codes and after that, it print "Code Already Exist" continuously (Around 700 000+ entries within 5 minutes and continuing). And again when running the script with table having only 1024 rows, It doesn't print or insert even a single code. Instead it infinitely continuing print "Code Already Exist".
Another thing I observed that the very first 1024 iteration of the WHILE loop passes the IF condition(if the table is empty). And all the subsequent iteration failed the IF condition.
I dont think that randomiser in str_shuffle is up to this.
If I run it once I get 1024 unique codes and then it just generates duplicates. If I then restart it, it will generate another 976 unique codes giving a total of 2000 codes on the database.
I therefore assume that the randomiser used by str_shuffle() needs a reset to accomplish the generation of the required 1500 unique codes.
Try this minor modification, it will at least stop the execution after 15000 failed attempts at generating a unique code.
Basically I think you have to come up with a much better randomisation mehanism.
<?php
include "../include/top.php";
set_time_limit(0);
ini_set("memory_limit", "-1");
$end=0;
$dups=0;
while ($end<1500 && $dups < 15000)
{
//generate an authentication code
$string="ABCDEFGHJKLMNPQRSTUVWXYZ123456789";
$string= substr(str_shuffle($string),5,8) ;
//check whether generated code already exist
$query = "select count(*) from auth where code = '$string' ";
$stmt = prepare ($query);
execute($stmt);
$bind = mysqli_stmt_bind_result($stmt, $count);
check_bind_result($bind);
mysqli_stmt_fetch($stmt);
mysqli_stmt_free_result($stmt);
//If generated code does not already exist, insert it to Database table
if ($count == 0) {
echo $string."<br>";
$query = "insert into auth (Code) values ('$string')";
$stmt = prepare ($query);
execute($stmt);
$end++;
} else {
$dups++
echo "DUPLICATE for $string, Dup Count = $dups<br>";
}
}
?>
Why you need to restart = you set php timeout so it never times out.
I don't see any specific coding errors. The use of the str_shuffle for creating a authentication code is peculiar because it will prevent duplicated letters, making a much smaller possible range of values. So it may just be repeating the patterns.
Try something like this instead, so that you are shuffling the shuffled string:
$origstring= str_shuffle("ABCDEFGHJKLMNPQRSTUVWXYZ123456789");
while ($end<1500 && $dups < 15000)
{
$origstring = str_shuffle( $origstring);
$string = substr( $newstring, 5, 8);
Or, use something like this to generate the string so that you can have duplicates, creating a much larger range of possible values:
$characters = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890';
for ($i = 0; $i < 8; $i++)
{
$code .= $characters[mt_rand(0, 35)];
}
You have to fine-tune some variables in your php.ini configuration file, find those:
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 600
And, not mandatory, you can change those also:
suhosin.post.max_vars = 5000
suhosin.request.max_vars = 5000
After modification, restart your web server.

server error executing a large file

I have created a script which reads an XML file and adds it to the database. I am using XML Reader for this.
The problem is that my XML contains 500,000 products in it. This causes my page to time out. is there a way for me to achieve this?
My code below:
$z = new XMLReader;
$z->open('files/NAGardnersEBook.xml');
$doc = new DOMDocument;
# move to the first node
while ($z->read() && $z->name !== 'EBook');
# now that we're at the right depth, hop to the next <product/> until the end of the tree
while ($z->name === 'EBook')
{
$node = simplexml_import_dom($doc->importNode($z->expand(), true));
# Get the value of each node
$title = mysql_real_escape_string($node->Title);
$Subtitle = mysql_real_escape_string($node->SubTitle);
$ShortDescription = mysql_real_escape_string($node->ShortDescription);
$Publisher = mysql_real_escape_string($node->Publisher);
$Imprint = mysql_real_escape_string($node->Imprint);
# Get attributes
$isbn = $z->getAttribute('EAN');
$contributor = $node->Contributors;
$author = $contributor[0]->Contributor;
$author = mysql_real_escape_string($author);
$BicSubjects = $node->BicSubjects;
$Bic = $BicSubjects[0]->Bic;
$bicCode = $Bic[0]['Code'];
$formats = $node->Formats;
$type = $formats[0]->Format;
$price = $type[0]['Price'];
$ExclusiveRights = $type[0]['ExclusiveRights'];
$NotForSale = $type[0]['NotForSale'];
$arr[] = "UPDATE onix_d2c_data SET is_gardner='Yes', TitleText = '".$title."', Subtitle = '".$Subtitle."', PersonName='".$author."', ImprintName = '".$Imprint."', PublisherName = '".$Publisher."', Text = '".$ShortDescription."', BICMainSubject = '".$bicCode."', ExcludedTerritory='".$NotForSale."', RightsCountry='".$ExclusiveRights."', PriceAmount='".$price."', custom_category= 'Uncategorised', drm_type='adobe_drm' WHERE id='".$isbn."' ";
# go to next <product />
$z->next('EBook');
$isbns[] = $isbn;
}
foreach($isbns as $isbn){
$sql = "SELECT * FROM onix_d2c_data WHERE id='".$isbn."'";
$query = mysql_query($sql);
$count = mysql_num_rows($query);
if($count >0){
} else{
$sql = "INSERT INTO onix_d2c_data (id) VALUES ('".$isbn."')";
$query = mysql_query($sql);
}
}
foreach($arr as $sql){
mysql_query($sql);
}
Thank you,
Julian
You could use the function set_time_limit to extend the allowed script execution time or set max_execution_time in your php.ini.
You need to set these vaiables.Mare sure you have permission to change them
set_time_limit(0);
ini_set('max_execution_time', '6000');
You're executing two queries for each ISBN, just to check whether the ISBN already exists. Instead, set the ISBN column to unique (if it isn't already, it should be) then just go ahead and insert without checking. MySQL will return an error if it detects a duplicate which you can handle. This will reduce the number of queries and improve performance.
You're inserting each title with a separate call to the database. Instead, use the extended INSERT syntax to batch up many inserts in one query - see the MySQL manual for the ful syntax. Batching, say, 250 inserts will save a lot of time.
If you're not happy with batching inserts, use mysqli prepared statements which will reduce parsing time and and transmission time, so should improve your overall performance
You can probably trust Gardners list - consider dropping some of the escaping you're doing. I wouldn't recommend this for user input normally, but this is a special case.
Have you tried adding set_time_limit(0); on top of your PHP file ?
EDIT :
ini_set('memory_limit','16M');
Specify your limit there.
if you don't want to change the max_execution time as proposed by others, then you could also split up your tasks into several smaller tasks and let the server run a cron-job in several intervals.
E.g. 10.000 products each minute
Thank you all for such fast feedback. I managed to get the problem sorted by using array_chunks. Example below:
$thumbListLocal = array_chunk($isbns, 4, preserve_keys);
$thumbListLocalCount = count($thumbListLocal);
while ($i <= $thumbListLocalCount):
foreach($thumbListLocal[$i] as $index => $thumbName):
$sqlConstruct[] = "INSERT IGNORE INTO onix_d2c_data (id) VALUES ('".$thumbName."')";
endforeach;
foreach($sqlConstruct as $processSql){
mysql_query($processSql);
}
unset($thumbListLocal[$i]);
$i++;
endwhile;
I hope this helps someone.
Julian

Categories