I've got a script that I needed to change since the data which is going to be inserted into the db got too big to do it at once. So I created a loop, that splits up the array in blocks of 6000 rows and then inserts it.
I don't know exactly if the data is to big for the server to process at once or if it's too big to upload, but atm I got both steps split up in these 6000s blocks.
Code:
for ($j = 0; $j <= ceil($alength / 6000); $j++){
$array = array_slice($arraysource, $j * 6000, 5999);
$sql = "INSERT INTO Ranking (rank, name, score, kd, wins, kills, deaths, shots, time, spree) VALUES ";
foreach($array as $a=>$value){
//transforming code for array
$ra = $array[$a][0];
$na = str_replace(",", ",", $array[$a][1]);
$na = str_replace("\", "\\\\", $na);
$na = str_replace("'", "\'", $na);
$sc = $array[$a][2];
$kd = $array[$a][3];
$wi = $array[$a][4];
$ki = $array[$a][5];
$de = $array[$a][6];
$sh = $array[$a][7];
$ti = $array[$a][8];
$sp = $array[$a][9];
$sql .= "('$ra',' $na ','$sc','$kd','$wi','$ki','$de','$sh','$ti','$sp'),";
}
$sql = substr($sql, 0, -1);
$conn->query($sql);
}
$conn->close();
Right now it only inserts the first 5999 rows, but not more as if it only executed the loop once. No error messages..
Don't know if this'll necessarily help, but what about using array_chunk, array_walk, and checking error codes (if any)? Something like this:
function create_query(&$value, $key) {
//returns query statements; destructive though.
$value[1] = str_replace(",", ",", $value[1]);
$value[1] = str_replace("\", "\\\\", $value[1]);
$value[1] = str_replace("'", "\'", $value[1]);
$queryvalues = implode("','",$value);
$value = "INSERT INTO Ranking (rank, name, score, kd, wins, kills, deaths, shots, time, spree) VALUES ('".$queryvalues."');";
}
$array = array_chunk($arraysource, 6000);
foreach($array as $key=>$value){
array_walk($value,'create_query');
if (!$conn->query($value)) {
printf("Errorcode: %d\n", $conn->errno);
}
}
$conn->close();
Secondly, have you considered using mysqli::multi_query? It'll do more queries at once, but you'll have to check the max allowed packet size (max_allowed_packet).
Another tip would be to check out the response from the query, which your code doesn't include.
Thanks for the tips but I figured it out. Didn't think about this ^^
it was the first line after the for loop that i didnt include in my question:
array_unshift($array[$a], $a + 1);
this adds an additional value infront of each user, the "rank". But the numbers would repeat after one loop finishes and it can't import users with the same rank.
now it works:
array_unshift($array[$a], $a + 1 + $j * 5999);
Related
I have a large textfile with rows of data that need to be imported in the database. But the file contains like 300000 rows and I can't get it to work, because the query seems too big.
$all_inserts = array();
$count = 0;
foreach($file as $line) {
if($count > 0) {
$modelnummer = trim(substr($line, 0, 4));
$datum = trim(substr($line, 4, 8));
$acc_nummer = trim(substr($line, 12, 4));
$acc_volgnr = trim(substr($line, 16, 1));
$prijs = trim(substr($line, 17,5));
$mutatiecode = trim(substr($line, 22,1));
$all_inserts[] = array($modelnummer, $datum, $acc_nummer, $acc_volgnr, $prijs, $this->quote($mutatiecode));
}
$count++;
}
$query = 'INSERT INTO accessoire_model_brommer (modelnummer, datum, acc_nummer, acc_volgnr, prijs, mutatiecode) VALUES ';
$rows = array();
foreach($all_inserts as $one_insert) {
$rows[] = '(' . implode(',', $one_insert) . ')';
}
$query .= ' ' . implode(',', $rows);
$db->query($query);
I used above code for smaller files and it works fine and fast. But it doesn't work for the bigger files. Does someone know a better way to read and insert this file?
Also tried to use an insert statement per row within a transaction, but it doesn't work either.
Note: I don't know the exact limitations of PDO or SQL for query length
If the query seems too big, perhaps you could splice up the textfile, so instead of running 1 query with 300k values, run 100 queries with 3000 values i.e.?
Perhaps you could create a buffer, fill it with 3000 values and run the query. Empty the buffer, fill it with the next 3000 values and run the query again.
I need to update tags column so each cell has the content like this:
2-5-1-14-5
or
3-9-14-19-23
or simmilar (five integers, in range from 1-25).
id column is not consecutive from 1-117, but anyway min id is 1 and max 117.
$arr = [];
$str = '';
$id = 1;
for ($x = 1; $x <= 25; $x++){
array_push($arr, $x);
}
while ($id < 117) {
shuffle($arr);
array_splice($arr, 5, 25);
foreach ($arr as $el){
$str .= $el . '-';
}
$str = rtrim($str,'-');
$db->query("update posts set tags = '" . $str . "' where id = " . $id);
$id += 1;
}
I'm not sure how to describe the final result, but it seems that the majority of cells are written multiple times.
Any help ?
To combine my comments into one piece of code:
$full = range(1, 25);
$id = 1;
while ($id < 117) {
shuffle($full);
$section = array_slice($full, 0, 5);
$str = implode('-',$section);
$db->query("update posts set tags = '" . $str . "' where id = " . $id);
$id += 1;
}
So the reset of $str is not needed anymore since I have inserted the implode() where it seems functional. The other bits of code could probably be improved.
Two warnings:
Using PHP variables directly in queries is not a good idea. Please use parameter binding. This particular piece of code might not be vulnerable to SQL-injection but if you do the same elsewhere it might be.
Your database doesn't seem to be normalized. This might cause trouble for you in the long run when you expand your application.
I have the following code - it produces a series of queries that are sent to a database:
$a = 'q';
$aa = 1;
$r = "$a$aa";
$q = 54;
while($aa <= $q){
$query .= "SELECT COUNT(". $r .") as Responses FROM tresults;";
$aa = $aa + 1;
$r = "$a$aa";
}
The issue I have is simple, within the database, the number is not sequential.
I have fields that go from q1 to q13 but then goes q14a, q14b, q14c, q14d and q14e and then from q15 to q54.
I've looked at continue but that's more for skipping iterations and hasn't helped me.
I'm struggling to adapt the above code to handle this non-sequential situation. Any ideas and suggestions welcomed.
I have fields that go from q1 to q13 but then goes q14a, q14b, q14c, q14d and q14e and then from q15 to q54.
for($i=1; $i<=54; ++$i) {
if($i != 14) {
echo 'q' . $i . "<br>";
}
else {
for($j='a'; $j<='e'; ++$j) {
echo 'q14' . $j . "<br>";
}
}
}
If you don’t need to execute the statements in order of numbering, then you could also just skip one in the first loop if the counter is 14, and then have a second loop (not nested into the first one), that does the q14s afterwards.
You could get the columns from the table and test to see if they start with q (or use a preg_match):
$result = query("DESCRIBE tresults");
while($row = fetch($result)) {
if(strpos($row['Field'], 'q') === 0) {
$query .= "SELECT COUNT(". $r .") as Responses FROM tresults;";
}
}
Or build the columns array and use it:
$columns = array('q1', 'q2', 'q54'); //etc...
foreach($columns as $r) {
$query .= "SELECT COUNT(". $r .") as Responses FROM tresults;";
}
I want to generate a pair of random numbers 1234567890-9876543210 (10 digits each)
I made this code. It works fine it generates a pair of random numbers BUT if I try to insert it into database I get same results multiple times. Let's say I get 1234567890 more than once. If I echo the insert statement I get different results but when I want to query it into database I get same results.
$numbers = array(0,1,2,3,4,5,6,7,8,9);
srand(time());
$f = fopen('sql0.txt', 'w');
for($i=0;$i<100000;$i++)
{
$r = NULL;
$r2 = NULL;
for($x=0;$x<10;$x++)
{
$n = rand(0,9);
$r .= $numbers[$n];
}
for($x=0;$x<10;$x++)
{
$n1 = rand(0,9);
$r2 .= $numbers[$n1];
}
echo("INSERT INTO ci_codes VALUES (NULL, '$r', '$r2', '0')<br>");
}
Do you need the INSERT expression inside your loop? That might cause the trouble.
As others have mentioned, it is better to just generate the numbers once with the php function, then run your MySql Query.
$r = mt_rand(0, 10000000);
$r2 = mt_rand(0, 10000000);
echo("INSERT INTO ci_codes VALUES (NULL, '$r', '$r2', '0'");
How can I get this random string generator to create random strings. I keep getting repeats. The arrays generally consist of between 0 - 10 things, but still it is getting the same number of beds and baths on the repeats, I know that statistically it is messed up.
How can I elminate repeats?
for ($i = 0; $i <= 1000000; $i++) {
srand($i);
$price = rand(20000, 1000000);
$bed = rand(0, 20);
$bath = rand(0, 7);
$addressnum = rand(100, 10000);
$address = (int) preg_replace('/\D/', '', $addressnum) . " lol st";
$province = $f_contents[rand(0, count($f_contents) - 1)];
$postedby = 3;
$description = $de_contents[rand(0,count($de_contents) - 1)];
$status = "Unsold";
$type = $status_a[array_rand($status_a)];
$category = $category_type[array_rand($category_type)];
$size = rand(100,100000);
$builtin = rand(1850, 2013);
$queryString = "INSERT INTO listings
(PRICE, ADDRESS, PROVINCE, DESCRIPTION, STATUS, TYPE, CATEGORY, SIZE, BUILTIN, BED, BATH, POSTED_BY) VALUES
($price, '$address', '$province', '$description', '$status', '$type', '$category', $size, $builtin, $bed, $bath, $postedby)";
echo $queryString . "<br>";
$query = $db -> query($queryString);
}
Stop using rand(). Using mt_rand() instead should solve this problem.
I've come across problems several times with rand() and repeated values. I have to admit that in those cases I never really took the time to check why rand() seems so wonky.
Take srand() outside the loop. srand() should be called ONCE, and only once, in your program. Thereafter, rand() will produce random results. If you call srand() repeatedly, you're not getting random numbers at all, you're getting a hash value of the seeds you call it with.
Sure, switching to mt_rand() will give you even higher-quality random numbers, but that wasn't the problem with your code.