I'm trying to slim down the code:
I want to make a for loop out of this part, but it wouldn't work.
$line1 = $frage1[0] . '|' . $frage1[1] . '|' . $frage1[2] . '|' . $frage1[3];
$line2 = $frage2[0] . '|' . $frage2[1] . '|' . $frage2[2] . '|' . $frage2[3];
$line3 = $frage3[0] . '|' . $frage3[1] . '|' . $frage3[2] . '|' . $frage3[3];
$line4 = $frage4[0] . '|' . $frage4[1] . '|' . $frage4[2] . '|' . $frage4[3];
$line5 = $frage5[0] . '|' . $frage5[1] . '|' . $frage5[2] . '|' . $frage5[3];
This is my attempt:
for ($i=1; $i<6; $i++){
${line.$i} = ${frage.$i}[0] . '|' . ${frage.$i}[1] . '|' . ${frage.$i}[2] . '|' . ${frage.$i}[3];
}
EDIT:
This is the solution that works (just so simple :-p):
for ($i=1; $i<18; $i++){
${"line".$i} = implode("|", ${"frage".$i});
fwrite($antworten, ${"line".$i});
}
for ($i=1; $i<6; $i++){
$line[$i] = $frage[0] . '|' . $frage[1] . '|' . $frage$i[2] . '|' . $frage$i[3];
}
This will insert the same thing you posted abov e in line 1 to 6. Is this what you are looking for?!
for ($i=1; $i<6; $i++){
${"line$i"} = ${"frage$i"}[0] . '|' . ${"frage$i"}[1] . '|' . ${"frage$i"}[2] . '|' . ${"frage$i"}[3];
}
That works too!
TL;DR
You might find the documentation on variable variables useful but it looks like the problem is that PHP cannot handle the unquoted string inside your curly brackets.
So instead of ${frage.$i} you need ${"frage$i"}.
Another Approach
However, this is probably not the clearest way to solve this problem. It certainly gives me a bit of a headache trying to work out what this code is trying to do. Instead I would recommend adding all of your $frage to an array first and then looping as follows:
$lines = array();
$frages = array($frage1, $frage2, $frage3, $frage4, $frage5)
foreach($frages as $frage) {
$lines[] = join('|', $frage);
}
Note in particular that you can use join to concatenate each $frage with a | inbetween each, rather than doing this concatenation manually. You could always use [array_slice][2] if you really did only want the first 4 elements of the array to be joined.
If you have a significant number of $frage variables and don't want to add them to an array manually then:
$frages = array();
for($i = 1; $i < x; $i++) {
$frages[] = ${"frage$i"}
}
If you really need each $line variable rather than an array of lines, then you can use extract although this will give you variables like $line_1 rather than $line1:
extract($lines, EXTR_PREFIX_ALL, "line")
However, I would recommend taking a serious look at why you have these numbered $frage being generated and why you need these numbered $line as your output. I would be very surprised if your code could not be re-factored to just use arrays instead, which would make your code much simpler, less surprising and easier to maintain.
Related
I have searched everywhere, but nothing I find seems to help solve this. I have a html web form (in a PHP document) that writes data to a CSV file, and below the form is a table that filters the CSV data back in based on a key word. I have no problems with my existing code for that part. However, I need to have an auto-number function that assigns a number to each form. I need help on even where to start. I'm still relatively new to coding, so any help would be great.
Edit: Here is the code I use to write my data to the csv file.
if($_POST['formSubmit'] == "Submit")
{
$fs = fopen("fixturerequests.csv","a");
fwrite($fs,$varFixNum . ", " . $varRequester . ", " . $varDept . ", " . $varSupervisor . ", " . $varDesc . ", " . $varParts . ", " . $varWC . ", " . $varAddinfo . ", " . $varDateReq . ", " . $varDateNeed . ", " .$varStatus . "\n");
fclose($fs);
header("Location: successfullysubmitted.php");
exit;
}
Any guidance would be excellent. Thank you.
You can use this function
function next_available_form_id(){
$rows = file('fixturerequests.csv'); //put our csv file into an array
if(empty($rows)) return 1; //if our csv is empty we start from 1
$data = str_getcsv(array_pop($rows)); //array_pop gets the last row
return $data[0]+1; //we get first field and add 1 to it
//Just use the field where you store the form number
//e.g if you store the form number in the
//4th field replace $data[0] with $data[3]
}
Based on the code you provided you can use the function I provided to get the next form_id before storing it in the csv file.Just make this modification to your code after opening the csv file :
$fs = fopen("fixturerequests.csv","a");
$form_id=next_available_form_id(); //ADD THIS to get the next available id
//And insert $form_id as the first field in your csv file
fwrite($form_id,$fs,$varFixNum . ", " . $varRequester . ", " . $varDept . ", " . $varSupervisor . ", " . $varDesc . ", " . $varParts . ", " . $varWC . ", " . $varAddinfo . ", " . $varDateReq . ", " . $varDateNeed . ", " .$varStatus . "\n");
Notice:
Of course since the csv you have now does not have form_id as the first field you should either create your csv file from scratch or add form numbers in your existing records.In the example I use awk to do that:
awk '{printf "%d,%s\n", NR, $0}' < fixturerequests.csv
I have the following code in one of my pages. Prior to this I execute a query that returns multiple rows keyed off of alias_code. This code creates an array of string arrays to me echoed into a javascript function for populating points on a graph. I've profiled this multiple times, but I still have the feeling that there's a more efficient way to do this. I do realize that I run the risk of running out of memory if my strings are too big, but I'll constrain this in my query since I'd like to avoid an additional sub array or the use of implode/join. Does anyone have any thoughts on speeding this up?
$detailArray = array();
$prevAliasCode = '';
$valuesStr = '';
while ($detailRow = mysqli_fetch_array($detailResult)) {
$aliasCode = $detailRow['alias_code'];
if ($aliasCode <> $prevAliasCode) {
if ($valuesStr <> '')
$detailArray[$prevAliasCode] = $valuesStr;
$valuesStr = '';
}
if ($valuesStr <> '')
$valuesStr = $valuesStr . ', ';
$valuesStr = $valuesStr .
"['" .
$detailRow['as_of_date'] . "', " .
$detailRow['difficulty'] . ", " .
$detailRow['price_usd'] . "]";
$prevAliasCode = $aliasCode;
}
$detailArray[$prevAliasCode] = $valuesStr;
Hey all i am created 2 random numbers like so:
$firstlink = intval(mt_rand(100, 999) . mt_rand(100, 999) . mt_rand(1, 9) . mt_rand(100, 999)); // 10 digit
$secondLink = intval(mt_rand(1000, 999) . mt_rand(1, 999) . mt_rand(10, 99) . mt_rand(100, 999));
And this is my insert code:
$result = mysql_query("INSERT INTO userAccount
(Category,Fname,LName,firstlink,secondLink,AccDate)
VALUES ( '" . $cat . "',
'" . $fname . "',
'" . $lname . "',
" . $firstlink . ",
" . $secondLink . ",
'" . date('Y-m-d g:i:s',time()). "');");
It has no errrs and it places the data into the mysql database. However, its always the same number for BOTH firstlink and secondLink no matter who i add to the database and i have no idea why its doing it!
The datatype for both rows is INT(15)
Remove intval and all will work fine.
$firstlink = mt_rand(100, 999) . mt_rand(100, 999) . mt_rand(1, 9) . mt_rand(100, 999); // 10 digit
32 bit systems have a maximum signed integer range of -2147483648 to 2147483647. With intval you got 2147483647 mostly.
You can simplify your code and improve the randomness of the code like this:
$firstlink = mt_rand(10000,99999) . mt_rand(10000,99999);
$secondLink = mt_rand(10000,99999) . mt_rand(10000,99999);
echo "INSERT INTO userAccount
(Category,Fname,LName,firstlink,secondLink,AccDate)
VALUES ( '" . $cat . "',
'" . $fname . "',
'" . $lname . "',
" . $firstlink . ",
" . $secondLink . ",
'" . date('Y-m-d g:i:s',time()). "');"
PHPFiddle: http://phpfiddle.org/main/code/6nf-wpk
This will build your random 10-digit code by making two random 5 digit codes and joining them together. Is is simpler and easier to follow with less parts making it up. It had to be done with two mt_rand()s because the maximum number possible is 2147483647. For each mt_rand() function you're using, you're preventing a 0 from being the first digit in that section of the number, because you're starting the first digit at between 1 and 9.
If you don't care about the first number being only a 1 or 2 (and never being 3-9 or 0) you can simply use
$firstlink = mt_rand(1000000000,2147483647); // random 10 digit number
$secondLink = mt_rand(1000000000,2147483647); // random 10 digit number
As general coding tips:
Be consistent with how you name variables. You have a lowercase "L" in "$firstlink" and a capital "L" in "$secondLink". PHP is case-sensitive and you'll end up using the wrong name and getting unexpected (blank) results elsewhere in your program.
Be be careful never to put any user-provided data into a SQL command without protecting against SQL Injection attacks. Use parameterized queries as a rule. See How can I prevent SQL injection in PHP? for more details and examples.
i have a task where i need to parse an extremely big file and write the results into a mysql database. "extremely big" means we are talking about 1.4GB of sort-of-CSV data, totalling in approx 10 million lines of text.
Thing is not "HOW" to do it, but how to do it FAST. my first approach was to just do it in php without any speed optimization and then let it run for a few days until it's done. unfortunately, it's been running for 48 hours straight right now and has processed only 2% of the total file. therefore, that's not an option.
the file format is as follows:
A:1,2
where the amount of comma separated numbers following the ":" can be 0-1000. the example dataset has to go into a table as follows:
| A | 1 |
| A | 2 |
so right now, i did it like this:
$fh = fopen("file.txt", "r");
$line = ""; // buffer for the data
$i = 0; // line counter
$start = time(); // benchmark
while($line = fgets($fh))
{
$i++;
echo "line " . $i . ": ";
//echo $i . ": " . $line . "<br>\n";
$line = explode(":", $line);
if(count($line) != 2 || !is_numeric(trim($line[0])))
{
echo "error: source id [" . trim($line[0]) . "]<br>\n";
continue;
}
$targets = explode(",", $line[1]);
echo "node " . $line[0] . " has " . count($targets) . " links<br>\n";
// insert links in link table
foreach($targets as $target)
{
if(!is_numeric(trim($target)))
{
echo "line " . $i . " has malformed target [" . trim($target) . "]<br>\n";
continue;
}
$sql = "INSERT INTO link (source_id, target_id) VALUES ('" . trim($line[0]) . "', '" . trim($target) . "')";
mysql_query($sql) or die("insert failed for SQL: ". mysql_error());
}
}
echo "<br>\n--<br>\n<br>\nseconds wasted: " . (time() - $start);
this is obviously not optimized for speed in ANY way. any hints for a fresh start? should i switch to another language?
The first optimization would be to insert with a transaction - each 100 or 1000 lines commit and begin a new transaction. Obviously you'd have to use a storage engine that supports transactions.
Then observe the CPU usage with the top command - if you have multiple cores, the mysql process does not do much and the PHP process does much of the work, rewrite the script to accept a parameter that skips n lines from the beginning and only import 10000 lines or so. Then start multiple instances of the script, each with a different starting point.
Third solution would be to convert the file into a CSV with PHP (no INSERT at all, just writing to a file) and the using LOAD DATA INFILE as m4t1t0 suggested.
as promised, attached you'll find the solution i went for in this post. i benchmarked it and it turned out, that it is 40 times (!) faster than the old one :)
sure - there's still much room for optimization, but it's fast enough for me right now :)
$db = mysqli_connect(/*...*/) or die("could not connect to database");
$fh = fopen("data", "r");
$line = ""; // buffer for the data
$i = 0; // line counter
$start = time(); // benchmark timer
$node_ids = array(); // all (source) node ids
mysqli_autocommit($db, false);
while($line = fgets($fh))
{
$i++;
echo "line " . $i . ": ";
$line = explode(":", $line);
$line[0] = trim($line[0]);
if(count($line) != 2 || !is_numeric($line[0]))
{
echo "error: source node id [" . $line[0] . "] - skipping...\n";
continue;
}
else
{
$node_ids[] = $line[0];
}
$targets = explode(",", $line[1]);
echo "node " . $line[0] . " has " . count($targets) . " links\n";
// insert links in link table
foreach($targets as $target)
{
if(!is_numeric($target))
{
echo "line " . $i . " has malformed target [" . trim($target) . "]\n";
continue;
}
$sql = "INSERT INTO link (source_id, target_id) VALUES ('" . $line[0] . "', '" . trim($target) . "')";
mysqli_query($db, $sql) or die("insert failed for SQL: ". $db::error);
}
if($i%1000 == 0)
{
$node_ids = array_unique($node_ids);
foreach($node_ids as $node)
{
$sql = "INSERT INTO node (node_id) VALUES ('" . $node . "')";
mysqli_query($db, $sql);
}
$node_ids = array();
mysqli_commit($db);
mysqli_autocommit($db, false);
echo "committed to database\n\n";
}
}
echo "<br>\n--<br>\n<br>\nseconds wasted: " . (time() - $start);
I find your description rather confusing - and it doesn't match up with the code you've provided.
if(count($line) != 2 || !is_numeric(trim($line[0])))
the trim here is redundant - whitespace doesn't change the behaviour of is_numberic. But you've said aleswhere that the start of the line is a letter - therefore this will always fail.
If you want to speed it up then switch to using stream processing of the input rather than message processing (PHP arrays can be very slow) or use a different language and aggregate the insert statements into multi-line inserts.
I would first just use the script to create a SQL file. Then lock the table using this http://dev.mysql.com/doc/refman/5.0/en/lock-tables.html by placing the appropriate commands at the start/end of the SQL file (could get you script to do this).
Then just use the command tool to inject the SQL into the database (preferably on the machine where the database resides).
Good eve everyone!
For some reason Database::fetchArray() is skipping the first $row of the query result set.
It prints all rows properly, only keeps missing out the first one for some reason, I assume there's something wrong with my fetchArray() function?
I ran the query in phpMyAdmin and it returned 4 rows, when I tried it on my localhost with the php file (code below) it only printed 3 rows, using the same 'WHERE tunes.riddim'-value ofcourse. Most similiar topics on google show that a common mistake is to use mysql_fetch_array() before the while(), which sets the pointer ahead and causes the missing of the first row, unfortunately I only have one mysql_fetch_array() call (the one within the while()-head).
<?php
$db->query("SELECT " .
"riddims.riddim AS riddim, " .
"riddims.image AS image, " .
"riddims.genre AS genre, " .
"tunes.label AS label, " .
"tunes.artist AS artist, " .
"tunes.tune AS tune, " .
"tunes.year AS year," .
"tunes.producer AS producer " .
"FROM tunes " .
"INNER JOIN riddims ON tunes.riddim = riddims.riddim " .
"WHERE tunes.riddim = '" . mysql_real_escape_string(String::plus2ws($_GET['riddim'])) . "'" .
"ORDER BY tunes.year ASC");
$ar = $db->fetchArray();
for($i = 0; $i < count($ar) - 1; $i++)
{
echo $ar[$i]['riddim'] . " - " . $ar[$i]['artist'] . " - " . $ar[$i]['tune'] . " - " . $ar[$i]['label'] . " - " . $ar[$i]['year'] . "<br>";
}
?>
Database::fetchArray() looks like:
public function fetchArray()
{
$ar = array();
while(($row = mysql_fetch_array($this->result)) != NULL)
$ar[] = $row;
return $ar;
}
Any suggestions appreciated!
You should remove -1 from the for loop
The problem's in your while loop:
for($i = 0; $i < count($ar) - 1; $i++)
if count ($ar) is 1, because there's one entry, your loop will never be called; try tweaking the check part:
for($i = 0; $i < count($ar) ; $i++)
You can also use a simple foreach:
foreach($db->fetchArray() as $row)
{
echo $row['riddim'] # ...
}
It'll make your code more readable too.