<?php
$con = mysqli_connect("localhost","root","","addressbook");
$file="localhost/IMDBAPI/-title.ratings.tsv";
$row = 1;
if (($handle = fopen($file, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000000, "\t")) !== FALSE) {
$num = count($data);
$row++;
$id=$data[0];
$name=$data[1];
$address=$data[2];
$phone=$data[3];
$sql="INSERT INTO adress (First_Name, Surname, Address) VALUES ('".$name."','".$address."','".$phone."')";
mysqli_query($con, "SELECT * FROM adress");
}
}
fclose($file);
?>
I have a TSV file named title.ratings.tsv and I am trying to insert values from that file into a MySQL table named adress (yes it is spelled incorrectly) into the columns First_Name, Surname, Address but I am getting the error ,
( ! ) Warning: fopen(localhost/IMDBAPI/-title.ratings.tsv): failed to open stream: No such file or directory in C:\wamp64\www\tsv.php on line 9
And,
( ! ) Warning: fclose() expects parameter 1 to be resource, string given in C:\wamp64\www\tsv.php on line 25
The name of File is indiferent. Is the name of file
"-title.ratings.tsv" or
"title.ratings.tsv" (without minus)
Do you have set permissions?
chmod 777 -title.ratings.tsv
Related
I'm trying to import a pretty big CSV file into my database (locally)
the file is 230MB and its about 8.8 million lines
the problem I have isn't opening the CSV or dont know how to import it,
the file opens, imports about 500,000 lines and then it quits and trows no error or timeout or anything, i just get to see my webpage.
this is the code:
try {
$conn = new PDO("mysql:host=$servername;dbname=adresses_database", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
$row = 1;
if (($handle = fopen("bagadres.csv", "c+")) !== FALSE) {
while (($data = fgetcsv($handle, '', ";")) !== FALSE) {
if (!isset($write_position)) { // move the line to previous position, except the first line
$write_position = 0;
$num = count($data); // $num is 15
$row++; //i dont need this?
$stmt = $conn->prepare("INSERT INTO adresses (openbareruimte, huisnummer, huisletter, huisnummertoevoeging, postcode, woonplaats, gemeente, provincie, object_id, object_type, nevenadres, x, y, lon, lat) VALUES (:openbareruimte, :huisnummer, :huisletter, :huisnummertoevoeging, :postcode, :woonplaats, :gemeente, :provincie, :object_id, :object_type, :nevenadres, :x, :y, :lon, :lat)");
$stmt->bindParam(':openbareruimte', $data[0]);
$stmt->bindParam(':huisnummer', $data[1]);
$stmt->bindParam(':huisletter', $data[2]);
$stmt->bindParam(':huisnummertoevoeging', $data[3]);
$stmt->bindParam(':postcode', $data[4]);
$stmt->bindParam(':woonplaats', $data[5]);
$stmt->bindParam(':gemeente', $data[6]);
$stmt->bindParam(':provincie', $data[7]);
$stmt->bindParam(':object_id', $data[8]);
$stmt->bindParam(':object_type', $data[9]);
$stmt->bindParam(':nevenadres', $data[10]);
$stmt->bindParam(':x', $data[11]);
$stmt->bindParam(':y', $data[12]);
$stmt->bindParam(':lon', $data[13]);
$stmt->bindParam(':lat', $data[14]);
$stmt->execute();
} else {
$read_position = ftell($handle); // get actual line
fseek($handle, $write_position); // move to previous position
fputs($handle, $line); // put actual line in previous position
fseek($handle, $read_position); // return to actual position
$write_position += strlen($line); // set write position to the next loop
}
fflush($handle); // write any pending change to file
ftruncate($handle, $write_position); // drop the repeated last line
flock($handle, LOCK_UN);
}
fclose($handle);
}
}
catch(PDOException $e)
{
echo "Connection failed: " . $e->getMessage();
}
$conn = null;
i came this far looking for help on stackoverflow and PHP manual, i also searched if it were a mysql error.
but i cannot figure this out,
(for any suggestions about mysql settings im using linux mint 18)
I would strongly recommend that you use MySQL's LOAD DATA INFILE, which is probably the fastest and most efficient to get CSV data into a MySQL table. The command for you setup would look something like this:
LOAD DATA INFILE 'bagadres.csv'
INTO TABLE adresses
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
If your fields are not enclosed by quotes, or are enclosed by something other than quotes, then remove or modify the ENCLOSED BY clause. Also, IGNORE 1 ROWS will ignore the first row, which would make sense assuming that the first line of your file were a header row (i.e. not actual data but column labels).
For some reason this code is causing odbc_execute(); to attempt to open a file...
$file = fopen('somefile.csv', 'r');
fgetcsv($file); // Skip the first line
$data = [];
while (($line = fgetcsv($file)) != false) {
$data[] = $line;
}
fclose($file);
try {
$conn = odbc_connect("Teradata", "User", "Pass");
odbc_autocommit($conn, false);
odbc_exec($conn, 'DELETE FROM table');
foreach ($data as &$test) {
$stmt = odbc_prepare($conn, 'INSERT INTO table (experiment_code, experiment_name, variant_code, variant_name, version_number, report_start_date, report_end_date, status, transaction_date, experiment_test_id, test_manager, product_manager, pod, created_date, last_updated_date) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)');
odbc_execute($stmt, $test);
}
odbc_commit($conn);
$result = odbc_exec($conn, 'SELECT * FROM table');
odbc_result_all($result);
} catch (Exception $e) {
odbc_rollback($conn);
echo $e->getMessage();
}
Here is a snip-it of the CSV file...
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
And here is the type of error I am getting...
Warning: odbc_execute(): Can't open file Control in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 19
I get multiple version of the same error just with a different file name. The file name seems to be coming from column 3 (0 based). Another weird thing is that it actually does insert some lines correctly.
The final error I get is...
Fatal error: Maximum execution time of 120 seconds exceeded in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 27
I am using Teradatas ODBC Drivers for version 15 on windows 7 64bit.
What could be causing this?
Turns out that some of the fields in the CSV file had single quotes in them which broke the query.
Simple but annoying oversight.
This has been inserting each line from the csv into the database twice, and now today three times. Nothing else I put in the loop happens more than it should.
$file_handle = fopen("uploads/numbers.csv", "r");
$stmt = $db->prepare("INSERT INTO database
(firstname,lastname,phonenumber) VALUES
(:field1,:field2,:field3)");
while (($line_of_data = fgetcsv($file_handle, 1000, ",")) !== FALSE)
{
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
Setup a proper primary key on database. Either (firstname, lastname) or (firstname, lastname, phonenumber) depending on the usage. Ta-da, no more duplicates.
I'm going to assume James was right in the columns by the fact that the CSV contains preexisting data in the database, but either way, a primary key will prevent duplicates.
If you use the firstname, lastname key and you want to have the script be able to update the phone number, you could use REPLACE instead of INSERT.
Your check is here:
while (($line_of_data = fgetcsv($file_handle, 1000, ",")) !== FALSE)
First, you don’t need the !== FALSE. It can just be this:
while (($line_of_data = fgetcsv($file_handle, 1000, ",")))
Also, your code is just checking while that fgetcsv is not empty. So what if there is an empty line in the file? It gets run twice. So how about this:
while (($line_of_data = trim(fgetcsv($file_handle, 1000, ",")))) {
if (!empty($line_of_data)) {
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
}
The idea is that when you call fgetcsv let’s trim the line to get rid of extra stuff like maybe a line break at the end of the line. Then the if (!empty($line_of_data)) { checks if the line is not empty & only acts on the query if it is definitely not empty. If somehow trim(fgetcsv(…)) doesn’t work, you can do it this way instead:
while (($line_of_data = fgetcsv($file_handle, 1000, ","))) {
if (!empty(trim($line_of_data))) {
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
}
With all of that logic in if (!empty(trim($line_of_data))) {.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Import CSV to mysql
Right I need some help with this:
I am trying to import a .csv file into a mysql database using php, rather than doing it manually through phpmyadmin.
This is the code I have at the moment:
if($_REQUEST['func'] == "iid"){
$db->conn = new mysqli(DB_SERVER, DB_USER, DB_PASSWORD, DB_NAME) or
die('There was a problem connecting to the database.');
$csv = $_POST['csv-file'];
$path = $csv;
$row = 1;
if (($handle = fopen($path, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$row++;
$data_entries[] = $data ;
}
fclose($handle);
}
// this you'll have to expand
foreach($data_entries as $line){
$sql = $db->conn->prepare('INSERT INTO `bd_results`');
$db->execute($line);
}
}
However I get the following error:
Fatal error: Call to undefined method stdClass::execute() in /homepages/19/d372249701/htdocs/business-sites/bowlplex-doubles-new/admin/scores.php on line 44
For reference I am using this code taken from: Here
I am not well versed in the $db->conn business I'm used to mysql_connect!! so any help would be appreciated.
Try this simple one.
if (($handle = fopen("google.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$db->conn->query("INSERT INTO values('" . implode('\',\'', $data) . "');");
}
fclose($handle);
}
Your script might also need to add quotes to the CSV values if they don't have quotes already. If you'll be needing to deal with quotes and all in your CSV files, I recommend you look at my blog post at http://www.fusionswift.com/2012/07/php-import-csv-to-mysql/
foreach($data_entries as $line) {
$stmt = $db->conn->prepare('INSERT INTO `bd_results` (field1, field2) VALUES (?, ?)');
$stmt->bindParam('ss', $field1Value, $field2Value);
list($field1Value, $field2Value) = $line
$stmt->execute();
}
Where $field1Value is first CSV column, $field2Value is second CSV column and both are of type string, specified as such in bindParam() method.
Basically you will need to prepare the query in its entirety, then you can assign variables to it, and once the variables have desired values you execute the query using execute() method.
This is how you use prepared statements. Personally I'd go with Mahn's suggestion though and avoid using prepared statements for such a task unless you need to process the data while on it.
mysqli_stmt::bind_param
mysqli_stmt::execute
I'm trying to save the data retrieved from the database into a .json. This is what is just tried.
$sql = mysql_query("SELECT `positive`,`time` FROM sentiment WHERE acctid=1");
$response = array();
$posts = array();
while($row=mysql_fetch_array($sql))
{
$positive=$row['positive'];
$time=$row['time'];
$posts[] = array('positive'=> $positive, 'time'=> $time,);
}
$response['posts'] = $posts;
$fp = fopen('results.json', 'w');
fwrite($fp, json_encode($response));
fclose($fp);
I got the following errors:
Warning: fopen(results.json) [function.fopen]: failed to open stream: Permission denied in /Applications/XAMPP/xamppfiles/htdocs/test/getjson.php on line 29
Warning: fwrite() expects parameter 1 to be resource, boolean given in /Applications/XAMPP/xamppfiles/htdocs/test/getjson.php on line 30
Warning: fclose() expects parameter 1 to be resource, boolean given in /Applications/XAMPP/xamppfiles/htdocs/test/getjson.php on line 31
What could the problem be?
The folder /Applications/XAMPP/xamppfiles/htdocs/test isn't writable by Apache - change the permissions so it can write to it. In Windows Explorer, right click the test folder, and untick "Read Only".