I have a wordpress plugin that exports form entries to a txt file. So I need to write a php script to add them to a sql database as I want the submissions added to a database on a different domain (otherwise I’d just get the plugin to do it for me). I’m fine about how I get it to connect to the database, it’s just how I code it to interpret the data as the column names are always next to the field as shown.
{"Entry_ID":"235","Name":"matt","Email":"matt#gmail.com","Date":"03/10/2017"}{"Entry_ID":"236","Name":"matt","Email":"matt#btinternet.com","Date":"10/10/2017"}
Is there a way to get it to ignore the column name and only interpret the data within the “” after the : ?
Once these have been added to the sql database I would then need to get the lines removed from the txt
So far I have this but it isn't working...
$file= fopen('http://mpcreations.staging.wpengine.com/wp-content/themes/red-seal-resources/test.txt', 'r');
while (($data = fgetcsv($file)) !== FALSE) {
$object = json_encode($data[0]);
$servername = "";
$username = "";
$password = "";
$dbname = "";
// Create connection
$conn = mysqli_connect($servername, $username, $password, $dbname);
// Check connection
if (!$conn) {
die("Connection failed: " . mysqli_connect_error());
}
$query = "INSERT INTO 'wp_forms' LINES TERMINATED BY '\n';
if (mysqli_multi_query($conn, $query)) {
echo "New records created successfully";
} else {
echo "Error: " . $query . "<br>" . mysqli_error($conn);
}
mysqli_close($conn);
}
Any help would be greatly appreciated.
Thank you
Each line in the txt file has JSON data? Process the txt file, parse the data and INSERT it into the database table.
$file= fopen('file.txt', 'r');
while (($data = fgetcsv($file)) !== FALSE) {
$object = json_encode($data[0]);
// Prepare INSERT query here...
}
Related
How to save long text from textarea-input line per line
i have a form with a text area, i wanna save a long text line per line in mysql
i have no idea
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
You need to learn how mysql interacts with the DB. You will likely need to use VARCHAR datatype depending on how big you text area is. So if the field from the from is up to 250 characters then the text area column's datatype would be VARCHAR(250).
You would do a POST request to a file with something like this:
$post = $_POST;
//set other fields here, I recommend sanitizing your inputs.
...
$textarea = $_POST['text_area'];
$servername = "HOST";
$username = "username";
$password = "password";
// Create connection
$conn = new mysqli($servername, $username, $password);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$sql = "INSERT INTO MyGuests (...other columns you have, textarea)
VALUES (..., $textarea)";
if ($conn->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
Two links I highly suggested looking at:
How to filter inputs via php(used before sql execution)
PHP MySQL Insert Data
It would have been easier if I had a look at your form data. I'll assume my own form data to try and answer your question.form.php
<form action="processing.php" method="POST">
<textarea required name="records" class="form-control" rows="8" cols="4" placeholder="Enter records separated by new line"></textarea>
<button type="submit" name="addRecords" class="btn btn-warning">Add Records</button>
</form>
Then processing.php
if (isset($_POST['addRecords'])) {
$record = $_POST['records'];
//explode records based on new line \n
$records = explode("\n", $record);
foreach ($records as $new) {
$data = $new;
//Here you'll write your sql code to insert records in the database
}
}
I am new to PHP
I am trying to load a large 14MB .csv into the mysql table.
But it is not fully uploaded into db, probably due to large file (~400000 rows). ERROR page took too long to respond.
Is there any faster way to do it.
My DB on Amazon RDS, PHP on EC2.
My current code is
<?php
header('Access-Control-Allow-Origin: *');
require "../config.php";
//$user_id = $_REQUEST['user_id'];
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
// path where your CSV file is located
define('CSV_PATH','./');
$csv_file = CSV_PATH . "data_unique.csv";
if (($handle = fopen($csv_file, "r")) !== FALSE) {
fgetcsv($handle);
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
for ($c=0; $c < $num; $c++) {
$col[$c] = $data[$c];
}
$col1 = $col[0];
$col2 = $col[1];
$col3 = $col[2];
$col4 = $col[3];
$col5 = $col[4];
$col6 = $col[5];
// SQL Query to insert data into DataBase
$query = "INSERT INTO uniqueid_master(autoid,package_id,unique_id,user_id,issued,book_code) VALUES('".$col1."','".$col2."','".$col3."','".$col4."','".$col5."','".$col6."')";
$result = $conn->query($query);
}
fclose($handle);
}
echo "File data successfully imported to database!!";
$conn->close();
?>
I think you should try LOAD DATA Mysql statement.This will be very fast since you don't have to read everything into php.
mysqli_query($dblink, '
LOAD DATA LOCAL INFILE "'.$file.'"
INTO TABLE transactions
FIELDS TERMINATED by ","
OPTIONALLY ENCLOSED BY "\'"
LINES TERMINATED BY "\n"
');
could be write like this too:
$sql = "LOAD DATA LOCAL INFILE '/path/to/file.csv'
REPLACE INTO TABLE table_name FIELDS TERMINATED BY ','
ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES";
$result = $mysqli->query($sql);
OR
For an alternate method : Refer to this question too
This will improve some of PHP performance variables.
ini_set('memory_limit','-1');
ini_set('max_execution_time', 0);
This might solve your problem. But , it could still be possible, that, you reach a memory error. In that case, divide the csv file into multiple chunks and handle them one after another.
This is a good way to handle it.
How to extract data from csv file in PHP
This question already has answers here:
PHP: How to check if image file exists?
(22 answers)
Closed 5 years ago.
Can someone please help me to make this script check if the file exists, before it truncate the table.
If filename not exists, I want to stop the import.
<?php
//set the connection variables
$hostname = "host";
$username = "username";
$password = "pass";
$database = "database";
$filename = "filename.csv";
//connect to mysql database
$connection = mysqli_connect($hostname, $username, $password, $database) or die("Error " . mysqli_error($connection));
mysqli_query($connection, "TRUNCATE TABLE `my_tablename`");
// open the csv file
$fp = fopen($filename,"r");
//parse the csv file row by row
while(($row = fgetcsv($fp,"500",",")) != FALSE)
{
//insert csv data into mysql table
$sql = "INSERT INTO Pristabell (Produkt, Pris, Rabattkr, Rabattprosent, Lagerstatus, Butikk, TAGS) VALUES('" . implode("','",$row) . "')";
if(!mysqli_query($connection, $sql))
{
die('Error : ' . mysqli_error($conection));
}
}
fclose($fp);
//close the db connection
mysqli_close($connection);
?>
Thanks :-)
http://php.net/manual/en/function.file-exists.php
if(file_exists($pathtofile)){
//do import
}else{
//stop
}
One simple solution is use
file_exist;
in an if() chunk.
Before the while(), if true continue else exit or trow an exception.
I am adding data from a file to my database. Currently the location of the files are limited to only those inside directory D:/. I want to be able to support adding files from multiple directories.
<?php
$servername = "localhost";
$username = "root";
$password = "root";
$dbname = "stdprt";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$filename = "d:/" . $_POST['fname'];
$handle = fopen($filename, "r");
while (($data = fgetcsv($handle)) !== FALSE) {
$num = count($data);
$row;
$sql = "INSERT into marks(regno,semister,subcode,subname,internals,externals,credits)values('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]')";
//echo "INSERT into marks(regno,semister,subcode,subname,internals,externals,credits)values('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]')";
if ($conn->query($sql) === TRUE) {
// echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
echo "<br>";
}
?>
<h2>Uploaded Successfully....</h2>
back
If you are wanting to choose a file in another drive you can modify this line in your code and change the directory at the start.
$filename="d:/".$_POST['fname'];
So for example if you wanted to change the directory to drive F it would be like so:
$filename="f:/".$_POST['fname'];
If you wanted to enable the ability to specify a custom directory in your request then you could pass it through the same way you are passing fname. Say for example you passed your custom directory along in a key named cust_dir you could add it as the directory like so.
if($_POST['cust_dir']{
$filename=$_POST['cust_dir'].$_POST['fname'];
} else {
$filename="d:/".$_POST['fname'];
}
The code above would use a custom directory path that you passed in the $_POST variable if you passed one. If you do not pass cust_dir then it will default to directory d:/.
Hope someone can help me with what I think will be something minor (I'm still learning...). I'm trying to write the entire contents of a CSV File server based to an SQL database here is the code I presently have. The line // out writes perfectly and generates a new record. The $ar0 values generate no entries into the table named order - even though the csv file is about 100 lines long I just get
Error: INSERT INTO order (Picker,Order_Number,Timestamp,System)values ('','','','')
$file = "Pal.ORD.csv";
$tbl = "order";
$f_pointer=fopen("$file","r"); // file pointer
while(! feof($f_pointer)){
$ar=fgetcsv($f_pointer);
//$sql="INSERT INTO `order` (Picker,Order_Number,Timestamp,System)values ('Me','9999','23-01-2015','ORD')";
$sql="INSERT INTO `order` (Picker,Order_Number,Timestamp,System)values ('$ar[0]','$ar[1]','$ar[2]','$ar[3]')";
echo $sql;
echo "<br>";
}
if ($connect->query($sql) === TRUE) {
echo "New records created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
What I think may be going on is that your file probably has an empty line/carriage return as the last line in the file and is using that to insert the data as blank entries.
I can't be 100% sure about this since you have not provided a sample of your CSV file, however that is what my tests revealed.
Based on the following CSV test model: (Sidenote: blank lines will be ignored)
a1,a2,a3,a4
b1,b2,b3,b4
c1,c2,c3,c4
Use the following and replace with your own credentials.
This will create a new entry/row for each line found in a given file based on the model I have provide above.
<?php
$DB_HOST = 'xxx';
$DB_USER = 'xxx';
$DB_PASS = 'xxx';
$DB_NAME = 'xxx';
$db = new mysqli($DB_HOST, $DB_USER, $DB_PASS, $DB_NAME);
if($db->connect_errno > 0) {
die('Connection failed [' . $db->connect_error . ']');
}
$file = "Pal.ORD.csv";
$delimiter = ',';
if (($handle = fopen("$file", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, $delimiter)) !== FALSE) {
foreach($data as $i => $content) {
$data[$i] = $db->real_escape_string($content);
}
// echo $data[$i].""; // test only not required
$db->query("INSERT INTO `order`
(Picker, Order_Number, Timestamp, System)
VALUES ('" . implode("','", $data) . "');");
}
fclose($handle);
}
if($db){
echo "Success";
}
else {
echo "Error: " . $db->error;
}
At a quick glance it seems like this:
$f_pointer=fopen("$file","r"); // file pointer
Should be this:
$f_pointer=fopen($file,"r"); // file pointer
You might not be reading anything from the file. You can try outputting the file contents to see if that part is working, since you've confirmed that you can insert into the DB.