I have csv file exported from access. (dates formatted to match mysql)
I need to import the data into a mysql DB through code.
when I import the file through PhpMyAdmin it worked beautifully.
$fname = $_FILES['sel_file']['name'];
$filename = $_FILES['sel_file']['tmp_name'];
$sql="LOAD DATA INFILE '../filepath/data.txt' INTO TABLE table1 FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"';";
mysql_query($sql)or die(mysql_error());
I tried using the file path or using the variable $filename and both gave me the same error
I get the following error:
Access denied for user 'uname'#'%' (using password: YES)
I set the permissions to be 777, The databse was created allowing direct access. (I am able to opload the file and read from it using an "INSERT" statement, but not load data.)
A. Is the LOAD DATA statement wrong?
B. What other things do I have to take care of in order to use the LOAD DATA command.
Thank you all!
Try LOAD DATA LOCAL INFILE
It may be a authentication problem. Please check the permissions for Import/export data in mysql.
Related
I tried to change from insert to to load data infile for uploading a large CSV file.
I explored some examples on how to use load data infile. But it's not working and giving this Error:
SQLSTATE[HY000]: General error: 29 File '/direct1/#tmp/phpFZLLYA' not found (Errcode: 13 "Permission denied").
This is my code:
$target_dir = '/direct1/#tmp/';
$target_file = $target_dir . basename($_FILES["file"]["tmp_name"]);
$stmt = $dbcon->prepare("LOAD DATA INFILE '$target_file' INTO TABLE Rain FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES(id, date, vol, day)");
$stmt->execute();
When I try to change to load data local infile based on related discussion, error code does not appear but the data is not inserted to the database. I am using INNODB storage engine.
I solved the error by using move_uploaded_file function and the file can be access.
I am trying to work on a project that accepts an SQLite file via upload, saves it, and parses the data in the SQLite database. I am finding that when I use a smaller DB file the "-wal" and "-shm" files are not being created but when using a larger DB (at least 2mb) they are.
Here is the code that I am using:
$destination = storage_path().'/tempDir/databases/'.$userId.'/';
//store the file in the directory
$request->file('database')->move($destination .'database/database.db');
$destination = $destination.'database/database.db';
Config::set('database.connections.sqlite.database', $dbLocation);
$sqliteDbConnection = \DB::connection('sqlite');
$getUsers = $sqliteDbConnection->table("users")->get();
Whenever I try and do this, I am getting the following error:
"SQLSTATE[HY000]: General error: 10 disk I/O error (SQL: select * from \"users\")"
Does anyone have an idea why that might be? I can confirm that the file being saved to the server is in good standing, I am able to open it with other SQLite viewers.
UPDATE: Is it possible that the link between Homestead and my MAC is prohibiting the file from being accessed by Laravel?
UPDATE2: The problem is 100% located in the sharing mechanism between VirtualBox and the computer. I tried this via XAMPP and it works alright. WTF?
This is how I managed to get a Builder instance with my custom PDO connection:
$pathname = '/absolute/path/to.sqlite';
$connection = new \Illuminate\Database\SQLiteConnection(new \PDO('sqlite:' . $pathname));
$builder = new \Illuminate\Database\Query\Builder($connection);
$builder->newQuery()->from('my_table')->get()->all();
I want to import a text file that contains data separated by , . I read on several sources that most people use, "LOAD DATA INFILE". So, I figured it would work for me too.
I get this permissions error however when I do so. I ran this command and here is what I got:
LOAD DATA INFILE '/public_html/nyccrash.txt' INTO TABLE nyccrash;
But it gives me this error:
ERROR 1045(28000): Access denied for user 'username'#'%' (using password: YES)
I read on some other threads that all I had to do was include the full file path and I did but it still didn't work.
Is there another way to import a text file into my table in my database? Using SQL or PHP.
EDIT:
I found this command I can use:
<?php
$row = 1;
$handle = fopen("nyccrash.txt", "r");
echo("<table>");
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
echo("<tr>\r\n");
foreach ($data as $index=>$val) {
echo("\t<td>$val</td>\r\n");
}
echo("</tr>\r\n");
}
echo("</table>");
fclose($handle);
?>
That allows me to read the data and create a table and print it. I can also use the INSERT INTO table sql command after using the above to collect the data but I'm not sure how to insert the values into the table. That is, loop through the values for insertion. My data in the txt file doesn't not contain the attributes or headers of what's contained. So... I'm a little confused on how to sort the data into the right columns.
In order to load data via LOAD DATA LOCAL INFILE you need two things:
FILE privilege. Have a superuser run GRANT FILE ON *.* TO 'username'#'%';.
Set local_infile to 1 in my.cnf. To avoid having to restart mysql, have a superuser run SET GLOBAL local_infile=1;.
CAVEAT : Both of these things would be deemed a security breach.
I made sure I gave my txt file permissions: chmod 711
Then I used LOAD DATA LOCAL INFILE 'nyccrash.txt' INTO TABLE nyccrash FIELDS TERMINATED BY ','; and it worked.
IF you gonna do this a few times, you can mount a sql query by importing data into Excel, and use concat to mount the sql lines, and copy/paste to the client and execute. It's not very usefull for lots of tables. If so, it's not hard to use php to upload a csv file and populate the table.
Try something along these lines:
// Connect to Database
$db_Host = "";
$db_Username = "";
$db_Password = "";
mysql_connect($db_Host,$db_Username,$db_Password) or die("MySQL - Connection Error");
mysql_select_db($db_Database) or die("MySQL - Cannot Select Database");
mysql_query("LOAD DATA LOCAL INFILE '/home/username/public_html/Database.txt' INTO TABLE yourtablename") or die("MySQL - Query Error - " . MySQL_Error());
//MySQL is automatically disconnected from when PHP ends.
Is there any way I can save the data of a specific table of the sugarcrm database into a doc file ?
I have a custom module with username,some notes and date. I want to write this into the database and into a file as well.
Its not just a php file. I want to use logic hooks and write the code. I want to use the logic hooks to access database and then write the data into the file.
Thanks in advance
Saving as a DOC file probably isn't the best idea, since it is primarily used for formatting information. A standard .txt file is usually what you would use for such a process.
With that said, there isn't any methods built into sugar that will let you do this. You will need to build the capability into the module.
What exactly are you attempting to accomplish? There is a very powerful auditing tool set, which is good for seeing revisions to a module object. If you are just wanting to monitor changes to the table, you can setup logging for that table/database inside of SQL.
+++Ok, if you are just looking to write to a file after saves, follow the instructions at: http://cheleguanaco.blogspot.com/2009/06/simple-sugarcrm-logic-hook-example.html for a quick how-to on getting the logic hooks working. You are going to want to make a php file that simply uses the data passed to it via the bean class, and either writes to the file directly from the data within bean, or uses the bean->id parameter to do a SQL query and write to the file from that data.
Also, is this a DOC file that is going to be immediately generated and then destroyed at the end of the transaction? Or is it more of a log file that will be persistent?
++++That is simple enough then
Where you have the Query right now, replace it with:
$file = fopen($pathAndNameOfFile, 'a+') or die('Could not open file.');
$query = "SELECT * FROM data_base.table";
$result = $bean->db->query($query,true);
$dbRowData = $bean->db->mysql_fetch_assoc($result);
$printedArray = print_r($dbRowData);
fwrite($file, $printedArray) or die('Could not write to file.');
fclose($file);
*A quick note, you might need to set permissions in order to be able to read/write to the file, but those are specific to the machine type, so if you encounter errors with either do a search for setting permissions for your particular server type.
**Also, 'SELECT * FROM database.table' is going to return ALL of the rows in the entire table. This will generate a very large file, and be a performance hindrance as the table grows. You should use the bean class to update the last saved tuple:
$file = fopen($pathAndNameOfFile, 'a+') or die('Could not open file.');
$query = "SELECT * FROM data_base.table WHERE id = '".$focus->id."';";
$result = $bean->db->query($query,true);
$dbRowData = $bean->db->mysql_fetch_assoc($result);
$printedArray = print_r($dbRowData);
fwrite($file, $printedArray) or die('Could not write to file.');
fclose($file);
You can export/dump mysql databases into SQL files using mysqldump
mysqldump -u userName -p databaseName tableName > fileName.sql
If I use the following in a mysql_query command:
SELECT *
FROM mytable
INTO OUTFILE '/tmp/mytable.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Where is the tmp file relative to, to the MySQL database somehow or to the PHP file?
If it does not exist will it be created?
If I would like it to appear 1 folder up from the PHP file which does it, how would I do that?
According to The Documentation On Select, it's stored on the server and not on the client:
The SELECT ... INTO OUTFILE 'file_name' form of SELECT writes the selected rows to a file. The file is created on the server host, so you must have the FILE privilege to use this syntax. file_name cannot be an existing file, which among other things prevents files such as /etc/passwd and database tables from being destroyed. As of MySQL 5.0.19, the character_set_filesystem system variable controls the interpretation of the file name.
And, more to the point:
The SELECT ... INTO OUTFILE statement is intended primarily to let you very quickly dump a table to a text file on the server machine. If you want to create the resulting file on some other host than the server host, you normally cannot use SELECT ... INTO OUTFILE since there is no way to write a path to the file relative to the server host's file system.
So, don't use it in production to generate CSV files. Instead, build the CSV in PHP using fputcsv:
$result = $mysqli->query($sql);
if (!$result) {
//SQL Error
}
$f = fopen('mycsv.csv', 'w');
if (!$f) {
// Could not open file!
}
while ($row = $result->fetch_assoc()) {
fputcsv($f, $row);
}
fclose($f);
Where is the tmp file relative to?
A: The file will have the result of the select * from mytable
If it does not exist will it be created?
A: yes
If I would like it to appear 1 folder up from the php file which does it, how would I do that?
A: if you want one folder up from the fileYouAreRunning.php then make path like that: "../mytable.csv"
Your current query has an absolute path. So the outfile will not be relative to anything, but saved to /tmp/mytable.csv.
I'd say, the safest bet would be to keep useing absolute paths, so check in your php what your absolute path to the parent is, and then add this to your query using a variable.