Txt file:
942,Nike%27s+cities,2469,2,2,164
316,just+me,820,1,1,286
827,RES+4+GOLD,1523,1,1,214
1171,KnightBlood,1550,1,1,211
1172,athens,2095,2,2,177
The above file are alliances from a game i'm trying to make a tool for the format is the following:
id, name, points, towns, members, rank
In my database i'm storing 3 additional columns:
id, name, points, towns, members, rank, world_server, world_id, date
I've looked around and tried multiple different options/queries but I can't seem to get it to work. My current query, in PHP, is:
"LOAD DATA LOCAL INFILE /path/file.txt
INTO TABLE alliance
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
SET world_server = $world[server], world_id = $world[id], date = $curDate"
I get that this is probably a duplicate question but I have searched through stackoverflow, mysql docs, etc... for multiple days now and I don't get what's wrong with my query. I am not getting any error messages either which doesn't help, my database just stays empty.
Hope someone can help,
A desperate student.
Related
my issue is this
I have table in my database which have more than 1 million rows. Sometimes i need have sql dump of my table in my PC and export whole table taking very long.
Actually i need exported table for example with last 5000 rows.
So is there a way to export MySql table by selecting last X rows.
I know some ways to do it by terminal commands, but i need poor MySql query if it is possible.
Thanks
If I understand well, you could try by using the INTO OUTFILE functionality provided by MySQL. of course I don't know what's your current query but you can easily change my structure with yours:
SELECT *
FROM table_name
INTO OUTFILE '/tmp/your_dump_table.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
ORDER BY id DESC
LIMIT 5000
Since the user #Hayk Abrahamyan has expressed preference to export the dump as .sql file, let's try to analyze a valid alternative:
We can run the query from phpmyadmin or (it's for sure a better solution) MysqlWorkbench SQL editor console and save it by press the export button (see the picture below):
As .sql output file result you will have something like the structure below:
/*
-- Query: SELECT * FROM mytodo.todos
ORDER BY id DESC
LIMIT 5000
-- Date: 2018-01-07 13:15
*/
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (3,'Eat something',1);
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (2,'Buy something',1);
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (1,'Go to the store',0);
I can insert into my database. Except I need to modify the code.
I have a tables that contains the labor and part numbers and cost in a mysql database. I am then inserting it into a SQL database.
The SQL database display is not able to be changed. We perform service work and there are two lines for every product we fix.
Number 1 Parts Line, Number 2 Labor line. I am able to get this information and shove it into the SQL database one line at a time but, I am trying to figure out how to do it within an array.
So Product 1 has part line and labor line. Then Product 2 has a part line and labor line and so forth.
Example SQL Database View
So this is causing some issues. Any help would be greatly appreciated.
Not sure what the 'causing some issues' might be, as you don't say, but you CAN insert two items in one statement (assuming that you have the data for both rows), if it might help the 'issue':
Something like this:
INSERT INTO table (field1, field2, field3, ...)
VALUES (data1, data2, data3, ...),(data1, data2, data3, ...);
The first value set can be the parts info, the second value set can be the labor info.
I have a site where I want to add songs with all details like Artist ,Label etc. I upload song using addfile.php file. This page have individual text input fields and form POST to addfile_db.php where I have some SQL Query.
Query for insert into file. Works ok. Dont want any changes.
$qryUpdate = "insert into file
(name,artist,music,label,lyrics, recommended,dname,cid,ext,thumbext,size,`desc`,newtag,date,imagetype)
VALUES
('$name','$artist','$music','$label','$lyrics','$recommended','$newname',$cid,'$ext','$thumbext','$fileSize','$des','$newtag','(now())',0)";
$db->query($qryUpdate);
I also want to insert $artist value into artist table. A single song may multiple artist. $artist have comma seperated value like this Artist1,Artist2,Artist3 and I explode values into array.
$artist_array = explode(","$artist);
I try this query.
$ArtQry = "insert into artist (name) VALUES ('" . implode("'), ('", $artist_array) . "')";
$db->query($ArtQry);
Above codes works perfectly. Adds 3 rows to table artist with unique id lke 1,2,3,4 ....
Problem is this that I want to insert into artist if already not exist.
I have tried this code but not works, inserts nothing when I use these code.
INSERT INTO artist (name)
SELECT * FROM (SELECT '" . implode("', '", $artist_array) . "') AS tmp
WHERE NOT EXISTS (
SELECT name FROM artist WHERE name = '" . implode("'), ('", $artist_array) . "')";
) LIMIT 1;
You can use INSERT IGNORE or ON DUPLICATE KEY UPDATE name=name assuming name is a UNIQUE key. https://dev.mysql.com/doc/refman/5.5/en/insert.html
...but enough about that...
If you want to create a proper, First Normal Form (1NF) then never doom your database with columns that hold comma-separated values.
As a specific example, let's say you want to INSERT an artist called "Does It Offend You, Yeah?" or "What Price, Wonderland?"
Uh oh, you are going to be exploding/imploding in php and JOINing ON IN/LOCATE/FIND_IN_SET in mysql based solely on those delimiting commas resulting in unwanted surgery on artist names(*no delimiter is safe with all the funky artist name possibilities), but things get worse...
What about two or more bands that have the same name! You don't have any means to differentiate them.
If you have any shred of interest in future-proofing your data, you will need to expand your database design with sufficient separate tables to avoid CSVs.
If you don't, you are going to find yourself doing some unreliable and slow-performing queries using FIND_IN_SET and other workarounds that could have been fully avoided if you had invested your early development time setting up a healthy set of tables, each with unique primary keys.
If this is coming off sounding too judgmental, I apologize. You should find some solace in the fact that developers big and small are prolifically building database tables with CSVs -- some out of ignorance, some out of habit, some think it is more efficient, and on and on.
Some SO members may chime in to say that each case is different and developers must choose what is best for their specific agenda, but for your case I have already listed two different varieties of conflict.
Truth told, in my earlier days I inherited some CSV tables and had to learn how to hack and workaround them. In my ignorance, I copied what I saw and set up CSVs in my future DB builds until someone (or some site) enlightened me; and now I can't see myself ever going back.
Im trying to export data from a database into a set format so that it can be interpreted by myob correctly. I have the exporting function working great and I have half worked out the sql query but am having trouble with how to put it all together.
The following is a simple example to illustrate what I'm trying to achieve
$sql = "SELECT (CONCAT(PICKUPID,DROPOFFID) AS ITEM),
(CONCAT(Rego, Pickup, Dropoff, booker, date) AS Description)
FROM booking, myob";
The issue is that I need to pull the pickup and dropoff /ID out of the myob table in reference to what is in the booking table.
booking Table Columns
Rego
Pickup
Dropoff
Date
booker
myob Table Columns
MYOBID (where MYOBID = PICKUPID OR DROPOFFID)
Address (where Address = Dropoff or Pickup)
I am hoping to do this in an SQL query or a PL/SQL transaction but I'm having trouble getting my head around it. Any help would be appreciated (hope the questions not to confusing)
*Data in Tables***
booking
Rego , Pickup , Dropoff , Date , booker
123, bris, sydn, 1/2/12, barry
myob
MYOBID , Address
Q, bris
N, sydn
OUTPUT
ITEM , Description
QN, 123 bris sydn 1/2/12 barry
You need to JOIN the two tables.
You may also find some more practical information here.
Please come back to us if you are still stuck after reading these.
Also, I see little advantage in CONCAT'ing your fields at query time, I think you'd better leave the formatting out the output to your application layer. Of course, this is probably an overkill if all you are trying to achieve is a quick-and-dirty one-time export.
I am trying to understand how to write a query to import my goodreads.com book list (csv) into my own database. I'm using PHP.
$query = "load data local infile 'uploads/filename.csv' into table books
RETURN 1 LINES
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(Title, Author, ISBN)";
This results in data in the wrong columns, I also can't get the ISBN, no matter what I try, guessing the equals (=) sign is causing some problems.
Here's the first two lines of the CSV:
Book Id,Title,Author,Author l-f,Additional Authors,ISBN,ISBN13,My Rating,Average Rating,Publisher,Binding,Number of Pages,Year Published,Original Publication Year,Date Read,Date Added,Bookshelves,Bookshelves with positions,Exclusive Shelf,My Review,Spoiler,Private Notes,Read Count,Recommended For,Recommended By,Owned Copies,Original Purchase Date,Original Purchase Location,Condition,Condition Description,BCID
11487807,"Throne of the Crescent Moon (The Crescent Moon Kingdoms, #1)","Saladin Ahmed","Ahmed, Saladin","",="0756407117",="9780756407117",0,"3.87","DAW","Hardcover","288",2012,2012,,2012/02/19,"to-read","to-read (#51)","to-read",,"",,,,,0,,,,,
What I want to do is only import the Title, Author, and ISBN for all books to a table that has the following fields: id (auto), title, author, isbn. I want to exclude the first row as well because I dont want the column headers, but when I have tried that it fails every time.
What's wrong with my query?
Unfortunately, in case you want to use the native MySQL CSV parser, you will have to create a table that matches the column count of the CSV file. Simply drop the unnecessary columns when the import is finished.