Compare Query Result with CSV/Array and get missing entries - php

I have a huge CSV File with 20000+ User Entries+User Fields, that i have to compare with our Users in our Database.
The aim is to archive Every User in our database that is not in the CSV File.
My solution would be:
Get Multidimensional Array out of the CSV File
Get every User of the Database
While fetching the User, iterate through CSV array and look if User is in CSV
It is a solution that works but it draws way too much performance.
20,000~ User in CSV * 20,000~ User in Database.
=>400,000,000 Iterations (If no User is found of course...)
Is there a way to reduce the iterations to 20000~?

Yes, you can import csv data into another table and use SQL join to fetch the desired result. That way your data will be fetch much faster than before. Use temp table to import csv file.

Related

How to customize exporting query result (PHP mysql) to excel file?

I have some questions about customizing export result(excel) with php. The idea is i want to export my query of my raw data (mysql table) to excel file but with some customization in the result.
for example i want to have result which is summary of the table like below table:
The 3rd column until 7th column is named based on the last 5 days of my report date.
My idea is:
1. create temporary table using format as the result table i want to generate
2. Insert the table with my raw data.
3. Delete those table.
Is that efective?or is there any better idea?
You can always use a view. Which is essentially a select statement with your data in there, and which will be updated whenever your tables are updated. Then you can just do a 'select * from view_name' and export that into your excel.
Depending on the size of the data, there is no need to think about performance.
Edit the data before
You can have a temp table. Depending on the data, this is very fast if you can select and insert the data based on indexes. Then you make a SELECT * from tmp_table; and you have all your data
Edit the data after
You can just join over the different tables, get the data and then loop (read as foreach) over the result array and change the data and export it afterwards

Reporting across multiple CSV files

This might be a vague question. I am given 4 CSV files with about 500k row in each of them on a daily basis. I need to perform 'join' and 'where' equivalent RDMS operations on them to create daily reports. For example, the work flow could be:
Join 2 CSV files based on a column with IDs
Filter dataset down based on a date column
Join the new filtered dataset with another CSV file based on some where conditions
Further filter them down based on more criterias
.... // Repeat
Output final dataset into a CSV file
I was thinking of writing a PHP script to:
Load each CSV file into a relational database like MySQL
Perform the joins and where conditions with SQL
Load results into a temporary table
Repeat 2 and 3
Load final data into a table
Export the table into a CSV file.
What do you think is the best approach?

Move MySQL Data from one Table to Another Skip duplicate (or overwrite)

I need to upload csv into mysql. Regularly.
So I am planing to upload csv to a temporary table. Than move data from temporary_table to main_table.
Now, I need help:
How can I move data. And (a) skip duplicate, or (b) overwrite duplicate
The csv currently contains 55566 rows, and will increasing day by day. So, how to handle execute time.
Best practice to import csv to mysql.
How can I move data.
Use INSERT ... SELECT.
And (a) skip duplicate, or (b) overwrite duplicate
Define a UNIQUE key constraint on the columns that determine whether records are duplicate or not; then:
(a) use INSERT IGNORE; or
(b) use either INSERT ... ON DUPLICATE KEY UPDATE or REPLACE.
The csv currently contains 55566 rows, and will increasing day by day. So, how to handle execute time.
Rotate your CSV file after each upload so that past records are not repeatedly uploaded.
Best practice to import csv to mysql.
Use either mysqlimport or LOAD DATA INFILE.

modify csv with php before import / upload with php

I want to upload a csv to a database with php, but before i do that i want to modify some of the content.
The database table the csv will come from has 3 columns: id, postcode, debt_amount
The database table the csv will go to has 4 columns: id, postcode, debt_amount, count
What i want to do first is modify the full postcode to just show the first part before the 'space'.
Then i want to consolidate all the rows that have the same modified postcode, this will do two things:
Count the number of rows with the same modified postcode and place the total number into the consolidated row in the column 'count'.
Add up the 'debt_amount' column for the same modified postcode and put the total amount into the consolidated row under the 'debt_amount' column.
These processes would need to run together.
After that is done i want to upload it to the database.
I don't know if this is the best way of doing it, or if i should process the data from the first database first and export it into a CSV, to just allow me to upload the CSV on the other database.
Any help on either process would be good.
Thanks
I think it is best to process this data in MySQL itself. You may decide if you would like to process it in the source database or the target database.
So, if the processing involves:
modify the postcode
count #rows with same modified-postcode
sum debt_amount for same modified-postcode
Either do the processing in the source database, store the results in a temporary table, generate the CSV and then import to the target database. Or generate the CSV from the source DB, import to the target database in a temporary table, do the processing and store the results to the final table.
Do the standard file upload.
Read the CSV content from the temporary upload file.
Process the CSV Data (e.g. with SplFileObject­Docs.
Insert the processed data into your database.

How i can import a file(csv/excel) with partial data to table in database through phpmyadmin?

I needed from my client updated information to fill a clients table.
I exported the table that i wanted to an excel file, and asked them to fill with the new information (it was only a column that i needed updated) and they've sent me back the file.
Now I want to import that information back to my table column.
Trial and error many times, converted to csv the excel file and imported through phpmyadmin.
But it didnt update any column.
What am I doing wrong?
If you just need to generate UPDATE statements from CSV data, you may want to take a look at my FOSS CSV tool, CSVFix, which can do this and a lot more without you having to write any code, PHP or otherwise.
If you have the file in a .csv and you know some PHP, you can just write a script which loops through the file and inserts/updates the records in the database.
For example, lets say that each line in your csv is structured like this:
id,name,address,email,date
E.g:
1,bob smith,344 abc street,test#example.com,2009-04-01
You could loop through it in this way:
<?php
$data=file_get_contents('your-file.csv');
//Split the file and get an array representing the lines/rows in the .csv
$rows=explode("\n",$data);
foreach ($rows as $row)
{
//Remove any excess whitespace from start and end of the row:
$row=trim($row);
$id=$row[0];
$name=$row[1];
$address=$row[2];
$email=$row[3];
$date=$row[4];
mysql_query("UPDATE TABLE SET name='$name',....);
}
?>
PHP has a function called fgetcsv() to parse CSV files.
You could use that to loop through your CSV file and create MySQL update strings. Which you could execute either through mysql_query() or just to copy and paste into the Query window in PHPMyAdmin.

Categories