So I was able to parse the txt file into a csv file
$data = array();
while ($line= fgets ($fh)) {
$stack = array($LAUS,$FIPS,$CountyName,$Date,$_CLF,$_EMP,$_UNEMP,$RATE);
array_push($data, $stack);
}
$file = fopen('file.csv','w');
foreach ($data as $fields) {
fputcsv($file, $fields,',','"');
}
fclose($file);
My question is, what is the best way to create multiple csv files that are seperated by Month (also the year, like Jan01.csv, Jan02.csv).
I'm taking a bit of a guess at the formatting of your date
while ($line = fgets ($fh)) {
// Not sure where you're getting these values, but I'm assuming it's correct
$stack = array($LAUS,$FIPS,$CountyName,$Date,$_CLF,$_EMP,$_UNEMP,$RATE);
// Assuming $Date looks like this '2011-10-04 15:00:00'
$filename = date('My', strtotime($Date)) . '.csv';
$file = fopen($filename,'a+');
fputcsv($file, $stack,',','"');
fclose($file);
}
This will be a little slow since you're opening and closing files constantly, but since I don't know the size of your original data set I don't want to use up all the memory caching the result before I write it.
Be aware that running this multiple times will end up with duplicate data being inserted into your CSV files. You may want to add some code to remove/clear out any currently existing CSV files before you run this bit of code.
Related
I'm coding a plugin that runs everyday at 5am. It combines multiple csv files (That have a txt extension).
Currently, it is working... HOWEVER, the output format is incorrect.
The input will look like this:
"","","","","email#gmail.com","PARK PLACE 109 AVE","SOME RANDOM DATA","","","",""
And so on. this is only a partial row.
The ouput of this code does not retun the same format. It produces something like this without the " in columns without data
,,,,email#gmail.com,"PARK PLACE 109 AVE","SOME RANDOM DATA",,,,
Here is the part of the function that combines everything:
function combine_and_email_csv_files() {
// Get the current time and date
$now = new DateTime();
$date_string = $now->format('Y-m-d_H-i-s');
// Get the specified directories
$source_directory = get_option('csv_file_combiner_source_directory');
$destination_directory = get_option('csv_file_combiner_destination_directory');
// Load the CSV files from the source directory
$csv_files = glob("$source_directory/*.txt");
// Create an empty array to store the combined CSV data
$combined_csv_data = array();
// Loop through the CSV files
foreach ($csv_files as $file) {
// Load the CSV data from the file
$csv_data = array_map('str_getcsv', file($file));
// Add the CSV data to the combined CSV data array
$combined_csv_data = array_merge($combined_csv_data, $csv_data);
}
// Create the combined CSV file
$combined_csv_file = fopen("$destination_directory/$date_string.txt", 'w');
// Write the combined CSV data to the file
foreach ($combined_csv_data as $line) {
fputcsv($combined_csv_file, $line);
}
// Close the combined CSV file
fclose($combined_csv_file);
}
No matter, what I've tried... it's not working. I'm missing something simple I know.
Thank you Nigel!
So this thread, Forcing fputcsv to Use Enclosure For *all* Fields helped me get there....
Using fputs instead of fputscsv and force "" on null values is the short answer for me. Works beautifully... code is below:
function combine_and_email_csv_files() {
// Get the current time and date
$now = new DateTime();
$date_string = $now->format('Y-m-d_H-i-s');
// Get the specified directories
$source_directory = get_option('csv_file_combiner_source_directory');
$destination_directory = get_option('csv_file_combiner_destination_directory');
// Load the CSV files from the source directory
$csv_files = glob("$source_directory/*.txt");
// Create an empty array to store the combined CSV data
$combined_csv_data = array();
// Loop through the CSV files
foreach ($csv_files as $file) {
// Load the CSV data from the file
$csv_data = array_map('str_getcsv', file($file));
// Add the CSV data to the combined CSV data array
$combined_csv_data = array_merge($combined_csv_data, $csv_data);
}
// Create the combined CSV file
$combined_csv_file = fopen("$destination_directory/$date_string.txt", 'w');
// Write the combined CSV data to the file
foreach ($combined_csv_data as $line) {
// Enclose each value in double quotes
$line = array_map(function($val) {
if (empty($val)) {
return "\"\"";
}
return "\"$val\"";
}, $line);
// Convert the line array to a CSV formatted string
$line_string = implode(',', $line) . "\n";
// Write the string to the file
fputs($combined_csv_file, $line_string);
}
Thank you Sammitch
After much haggling with this problem... Sammitch pointed out why not just concat the files... Simplicity is the ultimate sophistication... right?
*Note: this will only work for my specific circumstance. All I'm doing now is concating the files and checking each file ends with a new line and just plain skipping the csv manipulation.
Code below:
function combine_and_email_csv_files() {
// Get the current time and date
$now = new DateTime();
$date_string = $now->format('Y-m-d_H-i-s');
// Get the specified directories
$source_directory = get_option('csv_file_combiner_source_directory');
$destination_directory = get_option('csv_file_combiner_destination_directory');
// Load the files from the source directory
$files = glob("$source_directory/*.txt");
// Create the combined file
$combined_file = fopen("$destination_directory/$date_string.txt", 'w');
// Loop through the files
foreach ($files as $file) {
// Read the contents of the file
$contents = file_get_contents($file);
// Ensure that the file ends with a newline character
if (substr($contents, -1) != "\n") {
$contents .= "\n";
}
// Write the contents of the file to the combined file
fwrite($combined_file, $contents);
}
// Close the combined file
fclose($combined_file);
I have a .csv file which I can use with Google maps API to successfully create map data.
What I'm looking to do is merge 2 (or more) .csv files and display the TOTAL data on the Google map in the same way. They are all in the same format.
I have the paths to the 2 csv files and if need be, a blank .csv file in the same directory where the files could be merged to...
Unfortuantely, the .csv files all have an initial 'header row' which would be awesome to omit when merging...
If anyone can point me in the right direction, I'd be very happy. Thanks
edit: I've tried:
$data1 = file_get_contents('google_map_data.csv');
$data2 = file_get_contents('google_map_data2.csv');
$TOTALdata = "google_map_dataALL.csv";
function joinFiles(array $files, $result)
{
if(!is_array($files)) {
throw new Exception('`$files` must be an array');
}
$wH = fopen($result, "w+");
foreach($files as $file)
{
$fh = fopen($file, "r");
while(!feof($fh))
{
fwrite($wH, fgets($fh));
}
fclose($fh);
unset($fh);
fwrite($wH, "\n"); //usually last line doesn't have a newline
}
fclose($wH);
unset($wH);
joinFiles(array($data1, $data2), $TOTALdata);
I'm assuming both files are small, so loading them all in one go should be OK.
The code loads both files then removes the first line off the second one. It also removes any end of line from the first file, but adds it's own to ensure it always has a new line...
$data1 = file_get_contents('google_map_data.csv');
$data2 = file_get_contents('google_map_data2.csv');
$TOTALdata = "google_map_dataALL.csv";
$data2 = ltrim(strstr($data2, PHP_EOL));
file_put_contents($TOTALdata, rtrim($data1).PHP_EOL.$data2);
I have a csv file that looks something like this (there are many more rows):
Jim,jim#email.com,8882,456
Bob,bob#email.com,8882,343
What I want to do is to change all the values in the fourth column,456,343 to 500.
I'm new to php and am not sure how to do this.
I have tried
<?php
$file = fopen('myfile.csv', 'r+');
$toBoot = array();
while ($data = fgetcsv($file)) {
echo $data[3];
$data[3] = str_replace($data[3],'500');
array_push($toBoot, $data);
}
//print_r($toBoot);
echo $toBoot[0][3];
fputcsv($file, $toBoot);
fclose($file)
?>
But it prints
Jim,jim#email.com,8882,456
Bob,bob#email.com,8882,343
Array,Array
not
Jim,jim#email.com,8882,500
Bob,bob#email.com,8882,500
I've looked at this post, PHP replace data only in one column of csv but it doesn't seem to work.
Any help appreciated. Thanks
You can use preg_replace and replace all values at once and not loop each line of the CSV file.
Two lines of code is all that is needed.
$csv = file_get_contents($path);
file_put_contents($path, preg_replace("/(.*),\d+/", "$1,500", $csv));
Where $path is the path and to the CSV file.
You can see it in action here: https://3v4l.org/Mc3Pm
A quick and dirty way to way to solve your problem would be:
foreach (file("old_file.csv") as $line)
{
$new_line = preg_replace('/^(.*),[\d]+/', "$1,500", $line);
file_put_contents("new_file.csv", $new_line, FILE_APPEND);
}
To change one field of the CSV, just assign to that array element, you don't need to use any kind of replace function.
$data[3] = "500";
fputcsv() is used to write one line to a CSV file, not the entire file at once. You need to call it in a loop. You also need to go back to the beginning of the file and remove the old contents.
fseek($file, 0);
ftruncate($file, 0);
foreach ($toBoot as $row) {
fputcsv($file, $row);
}
I am new here and need a bit of help. I have a php script which is pulling data out of a database and creating .csv files. I need to add some logic to the script which can compare two files and then rename the file if the files size is equal to or greater than a specific (TBD) size.
Basically this script runs twice a hour and I would only like the .csv files rewritten if the file size is large enough. THis is all in hopes that It will prevent .csv files being created which are incomplete or too small.
This a bit of the code which is creating the .csv documents. Any help would be much appreciated.
$course_csv = fopen('/Course.csv','w');
$courses_u = array_unique($courses, SORT_REGULAR);
foreach($courses_u as $course){
fputcsv($course_csv, $course, '|');
}
fclose($course_csv);
$data = file('/Course.csv');
$handle = fopen("/Course.csv", "w");
foreach ($data as $line) {
$line = str_replace(array("\r\n", ',','"'), "", $line);
fwrite($handle, "{$line}");
$maxfilesize = 2048;
$myfilesize = filesize('/Course.csv');
if ($myfilesize > $maxfilesize) {
rename('/Course.csv', '/CourseToBig.csv');
}
I have excel(file.xls)/csv(file.csv) file that contains/will contain hundreds of thousands of entry, even millions I guess. Is it possible to split this one to multiple file? Like file.xls to file1.xls, file2.xls, file3.xls and so on.
Are there any libraries to use? Is this possible on PHP? or how about javascript?
On where I can specify how many rows to be included on each file?
Thanks
Quick and dirty way of splitting a CSV file into several CSV files
$inputFile = 'input.csv';
$outputFile = 'output';
$splitSize = 10000;
$in = fopen($inputFile, 'r');
$rowCount = 0;
$fileCount = 1;
while (!feof($in)) {
if (($rowCount % $splitSize) == 0) {
if ($rowCount > 0) {
fclose($out);
}
$out = fopen($outputFile . $fileCount++ . '.csv', 'w');
}
$data = fgetcsv($in);
if ($data)
fputcsv($out, $data);
$rowCount++;
}
fclose($out);
Yes it is possible to do that in PHP and with CSV files. You basically iterate over the large file and chunk each X rows, forwarding those rows to another file.
You find the information how to open the large CSV file as an iterator in this answer here:
Answer to "how to extract data from csv file in php"
Then you need to chunk the iterator each X rows parts. That can be done as outline here:
Answer to "Need some advice with PHP loop"
Just instead of outputting into multiple <ul>...</ul> HTML lists, you copy over into a new files. That basically works like outlined in:
Answer to "How can I split a CSV file in PHP?"
However this time you want to use the SplFileObject::fputcsv method. Take care you use the latest stable PHP for this, otherwise you need do different, see fputcsv().
If the first line of the original file contains column-headers, you might be as well interested in the following:
Answer to "Process CSV Into Array With Column Headings For Key"
It just shows some ways to extend / process the incomming file. You might not need the full abstraction done there, just keeping the first line around might do it already.
I think You can also use "split by file size":
$part = 1;
$maxSize = 50;//50 Mb
$fopen = fopen('filename.csv','r') or die ('ERROR');
while (($line = fgetcsv($fopen, 10000, ";")) !== FALSE) {
$ftowrite = fopen("Part_$part.csv",'a');
fputcsv($ftowrite,$line);
clearstatcache();
$size = filesize ( "review_p$part.csv" ) / 1000000;
if ($size > $maxSize) {
fclose($ftowrite);
$part++;
}
}