If I have a CSV saved on a server, how can I use PHP to write a given line, say 142,fred,elephants to the bottom of it?
Open the CSV file for appending (fopenDocs):
$handle = fopen("test.csv", "a");
Then add your line (fputcsvDocs):
fputcsv($handle, $line); # $line is an array of strings (array|string[])
Then close the handle (fcloseDocs):
fclose($handle);
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fputcsv.php (PHP 5 >= 5.4.0)
$file = new SplFileObject('file.csv', 'a');
$file->fputcsv(array('aaa', 'bbb', 'ccc', 'dddd'));
$file = null;
This solution works for me:
<?php
$list = array
(
'Peter,Griffin,Oslo,Norway',
'Glenn,Quagmire,Oslo,Norway',
);
$file = fopen('contacts.csv','a'); // 'a' for append to file - created if doesn't exit
foreach ($list as $line)
{
fputcsv($file,explode(',',$line));
}
fclose($file);
?>
Ref: https://www.w3schools.com/php/func_filesystem_fputcsv.asp
If you want each split file to retain the headers of the original; this is the modified version of hakre's answer:
$inputFile = './users.csv'; // the source file to split
$outputFile = 'users_split'; // this will be appended with a number and .csv e.g. users_split1.csv
$splitSize = 10; // how many rows per split file you want
$in = fopen($inputFile, 'r');
$headers = fgets($in); // get the headers of the original file for insert into split files
// No need to touch below this line..
$rowCount = 0;
$fileCount = 1;
while (!feof($in)) {
if (($rowCount % $splitSize) == 0) {
if ($rowCount > 0) {
fclose($out);
}
$out = fopen($outputFile . $fileCount++ . '.csv', 'w');
fputcsv($out, explode(',', $headers));
}
$data = fgetcsv($in);
if ($data)
fputcsv($out, $data);
$rowCount++;
}
fclose($out);
Related
This exports one file with 2 rows in it. How can export 2 files each with one row in it?
<?php
$list = array (
array("Peter", "Griffin" ,"Oslo", "Norway"),
array("Glenn", "Quagmire", "Oslo", "Norway")
);
$file = fopen("contacts.csv","w");
foreach ($list as $line) {
fputcsv($file, $line);
}
fclose($file);
?>
I tried this:
$list = array (
array("Peter", "Griffin" ,"Oslo", "Norway"),
array("Glenn", "Quagmire", "Oslo", "Norway")
);
$file = fopen('php://output', 'w');
$i = 1;
foreach ($list as $line) {
header('Content-Disposition: attachment; filename="' . $i . 'wp.csv"');
fputcsv($file, $line);
fclose($file);
$i++;
}
But it only downloads one file. At least it only has the one row in it though.
I found this example. Even though it creates 2 separate csv files, the data is identical in each file instead of each file containing an individual record.
// some data to be used in the csv files
$headers = array('id', 'name', 'age', 'species');
$records = array(
array('1', 'gise', '4', 'cat'),
array('2', 'hek2mgl', '36', 'human')
);
// create your zip file
$zipname = 'file.zip';
$zip = new ZipArchive;
$zip->open($zipname, ZipArchive::CREATE);
// loop to create 3 csv files
for ($i = 0; $i < 3; $i++) {
// create a temporary file
$fd = fopen('php://temp/maxmemory:1048576', 'w');
if (false === $fd) {
die('Failed to create temporary file');
}
// write the data to csv
fputcsv($fd, $headers);
foreach($records as $record) {
fputcsv($fd, $record);
}
// return to the start of the stream
rewind($fd);
// add the in-memory file to the archive, giving a name
$zip->addFromString('file-'.$i.'.csv', stream_get_contents($fd) );
//close the file
fclose($fd);
}
// close the archive
$zip->close();
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipname);
header('Content-Length: ' . filesize($zipname));
readfile($zipname);
// remove the zip archive
// you could also use the temp file method above for this.
unlink($zipname);
You can only put one "file" into an HTTP response.
If you want to generate multiple CSV files then you'll need to get more exotic. You could generate a HTML document of links to URLs that each generate a CSV file, or you could generate a zip file containing the CSV files.
Just make a new File and use it like this
$file2 = fopen("contacts2.csv", "w");
fputcsv($file2, $list[1]);
You always write in the same file here, because you are refering to $file
fputcsv($file, $list[1]);
I'm trying to delete one line from CSV file by its line number, which I get as a parameter in URL.
I saw some discussions here, but it was mainly "delete a line by its id stored in first column" and so on. I tried to make it in the same way as others in these discussions, but it does not work. I only changed the condition.
if (isset($_GET['remove']))
{
$RowNo = $_GET['remove']; //getting row number
$row = 1;
if (($handle = fopen($FileName, "w+")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE)
{
//Here, I don't understand, why this condition does not work.
if ($row != $RowNo)
{
fputcsv($handle, $data, ';');
}
$row++;
}
fclose($handle);
}
}
I supposed, that it should work for me too, BCS just condition was changed. But it does not. It clears the whole file. Could you help me with it, please?
Thank you very much for any advice. Daniel.
You could load the file as an array of lines by using file().
Then remove the line and write the file back.
// read the file into an array
$fileAsArray = file($FileName);
// the line to delete is the line number minus 1, because arrays begin at zero
$lineToDelete = $_GET['remove'] - 1;
// check if the line to delete is greater than the length of the file
if ($lineToDelete > sizeof($fileAsArray)) {
throw new Exception("Given line number was not found in file.");
}
//remove the line
unset($fileAsArray[$lineToDelete]);
// open the file for reading
if (!is_writable($fileName) || !$fp = fopen($fileName, 'w+')) {
// print an error
throw new Exception("Cannot open file ($fileName)");
}
// if $fp is valid
if ($fp) {
// write the array to the file
foreach ($fileAsArray as $line) {
fwrite($fp, $line);
}
// close the file
fclose($fp);
}
If you have a unix system you could also use sed command:
exec("sed -e '{$lineToDelete}d' {$FileName}");
Remember cleaning command parameters if user input used:
https://www.php.net/manual/de/function.escapeshellcmd.php
Option if your CSV can fit to memory:
// Read CSV to memory array
$lines = file($fileName, FILE_SKIP_EMPTY_LINES | FILE_IGNORE_NEW_LINES);
// Remove element from array
unset($lines[$rowNo - 1]); // Validate that element exists!
// Rewrite your CSV file
$handle = fopen($fileName, "w+");
for ($i = 0; $i < count($lines); $i++) {
fputcsv($handle, $data, ';');
}
fclose($handle);
Option if your CSV can not fit to memory:
Use code from question, just write to separate file and later replace it with actual file:
$handle = fopen($FileName, "r");
// Read file wile not End-Of-File
while (!feof($fn)) {
if ($row != $RowNo) {
file_put_contents($FileName . '.tmp', fgets($fn), FILE_APPEND);
}
$row++;
}
fclose($handle);
// Remove old file and rename .tmp to previously removed file
unlink($FileName);
rename($FileName . '.tmp', $FileName);
I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}
I saw using fseek to insert string before last line this question, but this isn't solving my problem. I not use "?>" tag. Php version PHP 5.4
example line1
example line2
//i need insert here
lastline $eg};
My code is working but this is adding empty lines after all lines :
$filename = 'example.php';
$arr = file($filename);
if ($arr === false) {
die('Error' . $filename);
}
array_pop($arr);
file_put_contents($filename, implode(PHP_EOL, $arr));
/// I'm deleting last line here
$person = "my text here\n";
file_put_contents($filename, $person, FILE_APPEND);
$person = "andherelastline";
file_put_contents($filename, $person, FILE_APPEND);
//and then add again here
$file = "tmp/saf.txt";
$fc = fopen($file, "r");
while (!feof($fc)) {
$buffer = fgets($fc, 4096);
$lines[] = $buffer;
}
fclose($fc);
//open same file and use "w" to clear file
$f = fopen($file, "w") or die("couldn't open $file");
$lineCount = count($lines);
//loop through array writing the lines until the secondlast
for ($i = 0; $i < $lineCount- 1; $i++) {
fwrite($f, $lines[$i]);
}
fwrite($f, 'Your insert string here'.PHP_EOL);
//write the last line
fwrite($f, $lines[$lineCount-1]);
fclose($f);
I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}