Is it possible to get a resource handle from file data? - php

I currently have a function that takes a csv file, and returns an array of the data from it. I want to minimally alter this function to take the file data instead of the file itself.
Using the following code, I would like to get a resource handle from the passed in data, instead of from a file so that I can keep the rest of the function the same. Is this possible?
public function returnRawCSVData($filepath, $separator = ',')
{
$file = fopen($filepath, 'r');
$csvrawdata = array();
//I WANT TO CHANGE $filepath to $file_data and get a resource from it to pass into fgetcsv below.
while( ($row = fgetcsv($file, $this->max_row_size, $separator, $this->enclosure)) != false ) {
if( $row[0] != null ) { // skip empty lines
}
}
fclose($file);
return $csvrawdata;
}

It seems you're looking for a way to create a new file resource from the source text?
If so, you can create a file resource in-memory like so:
/**
* Return an in-memory file resource handle from source text
* #param string $csvtxt CSV source text
* #return resource File resource handle
*/
public static function getFileResourceFromSrcTxt($csvtxt)
{
$tmp_handle = fopen('php://temp', 'r+');
fwrite($tmp_handle, $csvtxt);
return $tmp_handle;
}
/**
* Parse csv data from source text
* #param $file_data CSV source text
* #see self::getFileResourceFromSrcTxt
*/
public function returnRawCSVData($file_data, $separator = ',')
{
$file = self::getFileResourceFromSrcTxt($file_data);
$csvrawdata = array();
while( ($row = fgetcsv($file, $this->max_row_size, $separator, $this->enclosure)) != false ) {
if( $row[0] != null ) { // skip empty lines
// do stuff
}
}
fclose($file);
}
It's worth noting that you can also use "php://memory" in place of "php://temp" -- the difference being that 'memory' ONLY stores things in memory while 'temp' will store something in memory until it reaches a given size (the default is 2 MB), then transparently switch to the filesystem.
Find out more about what the php docs say on this topic ...

If you're trying to pass around file handles, you can treat them as such:
$in_file = fopen('some_file.csv', 'r');
// Do stuff with input...
// Later, pass the file handle to a function and let it read from the file too.
$data = doStuffWithFile($in_file);
fclose($in_file);
function doStuffWithFile($file_handle)
{
$line = fgetcsv($file_handle);
return $line;
}

Related

Best way to read a large file in php [duplicate]

This question already has answers here:
Reading very large files in PHP
(8 answers)
Closed 1 year ago.
I have a file with around 100 records for now.
The file has users in json format per line.
Eg
{"user_id" : 1,"user_name": "Alex"}
{"user_id" : 2,"user_name": "Bob"}
{"user_id" : 3,"user_name": "Mark"}
Note : This is a just very simple example, I have more complex json values per line in the file.
I am reading the file line by line and store that in an array which obviously will be big if there are a lot of items in the file.
public function read(string $file) : array
{
//Open the file in "reading only" mode.
$fileHandle = fopen($file, "r");
//If we failed to get a file handle, throw an Exception.
if ($fileHandle === false) {
throw new Exception('Could not get file handle for: ' . $file);
}
$lines = [];
//While we haven't reach the end of the file.
while (!feof($fileHandle)) {
//Read the current line in.
$lines[] = json_decode(fgets($fileHandle));
}
//Finally, close the file handle.
fclose($fileHandle);
return $lines;
}
Next, Ill process this array and only take the parameters I need (some parameters might be further processed) and then Ill export this array to csv.
public function processInput($users){
$data = [];
foreach ($users as $key => $user)
{
$data[$key]['user_id'] = $user->user_id;
$data[$key]['user_name'] = strtoupper($user->user_name);
}
// Call export to csv $data.
}
What should be the best way to read the file (incase we have a big file)?
I know file_get_contents is not optimized way and instead fgets is a better approach.
Is there a much better way considering big file read and then put it to csv.
You need to modify your reader to make it more "lazy" in some sense. For example consider this:
public function read(string $file, callable $rowProcessor) : void
{
//Open the file in "reading only" mode.
$fileHandle = fopen($file, "r");
//If we failed to get a file handle, throw an Exception.
if ($fileHandle === false) {
throw new Exception('Could not get file handle for: ' . $file);
}
//While we haven't reach the end of the file.
while (!feof($fileHandle)) {
//Read the current line in.
$line = json_decode(fgets($fileHandle));
$rowProcessor($line);
}
//Finally, close the file handle.
fclose($fileHandle);
return $lines;
}
Then your will need different code that works with this:
function processAndWriteJson($filename) { //Names are hard
$writer = fopen('output.csv', 'w');
read($filename, function ($row) use ($writer) {
// Do processing of the single row here
fputcsv($writer, $processedRow);
});
}
If you want to get the same result as before with your read method you can do:
$lines = [];
read($filename, function ($row) use ($writer) {
$lines[] = $row;
});
It does provide some more flexibility. Unfortunately it does mean you can only process one line at a time and scanning up and down the file is harder

How to remove duplicate emails in csv file

How to remove duplicate email address in different csv file
example:
I have 10.000 emails address (all_members.csv) after sent out > Then I received 2550 invalid email (invalid.csv)
I want to remove that "invalid email"
My Code "
<?php
$all = file('all_email.csv'); // all_members.csv
$invalid = file('invalid.csv'); // invalid_email.csv
$correctEmails=array_diff($all, $invalid);
foreach ($correctEmails as $email) { echo $email."<br>"; }
$result = array_intersect($all,$invalid);
?>
for remove email only > this php code is work.
the problem is if I want to remove emails under "Multiple columns" is Not work
anyone can help
I would greatly appreciate if you able to help me, thanks
I'd recommend encapsulating this in a function that streams each line, builds an array of field -> values then checks to see if the value is in the data you want to remove, if it's not, write the line to an out file.
Something like...
<?php
/**
* Given a CSV file to read, the delimiter, and what to remove, return the filtered CSV data
* #param $filename string The /path/to/file.csv
* #param $outputFile string Where to write the output CSV data to
* #param $delimiter string How the fields are delimited in the CSV
* #param $removeHeader string The header to remove data from
* #param $removeData array The data to omit from the output
*
* #return boolean
**/
function remove_duplicates($filename, $outputFile, $delimiter=',', $removeHeader, $removeData)
{
// If the file doesn't exist or isn't readable - return false
if(!file_exists($filename) || !is_readable($filename)) {
return false;
}
$header = null;
$validData = [];
$writeHandle = fopen($outputFile, 'w');
if (false !== ($readHandle = fopen($filename, 'r'))) {
//While there are rows in the CSV, get this as an array of values
while (false !== ($row = fgetcsv($readHandle, 1000, $delimiter))) {
//On the first iteration, get the headers from the CSV
if (!$header) {
$header = $row;
fputcsv($writeHandle, $header);
} else {
// Combine the headers with the row to create an associative array representing a line
$line = array_combine($header, $row);
// Looking at the removeHeader field in this line, check to see if the value is in removeData
if (!in_array($line[$removeHeader], $removeData) {
// If it's not, then it's a valid line
fputcsv($writeHandle, $line);
}
}
}
fclose($readHandle);
fclose($writeHandle);
}
// Return
return true;
}

Convert csv to excel with PHPExcel in laravel?

i have found this answer ,
PHP Converting CSV to XLS - phpExcel error
but i have tried it in Laravel 4 and i am not able to get it to work , any help would be appreciated.
My Code
public function CsvExcelConverter($filename){
$objReader = Excel::createReader('CSV');
$objReader->setDelimiter(";");
$objPHPExcel = $objReader->load('uploads/'.$filename);
$objWriter = Excel::createWriter($objPHPExcel, 'Excel5');
//new file
$new_filename = explode('.',$filename);
$new_name = $new_filename[1];
$objWriter->save($new_name.'.xls');
return $new_name.'.xls';
}
thank for the answers, but for some reason we cant seem to set the delimiter on load but i have found that you can set it in the config file .
vendeor/maatwebsite/excel/src/config/csv.php
then just specify the delimiter. this way when loading the file it actually separates each entry and when converting it each entry is in its own cell.
thanks for all the help.
/* Get the excel.php class here: http://www.phpclasses.org/browse/package/1919.html */
require_once("../classes/excel.php");
$inputFile=$argv[1];
$xlsFile=$argv[2];
if( empty($inputFile) || empty($xlsFile) ) {
die("Usage: ". basename($argv[0]) . " in.csv out.xls\n" );
}
$fh = fopen( $inputFile, "r" );
if( !is_resource($fh) ) {
die("Error opening $inputFile\n" );
}
/* Assuming that first line is column headings */
if( ($columns = fgetcsv($fh, 1024, "\t")) == false ) {
print( "Error, couldn't get header row\n" );
exit(-2);
}
$numColumns = count($columns);
/* Now read each of the rows, and construct a
big Array that holds the data to be Excel-ified: */
$xlsArray = array();
$xlsArray[] = $columns;
while( ($rows = fgetcsv($fh, 1024, "\t")) != FALSE ) {
$rowArray = array();
for( $i=0; $i<$numColumns;$i++ ) {
$key = $columns[$i];
$val = $rows[$i];
$rowArray["$key"] = $val;
}
$xlsArray[] = $rowArray;
unset($rowArray);
}
fclose($fh);
/* Now let the excel class work its magic. excel.php
has registered a stream wrapper for "xlsfile:/"
and that's what triggers its 'magic': */
$xlsFile = "xlsfile://".$xlsFile;
$fOut = fopen( $xlsFile, "wb" );
if( !is_resource($fOut) ) {
die( "Error opening $xlsFile\n" );
}
fwrite($fOut, serialize($xlsArray));
fclose($fOut);
exit(0);
If you use the maatwebsite/excel library in Laravel, you can only use native PHPExcel instance methods, not static methods. To convert from CSV to excel, this code can be found at Documentation page
Excel::load($filename, function($file) {
// modify file content
})->setFileName($new_name)->store('xls');
In theory, you should create your custom class to set delimiter:
class CSVExcel extends Excel {
protected $delimiter = ';';
}
and now you could use:
CSVExcel::load('csvfilename.csv')->setFileName('newfilename')->export('xls');
But the problem is, that $delimiter isn't used in this case. Delimiter support seems to be added not long time ago, so maybe there is a bug or it needs to be used in the other way. I've added issue just in case for that: https://github.com/Maatwebsite/Laravel-Excel/issues/262

How to delete a line from the file with php?

I have a file named $dir and a string named $line, I know that this string is a complete line of that file but I don't know its line number and I want to remove it from file, what should I do?
Is it possible to use awk?
$contents = file_get_contents($dir);
$contents = str_replace($line, '', $contents);
file_put_contents($dir, $contents);
Read the lines one by one, and write all but the matching line to another file. Then replace the original file.
this will just look over every line and if it not what you want to delete, it gets pushed to an array that will get written back to the file. see this
$DELETE = "the_line_you_want_to_delete";
$data = file("./foo.txt");
$out = array();
foreach($data as $line) {
if(trim($line) != $DELETE) {
$out[] = $line;
}
}
$fp = fopen("./foo.txt", "w+");
flock($fp, LOCK_EX);
foreach($out as $line) {
fwrite($fp, $line);
}
flock($fp, LOCK_UN);
fclose($fp);
It can be solved without the use of awk:
function remove_line($file, $remove) {
$lines = file($file, FILE_IGNORE_NEW_LINES);
foreach($lines as $key => $line) {
if($line === $remove) unset($lines[$key]);
}
$data = implode(PHP_EOL, $lines);
file_put_contents($file, $data);
}
Another approach is to read the file line by line until you find a match, then truncate the file to that point, and then append the rest of the lines.
This is also good if you're looking for a substring (ID) in a line and want to replace the old line with the a new line.
Code:
$contents = file_get_contents($dir);
$new_contents = "";
if (strpos($contents, $id) !== false) { // if file contains ID
$contents_array = explode(PHP_EOL, $contents);
foreach ($contents_array as &$record) { // for each line
if (strpos($record, $id) !== false) { // if we have found the correct line
continue; // we've found the line to delete - so don't add it to the new contents.
} else {
$new_contents .= $record . "\r"; // not the correct line, so we keep it
}
}
file_put_contents($dir, $new_contents); // save the records to the file
echo json_encode("Successfully updated record!");
}
else {
echo json_encode("failed - user ID ". $id ." doesn't exist!");
}
Example:
input: "123,student"
Old file:
ID,occupation
123,student
124,brick layer
Running the code will change file to:
New file:
ID,occupation
124,brick layer
All answeres here have in common, that they load the complete file into the memory. Here is an implementation of removing one (or more) line(s) without coyping the files content into a variable.
The idea is to iterate over the files lines. If a line should be removed, the lines length is added to the $byte_offset. The next line is then moved $byte_offset bytes "upwards". This is done with all following lines. If all lines are processed, the files last $byte_offset bytes are removed.
I guess that this is faster for bigger files because nothing is copied. And I guess that at some file size the other answers do not work at all while this one should. But I didn't test it.
Usage:
$file = fopen("path/to/file", "a+");
// remove lines 1 and 2 and the line containing only "line"
fremove_line($file, 1, 2, "line");
fclose($file);
The code of the fremove_line() function:
/**
* Remove the `$lines` by either their line number (as an int) or their content
* (without trailing new-lines).
*
* Example:
* ```php
* $file = fopen("path/to/file", "a+"); // must be opened writable
* // remove lines 1 and 2 and the line containing only "line"
* fremove_line($file, 1, 2, "line");
* fclose($file);
* ```
*
* #param resource $file The file resource opened by `fopen()`
* #param int|string ...$lines The one-based line number(s) or the full line
* string(s) to remove, if the line does not exist, it is ignored
*
* #return boolean True on success, false on failure
*/
function fremove_line($file, ..$lines): bool{
// set the pointer to the start of the file
if(!rewind($file)){
return false;
}
// get the stat for the full size to truncate the file later on
$stat = fstat($file);
if(!$stat){
return false;
}
$current_line = 1; // change to 0 for zero-based $lines
$byte_offset = 0;
while(($line = fgets($file)) !== false){
// the bytes of the lines ("number of ASCII chars")
$line_bytes = strlen($line);
if($byte_offset > 0){
// move lines upwards
// go back the `$byte_offset`
fseek($file, -1 * ($byte_offset + $line_bytes), SEEK_CUR);
// move the line upwards, until the `$byte_offset` is reached
if(!fwrite($file, $line)){
return false;
}
// set the file pointer to the current line again, `fwrite()` added `$line_bytes`
// already
fseek($file, $byte_offset, SEEK_CUR);
}
// remove trailing line endings for comparing
$line_content = preg_replace("~[\n\r]+$~", "", $line);
if(in_array($current_line, $lines, true) || in_array($line_content, $lines, true)){
// the `$current_line` should be removed so save to skip the number of bytes
$byte_offset += $line_bytes;
}
// keep track of the current line
$current_line++;
}
// remove the end of the file
return ftruncate($file, $stat["size"] - $byte_offset);
}
Convert text to array, remove first line and reconvert to text
$line=explode("\r\n",$text);
unset($line[0]);
$text=implode("\r\n",$line);
I think the best way to work with files is to act them like strings:
/**
* Removes the first found line inside the given file.
*
* #param string $line The line content to be searched.
* #param string $filePath Path of the file to be edited.
* #param bool $removeOnlyFirstMatch Whether to remove only the first match or
* the whole matches.
* #return bool If any matches found (and removed) or not.
*
* #throw \RuntimeException If the file is empty.
* #throw \RuntimeException When the file cannot be updated.
*/
function removeLineFromFile(
string $line,
string $filePath,
bool $removeOnlyFirstMatch = true
): bool {
// You can wrap it inside a try-catch block
$file = new \SplFileObject($filePath, "r");
// Checks whether the file size is not zero
$fileSize = $file->getSize();
if ($fileSize !== 0) {
// Read the whole file
$fileContent = $file->fread($fileSize);
} else {
// File is empty
throw new \RuntimeException("File '$filePath' is empty");
}
// Free file resources
$file = null;
// Divide file content into its lines
$fileLineByLine = explode(PHP_EOL, $fileContent);
$found = false;
foreach ($fileLineByLine as $lineNumber => $thisLine) {
if ($thisLine === $line) {
$found = true;
unset($fileLineByLine[$lineNumber]);
if ($removeOnlyFirstMatch) {
break;
}
}
}
// We don't need to update file either if the line not found
if (!$found) {
return false;
}
// Join lines together
$newFileContent = implode(PHP_EOL, $fileLineByLine);
// Finally, update the file
$file = new \SplFileObject($filePath, "w");
if ($file->fwrite($newFileContent) !== strlen($newFileContent)) {
throw new \RuntimeException("Could not update the file '$filePath'");
}
return true;
}
Here is a brief description of what is being done: Get the whole file content, split the content into its lines (i.e. as an array), find the match(es) and remove them, join all lines together, and save the result back to the file (only if any changes happened).
Let's now use it:
// $dir is your filename, as you mentioned
removeLineFromFile($line, $dir);
Notes:
You can use fopen() family functions instead of SplFileObject, but I do recommend the object form, as it's exception-based, more robust and more efficient (in this case at least).
It's safe to unset() an element of an array being iterated using foreach (There's a comment here showing it can lead unexpected results, but it's totally wrong: As you can see in the example code, $value is copied (i.e. it's not a reference), and removing an array element does not affect it).
$line should not have new line characters like \n, otherwise, you may perform lots of redundant searches.
Don't use
$fileLineByLine[$lineNumber] = "";
// Or even
$fileLineByLine[$lineNumber] = null;
instead of
unset($fileLineByLine[$key]);
The reason is, the first case doesn't remove the line, it just clears the line (and an unwanted empty line will remain).
Hope it helps.
Like this:
file_put_contents($filename, str_replace($line . "\r\n", "", file_get_contents($filename)));

How to implement a Yahoo currency cache?

I have a Yahoo currency script in my site but they are taking too much time to load and are slowing my site. How can I cache them and refreshing cache every 3600 minutes?
You need some place to store these results. MySQL is a popular choice, but if the data does not need to stick around or have historical values, using memcache would be easier. Depending on your host, both of these options may be available.
The idea is:
create some sort of cache dir and set the defined cache age
then, at the very beginning of your function, check for cache
if it exists, check it's age.
If within range, get it
if cache too old
use live data and set that data into cache file.
Something like this should do the trick:
define(CACHE_DIR, 'E:/xampp/xampp/htdocs/tmp');
define(CACHE_AGE, 3600);
/**
* Adds data to the cache, if the cache key doesn't aleady exist.
* #param string $path the path to cache file (not dir)
* #return false if there is no cache file or the cache file is older that CACHE_AGE. It return cache data if file exists and within CACHE_AGE
*/
function get_cache_value($path){
if(file_exists($path)){
$now = time();
$file_age = filemtime($path);
if(($now - $file_age) < CACHE_AGE){
return file_get_contents($path);
} else {
return false;
}
} else {
return false;
}
}
function set_cache_value($path, $value){
return file_put_contents($path, $value);
}
function kv_euro () {
$path = CACHE_DIR . '/euro.txt';
$kveuro = get_cache_value($path);
if(false !== $kveuro){
echo "\nFROM CACHE\n";
return round($kveuro, 2);
} else {
echo "\nFROM LIVE\n";
$from = 'EUR'; /*change it to your required currencies */
$to = 'ALL';
$url = 'http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s='. $from . $to .'=X';
$handle = #fopen($url, 'r');
if ($handle) {
$result = fgets($handle, 4096);
fclose($handle);
}
$allData = explode(',',$result); /* Get all the contents to an array */
$kveuro = $allData[1];
set_cache_value($path, $kveuro);
return $kveuro;
}
}
Also, rather than fgets which reads file line by line rather slower and since you are not manipulating a line, you should consider using file_get_contents function instead.

Categories