This question already has answers here:
How to use return inside a recursive function in PHP
(4 answers)
Closed 9 months ago.
I have the following codes and they are not working:
index.php:
include("loadData.php");
$my_var = loadData("myTxt.txt");
var_dump($my_var);
loadData.php:
function loadData($my_file){
if(file_exists($my_file)){
$file_contents = file_get_contents($my_file);
$file_contents = json_decode($file_contents);
}else{
// If file doesn't exist, creates the file and runs the function again
$data_to_insert_into_file = simplexml_load_file("http://site_with_content.com");
$fp = fopen($my_file, "w");
fwrite($fp, json_encode($data_to_insert_into_file));
fclose($fp);
// Since the file is created I will call the function again
loadData($my_file);
return;
}
// Do things with the decoded file contents (this is suposed to run after the file is loaded)
$result = array();
$result = $file_contents['something'];
return $result;
}
This works as expected in the second time (after the file is created), I can display info on index.php, but in the first time I run (before the file is created) it always displays $result as NULL, I can't understand why since I call the function again...
Any idea?
Thank you
You don't return anything when you do your fetch:
if (...) {
$file_contents = file_get_contents(...);
// no return call here
} else {
...
return; // return nothing, e.g. null
}
return $result; // $result is NEVER set in your code
You should have return $file_contents. Or better yet:
if (...) {
$result = get cached data
} else {
$result = fetch/get new data
}
return $result;
by using the proper variable names everywhere.
Related
This question already has answers here:
Reading very large files in PHP
(8 answers)
Closed 1 year ago.
I have a file with around 100 records for now.
The file has users in json format per line.
Eg
{"user_id" : 1,"user_name": "Alex"}
{"user_id" : 2,"user_name": "Bob"}
{"user_id" : 3,"user_name": "Mark"}
Note : This is a just very simple example, I have more complex json values per line in the file.
I am reading the file line by line and store that in an array which obviously will be big if there are a lot of items in the file.
public function read(string $file) : array
{
//Open the file in "reading only" mode.
$fileHandle = fopen($file, "r");
//If we failed to get a file handle, throw an Exception.
if ($fileHandle === false) {
throw new Exception('Could not get file handle for: ' . $file);
}
$lines = [];
//While we haven't reach the end of the file.
while (!feof($fileHandle)) {
//Read the current line in.
$lines[] = json_decode(fgets($fileHandle));
}
//Finally, close the file handle.
fclose($fileHandle);
return $lines;
}
Next, Ill process this array and only take the parameters I need (some parameters might be further processed) and then Ill export this array to csv.
public function processInput($users){
$data = [];
foreach ($users as $key => $user)
{
$data[$key]['user_id'] = $user->user_id;
$data[$key]['user_name'] = strtoupper($user->user_name);
}
// Call export to csv $data.
}
What should be the best way to read the file (incase we have a big file)?
I know file_get_contents is not optimized way and instead fgets is a better approach.
Is there a much better way considering big file read and then put it to csv.
You need to modify your reader to make it more "lazy" in some sense. For example consider this:
public function read(string $file, callable $rowProcessor) : void
{
//Open the file in "reading only" mode.
$fileHandle = fopen($file, "r");
//If we failed to get a file handle, throw an Exception.
if ($fileHandle === false) {
throw new Exception('Could not get file handle for: ' . $file);
}
//While we haven't reach the end of the file.
while (!feof($fileHandle)) {
//Read the current line in.
$line = json_decode(fgets($fileHandle));
$rowProcessor($line);
}
//Finally, close the file handle.
fclose($fileHandle);
return $lines;
}
Then your will need different code that works with this:
function processAndWriteJson($filename) { //Names are hard
$writer = fopen('output.csv', 'w');
read($filename, function ($row) use ($writer) {
// Do processing of the single row here
fputcsv($writer, $processedRow);
});
}
If you want to get the same result as before with your read method you can do:
$lines = [];
read($filename, function ($row) use ($writer) {
$lines[] = $row;
});
It does provide some more flexibility. Unfortunately it does mean you can only process one line at a time and scanning up and down the file is harder
I'm a beginner so sorry if my question is inappropriate.
I'm trying to create a function that allows me to load and display a csv file in PHP, but it keeps giving me an error.
Here is the function:
<?php
function load_csv_file($nom){
$tableau_asso= array();
$fichier = fopen($nom, "r");
while($ligne = fgetcsv($fichier, 1024, ';')){
array_push($tableau_asso, $ligne);
}
fclose($fichier);
foreach($tableau_asso as $ligne){
print($ligne[0]);
print(", ");
print($ligne[1]);
print("<br>");
}
return $tableau_asso;
}
?>
Then I created another document where I call this function:
<?php
include("library.php");
$nom = 'C:\\MAMP\\htdocs\\MEDAS-PHP\\data.csv';
$mon_tableau = load_csv_file($nom);
?>
When I try to load it, nothing happens. What am I doing wrong?
I have a very strange problem that I have been unable to find an answer to. I have a PHP function that reads CSV data into an array then returns true if the data was successfully read and passes the array back by reference variable
function ReadCsvDataIntoArray($path, &$headers, &$array, $idFilter = NULL){
if(file_exists($path)){
$fh = fopen($path, 'r');
if($fh){
$headers = fgetcsv($fh);
$rowIdx = 0;
while($row = fgetcsv($fh)){
$addRow = true;
if($idFilter != NULL){
if(isset($row[0])){
if(!in_array($row[0], $idFilter)){
$addRow = false;
}
}
}
if($addRow){
$colIdx = 0;
foreach($row as $val){
$array[$rowIdx][$headers[$colIdx]] = $val;
$colIdx++;
}
$rowIdx++;
}
}
fclose($fh);
return true;
} else {
echo "Unable to open file: ".$path;
}
} else {
echo "CSV doesn't exist: ".$path;
}
return false;
}
If the function returns as true I then check to make sure the array wasn't passed back as null or empty, then sort the data.
if($this->ReadCsvDataIntoArray($client_library_path, $headers, $CSVdata, $log)){
if($CSVData != NULL){
usort($CSVdata, create_function('$a, $b', 'return $a["engagement"] < $b["engagement"];'));
// Do stuff with the sorted array
} else {
echo "CSV data is NULL.\n";
}
I keep getting "CSV data is NULL" from this. If I change the logic to if($CSVData == NULL) or even if(empty($CSVData)) it enters the if statement, attempts to sort the array (which is full, even though the if statement says it's empty) and does stuff with the data.
This is where my second issue comes in. This usort works on my localhost:
usort($CSVdata, function($a, $b) { return $a["scheduled"] < $b["scheduled"]; });
but it doesn't work on the server because of its php version, so I have changed it to:
usort($CSVData, create_function('$a, $b', 'return $a["scheduled"] < $b["scheduled"];'));
But with the create_function version of the usort I get this error message
Warning: usort(): The argument should be an array
I am guessing this has something to do with the fact that my full array is somehow being evaluated as empty and null even when it isn't.
You say this:
…and passes the array back by reference variable…
And this:
If the function returns as true I then check to make sure the array
wasn't passed back as null or empty, then sort the data.
Why are you doing this? If you are checking true or false then checking if it is null or empty what is the value of that? Just check if it is null or empty by doing this instead:
// return true;
return $array;
} else {
echo "Unable to open file: ".$path;
}
} else {
echo "CSV doesn't exist: ".$path;
}
// return false;
return $array;
And then get rid of the return by reference for $array in the interface:
function ReadCsvDataIntoArray($path, &$headers, $array, $idFilter = NULL){
Your true or false logic is probably broken & not worth dealing with correctly at all. But why spend time reinventing the wheel if the value returns null or empty and that is what you are acting on.
Also you can then adjust this $CSVData logic to fit the new structure:
$CSVData = $this->ReadCsvDataIntoArray($client_library_path, $headers, $CSVdata, $log);
if(!empty($CSVData)){
usort($CSVdata, create_function('$a, $b', 'return $a["engagement"] < $b["engagement"];'));
// Do stuff with the sorted array
} else {
echo "CSV data is empty.\n";
}
Also, your whole return true logic is based strictly on the file itself can be opened:
$fh = fopen($path, 'r');
if($fh){
// Code removed for structural illustration purposes.
// ...
// ...
fclose($fh);
return true;
} else {
But you say this; empahsis mine:
I have a PHP function that reads CSV data into an array then returns
true if the data was successfully read…
No. Your logic does not check if the data was read successfully. Your logic simply returns true if the file itself can be read. Which does not mean the contents of the file itself is valid. Have you checked the file itself? Or did you check if this line:
while($row = fgetcsv($fh)){
Actually has values in $row by doing something like this?
echo '<pre>';
print_r($row);
echo '</pre>';
I think that maybe your CSV has line formatting issues. Like it was saved on a Windows machine but is now being read on a Mac OS X or Linux machine or the other way around. Look at this in the documentation for fgetcsv:
Note: If PHP is not properly recognizing the line endings when reading
files either on or created by a Macintosh computer, enabling the
auto_detect_line_endings run-time configuration option may help
resolve the problem.
So perhaps add this line enabling auto_detect_line_endings to your function like this:
function ReadCsvDataIntoArray($path, &$headers, &$array, $idFilter = NULL){
ini_set("auto_detect_line_endings", true);
I have several files to parse (with PHP) in order to insert their respective content in different database tables.
First point : the client gave me 6 files, 5 are CSV with values separated by coma ; The last one do not come from the same database and its content is tabulation-based.
I built a FileParser that uses SplFileObject to execute a method on each line of the file-content (basically, create an Entity with each dataset and persist it to the database, with Symfony2 and Doctrine2).
But I cannot manage to parse the tabulation-based text file with SplFileObject, it does not split the content in lines as I expect it to do...
// In my controller context
$parser = new MyAmazingFileParser();
$parser->parse($filename, $delimitor, function ($data) use ($em) {
$e = new Entity();
$e->setSomething($data[0);
// [...]
$em->persist($e);
});
// In my parser
public function parse($filename, $delimitor = ',', $run = null) {
if (is_callable($run)) {
$handle = new SplFileObject($filename);
$infos = new SplFileInfo($filename);
if ($infos->getExtension() === 'csv') {
// Everything is going well here
$handle->setCsvControl(',');
$handle->setFlags(SplFileObject::DROP_NEW_LINE + SplFileObject::READ_AHEAD + SplFileObject::SKIP_EMPTY + SplFileObject::READ_CSV);
foreach (new LimitIterator($handle, 1) as $data) {
$result = $run($data);
}
} else {
// Why does the Iterator-way does not work ?
$handle->setCsvControl("\t");
// I have tried with all the possible flags combinations, without success...
foreach (new LimitIterator($handle, 1) as $data) {
// It always only gets the first line...
$result = $run($data);
}
// And the old-memory-killing-dirty-way works ?
$fd = fopen($filename, 'r');
$contents = fread($fd, filesize($filename));
foreach (explode("\t", $contents) as $line) {
// Get all the line as I want... But it's dirty and memory-expensive !
$result = $run($line);
}
}
}
}
It is probably related with the horrible formatting of my client's file, but after a long discussion with them, they really cannot get another format for me, for some acceptable reasons (constraints in their side), unfortunately.
The file is currently long of 49459 lines, so I really think the memory is important at this step ; So I have to make the SplFileObject way working, but do not know how.
An extract of the file can be found here :
Data-extract-hosted
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Execute a PHP file, and return the result as a string
PHP capture print/require output in variable
I am trying to get the contents of an include to a string. Is that possible?
For example, if I have a test.php file:
echo 'a is equal to '.$a;
I need a function, say include_to_string to include the test.php and return what would be output by in in a string.
Something like:
$a = 4;
$string = include_to_string(test.php); // $string = "a is equal to 4"
ob_start();
include 'test.php';
$string = ob_get_clean();
I think is what you want. See output buffering.
ob_start();
include($file);
$contents = ob_get_contents(); // data is now in here
ob_end_clean();
You can do this with output buffering:
function include2string($file) {
ob_start();
include($file);
return ob_get_clean();
}
#DaveRandom points out (correctly) that the issue with wrapping this in a function is that your script ($file) will not have access to variable defined globally. That might not be an issue for many scripts included dynamically, but if it is an issue for you then this technique can be used (as others have shown) outside of a function wrapper.
** Importing variables
One thing you can do is to add a set of data you would like to expose to your script as variables. Think of it like passing data to a template.
function include2string($file, array $vars = array()) {
extract($vars);
ob_start();
include($file);
return ob_get_clean();
}
You would call it this way:
include2string('foo.php', array('key' => 'value', 'varibleName' => $variableName));
and now $key and $variableName would be visible inside your foo.php file.
You could also provide a list of global variables to "import" for your script if that seems clearer to you.
function include2string($file, array $import = array()) {
extract(array_intersect_key($GLOBALS, array_fill_keys($import, 1)));
ob_start();
include($file);
return ob_get_clean();
}
And you would call it, providing a list of the globals you would like exposed to the script:
$foo='bar';
$boo='far';
include2string('foo.php', array('foo'));
foo.php should be able to see foo, but not boo.
You could also use this below but I recommend the above answer.
// How 1th
$File = 'filepath';
$Content = file_get_contents($File);
echo $Content;
// How 2th
function FileGetContents($File){
if(!file_exists($File)){
return null;
}
$Content = file_get_contents($File);
return $Content;
}
$FileContent = FileGetContents('filepath');
echo $FileContent;
Function in PHP manual : file_get_contents