json output to file readable again in php - php

I want to be able to transfer a php array from one server to another via ftp in the form of a file.
The receiving server needs to be able to open the said file and read its contents and use the array provided.
I've thought about going about this two ways, either writing a php file from server 1 with php code of an array then simply loading this file on server 2. However writing the said file is getting tricky when the depth of the array is unknown.
So I though about writing the array to the file json encoded but I don't know how the second server could open and read the said data.
Could I simply do:
$jsonArray= json_encode($masterArray);
$fh = fopen('thefile.txt' , 'w');
fwrite($fh, $thePHPfile);
fclose($fh);
Then on the other server open the data into a variable:
$data = json_decode( include('thefile.txt') );
Has anyone had any experience of this before?

For first server, connect to second server by FTP and put that file contents into a file
$jsonArray = json_encode($masterArray);
$stream = stream_context_create(array('ftp' => array('overwrite' => true)));
file_put_contents('ftp://user:pass#host/folder/thefile.txt', $jsonArray, 0, $stream);
use file_get_contents() for second server:
$data = json_decode( file_get_contents('/path/to/folder/thefile.txt') );

If you're only going to be interested in reading the file using PHP, have you thought about using serialize() and unserialize()?
See http://php.net/manual/en/function.serialize.php
It's also probably faster than json_encode() / json_decode() (see http://php.net/manual/en/function.serialize.php#103761).

The PHP Function that your looking for is : file_get_contents
$masterArray = array('Test','Test2','Test3');
$jsonArray= json_encode($masterArray);
$fh = fopen('thefile.txt' , 'w');
fwrite($fh, $jsonArray);
fclose($fh);
Then on the other server:
$masterArray = json_decode( file_get_contents('thefile.txt') );
var_dump($masterArray);

To "transfer" the array between servers, using a file as medium, you found a nice solution by using json_encode and json_decode. The serialize and unserialize functions would perform the same goal nicely.
$my_array = array('contents', 'et cetera');
$serialized = serialize($my_array);
$json_encoded = json_encode($my_array);
// here you send the file to the other server, (you said you know how to do)
// for example:
file_put_contents($serialized_destination, $serialized);
file_put_contents($json_encoded_destination, $json_encoded);
In the receiving server, you just need to read the file contents and apply the corresponding "parse" function:
$serialized = file_get_contents($serialized_destination);
$json_encoded = file_get_contents($json_encoded_destination);
$my_array1 = unserialize($serialized);
$my_array2 = json_decode($json_encoded);

Related

MAMP strange behaviour : php read external file from an http:// is very slow, but from https:// is quick

I have a simple PHP script to read a remote file line-by-line, and then JSON decode it. On the production server all works ok, but on my local machine (MAMP stack, OSX) the PHP hangs. It is very slow, and takes more than 2 minutes to produce the JSON file. I think it's the json_decode() that is freezing. Why only on MAMP?
I think it's stuck in while loop, because I can't show the final $str variable that is the result of all the lines.
In case you are wondering why I need to read the file line-by-line, it's because in the real scenario, the remote JSON file is a 40MB text file. My only good performance result is like this, but any good suggestion?
Is there a configuration in php.ini to help solve this?
// The path to the JSON File
$fileName = 'http://www.xxxx.xxx/response-single.json';
//Open the file in "reading only" mode.
$fileHandle = fopen($fileName, "r");
//If we failed to get a file handle, throw an Exception.
if($fileHandle === false){
error_log("erro handle");
throw new Exception('Could not get file handle for: ' . $fileName);
}
//While we haven't reach the end of the file.
$str = "";
while(!feof($fileHandle)) {
//Read the current line in.
$line = fgets($fileHandle);
$str .= $line;
}
//Finally, close the file handle.
fclose($fileHandle);
$json = json_decode($str, true); // decode the JSON into an associative array
Thanks for your time.
I found the cause. It is path protocol.
With
$filename = 'http://www.yyy/response.json';
It freezes the server for 1 to 2 minutes.
I changed the file to another server with https protocol, and used
$filename = 'https://www.yyy/response.json';
and it works.

php - Pass contents of file to function that's expecting a filename

I have the contents of a file in a string. I need to pass this file to a function where the function is expecting the parameter to be the name of the file, not the contents. The obvious and probably simplest way to do this would be to write the contents to a temp file, then pass that file name to the function, and unlink the file once I'm finished.
However, I'm looking for a solution that doesn't involve writing the file out to the file system and then reading it back in. I've had a need for this in multiple cases, so I'm not looking for a work-around to a specific function, but more of a generic method that will work for any function expecting a file name (like file_get_contents(), for instance).
Here are some thoughts, but not sure how to pursue these yet:
Is it possible to write the contents somewhere in memory, and then
pass that to the function as a filename? Perhaps something using
php://memory.
Is it possible to write the contents to a pipe, then pass the name of the
pipe to the function?
I did a short proof-of-concept trying with php://memory as follows, but no luck:
$data = "This is some file data.\n";
file_put_contents( 'php://memory', $data );
echo file_get_contents( 'php://memory' );
Would be interested in knowing of good ways to address this. Googling hasn't come up with anything for me.
It mainly depends on what the target function does with the file name. If you're lucky, you can register your own stream wrapper:
stream_wrapper_register('demo', 'DemoStream');
$data = "This is some file data.\n";
$filename = 'demo://foo';
file_put_contents($filename, $data );
echo file_get_contents($filename);
Why not use a file in the /tmp/ directory? Like this:
<?php
$filename = '/tmp/mytmpfile';
$data = "This is some file.\n";
file_put_contents($filename, $data);
$result = file_get_contents($filename);
var_dump($result);
Well, as you say you don't want to use a file, you shouldn't use file_get_contents().
But you can achieve the same result by using stream_get_contents(), like this:
<?php
$data = "This is some file data.\n";
$handle = fopen('php://memory', 'r+'); // open an r/w handle to memory
fputs($handle, $data); // write the data
rewind($handle); // rewind the pointer
echo stream_get_contents($handle); // retrieve the contents

Parsing a Zipped (GZ) JSON file in PHP

With help from the guys on Stackoverflow I can now Parse JSON code from a file and save a 'Value' into a database
However the file I intend to read from is actually a massive 2GB file. My web server will not hold this file. However it will hold a ZIPPED version of it - ie 80MB.(ie .GZ)
I believe there is a way to PARSE JSON from a ZIPPED file (.GZ)..........Can anybody help?
I have found the below function which I believe will do this (I think) but I don't know how to link it to my code
private function uncompressFile($srcName, $dstName) {
$sfp = gzopen($srcName, "rb");
$fp = fopen($dstName, "w");
while ($string = gzread($sfp, 4096)) {
fwrite($fp, $string, strlen($string));
}
gzclose($sfp);
fclose($fp);
}
My current PHP code is below and works. It reads a basic small file, JSON decodes it (The JSON is in a series of separate lines hence the need for FILE_IGNORE_NEW_LINES) and then takes a value and saves to MySQL database.
However I believe I need to somehow combine these two bits of code so I can read a ZIPPED file without exceeding my 100MB storage on my webserver
$file="CIF_ALL_UPDATE_DAILY_toc-update-sun";
$trains = file($json_filename, FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
foreach ($trains as $train) {
$json=json_decode($train,true);
foreach ($json as $key => $value) {
$input=$value['main_train_uid'];
$q="INSERT INTO railstptest (main_train_uid) VALUES ('$input')";
$r=mysqli_query($mysql_link,$q);
}
}
}
if (is_null($json)) {
die("Json decoding failed with error: ". json_last_error());
}
mysqli_close($mysql_link);
Many Thanks
EDIT
Here is a short snippet of the JSON . There are a series of these
I would only want to be getting a few key values. For example the value G90491 and P20328. A lot of the info I would not need
{"JsonAssociationV1":{"transaction_type":"Delete","main_train_uid":"G90491","assoc_train_uid":"G90525","assoc_start_date":"2013-09-07T00:00:00Z","location":"EDINBUR","base_location_suffix":null,"diagram_type":"T","CIF_stp_indicator":"O"}}
{"JsonAssociationV1":{"transaction_type":"Delete","main_train_uid":"P20328","assoc_train_uid":"P21318","assoc_start_date":"2013-08-23T00:00:00Z","location":"MARYLBN","base_location_suffix":null,"diagram_type":"T","CIF_stp_indicator":"C"}}
It may be possible to do stream extraction of the file and then use a stream JSON parser. ZipArchive has getStream, and someone created a streaming JSON parser for PHP.
You will have to write a listener that inserts the database values as they are found and discards unnecessary JSON so it does not consume memory.
$zip = new ZipArchive;
$zip->open("file.zip");
$parser = new JsonStreamingParser_Parser($zip->getStream("file.json"),
new DB_Value_Inserter);
$parser->parse();
Based on your question, you're working with gzip instead of zip. To get the stream you can use
fopen("compress.zlib://path/to/file.json", "r");
It's difficult to write the DB_Value_Inserter since you haven't provided the format of the JSON you need, but it seems like you can probably just override the Listener::value method and just write the string values you receive.
PHP has compression wrappers that can help with opening and reading lines from compressed files. One is for reading gzip files:
$gzipFile = 'CIF_ALL_UPDATE_DAILY_toc-update-sun.gz';
$trains = new SplFileObject("compress.zlib://{$gzipFile}", 'r');
$trains->setFlags(SplFileObject::DROP_NEW_LINE | SplFileObject::READ_AHEAD
| SplFileObject::SKIP_EMPTY);
Because SplFileObject is iterable, you can keep your outer foreach loop the way it is. Of course, fgets() remains an alternative to using SplFileObject.

How to read the contents of a zipped URL using php

I am using php for one of my projects. I have a URl which has the set of XML of a specific site. The URL is in the format www.sitefeed.com/somekey&&gZipCompress=yes. I want to read the contents of this file with out downloading it into my server.
I tried compress.zlib: in front of the URL but it returned an empty array.
Thanks
If you have your php variable "allow_url_fopen" on, you can use:
$lrc = gzopen($the_link, "r");
$text ="";
while(!gzeof($lrc)){
$text .= gzread($lrc, 1024);
}
gzclose($lrc);
If not, you can use file_get_contents, or even curl .

Best way to download a file in PHP

Which would be the best way to download a file from another domain in PHP?
i.e. A zip file.
The easiest one is file_get_contents(), a more advanced way would be with cURL for example. You can store the data to your harddrive with file_put_contents().
normally, the fopen functions work for remote files too, so you could do the following to circumvent the memory limit (but it's slower than file_get_contents)
<?php
$remote = fopen("http://www.example.com/file.zip", "rb");
$local = fopen("local_name_of_file.zip", 'w');
while (!feof($remote)) {
$content = fread($remote, 8192);
fwrite($local, $content);
}
fclose($local);
fclose($remote);
?>
copied from here: http://www.php.net/fread
You may use one code line to do this:
copy(URL, destination);
This function returns TRUE on success and FALSE on failure.

Categories