PHP. Strange numbers in the end of JSON - php

I am reading and saving weather JSON data from forecast.io API. Because I am using free API which has 1000 requests limit per day. So I am requesting API every 10 minutes. I saving update time as timestamp and then I am using this timestamp to check to 10 minutes elapsed or not. However when I am reading JSON file and echoing it, strange number '18706' or '22659' coming out. I do not have idea where it is coming from. How to solve this problem?
Result in browser:
....madis-stations":["UTTT"],"units":"si"}}22659
PHP:
<?php
$t = time();
$last_updated_timestamp = file_get_contents("last_updated_timestamp.txt");
$delta = ($t - $last_updated_timestamp) / 60;
if ($delta > 10) {
$json = file_get_contents('https://api.forecast.io/forecast/MY_API_KEY/41.2667,69.2167?units=si&lang=ru');
$obj = json_decode($json);
echo $obj->access_token;
$fp = fopen('tw.json', 'w');
fwrite($fp, json_encode($obj));
fclose($fp);
$fp2 = fopen('last_updated_timestamp.txt', 'w');
fwrite($fp2, $t);
fclose($fp2);
}
echo readfile("tw.json");
?>

Change:
echo readfile("tw.json");
to just:
readfile("tw.json");
readfile writes the contents of the file to the output buffer, and then returns the number of bytes that it wrote. You're then echoing that number of bytes.
It seems like you confused readfile with file_get_contents, which returns the contents of the file as a string.

Remove the echo before readfile. Readfile already prints the content of the file. The return value of readfile is the number of read bytes, which you echoing.

Related

PHP File Writing (fwrite / file_put_contents) speed/optimization

So, i have a database with big data. The data to use is currently about 2,6 GB.
All the data need to be written to a text file for later use in another scripts.
The data is being limited per file and splitted in multiple parts. 100 results per file (around 37MB each file). Thats about 71 files.
The data is json data that is being serialized and then encrypted with openssl.
The data is correctly being written to the files, untill the max execution time is reached after 240 seconds. That's after about 20 files...
Well, i can just extend that time, but thats not the problem.
The problem is the following:
Writing file 1-6: +/- 5 seconds
Writing file 7-8: +/- 7 seconds
Writing file 9-11: +/- 12 seconds
Writing file 12-14: +/- 17 seconds
Writing file 14-16: +/- 20 seconds
Writing file 16-18: +/- 23 seconds
Writing file 19-20: +/- 27 seconds
Note: time is needed time per file
In other words, with every file im writing, the writing time per file goes significantly up, what causes the script to be slow offcourse.
The structure of the script is a bit like this:
$needed_files = count needed files/parts
for ($part=1; $part<=$needed_files; $part++) { // Loop throught parts
$query > mysqli select data
$data > json_encode > serialize > openssl_encrypyt
file_put_contents($filename.$part, $data, LOCK_EX);
}
WORKING CODE AFTER HELP
$notchDetails = mysqli_query($conn, "SELECT * FROM notches WHERE projectid = ".$projectid."");
$rec_count = 0;
$limit = 100;
$part = 1;
while ($notch = mysqli_fetch_assoc($notchDetails)) {
$data1[] = $notch;
$rec_count++;
if ($rec_count >= $limit) {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
$part++;
$rec_count = 0;
$data = $data1 = "";
}
}
if ($data1 != "") {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
}
mysqli_free_result($notchDetails);
Personally I would have coded this as a single SELECT with no LIMIT and then based on a $rec_per_file = ?; write the outputs from within the single while get results loop
Excuse the cryptic code, you didnt give us much of a clue
<?php
//ini_set('max_execution_time', 600); // only use if you have to
$filename = 'something';
$filename_suffix = 1;
$rec_per_file = 100;
$sql = "SELECT ....";
Run query
$rec_count = 0;
while ( $row = fetch a row ) {
$data[] = serialize > openssl_encrypyt
$rec_count++;
if ( $rec_count >= $rec_per_file ) {
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix,
$json_string,
LOCK_EX);
$filename_suffix++; // inc the suffix
$rec_count = 0; // reset counter
$data = array(); // clear data
// add 30 seconds to the remaining max_execution_time
// or at least a number >= to the time you expect this
// while loop to get back to this if statement
set_time_limit(30);
}
}
// catch the last few rows
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix, $data, LOCK_EX);
Also I am not sure why you would want to serialize() and json_encode()
I had a thought, based on your comment about execution time. If you place a set_time_limit(seconds) inside the if inside the while loop it might be cleaner, and you would not have to set ini_set('max_execution_time', 600); to a very large number, which if you have a real error in here may cause PHP continue processing for a long time before kicking the script out.
From the manual:
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When called, set_time_limit() restarts the timeout counter from zero. In other words, if the timeout is the default 30 seconds, and 25 seconds into script execution a call such as set_time_limit(20) is made, the script will run for a total of 45 seconds before timing out.

PHP Counter overwraps/overflows only 1 byte of data, counter resets (race condition)

I know this is a simple question but I downloaded a PHP Counter script from http://www.stevedawson.com/scripts/text-counter.php
which is the first result on google for PHP counter scripts and it worked great as expected.
I tried to see if it messes up by holding refresh in my browser after 255 requests it overflowed back to 0. How would I fix this script? I think the culprit is the filesize() which probably gets only 1 byte of data but it doesn't make sense since 255 is actually 3 bytes of data right? since it saves in plain-text format?
Why would it overflow? it's even PHP it shouldn't overflow just automatically mutate into a bigger datatype.
<?php
$orderCountFile = "order_num_count.txt";
if (file_exists($orderCountFile)) {
$fil = fopen($orderCountFile, r);
$dat = fread($fil, filesize($orderCountFile));
echo $dat+1;
fclose($fil);
$fil = fopen($orderCountFile, w);
fwrite($fil, $dat+1);
} else {
$fil = fopen($orderCountFile, w);
fwrite($fil, 1);
echo '1';
fclose($fil);
}
?>
Yeah I started to remake the script into another purpose I want to use it to keep track of order numbers for my website.
For a fix I think I have to recast $dat into a bigger integer type but can you even cast in PHP?
Also those r and w are suppose to be strings i think but they are used as constants but it doesn't seem to cause any troubles afaik.
Use file_get_contents and file_put_contents instead. You still have to consider, that there is a hard limit for that counter as well (see PHP_INT_MAX), but it's significantly higher.
<?php
$file = "counter.txt";
$counter = 0;
if (file_exists($file)) {
$counter = file_get_contents($file);
}
$counter = $counter + 1;
file_put_contents($file, $counter);
echo $counter;

How do you get last some lines of file via SFTP in PHP

I need to login to a production server retrieve a file and update my data base with the data in this file. Since this is a production database, I don't want to get the whole file every 5 minutes since the file may be huge and this may impact the server. I need to get the last 30 lines of this file every 5 minutes interval and have as little impact as possible.
The following is my current code, I would appreciate any insight to how best accomplish this:
<?php
$user="id";
$pass="passed";
$c = curl_init("sftp://$user:$pass#server1.example.net/opt/vmstat_server1");
curl_setopt($c, CURLOPT_PROTOCOLS, CURLPROTO_SFTP);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($c);
curl_close($c);
$data = explode("\n", $data);
?>
Marc B is wrong. SFTP is perfectly capable of partial file transfers. Here's an example of how to do what you want with phpseclib, a pure PHP SFTP implementation:
<?php
include('Net/SFTP.php');
$sftp = new Net_SFTP('www.domain.tld');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
$size = $sftp->size('filename.remote');
// outputs the last ten bytes of filename.remote
echo $sftp->get('filename.remote', false, $size - 10);
?>
In fact I'd recommend an approach like this anyway since some SFTP servers don't let you run commands via the system shell. Plus, SFTP can work on Windows SFTP servers whereas tail is unlikely to do so even if you do have shell access. ie. overall, it's a lot more portable a solution.
If you want to get the last x lines of a file you could loop repeatedly, reading however many bytes each time, until you encounter 10x new line characters. ie. get the last 10 bytes, then the next to last 10 bytes, then the ten bytes before those ten bytes, etc.
An answer by #Sammitch to a duplicate question Get last 15 lines from a large file in SFTP with phpseclib:
The following should result in a blob of text with at least 15 lines from the end of the file that you can then process further with your existing logic. You may want to tweak some of the logic depending on if your file ends with a trailing newline, etc.
$filename = './file.txt'
$filesize = $sftp->size($filename);
$buffersize = 4096;
$offset = $filesize; // start at the end
$result = '';
$lines = 0;
while( $offset > 0 && $lines < 15 ) {
// work backwards
if( $offset < $buffersize ) {
$offset = 0;
} else {
$offset -= $buffer_size;
}
$buffer = $sftp->get($filename, false, $offset, $buffer_size));
// count the number of newlines as we go
$lines += substr_count($buffer, "\n");
$result = $buffer . $result;
}
SFTP is not capable of partial file transfers. You might have better luck using a fullblowin SSH connection and use a remote 'tail' operation to get the last lines of the file, e.g.
$lines = shell_exec("ssh user#remote.host 'tail -30 the_file'");
Of course, you might want to have something a little more robust that can handle things like net.glitches that prevent ssh from getting through, but as a basic starting point, this should do the trick.

How do I choose a specific line from a file?

I'm trying to make (as immature as this sounds) an application online that prints random insults. I have a list that is 140 lines long, and I would like to print one entire line. There is mt_rand(min,max) but when I use that alongside fgets(file, "line") It doesn't give me the line of the random number, it gives me the character. Any help? I have all the code so far below.
<?php
$file = fopen("Insults.txt","r");
echo fgets($file, (mt_rand(1, 140)));
fclose($file);
?>
Try this, it's easier version of what you want to do:
$file = file('Insults.txt');
echo $file[array_rand($file)];
$lines = file("Insults.txt");
echo $lines[array_rand($lines)];
Or within a function:
function random_line($filename) {
$lines = file($filename) ;
return $lines[array_rand($lines)] ;
}
$insult = random_line("Insults.txt");
echo $insult;
use file() for this. it returns an array with the lines of the file:
$lines = file($filename);
$line = mt_rand(0, count($lines));
echo $lines[$line];
First: You totally screwed on using fgets() correctly, please refer to the manual about the meaning of the second parameter (it just plainly not what you think it is).
Second: the file() solution will work... until the filesize exceeds a certain size and exhaust the complete PHP memory. Keep in mind: file() reads the complete file into an array.
You might be better off with reading line-by-line, even if that means you'll have to discard most of the read data.
$fp = fopen(...);
$line = 129;
// read (and ignore) the first 128 lines in the file
$i = 1;
while ($i < $line) {
fgets($fp);
$i++;
}
// at last: this is the line we wanted
$theLine = fgets($fp);
(not tested!)

check size of memory before exporting it to .php file

I wrote a PHP script that index the contents of my site and stores it in multi-dimensional arrays. Then I export what is in the memory, that array, to a .php file to include and access later.
How can I check the size of the memory or the size of the memory associated with the variable I'm exporting before I save it to a file.
For example, if less than 1gb, export. Else, do nothing and erase what's in memory associated with variable $x.
How could I do this?
try with:
memory_get_usage()
call this function before and after create your array, and take the difference.
echo "At the start we're using (in bytes): ",
memory_get_usage() , "\n<br>";
... array...
echo "After, we're using (in bytes): ",
memory_get_usage(),"\n<br>";
or
$before = memory_get_usage();
... array ...
$after = memory_get_usage();
echo round(($after-$before)/1024/1024, 2)." MB\n";
Try with this
$filename = 'somefile.txt';
$filesize = filesize($filename); //'in bytes'
if($filesize < 1GB) // means if($filesize < 1024)
//Do export
else
echo 'Exceeded 1GB';
How will you save the array to the file? If you are going to serialize the array, than you may do that first (if you have enough ram) and then check the string length of the serialized variable before saving it to file.

Categories