So, i have a database with big data. The data to use is currently about 2,6 GB.
All the data need to be written to a text file for later use in another scripts.
The data is being limited per file and splitted in multiple parts. 100 results per file (around 37MB each file). Thats about 71 files.
The data is json data that is being serialized and then encrypted with openssl.
The data is correctly being written to the files, untill the max execution time is reached after 240 seconds. That's after about 20 files...
Well, i can just extend that time, but thats not the problem.
The problem is the following:
Writing file 1-6: +/- 5 seconds
Writing file 7-8: +/- 7 seconds
Writing file 9-11: +/- 12 seconds
Writing file 12-14: +/- 17 seconds
Writing file 14-16: +/- 20 seconds
Writing file 16-18: +/- 23 seconds
Writing file 19-20: +/- 27 seconds
Note: time is needed time per file
In other words, with every file im writing, the writing time per file goes significantly up, what causes the script to be slow offcourse.
The structure of the script is a bit like this:
$needed_files = count needed files/parts
for ($part=1; $part<=$needed_files; $part++) { // Loop throught parts
$query > mysqli select data
$data > json_encode > serialize > openssl_encrypyt
file_put_contents($filename.$part, $data, LOCK_EX);
}
WORKING CODE AFTER HELP
$notchDetails = mysqli_query($conn, "SELECT * FROM notches WHERE projectid = ".$projectid."");
$rec_count = 0;
$limit = 100;
$part = 1;
while ($notch = mysqli_fetch_assoc($notchDetails)) {
$data1[] = $notch;
$rec_count++;
if ($rec_count >= $limit) {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
$part++;
$rec_count = 0;
$data = $data1 = "";
}
}
if ($data1 != "") {
$data = json_encode($data1);
$data = openssl_encrypt(bin2hex($data), "aes128", $pass, false, $iv);
$filename = $mainfolder."/".$projectfolder."/".$subfolder."/".$fname.".part".$part."".$fext;
file_put_contents($filename, $data, LOCK_EX);
}
mysqli_free_result($notchDetails);
Personally I would have coded this as a single SELECT with no LIMIT and then based on a $rec_per_file = ?; write the outputs from within the single while get results loop
Excuse the cryptic code, you didnt give us much of a clue
<?php
//ini_set('max_execution_time', 600); // only use if you have to
$filename = 'something';
$filename_suffix = 1;
$rec_per_file = 100;
$sql = "SELECT ....";
Run query
$rec_count = 0;
while ( $row = fetch a row ) {
$data[] = serialize > openssl_encrypyt
$rec_count++;
if ( $rec_count >= $rec_per_file ) {
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix,
$json_string,
LOCK_EX);
$filename_suffix++; // inc the suffix
$rec_count = 0; // reset counter
$data = array(); // clear data
// add 30 seconds to the remaining max_execution_time
// or at least a number >= to the time you expect this
// while loop to get back to this if statement
set_time_limit(30);
}
}
// catch the last few rows
$json_string = json_encode($data);
file_put_contents($filename.$filename_suffix, $data, LOCK_EX);
Also I am not sure why you would want to serialize() and json_encode()
I had a thought, based on your comment about execution time. If you place a set_time_limit(seconds) inside the if inside the while loop it might be cleaner, and you would not have to set ini_set('max_execution_time', 600); to a very large number, which if you have a real error in here may cause PHP continue processing for a long time before kicking the script out.
From the manual:
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When called, set_time_limit() restarts the timeout counter from zero. In other words, if the timeout is the default 30 seconds, and 25 seconds into script execution a call such as set_time_limit(20) is made, the script will run for a total of 45 seconds before timing out.
Related
Here is my pseudo code
For loop repeat 30 times
Call and execute API script which takes less than 1 second
I want the system to sleep for some moment(less than 1 second)
Loop end
I want the above script should finish execution in minimum of 30 seconds (2-3 seconds longer is not a problem but must not less).
Can you write a sample code for me(only time related)
Here's a little program that implements the essence of what I believe you are looking for:
$minTime = 30;
$perSec = 10000;
$start = microtime(TRUE)*$perSec;
print("This must take $minTime seconds.\n");
// Replace this loop with what you actually need done
for ($i=0; $i<10; $i++) { print("Doing stuff #{$i}... "); sleep(1); }
// Check difference in timestamps to calculate remaining time
$remain=($minTime*$perSec-(microtime(TRUE)*$perSec-$start))/$perSec;
var_dump(['remain' => $remain]);
if ($remain > 0) {
print("\n$remain seconds remaining... ");
sleep($remain);
}
print("DONE\n");
This code has been updated to use microtime() instead of time(), which only permitted intervals of integer seconds. On systems that support gettimeofday() we can work with microseconds, instead.
I need to call the script function open() when the counter exceeds 5, but when I load my webpage, it shows me a fatal error : Maximum execution time of 30 seconds exceeded. Is that my while loop problem?
$result = mysqli_query($dbconnect, "SELECT id, temp, hum, lum FROM weather ORDER BY id DESC LIMIT 5");
while($row = mysqli_fetch_assoc($result)){
$weather["temp"] = $row["temp"];
$weather["lum"] = $row["lum"];
$tempcounter = 0;
while ($weather["temp"] > 27) {
$tempcounter++;
}
if($tempcounter > 5){
echo"<script>open()</script>";
}}
Your while loop is infinite. The loop will start if $weather["temp"] is bigger than 27 and inside you increment the tempcounter but never change $weather["temp"] so it will always be bigger than 27 and never get out of the loop.
You could put $weather["temp"] in a variable. Then inside the loop increment it also. But most importantly you need to reverse your condition. If you increment you need to check < and if you decrement you can check >.
If you increment:
$weatherTemp = $weather["temp"];
while ($weatherTemp < 27) {
$weatherTemp++;
}
If you decrement:
$weatherTemp = $weather["temp"];
while ($weatherTemp > 27) {
$weatherTemp--;
}
And then you can declare $tempcouter like this:
$tempcounter = $weatherTemp - $weather["temp"]; //Temperature needed to get to 27
if($tempcounter > 5){
echo"<script>open()</script>";
}
Code is not tested it might not work has it is.
If what you want to do is get the difference between 27 and your temperature you are not using the right logic here. You could simply take 27 minus your temperature form the query. If the result is negative just change it to positive and you get the difference.
I think it is time over problem.
You can fix this problem by 2 methods.
Set max_execution_time to large value ex. 300 or larger
Begin of your php script call set_time_limit function
set_time_limit(0);
In this loop:
$tempcounter = 0;
while ($weather["temp"] > 27) {
$tempcounter++;
}
You're not updating $weather["temp"], so it will loop forever if that condition is satisfied.
I am reading and saving weather JSON data from forecast.io API. Because I am using free API which has 1000 requests limit per day. So I am requesting API every 10 minutes. I saving update time as timestamp and then I am using this timestamp to check to 10 minutes elapsed or not. However when I am reading JSON file and echoing it, strange number '18706' or '22659' coming out. I do not have idea where it is coming from. How to solve this problem?
Result in browser:
....madis-stations":["UTTT"],"units":"si"}}22659
PHP:
<?php
$t = time();
$last_updated_timestamp = file_get_contents("last_updated_timestamp.txt");
$delta = ($t - $last_updated_timestamp) / 60;
if ($delta > 10) {
$json = file_get_contents('https://api.forecast.io/forecast/MY_API_KEY/41.2667,69.2167?units=si&lang=ru');
$obj = json_decode($json);
echo $obj->access_token;
$fp = fopen('tw.json', 'w');
fwrite($fp, json_encode($obj));
fclose($fp);
$fp2 = fopen('last_updated_timestamp.txt', 'w');
fwrite($fp2, $t);
fclose($fp2);
}
echo readfile("tw.json");
?>
Change:
echo readfile("tw.json");
to just:
readfile("tw.json");
readfile writes the contents of the file to the output buffer, and then returns the number of bytes that it wrote. You're then echoing that number of bytes.
It seems like you confused readfile with file_get_contents, which returns the contents of the file as a string.
Remove the echo before readfile. Readfile already prints the content of the file. The return value of readfile is the number of read bytes, which you echoing.
I need to login to a production server retrieve a file and update my data base with the data in this file. Since this is a production database, I don't want to get the whole file every 5 minutes since the file may be huge and this may impact the server. I need to get the last 30 lines of this file every 5 minutes interval and have as little impact as possible.
The following is my current code, I would appreciate any insight to how best accomplish this:
<?php
$user="id";
$pass="passed";
$c = curl_init("sftp://$user:$pass#server1.example.net/opt/vmstat_server1");
curl_setopt($c, CURLOPT_PROTOCOLS, CURLPROTO_SFTP);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($c);
curl_close($c);
$data = explode("\n", $data);
?>
Marc B is wrong. SFTP is perfectly capable of partial file transfers. Here's an example of how to do what you want with phpseclib, a pure PHP SFTP implementation:
<?php
include('Net/SFTP.php');
$sftp = new Net_SFTP('www.domain.tld');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
$size = $sftp->size('filename.remote');
// outputs the last ten bytes of filename.remote
echo $sftp->get('filename.remote', false, $size - 10);
?>
In fact I'd recommend an approach like this anyway since some SFTP servers don't let you run commands via the system shell. Plus, SFTP can work on Windows SFTP servers whereas tail is unlikely to do so even if you do have shell access. ie. overall, it's a lot more portable a solution.
If you want to get the last x lines of a file you could loop repeatedly, reading however many bytes each time, until you encounter 10x new line characters. ie. get the last 10 bytes, then the next to last 10 bytes, then the ten bytes before those ten bytes, etc.
An answer by #Sammitch to a duplicate question Get last 15 lines from a large file in SFTP with phpseclib:
The following should result in a blob of text with at least 15 lines from the end of the file that you can then process further with your existing logic. You may want to tweak some of the logic depending on if your file ends with a trailing newline, etc.
$filename = './file.txt'
$filesize = $sftp->size($filename);
$buffersize = 4096;
$offset = $filesize; // start at the end
$result = '';
$lines = 0;
while( $offset > 0 && $lines < 15 ) {
// work backwards
if( $offset < $buffersize ) {
$offset = 0;
} else {
$offset -= $buffer_size;
}
$buffer = $sftp->get($filename, false, $offset, $buffer_size));
// count the number of newlines as we go
$lines += substr_count($buffer, "\n");
$result = $buffer . $result;
}
SFTP is not capable of partial file transfers. You might have better luck using a fullblowin SSH connection and use a remote 'tail' operation to get the last lines of the file, e.g.
$lines = shell_exec("ssh user#remote.host 'tail -30 the_file'");
Of course, you might want to have something a little more robust that can handle things like net.glitches that prevent ssh from getting through, but as a basic starting point, this should do the trick.
I have to submit all telephone numbers from my database records to URL, but I need to restrict each time send 300 phone numbers only.
I need a php script able to run following scenario:
Retrieve 2,000 records from database.
Loop all rows and save each into a variable or something else. (important)
Count total has 2,000 records.
Loop 300 records each time to write into URL. (very important)
Submit the URL (this part no need to explain)
loop for next 300 records to write into URL, and repeat it until record 2,000.
I believe in this case, 2,000 / 300 = 7 times of looping, which 300 records for first 6 times and final time is sending 200 records only.
As I mentioned above the looping for 300 records is very important, and next looping able to know starting from record 301 until 600, and so on.
EDITED
Below are my original code, but it's reading all phone numbers and dumb them all to my URL:
$smsno = trim($_REQUEST['string_of_phone_number_eg_0123456;0124357;0198723']);
$message = trim($_REQUEST['message']);
$phoneNo = explode(";", $smsno);
// ----------
//
// Need to count total $phoneNo, eg total is 2,000 phone numbers
// Loop 300 times for the phone numbers, eg 001-300, 301-600, 401-900, ..., 1501-1800, 1801-2000
// Every 300 records, eg $phoneStr = '0123456;0124357;0198723;...' total 300 phone numbers in this string
// Write into my URL: $link = "http://smsexample.com/sms.php?destinationnumber=$phoneStr&messagetosms=$message";
//
// ----------
I am seeking solution from here as I have no idea how to loop each 300 records and write into string then throw this string to my URL.
I can make the first 300 records, but how to get next 300 records after first 300 records write into string and throw to my url, and waiting to perform second throw to url.
For example,
first loop for 300 records:
$phoneStr = phoneNumber01;phoneNumber02;phoneNumber03;...;phoneNumber300
$link = "http://smsexample.com/sms.php?destinationnumber=$phoneStr&messagetosms=$message";
second loop for next 300 records
$phoneStr = phoneNumber301;phoneNumber302;phoneNumber303;...;phoneNumber600
$link = "http://smsexample.com/sms.php?destinationnumber=$phoneStr&messagetosms=$message";
and so on.
for ($i = 1; $i <= 2000; $i++)
{
if ($i % 300 == 0 || $i == 2000)
{
//Make URL and send
}
}
// Per-request limit
$limit = 300;
// Get array of numbers
$numbers = explode(';', $_REQUEST['string_of_phone_number_eg_0123456;0124357;0198723']);
// Get message
$message = trim($_REQUEST['message']);
// Loop numbers
while ($numbers) {
// Get a block of numbers
$thisBlock = array_splice($numbers, 0, $limit);
// Build request URL
$url = "http://smsexample.com/sms.php?destinationnumber=".urlencode(implode(';', $thisBlock))."&messagetosms=".urlencode($message);
// Send the request
$response = file_get_contents($url);
}