I wrote an ajax function to call a script which creates a zip archive of about 500 files. There is a loop to add the files to the archive. That loop contains a filecounter. Now I want to update the status at the browser every 50 files. But the php script will send the whole output when the script ends and the zip file is complete.
The principle is quite similar to the following post:
Echo 'string' while every long loop iteration (flush() not working)
(Except to solution is not working on my server)
I found a lot of possible solutions, but nothing works...
I tried it with the flush/ob_flush method (ob_implicit_flush as well), but flush the content don't work for me. I played a litte with the server configuration but it didn't help. Also all the examples didn't work. Maybe a server problem.
I tried SSE but the next response succeed also after the script ends.
I tried to use WebSocket but I had some problems with the handshake.
The code may look like this
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
echo $filecounter;
}
}
Are there other options to get this working? Or how i get the code 'flushed'?
Thanks for your help!
You could store the progress in the Session and use a second ajax call to track the progress:
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
session_start();
$_SESSION['progress'] = $filecounter;
session_write_close();
}
}
You need session_write_close(); to make the Session var accessible to the second script.
Related
Following the good advice on this link:
How to keep checking for a file until it exists, then provide a link to it
The loop will never end if the file will never be created.
In a perfect system, it should not happen, but if it does how would one exit from that loop?
I have a similar case:
/* More codes above */
// writing on the file
$csvfile = $foldername.$date.$version.".csv";
$csv = fopen( $csvfile, 'w+' );
foreach ($_POST['lists'] as $pref) {
fputcsv($csv, $pref, ";");
}
// close and wait IO creation
fclose($csv);
sleep(1);
// Running the Java
$exec = shell_exec("/usr/bin/java -jar $app $csvfile");
sleep(3);
$xmlfile = preg_replace('/\\.[^.\\s]{3,4}$/', '.xml', $csvfile);
if (file_exists("$csvfile") && (file_exists("$xmlfile"))){
header("Location:index.php?msg");
exit;
}
else if (!file_exists("$csvfile")){
header("Location:index.php?msgf=".basename($csvfile)." creation failed!");
exit;
}
else if (!file_exists("$xmlfile")){
header("Location:index.php?msgf=".basename($xmlfile)." creation failed!");
exit;
}
//exit;
} // Just the end
?>
( Yes, bad idea to pass variables in the url.. I got that covered )
I use sleep(N); because I know the java takes short to create the file, same for the csv on the php.
How can I improve the check on the file, to wait the necessary time before reporting the status OK or NOT ok if the file was not created?
After reading your comments, I think "the best loop" isn't a good question to get a better answer.
The linked script just give a good approach when the script expects a file. That script will wait until the file is created or forever (but the creator ensures about the file creation).
Better than that, you could give a particular period to ensure if the file exists or not.
If after the shell_exec the java script didn't create the file (which I think is almost impossible, but is just a thought), you could use a code like above:
$cycles = 0;
while (!($isFileCreated = file_exists($filename)) && $cycles > 1000) {
$cycles++;
usleep(1);
}
if (!$isFileCreated)
{
//some action
//throw new RuntimeException("File doesn't exists");
}
//another action
The script above will wait until the file is created or until reach a particular amount of cycles (it's better to call cycles than microseconds, because I can't ensure that each cycle will be execute in one microsecond). The number of cycles can be changed if you need more time.
I have a directory which can contain CSV files that come through a service that I need to import into database. These CSV files are 1000 rows each and can be 10 to 150 files.
I want to insert data of all these CSV files into database. The problem is that PHP dies because of timeout issue because even if I use set_time_limit(0), the server (siteground.com) imposes its restrictions. Here is the code:
// just in case even though console script should not have problem
ini_set('memory_limit', '-1');
ini_set('max_input_time', '-1');
ini_set('max_execution_time', '0');
set_time_limit(0);
ignore_user_abort(1);
///////////////////////////////////////////////////////////////////
function getRow()
{
$files = glob('someFolder/*.csv');
foreach ($files as $csvFile) {
$fh = fopen($csvFile, 'r');
$count = 0;
while ($row = fgetcsv($fh)) {
$count++;
// skip header
if ($count === 1) {
continue;
}
// make sure count of header and actual row is same
if (count($this->headerRow) !== count($row)) {
continue;
}
$rowWithHeader = array_combine($this->headerRow, $row);
yield $rowWithHeader;
}
}
}
foreach(getRow() as $row) {
// fix row
// now insert in database
}
This is actually a Command run through artisan (I am using Laravel). I know that CLI doesn't have time restrictions but for some reason not all CSV files get imported and process ends at certain point of time.
So my question is is there way to invoke separate PHP process for each CSV file present in a directory ? Or some other way of doing this so I am able to import all CSV files without any issue like PHP's generator, etc
You could just do some bash magic. refactor your script so that it processes one file only. The file to process is an argument to the script, access it by using $argv.
<?php
// just in case even though console script should not have problem
ini_set('memory_limit', '-1');
ini_set('max_input_time', '-1');
ini_set('max_execution_time', '0');
set_time_limit(0);
ignore_user_abort(1);
$file = $argv[1]; // file is the first and only argument to the script
///////////////////////////////////////////////////////////////////
function getRow($csvFile)
{
$fh = fopen($csvFile, 'r');
$count = 0;
while ($row = fgetcsv($fh)) {
$count++;
// skip header
if ($count === 1) {
continue;
}
// make sure count of header and actual row is same
if (count($this->headerRow) !== count($row)) {
continue;
}
$rowWithHeader = array_combine($this->headerRow, $row);
yield $rowWithHeader;
}
}
foreach(getRow($file) as $row) {
// fix row
// now insert in database
}
Now, call your script like this:
for file in `ls /path/to/folder | grep csv`; do php /path/to/your/script.php /path/to/folder/$file; done
This will execute your script for each .csv file in your /path/to/folder
The best approach is to process a limited number of files per one php process. For example, you can start with 10(calculate a number of files empirical) files, process them, mark as removed(move to a folder with processed file) and stop the process. After that start a new process to import another 10 files and so on. In Laravel you can say to not start more than one process for a specific command if another process is working already. The command for Laravel is below:
$schedule->command("your job")->everyMinute()->withoutOverlapping();
If you use this approach you can be sure that all files will be processed for specific time and they will not consume too much resources to be killed.
If your hosting providers allows cron jobs, they dont have a timeout limit.
Also they should fit the job better than manually calling the function for heavy and long tasks, since that could case huge problems if the method its called several times.
Right now I'm using an event-stream to echo the upload progress from another script to the user, but no matter how I try to pass the variable from the form-data progressing php file to my event-stream file, it doesn't work.
The problem is, that all changes I make in my running php script for the file upload only become visible after full script execution on my server, while it's working fine on my local machine.
I've tried writing to files in my web directory, in the tmp directory and using a mysql database, but none of them worked.
Here's what my upload progressing php script roughly looks like with the sql solution:
$progress = 0;
foreach ($files as $file) {
sql("UPDATE `tmp` SET `progress`=? WHERE `ID`=?", $progress, $id);
imagejpeg($image, "images/gallery/$filename.jpg", 100);
$progress++;
}
And my event-stream php script:
echo "data: ".sql("SELECT `progress` FROM `tmp` WHERE `ID`=?", $id)."\n\n";
I would appreciate any solution how to solve this problem, best without installing any php libraries.
Thanks
I've solved my problem by flushing the progress to the user over the upload script and handling the output by checking the xhr.responseText periodically.
PHP (flushes line break + progress)
foreach ($files as $drop => $file) {
echo "\n$progress";
flush();
// file progressing
}
JavaScript (gets progress after last line break)
setInterval(function() {
if (page.xhr.readyState == 3 && page.xhr.status == 200) {
var response = page.xhr.responseText;
$('#progress').stop().animate({width: parseFloat(response.substr(response.lastIndexOf("\n")+1))+'%'}, 480);
}
}, 1000);
I have a script that re-writes a file every few hours. This file is inserted into end users html, via php include.
How can I check if my script, at this exact moment, is working (e.g. re-writing) the file when it is being called to user for display? Is it even an issue, in terms of what will happen if they access the file at the same time, what are the odds and will the user just have to wait untill the script is finished its work?
Thanks in advance!
More on the subject...
Is this a way forward using file_put_contents and LOCK_EX?
when script saves its data every now and then
file_put_contents($content,"text", LOCK_EX);
and when user opens the page
if (file_exists("text")) {
function include_file() {
$file = fopen("text", "r");
if (flock($file, LOCK_EX)) {
include_file();
}
else {
echo file_get_contents("text");
}
}
} else {
echo 'no such file';
}
Could anyone advice me on the syntax, is this a proper way to call include_file() after condition and how can I limit a number of such calls?
I guess this solution is also good, except same call to include_file(), would it even work?
function include_file() {
$time = time();
$file = filectime("text");
if ($file + 1 < $time) {
echo "good to read";
} else {
echo "have to wait";
include_file();
}
}
To check if the file is currently being written, you can use filectime() function to get the actual time the file is being written.
You can get current timestamp on top of your script in a variable and whenever you need to access the file, you can compare the current timestamp with the filectime() of that file, if file creation time is latest then the scenario occured when you have to wait for that file to be written and you can log that in database or another file.
To prevent this scenario from happening, you can change the script which is writing the file so that, it first creates temporary file and once it's done you just replace (move or rename) the temporary file with original file, this action would require very less time compared to file writing and make the scenario occurrence very rare possibility.
Even if read and replace operation occurs simultaneously, the time the read script has to wait will be very less.
Depending on the size of the file, this might be an issue of concurrency. But you might solve that quite easy: before starting to write the file, you might create a kind of "lock file", i.e. if your file is named "incfile.php" you might create an "incfile.php.lock". Once you're doen with writing, you will remove this file.
On the include side, you can check for the existance of the "incfile.php.lock" and wait until it's disappeared, need some looping and sleeping in the unlikely case of a concurrent access.
Basically, you should consider another solution by just writing the data which is rendered in to that file to a database (locks etc are available) and render that in a module which then gets included in your page. Solutions like yours are hardly to maintain on the long run ...
This question is old, but I add this answer because the other answers have no code.
function write_to_file(string $fp, string $string) : bool {
$timestamp_before_fwrite = date("U");
$stream = fopen($fp, "w");
fwrite($stream, $string);
while(is_resource($stream)) {
fclose($stream);
}
$file_last_changed = filemtime($fp);
if ($file_last_changed < $timestamp_before_fwrite) {
//File not changed code
return false;
}
return true;
}
This is the function I use to write to file, it first gets the current timestamp before making changes to the file, and then I compare the timestamp to the last time the file was changed.
I have a php script that logs ads(banners) for a website and stores them to a .dat file. Inside this file an ID, URL, an other important information is saved. The problem I am having is that there are at any given time 4 ads on the page and so the .dat file is often corrupted when the php script attempts to write to it while it is open.
I checked and tried this solution however it did not help me:
PHP Simultaneous file access / flock() issue
The function I am using at the moment looks like this:
function writeads(){
global $bannerAdsPath, $ads, $bannerAds;
$data = fopen($bannerAdsPath, 'w') or die();
flock($data, 2) or die();
fputs($data, #join("\n", $ads)."\n");
while (list ($key, $val) = each ($bannerAds)) {
if (($key != '') && ($val != '')) {
fputs($data, $key.'='.$val."\n");
}
}
flock($data, 3);
fclose($data);
reset($bannerAds);
}
Any help would be appreciated as I have been scratching my head over this for a while.
Side bit of information, the client did not want to have their code rewritten to use a Database instead of a file so that option is out.
Thanks!
fopen with 'w' truncates the file before you have the option of flocking it.
You almost never want to use flock to unlock a file; just use fclose; the file will be unlocked when the handle is closed, and that way you know that no buffered writes will happen after you unlock.