Right now I'm using an event-stream to echo the upload progress from another script to the user, but no matter how I try to pass the variable from the form-data progressing php file to my event-stream file, it doesn't work.
The problem is, that all changes I make in my running php script for the file upload only become visible after full script execution on my server, while it's working fine on my local machine.
I've tried writing to files in my web directory, in the tmp directory and using a mysql database, but none of them worked.
Here's what my upload progressing php script roughly looks like with the sql solution:
$progress = 0;
foreach ($files as $file) {
sql("UPDATE `tmp` SET `progress`=? WHERE `ID`=?", $progress, $id);
imagejpeg($image, "images/gallery/$filename.jpg", 100);
$progress++;
}
And my event-stream php script:
echo "data: ".sql("SELECT `progress` FROM `tmp` WHERE `ID`=?", $id)."\n\n";
I would appreciate any solution how to solve this problem, best without installing any php libraries.
Thanks
I've solved my problem by flushing the progress to the user over the upload script and handling the output by checking the xhr.responseText periodically.
PHP (flushes line break + progress)
foreach ($files as $drop => $file) {
echo "\n$progress";
flush();
// file progressing
}
JavaScript (gets progress after last line break)
setInterval(function() {
if (page.xhr.readyState == 3 && page.xhr.status == 200) {
var response = page.xhr.responseText;
$('#progress').stop().animate({width: parseFloat(response.substr(response.lastIndexOf("\n")+1))+'%'}, 480);
}
}, 1000);
Related
I wrote an ajax function to call a script which creates a zip archive of about 500 files. There is a loop to add the files to the archive. That loop contains a filecounter. Now I want to update the status at the browser every 50 files. But the php script will send the whole output when the script ends and the zip file is complete.
The principle is quite similar to the following post:
Echo 'string' while every long loop iteration (flush() not working)
(Except to solution is not working on my server)
I found a lot of possible solutions, but nothing works...
I tried it with the flush/ob_flush method (ob_implicit_flush as well), but flush the content don't work for me. I played a litte with the server configuration but it didn't help. Also all the examples didn't work. Maybe a server problem.
I tried SSE but the next response succeed also after the script ends.
I tried to use WebSocket but I had some problems with the handshake.
The code may look like this
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
echo $filecounter;
}
}
Are there other options to get this working? Or how i get the code 'flushed'?
Thanks for your help!
You could store the progress in the Session and use a second ajax call to track the progress:
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
session_start();
$_SESSION['progress'] = $filecounter;
session_write_close();
}
}
You need session_write_close(); to make the Session var accessible to the second script.
I have a php script that logs ads(banners) for a website and stores them to a .dat file. Inside this file an ID, URL, an other important information is saved. The problem I am having is that there are at any given time 4 ads on the page and so the .dat file is often corrupted when the php script attempts to write to it while it is open.
I checked and tried this solution however it did not help me:
PHP Simultaneous file access / flock() issue
The function I am using at the moment looks like this:
function writeads(){
global $bannerAdsPath, $ads, $bannerAds;
$data = fopen($bannerAdsPath, 'w') or die();
flock($data, 2) or die();
fputs($data, #join("\n", $ads)."\n");
while (list ($key, $val) = each ($bannerAds)) {
if (($key != '') && ($val != '')) {
fputs($data, $key.'='.$val."\n");
}
}
flock($data, 3);
fclose($data);
reset($bannerAds);
}
Any help would be appreciated as I have been scratching my head over this for a while.
Side bit of information, the client did not want to have their code rewritten to use a Database instead of a file so that option is out.
Thanks!
fopen with 'w' truncates the file before you have the option of flocking it.
You almost never want to use flock to unlock a file; just use fclose; the file will be unlocked when the handle is closed, and that way you know that no buffered writes will happen after you unlock.
My PHP/MySQL web application has a function that combines a set of images for each user into a single PDF using ImageMagick. Each PDF is then placed into a ZIP file. Each image can be up to 5 MB in size. My problem is that the browser times out before the user can download the document.
Any ideas on how to best do this?
I was thinking I could create the ZIP file without sending it to the browser (i.e. remove the "headers" at the end of my code) and then email a link to the file; however, it still requires that the user wait a long time for the ZIP file to be created (it appears as if the browser is just hanging). Could this be done with AJAX behind the scenes or directly on the server somehow?
$tmp_path = sys_get_temp_dir().'/';
$archive_name = "myarchive.zip";
$zip = new ZipArchive();
if($zip->open($archive_name,1 ? ZIPARCHIVE::OVERWRITE : ZIPARCHIVE::CREATE) !== true) {
return false;
}
foreach ($rows AS $row) {
$irows = get_images($row['user_id']);
$images = array();
foreach($irows AS $irow){
$doc = fetch_document_path($irow['id']);
$output_file_name = $tmp_path.$irow['id'].'.jpg';
exec ('convert '.$doc.' '.$output_file_name);
$images[] = $irow['id'].'.jpg';
}
$images = implode(' ', $images);
$output_file_name = $tmp_path.$row['name'].'.pdf';
exec ('convert '.$images.' "'.$output_file_name.'"');
$zip->addFile($output_file_name,basename($output_file_name));
}
$zip->close();
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="output.zip"');
readfile($archive_name);
IMO you should run some background task that will do the job. Background worker can for example save URL to result file in DB after it'll finished (can also save some more information like current status and job progress), meantime webpage can periodically ask server using AJAX if job is already done or not and finally display link when it'll be available.
Simplest way to achieve that is to run your script as background process:
$arg1 = escapeshellarg($somearg1);
$arg2 = escapeshellarg($somearg1);
exec(sprintf('/usr/bin/php archiver.php %s %s > /dev/null 2>&1 & echo $!', $arg1, $arg2));
archiver.php should begin with following lines:
<?php
ignore_user_abort(true);
set_time_limit(0);
it'll prevent script from being stopped when parent script will finish working.
Second idea I have is more complex - you can write a daemon that will run in background waiting for jobs. To communicate with it you can use queues (like AMQP) or just database. With daemon you'll have more control on what is happening and when - in first approach it can happen that your application will fire too many processes.
I'm aware the title of this is a little strange, but I'll get there.
I have a camera tethered to a laptop. Using Remote Shooting, when the photographer takes a photo it is saved into a folder on the laptop's hard drive. There is an Automator (Mac OS X) action set up on the folder that whenever a new file appears, it resizes it and pushes it up to an FTP using Transmit.
Here's where the code comes in.
I have a web page displaying the most recent photo taken. It uses ajax to repeatedly check whether or not a new file has been uploaded, and if it has, load the new photo and crossfade it with the old photo. Here is the Javascript running on the page.
(function() {
var delay, refreshLoop;
// Alias to setTimeout that reverses the parameters, making it much cleaner in code (for CoffeeScript)
delay = function(time, callback) {
return setTimeout(callback, time);
};
// Function that drives the loop of refreshing photos
refreshLoop = function(currentFolderState, refreshRate) {
// Get the new folder state
$.get("ajax/getFolderState.php", function(newFolderState) {
// If the new folder state is different
if (newFolderState !== currentFolderState) {
// Get the newest photo
$.get("ajax/getNewestPhoto.php", function(photoFilename) {
var img;
// Create the new image element
img = $('<img class="new-photo"/>')
// Append the src attribute to it after so it can BG load
.attr('src', "/events/mindsmack/booth/cinco-de-mindsmack-2012/" + photoFilename)
// When the image is loaded
.load(function() {
// Append the image to the photos container
$('#photos').append(img);
// Crossfade it with the old photo
$('#photos .current-photo').fadeOut();
$('#photos .new-photo').fadeIn().removeClass("new-photo").addClass("current-photo");
});
});
}
// Wait for the refresh rate and then repeat
delay(refreshRate, function() {
refreshLoop(newFolderState, refreshRate);
});
});
};
// Document Ready
$(function() {
var refreshRate;
// Load the first photo
$.get("ajax/getNewestPhoto.php", function(photoFilename) {
$('#photos').append("<img src='/events/mindsmack/booth/cinco-de-mindsmack-2012/" + photoFilename + "' class='current-photo' />");
});
refreshRate = 2000;
// After the timeout
delay(refreshRate, function() {
// Get the initial folder state and kick off the loop
$.get("ajax/getFolderState.php", function(initialFolderState) {
refreshLoop(initialFolderState, refreshRate);
});
});
});
}).call(this);
And here are the two PHP files that are called in that Javascript
getFolderState.php
<?php
$path = $_SERVER['DOCUMENT_ROOT'] . "/events/mindsmack/booth/cinco-de-mindsmack-2012/";
// Get a directory listing of the path where the photos are stored
$dirListing = scandir( $path );
// Echo an md5 hash of the directory listing
echo md5(serialize($dirListing));
getNewestPhoto.php
<?php
$path = $_SERVER['DOCUMENT_ROOT'] . "/events/mindsmack/booth/cinco-de-mindsmack-2012/";
// Get a directory listing of the path where the photos are stored
$listing = scandir($path);
$modTime = 0;
$mostRecent = "";
// Find the most recent file
foreach ( $listing as $file ) {
if ( is_file($path.$file) && $file !== ".DS_Store" && filectime($path.$file) > $modTime) {
$modTime = filectime($path.$file);
$mostRecent = $file;
}
}
// Echo the most recent filename
echo $mostRecent;
All of this works mostly flawlessly. The problem, I believe, is when the loop is fired while a file is in the middle of being uploaded. Occasionally a photo will be taken and it will only show up on the page partway. An error isn't thrown at all, the script continues to run just fine, and the image file is actually cached in that state, leading me to believe that my code is catching a file upload in progress and only showing the part of the file that has been uploaded at that moment.
I don't mind changing my solution if I need to in order to overcome this issue, I'm just not sure exactly what to do.
EDIT
As per one of the suggestions below, I added code to my getNewestPhoto.php that checks the file size of the photo, waits a bit, and checks it again. If they're different, it goes back and checks again until the file sizes are the same. I was hoping this would catch the files that are mid-upload because the filesize would change between loops but even when photos are coming up partially rendered, the filesize check didn't catch it.
Here's the code I added
$currFilesize = filesize($path . $mostRecent);
$newFilesize;
while ( true ) {
sleep(.5);
$newFilesize = filesize($path . $mostRecent);
if ( $newFilesize == $currFilesize) {
break;
}
else {
$currFilesize = $newFilesize;
}
}
I'm thinking (via another suggestion) that I need to add some kind of lock file on upload that stops the code from refreshing the photo and is removed when the upload is done, but seeing that I'm not running any kind of web server on the computer tethered to the camera, I'm not sure how to accomplish that. I would love suggestions.
There are many routes to solve this, most better than what I am about to suggest. I think the simplest and quickest solution, however, is to FTP to a tmp dir and when the transfer is completed trigger a move of the file to the production directory. Does your local automator job have such a mechanism in it's dealings with transmit?
I would make the PHP script check the filesize in a loop with a (small) delay, and if it matches then output the file. Otherwise, loop until it does.
I have written a PHP script that uploads images from a directory to a remote server using ftp_put. I have set this as a cron using task scheduler and wget. Initially it works great, but then after a while, do not know exactly when, the process freezes, by that "windows task scheduler" saying it is running the job, but no photos are no longer being uploaded.
Initially I thought the problem was due the max_execution_time , but I have set that to 24 hours by using set_time_limit(3600*24); and I have set max_input_time to 600 seconds (10 mins).
Why doesn't it complete the task?
Here is the code:
if($conn){
if (is_dir($imagesPath)){
if($files = opendir($imagesPath)){
while(($file = readdir($files)) !== false){
if($file != "." && $file != ".." && preg_match("/\.jpg|\.JPG|\.gif|\.GIF|\.png|\.PNG|\.bmp|\.BMP/",$file) && date("Ymd",filemtime($imagesPath.'/'.$file)) >= date("Ymd",strtotime(date("Y-m-d")." -".$days." day"))){
if(ftp_put($conn, $remotePath.$file, $imagesPath.'/'.$file, FTP_BINARY)){
//echo $file;
$counter++;
}
else{
echo '<br>'.$imagesPath.'/'.$file;
}
}
}
closedir($files);
echo $counter.' Files Uploaded on '.date("Y-m-d");
}
else{
echo 'Unable to read '.$imagesPath;
}
}
else{
echo $imagesPath.' Does not exist';
}
ftp_close($conn);
}else{
echo "Failed to connect";
}
/* End */
exit;
Added:
/* Settings */
// Set Max Execution time
set_time_limit(3600*24);
at the top of the script.
Thanks.
I am working on something similar and I found the following to help:
1) make sure every step is conditional so if something fails to load (like an image, ftp connection etc.) the procedure keeps running (iterating).
2) set a small sleep() at the end of the file (or between difficult steps). I am using it for images as well and when the connection to the image lags or the 'image write' time lags it helps.
3) set the scheduler to execute the script often (my is 2 times a day) but it could be set to hourly as long as you check the box: do not start new instance if the script is already running
4) depending on the setup of your server, check for other tasks which may interrupt the script run-time. It is not always PHP issue. As long as you have properly scheduled task to re-execute the script you should be fine.
5) for easy debugging, instead of the echo statement (which is most likely showing in your cmd window) use a simple file logging like ( $message = fopen($myLogFile, 'w'); ) additionally to your "Does not exist" or "Failed to connect" statements and include more details so when it fails, you can go to your log file and see when and why it failed.
6) you may try to use an infinite loop like ( while (true) { ... your code... }) instead of the set_time_limit().
I hope this helps :)