PHP long polling make my server hang - php

I am trying to make a facebook webpage messenger like app and I use AJAX in client side and php on my server side.
My AJAX code:
function longPoll(timestamp)
{
var queryString = {'timestamp' : timestamp};
var shouldDelay = false;
$.ajax(
{
type: 'GET',
async: true,
url: 'pollMsg.php',
data: queryString,
timeout: 5000,
cache: false
}
).done(function(data){
var array = jQuery.parseJSON(data);
for (var i = 0; i < array.length; i++) {
$('#msgTable > tbody:last-child').append('<tr><td><b>' + array[i].sender + '</b><br/>' + array[i].timestamp + '</td><td><b>' + array[i].title + '</b><br/>' + array[i].content + '</td></tr>');
}
longPoll(obj.timestamp);
}).fail(function(jqXHR, textStatus, errorThrown) {
//shouldDelay = textStatus !== "timeout";
}).always(function() {
var delay = shouldDelay ? 5000: 0;
if (shouldDelay) {
shouldDelay = false;
window.setTimeout(longPoll, delay);
}
});
}
// initialize jQuery
$(function() {
longPoll();
});
My PHP code:
//set php runtime to unlimited
set_time_limit(10);
while (true) {
$last_ajax_call = isset($_GET['timestamp']) ? (int)$_GET['timestamp'] : '1970-01-01 00:00:00';
$sql = "select * from post where (receiver = ? or sender = ?) and postat > str_to_date(?, '%Y/%m/%d %H:%i:%s')";
$stmt = $conn->prepare($sql);
$stmt->bind_param("sss", $username, $username, $last_ajax_call);
$result = $stmt->execute();
$rows = $stmt->get_result()->fetch_all();
if (count($rows) > 0) {
$result = array();
foreach ($rows as $row) {
$data = array (
'sender' => $row[3],
'timestamp' => $row[5],
'title' => $row[1],
'content' => $row[2]
);
array_push($result, $data);
}
$json = json_encode($result);
echo $json;
break;
} else {
sleep( 1 );
continue;
}
}
I found that if I once click on the page with the AJAX code, when I change to other page in the same webserver, I will get a nginx error (as I use nginx).
What is the problem in my code? I have found some example of long polling and it gives me similar code. Thank you.
When I restart php-fpm, everything is okay, which means that I have generated so many request so that the server cannot handle.

Related

Ajax never initiating success: when using xhrFields

I am having trouble getting the success call to fire in my ajax request. I know the communication is working fine, but the last call in my PHP script, which is a return json_encode($array); will fire as if it is a part of the onprogress object. I would like to "break" the onprogress call and run the success function on the last data sent via return json_encode when the PHP script has terminated...
Here is my AJAX call:
$( document ).ready(function(e) {
var jsonResponse = '', lastResponseLen = false;
$("#btn_search").click(function(e){
var firstname = document.getElementById('firstname').value;
var lastname = document.getElementById('lastname').value;
$.ajax({
type: "POST",
url: 'search.php',
data: $('#search_fields').serialize(),
dataType: "json",
xhrFields: {
onprogress: function(e) {
var thisResponse, response = e.currentTarget.response;
if(lastResponseLen === false) {
thisResponse = response;
lastResponseLen = response.length;
} else {
thisResponse = response.substring(lastResponseLen);
lastResponseLen = response.length;
}
jsonResponse = JSON.parse(thisResponse);
document.getElementById('progress').innerHTML = 'Progress: '+jsonResponse.msg;
}
},
success: function(data) {
console.log('done!');
document.getElementById('progress').innerHTML = 'Complete!';
document.getElementById('results').innerHTML = data;
}
});
e.preventDefault();
});
});
And here is the basic PHP server script:
<?php
function progress_msg($progress, $message){
echo json_encode(array('progress' => $progress, 'msg' => $message));
flush();
ob_flush();
}
$array = array('msg' => 'hello world');
$count = 0;
while($count < 100){
progress_message($count, "working....");
$count += 10;
sleep(2);
}
return json_encode($array);
?>
I made your code work, there were 2 errors. First, in your while loop, your function name is incorrect, try this:
progress_msg($count, "working... ." . $count . "%");
Secondly, the very last line outputs nothing, so technically you don't get a "successful" json return. Change the last line of your server script from:
return json_encode($array);
to:
echo json_encode($array);
UPDATE: Full working code with hacky solution:
Ajax:
$( document ).ready(function(e) {
var jsonResponse = '', lastResponseLen = false;
$("#btn_search").click(function(e){
var firstname = document.getElementById('firstname').value;
var lastname = document.getElementById('lastname').value;
$.ajax({
type: "POST",
url: 'search.php',
data: $('#search_fields').serialize(),
xhrFields: {
onprogress: function(e) {
var thisResponse, response = e.currentTarget.response;
if(lastResponseLen === false) {
thisResponse = response;
lastResponseLen = response.length;
} else {
thisResponse = response.substring(lastResponseLen);
lastResponseLen = response.length;
}
jsonResponse = JSON.parse(thisResponse);
document.getElementById('progress').innerHTML = 'Progress: '+jsonResponse.msg;
}
},
success: function(data) {
console.log('done!');
dataObjects = data.split("{");
finalResult = "{" + dataObjects[dataObjects.length - 1];
jsonResponse = JSON.parse(finalResult);
document.getElementById('progress').innerHTML = 'Complete!';
document.getElementById('results').innerHTML = jsonResponse.msg;
}
});
e.preventDefault();
});
Search.php:
<?php
function progress_msg($progress, $message){
echo json_encode(array('progress' => $progress, 'msg' => $message));
flush();
ob_flush();
}
$array = array('msg' => 'hello world');
$count = 0;
while($count <= 100){
progress_msg($count, "working... " . $count . "%");
$count += 10;
sleep(1);
}
ob_flush();
flush();
ob_end_clean();
echo json_encode($array);
?>
The problem with the "success" method of the ajax call was that it couldn't interpret the returning data as JSON, since the full return was:
{"progress":0,"msg":"working... 0%"}{"progress":10,"msg":"working... 10%"}{"progress":20,"msg":"working... 20%"}{"progress":30,"msg":"working... 30%"}{"progress":40,"msg":"working... 40%"}{"progress":50,"msg":"working... 50%"}{"progress":60,"msg":"working... 60%"}{"progress":70,"msg":"working... 70%"}{"progress":80,"msg":"working... 80%"}{"progress":90,"msg":"working... 90%"}{"progress":100,"msg":"working... 100%"}{"msg":"hello world"}
Which is not a valid JSON object, but multipje JSON objects one after another.
I tried removing all previous output with ob_end_clean(); , but for some reason I can't figure out, it didn't work on my setup. So instead, the hacky solution I came up with was to not treat the return as JSON (by removing the dataType parameter from the AJAX call), and simply split out the final Json element with string operations...
There has got to be a simpler solution to this, but without the use of a third party jQuery library for XHR and Ajax, I couldn't find any.

Ajax failing multiple times before it works

I have a very annoying bug. My AJAX queries are failing anywhere up to 20-30 times with 500 GET and POST server errors, but then all of a sudden they work.
It seems to be completely random, as sometimes I will load the page and it will work fine for a full day of usage, but then the next day it will fail and I have to catch and retry it up to 30 times to get it to work.
AJAX
function getData(){
//Get Data and build contributions Table
$.ajax({
url: 'assets/processes/dash/support/get-data.php', // URL of php command
tryCount : 0,
retryLimit : 3,
type: 'POST', //TYPE
data: {'id': id}, //Variables in JSON FORMAT
success: function(results) { //SUCCESSFUL REQUEST FUNCTION
var result = $.parseJSON(results);
console.log(result);
},
error : function(xhr, textStatus, errorThrown ) {
if (textStatus == 'timeout') {
this.tryCount++;
if (this.tryCount <= this.retryLimit) {
$.ajax(this);
return;
}
return;
}
if (xhr.status == 500) {
$.ajax(this);
return;
} else {
//handle error
}
}
}); // end ajax call
PHP
<?php
include "../../../includes/database.php";
$totalOpen = $db->pdo_query_assoc("xdGetTotalUnresolvedTickets
1,0,0,0,0,0");
$totalWaitState = $db->pdo_query_assoc("xdGetWaitState 1,0,0,0,0,0");
$nonWaitState = $totalOpen['xdGetTotalUnresolvedTickets'] -
$totalWaitState['xsGetWaitState'];
$getStaff = $db->query("SELECT * FROM user WHERE role = 'Technical
Support'");
$getStaffArray = array();
foreach($getStaff as $key => $value){
$sqlUsername = $value['sqlName'];
$pdo_today = $db->pdo_query_assoc("xsGetTotalResolvedTicketsToday
'$sqlUsername'");
$start = date('Y-m-d ', strtotime('-7 days'));
$end = date('Y-m-d ', strtotime('+1 days'));
$pdo_last_week = $db->pdo_query_assoc("xsGetTotalResolvedTicketsToday
'$sqlUsername','$start','$end'");
$getCompletionRate = $db->pdo_query_assoc('xStaffTicketCompletionRate
"'.$sqlUsername.'"');
$getWeeklyRate = $db->pdo_query_assoc('xLastSevenPreviousSeven
"'.$sqlUsername.'"');
$getStaffArray[$key]['staffName'] = $value['user'];
$getStaffArray[$key]['staffFullName'] = $value['name'];
$getStaffArray[$key]['colourScheme'] = $value['colourScheme'];
$getStaffArray[$key]['today'] =
$pdo_today['xsGetTotalResolvedTicketsToday'];
$getStaffArray[$key]['thisWeek'] =
$pdo_last_week['xsGetTotalResolvedTicketsToday'];
$getStaffArray[$key]['completionRate'] = $getCompletionRate['Level'];
$getStaffArray[$key]['thisTimeLastWeek'] =
round($getWeeklyRate['Percentage'],1);
$getStaffArray[$key]['weeklyRateLevel'] = $getWeeklyRate['Level'];
$getStaffArray[$key]['weeklyPrevious7'] = $getWeeklyRate['Previous 7
Days'];
$getStaffArray[$key]['weeklyEarlier7'] = $getWeeklyRate['Earlier 7
Days'];
}
$tickets = array('totalCurrent' => $totalOpen, 'totalWaitState' =>
$totalWaitState, 'totalNoneWaitState' => $nonWaitState);
$allData = array('totalTickets' => $tickets, 'staff' => $getStaffArray);
echo json_encode($allData, JSON_FORCE_OBJECT);
?>
I know at the moment the trycount doesnt work properly, as instead of trying 3 times it tries unlimited amount until it works. Any one have any ideas as to why this could be happening?
They always fail on the php file, like the query cant be executed.
ERROR
GET http://mysite/get-events.php 500 (Internal Server Error)
jquery.min.js:4

Get 2 variables from Ajax call in php

I am trying to get 2 variables from ajax in php. With one variable its working fine. New to ajax, so I am not sure how will I include a second variable. As of now I am getting the msg_count with out any issues. My ajax script is below:
function addmsg(type, msg) {
$('#msg_count').html(msg);
}
function waitForMsg() {
$.ajax({
type: "GET",
url: "notification/select.php",
async: true,
cache: false,
timeout: 50000,
success: function(data) {
addmsg("new", data);
setTimeout(
waitForMsg,
1000
);
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
addmsg("error", textStatus + " (" + errorThrown + ")");
setTimeout(
waitForMsg,
15000);
}
});
};
$(document).ready(function() {
waitForMsg();
});
select.php script is below:
$sql = "SELECT * from notification where tousername='$tousername' and isread = 0";
$result = $con->query($sql);
$row = $result->fetch_assoc();
$count = $result->num_rows;
echo $count;
$not=$row['notification'];
echo $not;
I am able to pass the $count properly. I need to pass $not also to the ajax. How will I do that?
My edited php script to use it with a WHILE Loop is as follows:
$result= mysqli_query($con,"SELECT * from notification where tousername='$tousername' and isread = 0");
while($row = mysqli_fetch_array($result)) {
$count = $result->num_rows;
$not=$row['notification_msg'];
$res=[];
$res['count'] = $count;
$res['not'] = $not;
echo json_encode($res);
Like #guradio said, set dataType : 'json' inside ajax properties and json_encode data that you want to pass into success block like following code :
$.ajax({
....
....
dataType : 'json', // added here
success : function ( data ) {
// access data from response
// access it using data.count, data.not
console.log(data)
// just called like original code
// passed on result `data`
addmsg( type, data );
// the rest of the code
}
...
});
function addmsg(type, msg){
// access it using msg.count, msg.not
console.log(msg.count)
$('#msg_count').html(msg);
}
In Php :
$sql = "SELECT * from notification where tousername='$tousername' and isread = 0";
$result = $con->query($sql);
$row = $result->fetch_assoc();
$count = $result->num_rows;
$not=$row['notification'];
// added here
echo json_encode( array( 'count' => $count, 'not' => $not ) );
Edited : This depend on how you want to store the data and populate it
// defined container outside loop
$res = [];
while($row = mysqli_fetch_array($result)) {
$count = $result->num_rows;
$not=$row['notification_msg'];
array_push( $res, array( 'count' => $count, 'not' => $not ) );
}
echo json_encode($res);
Suggestion(credited to guradio):
Must be noted that, there is not necessary to add async : true inside ajax properties as the default behavior of Ajax is asynchronous and the default value of that is TRUE unless you wanna it to be false, but not recommended.

Ajax call doesn't work from iPhone app and arduino

I've created an Arduino project wich sends the coordinates to an URL. The URL does some ajax calls. In the browser it works fine, but when I'm trying it with the Arduino it doesn't work. So I tried to do the same thing with an iOS app, but I got the same problem. This is the code on the page that the Arduino and iOS app request.
var directionsService = new google.maps.DirectionsService();
var base_url = window.location;
var received_data = <?php echo json_encode($received_data); ?>;
$.ajax({
url: 'http://gps-tracker.domain.nl/_api/handler.php',
data: { action: 'post', device_id: received_data['device_id']},
type: 'GET',
dataType:"jsonp",
jsonp:"callback",
success: function (response){
var error = [];
var total = response.length;
for (var type in response) {
if(response[type].types == 'area'){
var x = checkInsideCircle(response[type].longitude, response[type].latitude, received_data['longitude'], received_data['latitude'], response[type].reach / 1000);
if(x == false){
// Outside
error.push(true);
}else{
// Inside
error.push(false);
}
}else if(response[type].types == 'route'){
// Check route
checkOnRoute(response[type].start_latitude, response[type].start_longitude, response[type].end_latitude, response[type].end_longitude, response[type].type, response[type]['reach'], type, function(result) {
error.push(result);
if(error.length == total){
if(error.indexOf(false) >= 0){
// Device is inside route or area
outside = false;
}else{
// Send data to database
$.ajax({
url: 'http://gps-tracker.domain.nl/_api/handler.php',
data: { action: 'post', device_id: received_data['device_id'], longitude: received_data['longitude'], latitude: received_data['latitude']},
type: 'GET',
dataType: 'json',
success: function (response){
console.log('good');
},error: function(jq,status,message) {
alert('A jQuery error has occurred. Status: ' + status + ' - Message: ' + message);
}
});
}
}
});
}
}
},error: function(jq,status,message) {
alert('A jQuery error has occurred. Status: ' + status + ' - Message: ' + message);
}
});
Here is the code from the handler.php file, that the ajax request requests.
$action = isset($_REQUEST['action']) ? $_REQUEST['action'] : false;
// Switch actions
switch($action) {
case 'get':
$callback ='callback';
if(isset($_GET['callback'])){
$callback = $_GET['callback'];
}
$routes = ORM::for_table('gps_tracker_route')
->inner_join('gps_tracker_device', array('gps_tracker_device.device_id', '=', 'gps_tracker_route.device_id'))
->where('gps_tracker_route.device_id', $_GET['device_id'])
->where('gps_tracker_device.device_id', $_GET['device_id']);
if($routes = $routes->find_many()){
foreach($routes as $k=>$v){
$v = $v->as_array();
if($v['status'] == 'on' or strtotime(date('Y-m-d H:i:s')) > strtotime($v['start_time']) and strtotime(date('Y-m-d H:i:s')) < strtotime($v['end_time'])){
$response1[$k] = $v;
$response1[$k]['types'] = 'route';
}
}
}
$area = ORM::for_table('gps_tracker_area')
->inner_join('gps_tracker_device', array('gps_tracker_device.device_id', '=', 'gps_tracker_area.device_id'))
->where('gps_tracker_area.device_id', $_GET['device_id'])
->where('gps_tracker_device.device_id', $_GET['device_id']);
if($area = $area->find_many()){
foreach($area as $k=>$v){
$v = $v->as_array();
if($v['status'] == 'on' or strtotime(date('Y-m-d H:i:s')) > strtotime($v['start_time']) and strtotime(date('Y-m-d H:i:s')) < strtotime($v['end_time'])){
$response2[$k] = $v;
$response2[$k]['types'] = 'area';
}
}
}
if(isset($response1) and isset($response2)){
$response = array_merge($response1, $response2);
}elseif(isset($response1)){
$response = $response1;
}else{
$response = $response2;
}
if ( isset($response) ) {
if ( is_array($response) ) {
if (function_exists('json_encode')) {
header('Content-Type: application/json');
echo $callback.'(' . json_encode($response) . ')';
} else {
include( ABSOLUTE_PATH . '/classes/json.class.php');
$json = new Services_JSON();
echo $json->encode($response);
}
} else {
echo $response;
}
exit(0);
}else{
exit();
}
break;
case 'post':
$_GET['timestamp'] = date("Y-m-d H:i:s");
$record = ORM::for_table('gps_tracker_device_logging')->create($_GET);
$record->save();
$item = ORM::for_table('gps_tracker_device_logging')
->where('id', $record->id);
if($item = $item->find_one()){
$item = $item->as_array();
echo json_encode($item);
}
break;
default:
die('invalid call');
}
Can someone help me?
EDIT
I think it is something with Javascript. I don't know if it's possible to use javascript when a device, like Arduino, makes a http request to a server. Someone know?
I think that it's because you need a Web Browser that supports JavaScript.
I don't work with Arduino, but from what I know it does not have a "real" Web Browser - it can only pull/download data but can't execute the JS part.
For JS to work you need something to run it. That is why it works in a the browser.

How to run 2 PHP script simultaniously (non-blocking) for monitoring a popen command?

How to run 2 PHP script simultaniously (synchronous) for monitoring a popen command?
I have a script launching a command like this:
7za a -t7z -mx9 backup.7z "H:\Informatique\*"
And I would like to display the progress of the compression on a page using jQuery and PHP.
The php script running this command look like this:
if( ($fp = popen("7za a -t7z ".$GLOBALS["backup_compression"]." \"".$backuplocation.$backupname."\" \"".$pathtobackup."\"", "r")) ) {
while( !feof($fp) ){
$fread = fread($fp, 256);
$line_array = preg_split('/\n/',$fread);
$num_lines = count($line_array);
$_SESSION['job'][$jobid]['currentfile'] = $_SESSION['job'][$jobid]['currentfile']+$num_lines;
$num_lines = 0;
flush();
}
pclose($fp);
}
jQuery call the 7za script then jquery call the listener (listener.php) each 1000ms. the listener.php page contain the following code:
session_start();
$jobid = $_GET['jobid'];
if(!isset($_SESSION['job'][$jobid])) { $arr = array("error"=>"Job not found"); echo json_encode($arr); exit(); };
$arr = array(
"curfile" => $_SESSION['job'][$jobid]['currentfile'],
"totalfiles" => $_SESSION['job'][$jobid]['totalfiles'],
);
echo json_encode($arr);
$jobid = null;
$arr = null;
exit();
After the jquery call is complete (with the listener and we got a response from the server) we display the information with something normal like: $("currentfile").text(data['curfile']);
The problem is that the listener is in an infinite loop waiting for the first script to complete... and that's not the job of the listener... It's listening at the end of everything... when you listen, it's to know what's happening. :P
Do you have any idea what's going on and how can I fix this problem?
Or maybe you can help me with a new approch to this problem?
As always, any suggestions will be welcome.
Thank you.
EDIT
jQuery script:
function backup_launch(jobid) {
x('jobid: '+jobid+' on state '+state);
x('Listener launched');
listen(jobid);
timeout = setTimeout("listen('"+jobid+"')", 500);
$.ajax({
url:'backup.manager.php?json&jobid='+jobid+'&state='+state,
dataType:'json',
success:function(data)
{
state = 3;
}
});
}
function listen(jobid) {
$.ajax({
url:'backup.listener.php?json&jobid='+jobid,
dataType:'json',
success:function(data)
{
var curfile = data['curfile'];
var totalfiles = data['totalfiles'];
var p = curfile * 100 / totalfiles;
x('File '+curfile+' out of '+totalfiles+' progress%: '+p);
timeout = setTimeout("listen('"+jobid+"')", 500);
}
});
}
EDIT 2
I found Gearman (http://gearman.org/) but I don't know at all how to implement this and it needs to be portable/standalone... I'll try to investigate that.
EDIT 3
Full code for the backup.manager.php page. The script is sending the response right aay, but do the job in the background.
The listen.php page still wait for the command to finish before returning any results.
$jobid = isset($_GET['jobid']) ? $_GET['jobid'] : 0;
//Make sure jobid is specified
if($jobid == 0) { return; }
header("Connection: close");
#ob_end_clean();
ignore_user_abort();
ob_start();
echo 'Launched in backgroud';
$size = ob_get_length();
header("Content-Length: ".$size);
ob_end_flush();
flush();
$_SESSION['job'][$jobid]['currentfile'] = 0;
// 3. When app appove backup,
// - Write infos to DB
// - Zip all files into 1 backup file
$datebackup = time();
$bckpstatus = 1; //In progress
$pathtobackup = $_SESSION['job'][$jobid]['path'];
/*
$query = "INSERT INTO backups (watchID, path, datebackup, checksum, sizeori, sizebackup, bckpcomplete)
VALUES ($watchID, '{$path}', '{$datebackup}', '', '{$files_totalsize}', '', '{$bckpstatus}')";
$sth = $db->prepare($query);
$db->beginTransaction();
$sth->execute();
$db->commit();
$sth->closeCursor();
*/
$backupname = $jobid.".".$GLOBALS["backup_ext"];
$backuplocation = "D:\\";
if( ($fp = popen("7za a -t7z ".$GLOBALS["backup_compression"]." \"".$backuplocation.$backupname."\" \"".$pathtobackup."\"", "r")) ) {
while( !feof($fp) ){
$fread = fread($fp, 256);
$line_array = preg_split('/\n/',$fread);
$num_lines = count($line_array);
$_SESSION['job'][$jobid]['currentfile'] = $_SESSION['job'][$jobid]['currentfile']+$num_lines;
$num_lines = 0;
sleep(1);
flush();
}
pclose($fp);
}
Jeremy,
A couple of months ago, I answered a similar question about running server-side batch jobs in a *NIX/PHP environment. If I understand correctly, your requirement is different but it's possible there might be something in the answer which will help.
Run a batch file from my website
EDIT
Here's a modified version of your client-side code. You will see that the main things I have changed are :
to move listen(jobid); inside backup_launch's success handler.
to add error handlers so you can observe errors.
Everything else is just a matter of programming style.
function backup_launch(jobid) {
x(['jobid: ' + jobid, 'on state', state].join(' '));
$.ajax({
url: 'backup.manager.php',
data: {
'json': 1,
'jobid': jobid,
'state': state
}
dataType: 'json',
success:function(data) {
state = 3;
x("Job started: " + jobid);
listen(jobid);
x("Listener launched: " + jobid);
},
error: function(jqXHR, textStatus, errorThrown) {
x(["backup.manager error", textStatus, errorThrown, jobid].join(": "));
}
});
}
function listen(jobid) {
$.ajax({
url: 'backup.listener.php',
data: {
'json': 1
'jobid': jobid
},
dataType: 'json',
success: function(data) {
var curfile = data.curfile;
var totalfiles = data.totalfiles;
var p = curfile * 100 / totalfiles;
x(['File', curfile, 'out of', totalfiles, 'progress%:', p].join(' '));
timeout = setTimeout(function() {
listen(jobid);
}, 500);
},
error: function(jqXHR, textStatus, errorThrown) {
x(["Listener error", textStatus, errorThrown, jobid].join(": "));
}
});
}
I've decided to call the command for each individual file.
It is going to be slower, but it is safer to manage with file cause errors and which files was properly inserted in the archive.
I'll try to find something faster is the direcotry to archive got 10 000 icons file for exemple. Maybe 5 files at a time instead of one file at a time.
This code was intended for testing purposes, the code is not optimized at all for production.
index.html jquery script:
var state = 0;
var jobid = '';
var timeout;
var jobfiles;
var jobpaths = [];
$(document).ready(function() {
$("button").click(function(e) {
e.preventDefault();
console.log('Sending');
var path = $("input").val();
x('state 1 getting info - infpb');
state = 1;
$.ajax({
url:'backup.manager.php?json&path='+encodeURI(path)+'&type=0&state='+state,
dataType:'json',
success:function(data)
{
jobid = data['stats']['jobid'];
jobfiles = data['stats']['totalfiles'];
var jobsize = data['stats']['totalsize'];
for(var i = 0, len = jobfiles; i < len; i++) {
jobpaths.push(data[i]['File']);
}
state = 2;
x('state 2 - infpb stop (Information retrieved, launch backup) '+jobfiles+' files with '+jobsize+' bytes');
backup_launch(jobid);
}
});
});
});
function x(x) {
$("#l").append(x+"<br>");
}
var curfileid = 0;
function backup_launch(jobid) {
x('jobid: '+jobid+' on state '+state);
$.ajax({
url:'backup.manager.php',
data: {
'json': 1,
'jobid': jobid,
'state': state,
'path': jobpaths[curfileid]
},
dataType:'json',
success:function(data)
{
if(curfileid < jobfiles) {
x((curfileid+1)+' / '+jobfiles);
curfileid++;
backup_launch(jobid);
}
},
error: function(jqXHR, textStatus, errorThrown) {
x(["backup.manager error", textStatus, errorThrown, jobid].join(": "));
}
});
}
function listen(jobid) {
$.ajax({
url:'backup.listener.php?json&jobid='+jobid,
dataType:'json',
success:function(data)
{
var curfile = data['curfile'];
var totalfiles = data['totalfiles'];
var p = curfile * 100 / totalfiles;
x('File '+curfile+' out of '+totalfiles+' progress%: '+p);
timeout = setTimeout("listen('"+jobid+"')", 500);
}
});
}
backup.manager.php
set_time_limit(0);
require('../functions.php');
session_start();
$keepCPUlow_lastlookup = time();
$keepCPUlow_mindelay = 60;
function keepCPUlow() {
global $keepCPUlow_lastlookup, $keepCPUlow_mindelay;
if((time() - $keepCPUlow_lastlookup) > $keepCPUlow_mindelay) {
$keepCPUlow_lastlookup = time();
getSysload(75, 1000); // Max %, wait time in ms
}
}
$state = isset($_GET['state']) ? $_GET['state'] : 0;
if($state == '1') {
//
$json = isset($_GET['json']) ? true : false;// Result should be in json format
$path = isset($_GET['path']) ? $_GET['path'] : ''; // Path of the file or folder to backup
$type = isset($_GET['type']) ? $_GET['type'] : ''; // Type - not very useful, for now
//0. Assign a jobid for this job, it will help retrieve realtime information about this task
$jobid = hash('md4', (time().uniqid().session_id()));
//Store the current status (0) job not started
$_SESSION['job'][$jobid]['status'] = 0; //Not started... yet
// 1. Retrive list of files and stats
$fileslist = array(); //Will contain the list of files
$files_totalsize = 0; // Total size of files
// Check if file or folder
if(is_dir($path)) {
//Path is a folder, get the list of files
$files = getFilelist($path);
foreach($files as $file) { //For each files
if(!is_dir($file['File'])) { //That is not a directory
$files_totalsize = $files_totalsize+$file['Filesize']; //Increment toal size
$cpumon = keepCPUlow(); //if($cpumon[1]) echo ">CPU BURN".$cpumon[0]."<";
}
}
$files_total = count($files); // Number of files
} else {
$files_totalsize = $files_totalsize+getFilesize($path);
$files_total = 1;
}
$files['stats'] = array("totalfiles"=>$files_total, "jobid"=>$jobid, "totalsize"=>$files_totalsize);
//Store infos in session
$_SESSION['job'][$jobid]['totalfiles'] = $files_total;
$_SESSION['job'][$jobid]['totalsize'] = $files_totalsize;
$_SESSION['job'][$jobid]['path'] = is_dir($path) ? $path.'\\*' : $path;
$_SESSION['job'][$jobid]['currentfile'] = 0;
$_SESSION['job'][$jobid]['currentfile_path'] = '';
$_SESSION['job'][$jobid]['bname'] = "SafeGuard_".$jobid.".".$GLOBALS["backup_ext"];
$_SESSION['job'][$jobid]['blocation'] = "D:\\";
// 2. return to app and wait for ready confirmation
if(isset($_GET['json'])) {
echo json_encode($files);
}
exit();
}
else if($state == '2') {
$jobid = isset($_GET['jobid']) ? $_GET['jobid'] : 0;
$_SESSION['job'][$jobid]['currentfile'] = 0;
// 3. When app appove backup,
// - Write infos to DB
// - Zip all files into 1 backup file
$datebackup = time();
$bckpstatus = 1; //In progress
//$pathtobackup = $_SESSION['job'][$jobid]['path'];
$pathtobackup = isset($_GET['path']) ? $_GET['path'] : "--";
$backupname = $_SESSION['job'][$jobid]['bname'];
$backuplocation = $_SESSION['job'][$jobid]['blocation'];
/*
$query = "INSERT INTO backups (watchID, path, datebackup, checksum, sizeori, sizebackup, bckpcomplete)
VALUES ($watchID, '{$path}', '{$datebackup}', '', '{$files_totalsize}', '', '{$bckpstatus}')";
$sth = $db->prepare($query);
$db->beginTransaction();
$sth->execute();
$db->commit();
$sth->closeCursor();
*/
if( ($fp = popen("7za a -t7z ".$GLOBALS["backup_compression"]." \"".$backuplocation.$backupname."\" \"".$pathtobackup."\"", "r")) ) {
while( !feof($fp) ){
fread($fp, 256);
$num_lines = 1;
$_SESSION['job'][$jobid]['currentfile'] = $_SESSION['job'][$jobid]['currentfile']+$num_lines;
$num_lines = 0;
flush();
echo '1';
}
pclose($fp);
}
}

Categories