I have written a PHP script to import large amount of data. The import process is triggered through Ajax call and the ajax request keep on waiting for the server response. As I am working on a dedicated server so there is no issue of timeout.
The problem is that we require a feature by which we can terminate the import process. For example a stop button on client-side. We thought that if we had killed the waiting ajax call then the process on the server will also stop as there is no request to serve. But unfortunately that is not the case, the script keeps on executing on server side while the Ajax Request is already killed from client.
Secondly, we use PHP session in this project. Let say if the cancel button requires an Ajax call to another script on the server to stop the process then How could that request will reach server if there is already a waiting ajax request. Php/Apache will hold the second request until the first request cycle is completed.
Note: As per our project architecture we require session_start() on every page. It will be good if anyone can guide on these issues.
You can write the Process ID at the start of running the script to a file, the db or a cache. You can get the Process ID with http://php.net/manual/en/function.getmypid.php. This assumes each script has its own Process ID.
The kill script (not using the locked session) could read that Process ID and try to kill it.
Be careful while doing long running processes in PHP as PHP's zend engine & GC is not suited for long running processes.
So, I strongly suggest using a proper job manager like gearman. gearman does have a php extensions. Using a job manager will give you full control over each process. you can start/stop processes & taks.
Another option is to use a queue, like amqp, to handle these tasks more cleanly. Which one is more suitable for your use case, I'll let you decide.
How about setting a time limit slightly higher than your AJAX timeout on your PHP script? This should kill the PHP script if it runs over time. Something similar to this:
<?php
set_time_limit(20);
while ($i<=10)
{
echo "i=$i ";
sleep(100);
$i++;
}
?>
Source: http://php.net/manual/en/function.set-time-limit.php
Once a script runs it can only be stopped by ending the php process working on the script. One possibility would be to use the session to store a "continue" condition when another script is called.
For example:
Script 1 is the worker (importer)
Script 2 is a function called repeatedly by ajax as long as the importer shall work.
Script 1 and 2 share let's say $_SESSION['lastPing'].
so
Script 2 sets $_SESSION['lastPing'] = time(); on each call.
Script 1 has a condition if($_SESSION['lastPing'] - 30 > time()){ die(); }
You might be able to handle this using the proc_ functions. A possible outline of the steps:
Create a unique token and pass it to the server along with the order to begin the import.
When order to import is received (via AJAX):
a) Store a record showing that this token is being processed (via file, db, memcached).
b) Run the import script using proc_open.
c) Begin a polling loop, checking the record from (2a) to see that this token is still in processing status. If not, call proc_terminate and exit the loop.
If the order to stop import is given (in a separate AJAX call), update the persisted record to indicate that the particular token should be stopped, which will be picked up in (2c)
Goto the following link for the exact solution to your problem:
PHP auto-kill a script if the HTTP request is cancelled/closed
Related
I've got the following scenario: multiple users from local network access a web application in coded in php resident on a IIS server. At every page load 3 ajax calls to 3 separated php script are performed and these calls repeat themselves every X minutes (timed jquery). For every ajax call of every user connected a php-cgi session is opened on the server, adding quickly up to 20 or so processes. The problem is after the ajax call these processes remain open, thus using a large amount of memory on the server with consequential problems to performances (arriving at a total block at times).
All the php scripts are called via the jQuery $.post function and perform one or more queries on a mssql db and end echoing a json encoded object or array. Is there a way to make these process close after the execution of the php script? I would like to avoid the option of making serial calls instead of parallel ones.
Any help is strongly appreciated.
Thanks
Don't know if you have already got a solution but anyways:
Try adding a
die();
?>
at the end of the php scripts that are called. That would kill the scripts that are called after they have completed execution. As each call creates its own process, their would be no issues even if things are done in parallel.
Hi I am new to PHP and have no idea if what I am about to ask is even possible or does it even make sense but here goes.
I want to execute a PHP script as if I am executing a standalone application on the WebServer, what I am trying to implement is that when the Customer purchases something on the website and the once he sees the payment confirmation notice on the website, he should be allowed to close the browser window or logoff without affecting the big order generation process that get's started once the user is taken to the page that displays that the payment that he made was successful.
Right now I am making use of AJAX to call my after payment processing PHP script and have kept that script to ignore any user abort call.
This is the page that tells the user that the payment was received successfully.
thankyou.php
This is the page that performs the processing that needs to be done only after successful receipt of payment
FinishCheckoutProcess.inc.php
Now thankyou.php makes use of AJAX to execute FinishCheckoutProcess.inc.php asynchronously and FinishCheckoutProcess.inc.php has a PHP.ini setting in it that goes like this:
ignore_user_abort(true);
Now the combination of AJAX and ignore_user_abort(true) allows the after payment process to run without any errors even if the user closes his browser window, but since this script has nothing to do with the user or the browser I just wanted to know if it is possible to run this script in the background like a standalone application independent of the browser.
Also my WebServer is Apache and OS is Linux(Ubuntu OS).
My work is getting done but I just want to know if there is a better/safer way to do it.
Anyway thanks in advance to everyone, this site has helped me more than any book could have. So all you experts out there who donate their times to newbies like me you guys are awesome. Please keep up the good work.
Again thanks a lot.
Based on suggestions received
If I use the "exec" method to execute the FinishCheckoutProcess.inc.php, will this execute database related commands and will it be able to run further PHP scripts.
FinishCheckoutProcess.inc.php in turn executes a series of other PHP scripts which in turn executes other PHP scripts, so will using "exec" command to run FinishCheckoutProcess.inc.php create any difficulties.
FinishCheckoutProcess.inc.php process also does interaction with the MySQL database, so will I be able to do this if I execute this script using "exec" command. I am passing the necessary MySQLi connection object to this PHP script right now. So can I pass it the same way to it using "exec"
Also the process is quite heavy as it generates a set of 4 image files using IMagick and ImageMagick.
It generates a set of 4 image files for every product ordered, so if the quantity of 1 product is 10 then the total files generated will be 1x10x4 = 40
If there are two products with one having quantity as 2 and the other having quantity as 4 then the total files generated will be
1x2x4 = 8 +
1x4x4 = 16 = 24
So this script might need to run for a long time and cannot be allowed to be stopped due to time out reasons, it needs to finish what it started.
Basiclly the FinishCheckoutProcess.inc.php logic and process is quite complex so just want to confirm if the "exec" can handle it or not.
Also I am not sure but some of them also make use of $_SESSION variables, but if this a problem I can modify it, $_SESSION variables only get's used in one place and yes the $_SESSION get's set in the browser before the FinishCheckoutProcess.inc.php script is executed. By some previous PHP script.
I just want to execute the FinishCheckoutProcess.inc.php script independent of the parent/calling script i.e. thankyou.php, so that if the user closes the browser then the FinishCheckoutProcess.inc.php will not stop or abort becuse the parent/calling script i.e. thankyou.php is now no longer running.
FYI you can run php scripts like php my/script.php.
A safer way to do it would be have a master/worker process workflow. The master process runs on the server and checks a queue of work and the spawns worker processes to handle items on the queue as the arrive.
In your scenario you add stuff to the queue when the user pays. Once it is added to the queue you can send back thankyou.php to the user and they can continue or leave or whatever. Once the work is on the queue your master process spawns a worker process to handle the stuff (basically does everything in FinishCheckoutProcess.inc.php).
You can implement this in php with: php master.php
master.php
while( true ){
//check queue
//if found queue item
//shell_exec( 'php worker.php' );
}
From what i understand, you are looking for something like Laravel offers with it's illuminate/queue package:
Queues allow you to defer the processing of a time consuming task, such as sending an e-mail, until a later time which drastically speeds up web requests to your application.
This isn't something that only Laravel offers, though it does simplify/ease the implementation of such mechanism.
In the background you have supervisord executing a "worker" php script that executes tasks you put in a common place (db tabel, filesystem, anything), those tasks are usually references to a certain class/method with some variables to send to it.
The following links might give you a better understanding:
http://supervisord.org/index.html
https://laravel.com/docs/5.1/queues
https://laravel.com/docs/5.1/queues#supervisor-configuration
There are many ways you could implement a queue system, also without the use of supervisord. But i recently implemented this method myself because it guarantees my tasks are being processed, even after server restart (if configured properly).
I use XAMPP 1.7.7 on windows7.(PHP Version 5.3.8)
I use proc_open() run a process and want to redirect to another web page,
but PHP will wait until the process is finished.
I don't want the running process make my web to wait it.
What should I do?
And I need pipes and the return value.
What I need:
A user submit something in page A,then the web will redirect to page B(and user can leave page B).
At the same time some processes will be called , produce some results and update the
database,so when the user refresh the page B,the right result will be show.
What's more,the user can view the page B any time.
I notice that chris's comment on PHP Manual,his method can run a process which is
independent with PHP.But I don't know how to use pipes on the hide process or get
the return value.
And I have no idea on AJAX,I think the Gearman maybe work,but it's maybe a little complex.
This should be done using a job queue like Gearman so that you can leave a worker running and then interrogate it for its status later from the page you redirect to.
To install Gearman on Windows please see this previous SO question and answers: How to configure or install GEARMAN in windows OS?
PHP is single threaded by design. There is no way to leave a php process running when the HTTP request has finished.
Having said that, you could exploit AJAX to do what you want. Instead of having one HTTP request, fire two requests at the same time. One of them will contain the long process (along with set_time_limit(0)).
There are a lot of different ways to do that. What I usually do is that: When I receive the initial request, I respond immediately with an HTML page that contains an automatic AJAX call to the second php file that contains the long process. So everybody is happy: the user sees immediate response and the long process can take its time as nobody is waiting.
Try seeing if your problem is answered by this: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
I have solved this by start a new PHP to implements my request.
http://www.php.net/manual/en/function.proc-open.php#90584
I created a script that gets data from some web services and our database, formats a report, then zips it and makes it available for download. When I first started I made it a command line script to see the output as it came out and to get around the script timeout limit you get when viewing in a browser. But because I don't want my user to have to use it from the command line or have to run php on their computer, I want to make this run from our webserver instead.
Because this script could take minutes to run, I need a way to let it process in the background and then start the download once the file has been created successfully. What's the best way to let this script run without triggering the timeout? I've attempted this before (using the backticks to run the script separately and such) but gave up, so I'm asking here. Ideally, the user would click the submit button on the form to start the request, then be returned to the page instead of making them stare at a blank browser window. When the zip file they exists (meaning the process has finished), it should notify them (via AJAX? reloaded page? I don't know yet).
This is on windows server 2007.
You should run it in a different process. Make a daemon that runs continuously, hits a database and looks for a flag, like "ShouldProcessData". Then when you hit that website switch the flag to true. Your daemon process will see the flag on it's next iteration and begin the processing. Stick the results in to the database. Use the database as the communication mechanism between the website and the long running process.
In PHP you have to tell what time-out you want for your process
See PHP manual set_time_limit()
You may have another problem: the time-out of the browser itself (could be around 1~2 minutes). While that time-out should be changeable within the browser (for each browser), you can usually prevent the time-out user side to be triggered by sending some data to the browser every 20 seconds for instance (like the header for download, you can then send other headers, like encoding etc...).
Gearman is very handy for it (create a background task, let javascript poll for progress). It does of course require having gearman installed & workers created. See: http://www.php.net/gearman
Why don't you make an ajax call from the page where you want to offer the download and then just wait for the ajax call to return and also set_time_limit(0) on the other page.
For example, there is a very simple PHP script which updates some tables on database, but this process takes a long time (maybe 10 minutes). Therefore, I want this script to continue processing even if the user closed the browser, because sometimes users do not wait and they close the browser or go to another webpage.
If the task takes 10 minutes, do not use a browser to execute it directly. You have lots of other options:
Use a cronjob to execute the task
periodically.
Have the browser
request insert a new row into a
database table so that a regular
cronjob can process the new row and
execute the PHP script with the
appropriate arguments
Have the
browser request write a message to
queue system, which has a subscriber
listening for such events (which then
executes the script).
While some of these suggestions are probably overkill for your situation, the key, combining feature is to de-couple the browser request from the execution of the job, so that it can be completed asynchronously.
If you need the browser window updated with progress, you will need to use a periodically-executed AJAX request to retrieve the job status.
To answer your question directly, see ignore_user_abort
More broadly, you probably have an architecture problem here.
If many users can initiate this stuff, you'll want the web application to add jobs to some kind of queue, and have a set number of background processes that chew through all the work.
The PHP script will keep running after the client terminates the connection (not doing so would be a security risk), but only up to max_execution_time (set in php.ini or through a PHP script, generally 30 seconds by default)..
For example:
<?php
$fh = fopen("bluh.txt", 'w');
for($i=0; $i<20; $i++) {
echo $i."<br/>";
fwrite($fh,$i."\n");
sleep(1);
}
fclose($fh);
?>
Start running that in your browser and close the browser before it completes. You'll find that after 20 seconds the file contains all of the values of $i.
Change the upper bound of the for loop to 100 instead of 20, and you'll find it only runs from 0 to 29. Because of PHP's max_execution_time the script times out and dies.
if the script is completely server based (no feedback to the user) this will be done even if the client is closed.
The general architecture of PHP is that a clients send a request to a script that gives a reply to the user. if nothing is given back to the user the script will still execute even if the user is not on the other side anymore. More simpler: their is no constant connection between server and client on a regular script.
You can make the PHP script run every 20 minutes using a crontab file which contains the time and what command to run in this case it would be the php script.
Yes. The server doesn't know if the user closed the browser. At least it doesn't notice that immediately.
No: the server probably (depending of how it is configured) won't allow for a php script to run for 10 minutes. On a cheap shared hosting I wouldn't rely on a script running for longer than a reasonable response time.
A server-side script will go on what it is doing regardless of what the client is doing.
EDIT: By the way, are you sure that you want to have pages that take 10 minutes to open? I suggest you to employ a task queue (whose items are executed by cron on a timely basis) and redirect user to a "ok, I am on it" page.