proc_open() run a process make PHP wait it to finish? - php

I use XAMPP 1.7.7 on windows7.(PHP Version 5.3.8)
I use proc_open() run a process and want to redirect to another web page,
but PHP will wait until the process is finished.
I don't want the running process make my web to wait it.
What should I do?
And I need pipes and the return value.
What I need:
A user submit something in page A,then the web will redirect to page B(and user can leave page B).
At the same time some processes will be called , produce some results and update the
database,so when the user refresh the page B,the right result will be show.
What's more,the user can view the page B any time.
I notice that chris's comment on PHP Manual,his method can run a process which is
independent with PHP.But I don't know how to use pipes on the hide process or get
the return value.
And I have no idea on AJAX,I think the Gearman maybe work,but it's maybe a little complex.

This should be done using a job queue like Gearman so that you can leave a worker running and then interrogate it for its status later from the page you redirect to.
To install Gearman on Windows please see this previous SO question and answers: How to configure or install GEARMAN in windows OS?

PHP is single threaded by design. There is no way to leave a php process running when the HTTP request has finished.
Having said that, you could exploit AJAX to do what you want. Instead of having one HTTP request, fire two requests at the same time. One of them will contain the long process (along with set_time_limit(0)).
There are a lot of different ways to do that. What I usually do is that: When I receive the initial request, I respond immediately with an HTML page that contains an automatic AJAX call to the second php file that contains the long process. So everybody is happy: the user sees immediate response and the long process can take its time as nobody is waiting.

Try seeing if your problem is answered by this: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/

I have solved this by start a new PHP to implements my request.
http://www.php.net/manual/en/function.proc-open.php#90584

Related

How to run PHP code without waiting for the response?

I have a website (Site A) where visitors arrive every day. Site A is only tracking the user and redirecting them to Site B, when it's done.
I would like a PHP based solution which allows me to start a specific task on the server and redirect the visitor right after it has started. Basically I would not want my users to wait while my PHP scripts (which are taking like 3-4 sec to complete) are finishing their work.
It is very much essential to redirect the user as soon as possible. I have thought about Cronjobs, but it's not good because it can only run specific times, as far as I know. Also cURL isn't good, because it needs to wait the server to finish loading (as I know).
Any solutions for this issue?
You could probably use a CURL call. There is a thing in php called ignore_user_abort(). Combine that with set_time_limit() and you will be able to create a PHP script that will continue running even if curl request is cancelled.

Run PHP script like an application without Browser

Hi I am new to PHP and have no idea if what I am about to ask is even possible or does it even make sense but here goes.
I want to execute a PHP script as if I am executing a standalone application on the WebServer, what I am trying to implement is that when the Customer purchases something on the website and the once he sees the payment confirmation notice on the website, he should be allowed to close the browser window or logoff without affecting the big order generation process that get's started once the user is taken to the page that displays that the payment that he made was successful.
Right now I am making use of AJAX to call my after payment processing PHP script and have kept that script to ignore any user abort call.
This is the page that tells the user that the payment was received successfully.
thankyou.php
This is the page that performs the processing that needs to be done only after successful receipt of payment
FinishCheckoutProcess.inc.php
Now thankyou.php makes use of AJAX to execute FinishCheckoutProcess.inc.php asynchronously and FinishCheckoutProcess.inc.php has a PHP.ini setting in it that goes like this:
ignore_user_abort(true);
Now the combination of AJAX and ignore_user_abort(true) allows the after payment process to run without any errors even if the user closes his browser window, but since this script has nothing to do with the user or the browser I just wanted to know if it is possible to run this script in the background like a standalone application independent of the browser.
Also my WebServer is Apache and OS is Linux(Ubuntu OS).
My work is getting done but I just want to know if there is a better/safer way to do it.
Anyway thanks in advance to everyone, this site has helped me more than any book could have. So all you experts out there who donate their times to newbies like me you guys are awesome. Please keep up the good work.
Again thanks a lot.
Based on suggestions received
If I use the "exec" method to execute the FinishCheckoutProcess.inc.php, will this execute database related commands and will it be able to run further PHP scripts.
FinishCheckoutProcess.inc.php in turn executes a series of other PHP scripts which in turn executes other PHP scripts, so will using "exec" command to run FinishCheckoutProcess.inc.php create any difficulties.
FinishCheckoutProcess.inc.php process also does interaction with the MySQL database, so will I be able to do this if I execute this script using "exec" command. I am passing the necessary MySQLi connection object to this PHP script right now. So can I pass it the same way to it using "exec"
Also the process is quite heavy as it generates a set of 4 image files using IMagick and ImageMagick.
It generates a set of 4 image files for every product ordered, so if the quantity of 1 product is 10 then the total files generated will be 1x10x4 = 40
If there are two products with one having quantity as 2 and the other having quantity as 4 then the total files generated will be
1x2x4 = 8 +
1x4x4 = 16 = 24
So this script might need to run for a long time and cannot be allowed to be stopped due to time out reasons, it needs to finish what it started.
Basiclly the FinishCheckoutProcess.inc.php logic and process is quite complex so just want to confirm if the "exec" can handle it or not.
Also I am not sure but some of them also make use of $_SESSION variables, but if this a problem I can modify it, $_SESSION variables only get's used in one place and yes the $_SESSION get's set in the browser before the FinishCheckoutProcess.inc.php script is executed. By some previous PHP script.
I just want to execute the FinishCheckoutProcess.inc.php script independent of the parent/calling script i.e. thankyou.php, so that if the user closes the browser then the FinishCheckoutProcess.inc.php will not stop or abort becuse the parent/calling script i.e. thankyou.php is now no longer running.
FYI you can run php scripts like php my/script.php.
A safer way to do it would be have a master/worker process workflow. The master process runs on the server and checks a queue of work and the spawns worker processes to handle items on the queue as the arrive.
In your scenario you add stuff to the queue when the user pays. Once it is added to the queue you can send back thankyou.php to the user and they can continue or leave or whatever. Once the work is on the queue your master process spawns a worker process to handle the stuff (basically does everything in FinishCheckoutProcess.inc.php).
You can implement this in php with: php master.php
master.php
while( true ){
//check queue
//if found queue item
//shell_exec( 'php worker.php' );
}
From what i understand, you are looking for something like Laravel offers with it's illuminate/queue package:
Queues allow you to defer the processing of a time consuming task, such as sending an e-mail, until a later time which drastically speeds up web requests to your application.
This isn't something that only Laravel offers, though it does simplify/ease the implementation of such mechanism.
In the background you have supervisord executing a "worker" php script that executes tasks you put in a common place (db tabel, filesystem, anything), those tasks are usually references to a certain class/method with some variables to send to it.
The following links might give you a better understanding:
http://supervisord.org/index.html
https://laravel.com/docs/5.1/queues
https://laravel.com/docs/5.1/queues#supervisor-configuration
There are many ways you could implement a queue system, also without the use of supervisord. But i recently implemented this method myself because it guarantees my tasks are being processed, even after server restart (if configured properly).

How do I avoid this PHP Script causing a server standstill?

I'm currently running a Linux based VPS, with 768MB of Ram.
I have an application which collects details of domains and then connect to a service via cURL to retrieve details of the pagerank of these domains.
When I run a check on about 50 domains, it takes the remote page about 3 mins to load with all the results, before the script can parse the details and return it to my script. This causes a problem as nothing else seems to function until the script has finished executing, so users on the site will just get a timer / 'ball of death' while waiting for pages to load.
**(The remote page retrieves the domain details and updates the page by AJAX, but the curl request doesnt (rightfully) return the page until loading is complete.
Can anyone tell me if I'm doing anything obviously wrong, or if there is a better way of doing it. (There can be anything between 10 and 10,000 domains queued, so I need a process that can run in the background without affecting the rest of the site)
Thanks
A more sensible approach would be to "batch process" the domain data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for domains that aren't marked as processed.
Carry out the CURL lookup, etc.
Update the database record accordingly and mark it as processed.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add domains to the database/review the results of processing, etc. via a separate web front end.
This isn't the ideal solution, but if you need to trigger this process based on a user request, you can add the following at the end of your script.
set_time_limit(0);
flush();
This will allow the PHP script to continue running, but it will return output to the user. But seriously, you should use batch processing. It will give you much more control over what's going on.
Firstly I'm sorry but Im an idiot! :)
I've loaded the site in another browser (FF) and it loads fine.
It seems Chrome puts some sort of lock on a domain when it's waiting for a server response, and I was testing the script manually through a browser.
Thanks for all your help and sorry for wasting your time.
CJ
While I agree with others that you should consider processing these tasks outside of your webserver, in a more controlled manner, I'll offer an explanation for the "server standstill".
If you're using native php sessions, php uses an exclusive locking scheme so only a single php process can deal with a given session id at a time. Having a long running php script which uses sessions can certainly cause this.
You can search for combinations of terms like:
php session concurrency lock session_write_close()
I'm sure its been discussed many times here. I'm too lazy to search for you. Maybe someone else will come along and make an answer with bulleted lists and pretty hyperlinks in exchange for stackoverflow reputation :) But not me :)
good luck.
I'm not sure how your code is structured but you could try using sleep(). That's what I use when batch processing.

Need to run a long php script from a browser

I created a script that gets data from some web services and our database, formats a report, then zips it and makes it available for download. When I first started I made it a command line script to see the output as it came out and to get around the script timeout limit you get when viewing in a browser. But because I don't want my user to have to use it from the command line or have to run php on their computer, I want to make this run from our webserver instead.
Because this script could take minutes to run, I need a way to let it process in the background and then start the download once the file has been created successfully. What's the best way to let this script run without triggering the timeout? I've attempted this before (using the backticks to run the script separately and such) but gave up, so I'm asking here. Ideally, the user would click the submit button on the form to start the request, then be returned to the page instead of making them stare at a blank browser window. When the zip file they exists (meaning the process has finished), it should notify them (via AJAX? reloaded page? I don't know yet).
This is on windows server 2007.
You should run it in a different process. Make a daemon that runs continuously, hits a database and looks for a flag, like "ShouldProcessData". Then when you hit that website switch the flag to true. Your daemon process will see the flag on it's next iteration and begin the processing. Stick the results in to the database. Use the database as the communication mechanism between the website and the long running process.
In PHP you have to tell what time-out you want for your process
See PHP manual set_time_limit()
You may have another problem: the time-out of the browser itself (could be around 1~2 minutes). While that time-out should be changeable within the browser (for each browser), you can usually prevent the time-out user side to be triggered by sending some data to the browser every 20 seconds for instance (like the header for download, you can then send other headers, like encoding etc...).
Gearman is very handy for it (create a background task, let javascript poll for progress). It does of course require having gearman installed & workers created. See: http://www.php.net/gearman
Why don't you make an ajax call from the page where you want to offer the download and then just wait for the ajax call to return and also set_time_limit(0) on the other page.

Is process forking in PHP / Apache a good idea?

I'm writing a simple application in PHP which needs to occasionally carry out a fairly intensive set of MySQL updates. I don't particularly want this to cause a delay for the user, so I'm wondering about using pcntl_fork().
I'm not sure how this really works though: will the child process continue running after the parent process finishes? Will the parent process end, and the user's page load fully complete before the child process completes?
In other words, is this a safe way to have a PHP script (running under Apache) do some time-consuming updates without delaying the user, or should I just ask my users to put up with some delay?
The parent process will end, the user's page will load fully, the child process will continue, and the use will have no feedback as to whether or not the child process finished successfully.
Someone out there can probably tell you in detail what happens when you call that under apache but the chances are you will get answers that aren't always true depending on what versions and combinations of apache and php you are using. You should use ajax and have two requests. Respond once with the page that says what you are doing and then with an ajax call poll a second request for the status and where you actually do the work.
If PHP runs under Apache as mod_php module forking will not work at all, you'll get a warning saying that function *pcntl_fork()* is undefined. In that case a good solution is to use exec() instead to run a separate php job using the command line.
I think it is a bad idea. I have done the similar stuff, and the apache redirect the ouput of parent to its child. That is your browser shows the info from one of the child process.
Click this for more infomation
Hope it help you.

Categories