I have a webform that sends data to PHP script.
PHP script may take a while to process the data. What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background. Important thing is that the script must continue working even if the visitor closes "thank you" page.
Can you advise which solution should I look into?
P.S. I use nginx + php-fpm if that matters.
UPDATE. I've found info about using ignore_user_abort(true). Could this be the way to go?
What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background.
That basically describes how I'd do it right there, actually.
Consider two separate applications. One is the web application, which saves the user input to the database and then continues to interact with the user. The other is a scheduled console application (a standalone script invoked by cron most likely) which looks for data in the database to be processed and processes it.
The user uploads the data, receives a "thank you" message, and his/her interaction is complete. The next time the scheduled task runs (every couple minutes, maybe?) it sees the pending data in the database, flags it as being processed (so if another instance of the script runs it doesn't also try to process the same data), processes it, flags it as being done (so it doesn't pick it up again next time), and completes.
You can notify the user of the completed process a couple of different ways. The back-end script can send the user an email (active notification), or perhaps the web application can examine the table for the flagged completed records the next time the user visits the page (passive notification).
Something like this should work :
pclose(popen('php script.php &', 'r'));
http://fr2.php.net/manual/fr/function.popen.php
You can also use more options or others functions to get more control over the execution :
http://fr2.php.net/manual/fr/function.proc-open.php
But use this carefully and be sure you need this way to resolve your problem.
Ajax would be nice.
You need to do the thing asynchronously. Use AJAX to achieve this
Related
I am currently creating a stock market simulation and am working on the moment that the user logs into the simulation. I have a PHP script that will generate a certain price for a company four times and update it into my MySQL database while running. I currently have the following code:
PHP:
if (isset($_SESSION['userId']))
{
$isPlaying = 0;
while ($isPlaying <= 3)
{
$priceTemp = (rand(3300, 3700) / 100);
$sql = "UPDATE pricestemp SET price = $priceTemp WHERE companyName = 'Bawden';";
mysqli_query($conn, $sql);
sleep(1);
$isPlaying++;
}
echo '<h1>Welcome to the simulation</h1>';
}
I am aiming for these updates to happen in the background once the user has logged into the simulation. When refreshing my database every second, the updated prices are shown which is one of my objectives. However, what I would like it to do is still load the HTML onto the page (to say "Welcome to the simulation") while updating the database with every second with an updated price.
So far, when I log in, I have to wait 4 seconds before the HTML will load. In the future, I hope to have it consisently updating until a certain condition is met but when I have set an infinite loop earlier the HTML never loaded.
What do I have to do to allow the HTML to load once logged in and have the prices being generated and updated in the MySQL database in the background with no delay in either of these tasks happening?
You have a fundamental misunderstanding of how web-based requests work.
What you need to understand is that PHP is a server-side language. PHP generates any combination of HTML, CSS, JavaScript, JSON, or any other forms of data you want and sends it to your web browser when it's finished. While it's doing that, it can also manage data within a database or perform any other number of actions, but it will never send what the web browser can make use of until it finishes setting everything up. So if you're within an infinite loop, it will never finish and therefore nothing will be sent back to the web browser.
To remedy this, you need to use something called "asynchronous JavaScript", more commonly referred to as "ajax". Specifically, you first send some initial HTML to the web browser in one request and let the request end immediately. This allows the user to see something without waiting around for an indefinite period of time. Then, on the web browser end, you can use JavaScript to automatically send a second request to the server. During this second request to the server, you can perform your data processing and send back some data when you're finished to display to the user.
If you want to periodically update what you show the user, then you would repeat that second request to refresh what is shown on the user's webpage.
Any time you see some kind of "real-time" updating on a website, it's not coming from a single, persistently open connection to the web server--it's actually a series of repeated, broken up requests that periodically refresh what you see.
Broken down, standard web request workflows look something like this:
Web browser asks the web server for the webpage. Web browser waits for a reply.
Web server generates the webpage and sends the webpage to the web browser. Web server is done.
Web browser receives the webpage and shows it to the user. Web browser stops waiting for a reply.
Web browser runs any JavaScript it needs to run and requests data from the web server. Web browsers waits for a reply.
Web server processes the request and sends the requested data back to the web browser. Web server is done.
Web browser receives the requested data and updates the HTML on the webpage so the user can see it. Web browser stops waiting for a reply.
As you can see, each series of requests is 1) initiated by the web browser, 2) processed by the web server, and 3) any replies from the web server are then handled by the web browser after the web server is finished up. So each and every request goes browser -> server -> browser. If we add steps 7., 8., and 9. to the above, we will see them repeat the exact same pattern.
If you want to avoid adding JavaScript into the mix, preferring to refresh the entire page every time, then keep your data processing short. Optimize your database calls, fix your infrastructure (make sure your server and database have a LAN connection, that your hardware is good enough, etc.), make your code more efficient... do whatever you need to do to keep the processing time to a minimum.
This is all incredibly simplified and not 100% accurate, but should hopefully help you with your specific problem. The short version of all of this is: you can't show your HTML and process your data at the same time the way you're doing things now. You need to fundamentally change your workflow.
You have to do this in 2 network calls. The first network call should fetch the html. Then you have to use Javascript to fire another call to update your data. Once that api call returns it will update the html.
The scheduling model to manage the frequency of a background operation based on the frequency of requests at the front end is a very difficult problem. It's also a problem you don't need to solve. The data doesn't need to be changed when nobody is looking at it. You just need to store when the data was last looked at and apply greater deltas to older data.
I wrote a page (cron.php) that uses the imap library to connect to a mailbox, parse messages, and store them in a database, then echos the results in json. I have a few dozen mailboxes that I need to run this same process for, and so I put together a page (mailboxes.php) that lists all these accounts, each with a button that when clicked, essentially hits cron.php via AJAX and parses the json response to update the page when the process is completed.
I've noticed however that if I click each of these boxes, they return as if running serially, not in parallel. Is there a configuration option someplace that might explain this?
Yeah you need to use session_write_close() on the cron.php file. session_write_close
Are you using sessions? Every time you run session_start for given session, it is being locked, until the script finishes, or the session is being 'detached'.
I would like to find a way to have my user not having to wait for the output of a php script and being redirected to a page while the script is running on the server.
Basically the user submits a form which takes quite long to process and I would like to redirect the user to a page notifying him that the form is being processed and that its output will be later available (I thought about opening a tab when the output is ready).
Basically I would like something like this, which I tried without success, the
if ($form_valid) {
process_form(); // this would need not to be running on the current page so that the user don't have to wait for it to be ready (timeout problems)
header('Location: http://example.com/form_submitted_output_coming_soon.html');
}
I hope that it is not too vague.
Thank you in advance for any help / advice on how I could do that.
It really depends on the time the script takes to execute if it's seconds, under 10 I would do an ajax request and have a modal progress message
If they take extended amounts of time my approach would be to create or use an existing task scheduler/ report generator
a single system scheduled task entry calling a central management script ( probably not optimal )
You mark a task/report for execution
Concurrency. Count, limit the number currently executing ( so you don't over load the server)
users pool via ajax for their tasks / reports or push to the clients with web sockets
Example on how to fork php to background
Update
I think you would get better performance out of a bot continuously check a database or file for work to do and submitting results back to the database. Alerting users via ajax, web sockets and or email when the work that they need is done / updated.
Here is a good introduction on how to build a web crawler in php
The best approach for solving this kind of problem is to use AJAX to make the request to the server in the background and then update the user once it has finished processing.
You may submit the form with an asynchronous request (ajax) and handle the page forward also with javascript. This way your form is handled asynchronously - you may even wait for the response to tell the user once you have an answer. This asynchronous request will not block the UI.
Just for completeness if you really really want to use php only for this:
Run PHP Task Asynchronously
On my site, I have a feature that allows you to follow the updates of other users. When a user makes a change to their info, anyone following him or her will be emailed of this change.
I have it set up so that the php changes the info in the database, then searches through the users contacts to see who is following him/her, and sends emails to those who are following notifying them of the change.
The problem I have run into since adding this notification feature, is that insead of the page loading (form posts to itself) and showing the change almost instantly, (depending on how many people are following a particular user) it can take a few munites for the page to load and show the update (because the php is sending all the emails before the page reloads).
How can I set it up, where the script to send the emails, is run somewhere in the background, and the user does not have to wait for the emails to send before the page reloads, and could possibly even exit out of the website and still have the emails send if still the script is still running?
P.S. All my programming and development skills have been self taught, so I don't know a lot of terminology..... You may have to dumb down your responses so that I will understand what you are talking about. Sorry for the inconvenience, and thank you very much for any help.
I'm surprised it takes several minutes just to send some emails, but anyway you can do this in at least four ways:
Use ajax to send the form data to a separate script that sends the emails while the form posts normally
Have the form script fork (pcntl required)
Make an asynchronous request to your own page via php (either set a low timeout with cURL, or open a socket)
Use exec('script-that-sends-emails args >> some-other-file 2>&1 &');
I have a page where users enter their email address, click "Send", and the webserver will email them a ~10mb attachment. Right now, I have the page just display "Sending..." and the user waits about 20 seconds on this page.
Instead, I want to have it say "Your attachment will be emailed in a few minutes," sending that process somewhere else and allowing the user to continue browsing without having to open up a new tab.
I really just need someone to point me in the right direction because I don't even know what to Google at this point.
You could call another php file that will process the email sending and make sure to put in this call:
ignore_user_abort(true);
What this does is allows the php process to finish, even though the browser leaves. So you could initiate the process via ajax and then go to another page saying your attachment has been sent.
For more info:
http://www.php.net/manual/en/function.ignore-user-abort.php
I recommend checking out this question I posted a while ago.
How can I run a PHP script in the background after a form is submitted?
The answer explains how you can run a PHP script in the background after a form is submitted. This, of course, is just one way to accomplish this. Another would be to build a list of addresses and set up a cron to run a script on a timed interval. Both can accomplish the same thing, it just depends on how you wish to tackle the issue and what you can do on your server.