php running separately avoiding time out for user - php

I would like to find a way to have my user not having to wait for the output of a php script and being redirected to a page while the script is running on the server.
Basically the user submits a form which takes quite long to process and I would like to redirect the user to a page notifying him that the form is being processed and that its output will be later available (I thought about opening a tab when the output is ready).
Basically I would like something like this, which I tried without success, the
if ($form_valid) {
process_form(); // this would need not to be running on the current page so that the user don't have to wait for it to be ready (timeout problems)
header('Location: http://example.com/form_submitted_output_coming_soon.html');
}
I hope that it is not too vague.
Thank you in advance for any help / advice on how I could do that.

It really depends on the time the script takes to execute if it's seconds, under 10 I would do an ajax request and have a modal progress message
If they take extended amounts of time my approach would be to create or use an existing task scheduler/ report generator
a single system scheduled task entry calling a central management script ( probably not optimal )
You mark a task/report for execution
Concurrency. Count, limit the number currently executing ( so you don't over load the server)
users pool via ajax for their tasks / reports or push to the clients with web sockets
Example on how to fork php to background
Update
I think you would get better performance out of a bot continuously check a database or file for work to do and submitting results back to the database. Alerting users via ajax, web sockets and or email when the work that they need is done / updated.
Here is a good introduction on how to build a web crawler in php

The best approach for solving this kind of problem is to use AJAX to make the request to the server in the background and then update the user once it has finished processing.

You may submit the form with an asynchronous request (ajax) and handle the page forward also with javascript. This way your form is handled asynchronously - you may even wait for the response to tell the user once you have an answer. This asynchronous request will not block the UI.
Just for completeness if you really really want to use php only for this:
Run PHP Task Asynchronously

Related

force reload/refresh a second webpage through php

I'm trying to force to reload a (second) page if a criteria is met in php
But if the criteria is met, i want the page to force reload everywhere, even if 10 people have it open at once for example.
for simplicty lets say the code is like this:
in /filelocation/script.php:
if {$data == "ok"}{
reload/refresh "reload.php" if it's open somewhere;
}
I came across a software that basicly does this, and i want to understand how this is done.
(it works cross device somehow, so i asume its done through php somehow)
Well, in your PHP code, you cannot simply reload/refresh something for all the users connected. This is simply because the PHP code is only executed when your browser requests a page on the server so it's only executed to build the HTML response and then it stops executing. Once the browser has the HTML response it will render the page and then it waits for an action from the user to do something else (such as clicking on a link or posting a form).
I imagine that you would like that when a specific user does something, like posting a comment or buying a product, you would like all the other visitors to be notified that a new comment has been posted or that the number of products available has been reduced.
To do that, you need to implement some JavaScript which is executed in the browser of each visitor. The idea is to keep a connection with the server with the help of web sockets. This way, you can inform the browser that something has changed.
You could google to find some examples of PHP apps using web sockets. The first example I found:
https://www.twilio.com/blog/create-php-websocket-server-build-real-time-even-driven-application
Another solution could be to have some JavaScript doing some pooling, meaning that every N seconds, it executes an Ajax request to the server to ask if something has changed. This can be done with the help of setTimeout(yourFunction, 10000) to call a JavaScript function every 10 seconds. This function will do the Ajax request and then update the part of your page that needs to change. Just be carefull that if you get a lot of users on your site then you'll produce quite a lot of load on your server. So this wouldn't be a good solution, but it could be an alternative to the web sockets.

HTML not loading while PHP script running in background

I am currently creating a stock market simulation and am working on the moment that the user logs into the simulation. I have a PHP script that will generate a certain price for a company four times and update it into my MySQL database while running. I currently have the following code:
PHP:
if (isset($_SESSION['userId']))
{
$isPlaying = 0;
while ($isPlaying <= 3)
{
$priceTemp = (rand(3300, 3700) / 100);
$sql = "UPDATE pricestemp SET price = $priceTemp WHERE companyName = 'Bawden';";
mysqli_query($conn, $sql);
sleep(1);
$isPlaying++;
}
echo '<h1>Welcome to the simulation</h1>';
}
I am aiming for these updates to happen in the background once the user has logged into the simulation. When refreshing my database every second, the updated prices are shown which is one of my objectives. However, what I would like it to do is still load the HTML onto the page (to say "Welcome to the simulation") while updating the database with every second with an updated price.
So far, when I log in, I have to wait 4 seconds before the HTML will load. In the future, I hope to have it consisently updating until a certain condition is met but when I have set an infinite loop earlier the HTML never loaded.
What do I have to do to allow the HTML to load once logged in and have the prices being generated and updated in the MySQL database in the background with no delay in either of these tasks happening?
You have a fundamental misunderstanding of how web-based requests work.
What you need to understand is that PHP is a server-side language. PHP generates any combination of HTML, CSS, JavaScript, JSON, or any other forms of data you want and sends it to your web browser when it's finished. While it's doing that, it can also manage data within a database or perform any other number of actions, but it will never send what the web browser can make use of until it finishes setting everything up. So if you're within an infinite loop, it will never finish and therefore nothing will be sent back to the web browser.
To remedy this, you need to use something called "asynchronous JavaScript", more commonly referred to as "ajax". Specifically, you first send some initial HTML to the web browser in one request and let the request end immediately. This allows the user to see something without waiting around for an indefinite period of time. Then, on the web browser end, you can use JavaScript to automatically send a second request to the server. During this second request to the server, you can perform your data processing and send back some data when you're finished to display to the user.
If you want to periodically update what you show the user, then you would repeat that second request to refresh what is shown on the user's webpage.
Any time you see some kind of "real-time" updating on a website, it's not coming from a single, persistently open connection to the web server--it's actually a series of repeated, broken up requests that periodically refresh what you see.
Broken down, standard web request workflows look something like this:
Web browser asks the web server for the webpage. Web browser waits for a reply.
Web server generates the webpage and sends the webpage to the web browser. Web server is done.
Web browser receives the webpage and shows it to the user. Web browser stops waiting for a reply.
Web browser runs any JavaScript it needs to run and requests data from the web server. Web browsers waits for a reply.
Web server processes the request and sends the requested data back to the web browser. Web server is done.
Web browser receives the requested data and updates the HTML on the webpage so the user can see it. Web browser stops waiting for a reply.
As you can see, each series of requests is 1) initiated by the web browser, 2) processed by the web server, and 3) any replies from the web server are then handled by the web browser after the web server is finished up. So each and every request goes browser -> server -> browser. If we add steps 7., 8., and 9. to the above, we will see them repeat the exact same pattern.
If you want to avoid adding JavaScript into the mix, preferring to refresh the entire page every time, then keep your data processing short. Optimize your database calls, fix your infrastructure (make sure your server and database have a LAN connection, that your hardware is good enough, etc.), make your code more efficient... do whatever you need to do to keep the processing time to a minimum.
This is all incredibly simplified and not 100% accurate, but should hopefully help you with your specific problem. The short version of all of this is: you can't show your HTML and process your data at the same time the way you're doing things now. You need to fundamentally change your workflow.
You have to do this in 2 network calls. The first network call should fetch the html. Then you have to use Javascript to fire another call to update your data. Once that api call returns it will update the html.
The scheduling model to manage the frequency of a background operation based on the frequency of requests at the front end is a very difficult problem. It's also a problem you don't need to solve. The data doesn't need to be changed when nobody is looking at it. You just need to store when the data was last looked at and apply greater deltas to older data.

PHP script in background without user waiting

I have a webform that sends data to PHP script.
PHP script may take a while to process the data. What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background. Important thing is that the script must continue working even if the visitor closes "thank you" page.
Can you advise which solution should I look into?
P.S. I use nginx + php-fpm if that matters.
UPDATE. I've found info about using ignore_user_abort(true). Could this be the way to go?
What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background.
That basically describes how I'd do it right there, actually.
Consider two separate applications. One is the web application, which saves the user input to the database and then continues to interact with the user. The other is a scheduled console application (a standalone script invoked by cron most likely) which looks for data in the database to be processed and processes it.
The user uploads the data, receives a "thank you" message, and his/her interaction is complete. The next time the scheduled task runs (every couple minutes, maybe?) it sees the pending data in the database, flags it as being processed (so if another instance of the script runs it doesn't also try to process the same data), processes it, flags it as being done (so it doesn't pick it up again next time), and completes.
You can notify the user of the completed process a couple of different ways. The back-end script can send the user an email (active notification), or perhaps the web application can examine the table for the flagged completed records the next time the user visits the page (passive notification).
Something like this should work :
pclose(popen('php script.php &', 'r'));
http://fr2.php.net/manual/fr/function.popen.php
You can also use more options or others functions to get more control over the execution :
http://fr2.php.net/manual/fr/function.proc-open.php
But use this carefully and be sure you need this way to resolve your problem.
Ajax would be nice.
You need to do the thing asynchronously. Use AJAX to achieve this

Terminate connection to jQuery AJAX request, keep processing php on server side?

I have a signup form that calls a PHP script which can interact with our CRM's API, like so:
CRM API <--> PHP script <--> Signup form
The signup form passes some information to the PHP script in one
AJAX call
The PHP script run a dozen API calls to the CRM to create
an account and attach various data
The CRM returns the new account id it just created to the PHP script
The PHP script passes the account id back to the signup form, at which point the AJAX call is complete and the signup form can continue.
The problem is #2, those dozen calls take about 20 seconds to complete, but the data the signup form needs is generated after the first API call so it could in theory return that data much sooner and do the rest of the stuff server side without holding that AJAX call open the whole time.
I tried flush() and ob_flush() which does output account id to the client before processing is complete, but the jQuery AJAX connection remains open so I'm still stuck waiting for the connection to be closed on the signup form side before anything happens.
So what's the easiest route for returning that account id to the form as fast as possible?
Maybe break out using curl and exec?
if(signing up){
stuff
exec(curl myself, notsignup)
}
else {
bunch of api calls
}
You should probably think about creating a seperate process for the rest of the steps that are needed. One way is that you could after the #1 first api calls has been completed. It responds back to the user, and doesn't try to complete the rest of the 20 calls on the user side.
Then create a queue that will finish the rest. You could always create a table in mysql to store the queue.
Next just create a cronjob that will run in the background, knocking the queue out.
Note: You will not want this cronjob to just start and never stop. Maybe have it run every 5 minutes, but before it starts to run, check to see if another cron is still in progress. If it is then it will check in another 5 minutes to see if it is ok to run.
Hope this helps!
If you only need the information from the first API call to return the form, then I would probably try a different workflow:
Form calls PHP Script
PHP Calls first API Call
PHP Returns to Form
Form processes response
Form calls second PHP Script to complete the process
PHP finishes API Calls (the form can abandon at this point since it sounds like you don't care what happens from here on out).
The workflow requires a little more work and co-ordination for the developer, but presents the most responsive interface to the user.

AJAX call to check status of running process in the background, timing out

I've looked all over for any answer on this and haven't been able to find one, hopefully someone can point me in the right direction, I think I'm close
I have two hosts let's call them host1.mydomain.com and host2.mydomain.com (to get around the 2 concurrent connections per host/per browser issue), so they both point to the same content one is just an alias of the other
User goes to host1.mydomain.com, enters some information to register, clicks Go, which loads an iframe on the same page pointing to a page on host2.mydomain.com which calls a php script via exec("curl") sending the request to the background to start a website scraper, the process ID is then stored in the database for the user. After the iframe has successfully loaded (only takes 1 second since it's creating a background process) I have an AJAX request set on an interval to check the status periodically of the cURL process (by it's process ID in the database) so that I can display the current step of the scraper (there are 6 steps in total). All good so far.
The problem is that the AJAX requests are timing out after step 4 of the scraper (browser default timeout is 115/120 seconds) even though it shouldn't be because I'm working with two different hosts...meaning to say that it's almost as if I'm clogging both connections on host1.mydomain.com when I'm not because I initiated the scraper from host2
The iframe loads this URL: http://host2.mydomain.com/page.php
The contents of the PHP script calls:
exec("curl -o /dev/null 'http://host2.mydomain.com/page.php?method=process' > /dev/null & echo $!", $op);
Then my ajax request is polling http://host1.mydomain.com/status.php?pid=x which looks up in the database to check the status by the process ID
and once the scraper gets to step 4, my ajax requests are timing out
I think I confused myself explaining this, but hopefully someone can help me
Turns out I was successfully getting around the 2 connections per server/browser limitation...however in doing some research I found the reason why my ajax request was hanging is because I was trying to access and write to the session data from both of the requests. Digging a little deeper I found a session_write_close() which closes the session for reading/writing, I basically have to call this after each page request of the scraper and then reinitialize the session, this allows my ajax requests to go through and stops the blocking of the request.
Hopefully someone else finds this useful if you stumble across the same issue
Cheers!
Jeff
Instead of waiting for the request to finish, you should spawn new process which runs in the background on server. And use javascript to "check back" each few seconds to see when the execution has finished. Then all you have to do is pick up the result and display it.
Additionally you might want to make sure that only one php process is spawned.

Categories