Terminate connection to jQuery AJAX request, keep processing php on server side? - php

I have a signup form that calls a PHP script which can interact with our CRM's API, like so:
CRM API <--> PHP script <--> Signup form
The signup form passes some information to the PHP script in one
AJAX call
The PHP script run a dozen API calls to the CRM to create
an account and attach various data
The CRM returns the new account id it just created to the PHP script
The PHP script passes the account id back to the signup form, at which point the AJAX call is complete and the signup form can continue.
The problem is #2, those dozen calls take about 20 seconds to complete, but the data the signup form needs is generated after the first API call so it could in theory return that data much sooner and do the rest of the stuff server side without holding that AJAX call open the whole time.
I tried flush() and ob_flush() which does output account id to the client before processing is complete, but the jQuery AJAX connection remains open so I'm still stuck waiting for the connection to be closed on the signup form side before anything happens.
So what's the easiest route for returning that account id to the form as fast as possible?
Maybe break out using curl and exec?
if(signing up){
stuff
exec(curl myself, notsignup)
}
else {
bunch of api calls
}

You should probably think about creating a seperate process for the rest of the steps that are needed. One way is that you could after the #1 first api calls has been completed. It responds back to the user, and doesn't try to complete the rest of the 20 calls on the user side.
Then create a queue that will finish the rest. You could always create a table in mysql to store the queue.
Next just create a cronjob that will run in the background, knocking the queue out.
Note: You will not want this cronjob to just start and never stop. Maybe have it run every 5 minutes, but before it starts to run, check to see if another cron is still in progress. If it is then it will check in another 5 minutes to see if it is ok to run.
Hope this helps!

If you only need the information from the first API call to return the form, then I would probably try a different workflow:
Form calls PHP Script
PHP Calls first API Call
PHP Returns to Form
Form processes response
Form calls second PHP Script to complete the process
PHP finishes API Calls (the form can abandon at this point since it sounds like you don't care what happens from here on out).
The workflow requires a little more work and co-ordination for the developer, but presents the most responsive interface to the user.

Related

PHP script in background without user waiting

I have a webform that sends data to PHP script.
PHP script may take a while to process the data. What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background. Important thing is that the script must continue working even if the visitor closes "thank you" page.
Can you advise which solution should I look into?
P.S. I use nginx + php-fpm if that matters.
UPDATE. I've found info about using ignore_user_abort(true). Could this be the way to go?
What I want to do is to send raw data to database, then redirect the visitor to "thank you" page and then continue processing the data in background.
That basically describes how I'd do it right there, actually.
Consider two separate applications. One is the web application, which saves the user input to the database and then continues to interact with the user. The other is a scheduled console application (a standalone script invoked by cron most likely) which looks for data in the database to be processed and processes it.
The user uploads the data, receives a "thank you" message, and his/her interaction is complete. The next time the scheduled task runs (every couple minutes, maybe?) it sees the pending data in the database, flags it as being processed (so if another instance of the script runs it doesn't also try to process the same data), processes it, flags it as being done (so it doesn't pick it up again next time), and completes.
You can notify the user of the completed process a couple of different ways. The back-end script can send the user an email (active notification), or perhaps the web application can examine the table for the flagged completed records the next time the user visits the page (passive notification).
Something like this should work :
pclose(popen('php script.php &', 'r'));
http://fr2.php.net/manual/fr/function.popen.php
You can also use more options or others functions to get more control over the execution :
http://fr2.php.net/manual/fr/function.proc-open.php
But use this carefully and be sure you need this way to resolve your problem.
Ajax would be nice.
You need to do the thing asynchronously. Use AJAX to achieve this

call PHP script after form submit

I have two separate processes and I am wondering how I can combine them.
I have a PHP email sign-up form which posts the users email address and other data into an SQL DB.
I have a sync PHP script API which once run adds all info from the SQL DB to a 3rd party site (mailchimp)
How can I combine them so once a new user adds their details, and the form is submitted the PHP API script runs. I did this, however it runs the script and the user has to wait until the API call is done:
$appUrl = $_SERVER['HTTP_HOST'];
$path = 'newsletter/mailchimp.php';//your path here
$appUrl = 'http://'.$appUrl.'/'.$path;
if (count($_POST)>0) echo file_get_contents($appUrl);?>
Find an HTML-element which you can bind your function to, like your form, button, checkbox or whatever. There's many different ways like $("form").submit(function(){});, .click, .changed. After that you can use the $.post() function to call the PHP-files.
$(document).ready(function(){
$("#buttonId").click(function(){
$.post('phpfile1.php');
$.post('phpfile2.php');
});
});
If you're getting the content from a remote site, then you can't send it to the client before you've got it.
But in your example you are calling a URL on the local machine - if you include the file instead of calling it in a new HTTP request then you'll get a small improvement in speed.
If you don't need the content from the other URL, then call it from a shutdown function after your script generates the content flushes its buffers and exits.
Calling the second url via javascript is ot a good idea if the transaction spans both codesets - you're putting the user in control of the flow in your application.

I want to run a php script in background

I m trying to make messenger kind of service. In which I am facing a problem. I am supposed to reload the page manually. But the problem in running it automatically is that it gets reloaded for every second. I had used a javascript for reloading the page for every 1 sec. Inside which i m calling a php script. But the problem with that code is that complete page gets refreshed. Is there a way to reload only a particular part of the whole page and also i should be able run the php script in the background.
Use Ajax. Basic w3schools tutorial
Ajax is a group of interrelated web development techniques used on the client-side to create asynchronous web applications. With Ajax, web applications can send data to, and retrieve data from, a server asynchronously (in the background) without interfering with the display and behavior of the existing page. Data can be retrieved using the XMLHttpRequest object. Despite the name, the use of XML is not required (JSON is often used instead), and the requests do not need to be asynchronous.
[Wikipedia]

php running separately avoiding time out for user

I would like to find a way to have my user not having to wait for the output of a php script and being redirected to a page while the script is running on the server.
Basically the user submits a form which takes quite long to process and I would like to redirect the user to a page notifying him that the form is being processed and that its output will be later available (I thought about opening a tab when the output is ready).
Basically I would like something like this, which I tried without success, the
if ($form_valid) {
process_form(); // this would need not to be running on the current page so that the user don't have to wait for it to be ready (timeout problems)
header('Location: http://example.com/form_submitted_output_coming_soon.html');
}
I hope that it is not too vague.
Thank you in advance for any help / advice on how I could do that.
It really depends on the time the script takes to execute if it's seconds, under 10 I would do an ajax request and have a modal progress message
If they take extended amounts of time my approach would be to create or use an existing task scheduler/ report generator
a single system scheduled task entry calling a central management script ( probably not optimal )
You mark a task/report for execution
Concurrency. Count, limit the number currently executing ( so you don't over load the server)
users pool via ajax for their tasks / reports or push to the clients with web sockets
Example on how to fork php to background
Update
I think you would get better performance out of a bot continuously check a database or file for work to do and submitting results back to the database. Alerting users via ajax, web sockets and or email when the work that they need is done / updated.
Here is a good introduction on how to build a web crawler in php
The best approach for solving this kind of problem is to use AJAX to make the request to the server in the background and then update the user once it has finished processing.
You may submit the form with an asynchronous request (ajax) and handle the page forward also with javascript. This way your form is handled asynchronously - you may even wait for the response to tell the user once you have an answer. This asynchronous request will not block the UI.
Just for completeness if you really really want to use php only for this:
Run PHP Task Asynchronously

Push update notifications back to PHP while calling a function

I have a PHP script that loads a function which uses Curl to log itself in to another webpage to get some data. This operation takes about 14 seconds altogether and some users might become impatient. I have a little busy loader to indicate activity.
I have seen this on other websites and want to know what technology I need to implement the following:
How can I send little notification messages back to the main site while the PHP function is running so the user knows about the progress.
Messages could be in the following form:
Logging in to website
Extracting data
Sorting data
Closing connection
You can do it using ajax, do the following.
1) Keep your curl code in the seperate php file.
2) While click the button, using js or jquery call that php file using ajax.
3) Before ajax return the reponse, you can display the div with your message, once you
got response, you can hide the div.

Categories