Execute php after 10s non blocking - php

Building a Facebook application and due to Facebook rules i must wait 10 seconds before executing a posting of user action.
This can be done with JavaScript also but have found the php sdk more precise.
My question is, What is the best and most accurate way of delaying the execution of code 10s
$response = $facebook->api(
'me/video.watches',
'POST',
array(
'tv_show' => "$permalink"
)
);
Using sleep(10); will block the entire page from loading. What is the correct solution here ?

This can be done by creating a database table and feed your database with records to be sent as post.
Create a cron at your server that reads the unprocessed records and create a facebook post to desired user.

What you want is some form of asynchrones processing. There are a number of different approaches you could take.
Write 'jobs' to a file, table in database, etc. and have a cronjob handle them.
Fork your PHP process, serving the response to the user in the 'parent' process and posting to Facebook in the 'child' proces.
Start a new process and detach it. (exec('php postToFacebook.php > /dev/null 2>/dev/null &');)
Use a job server system such as Gearman.
I'd rate Gearman the coolest, the 'new process and detach' trick is probably the easiest.

Using sleep(10); will block the entire page from loading. What is the correct solution here ?
shot in the dark, try rendering the page first then calling the sleep()? it's difficult to advise a "simple" solution without context on how you are handling the response and what sort of stuff the user needs to see rendered.

You could trigger the Facebook operation via an AJAX call from the client side. You could use a simple setTimeout() to delay the call once the page has loaded:
setTimeout( function(){
// Execute your AJAX call
}, 10000 ); // 1000 miliseconds = 1 second
The page that is being called via AJAX can simply execute the request to Facebook as usual.

Related

PHP requests processing in serial not parallel

I wrote a page (cron.php) that uses the imap library to connect to a mailbox, parse messages, and store them in a database, then echos the results in json. I have a few dozen mailboxes that I need to run this same process for, and so I put together a page (mailboxes.php) that lists all these accounts, each with a button that when clicked, essentially hits cron.php via AJAX and parses the json response to update the page when the process is completed.
I've noticed however that if I click each of these boxes, they return as if running serially, not in parallel. Is there a configuration option someplace that might explain this?
Yeah you need to use session_write_close() on the cron.php file. session_write_close
Are you using sessions? Every time you run session_start for given session, it is being locked, until the script finishes, or the session is being 'detached'.

php running separately avoiding time out for user

I would like to find a way to have my user not having to wait for the output of a php script and being redirected to a page while the script is running on the server.
Basically the user submits a form which takes quite long to process and I would like to redirect the user to a page notifying him that the form is being processed and that its output will be later available (I thought about opening a tab when the output is ready).
Basically I would like something like this, which I tried without success, the
if ($form_valid) {
process_form(); // this would need not to be running on the current page so that the user don't have to wait for it to be ready (timeout problems)
header('Location: http://example.com/form_submitted_output_coming_soon.html');
}
I hope that it is not too vague.
Thank you in advance for any help / advice on how I could do that.
It really depends on the time the script takes to execute if it's seconds, under 10 I would do an ajax request and have a modal progress message
If they take extended amounts of time my approach would be to create or use an existing task scheduler/ report generator
a single system scheduled task entry calling a central management script ( probably not optimal )
You mark a task/report for execution
Concurrency. Count, limit the number currently executing ( so you don't over load the server)
users pool via ajax for their tasks / reports or push to the clients with web sockets
Example on how to fork php to background
Update
I think you would get better performance out of a bot continuously check a database or file for work to do and submitting results back to the database. Alerting users via ajax, web sockets and or email when the work that they need is done / updated.
Here is a good introduction on how to build a web crawler in php
The best approach for solving this kind of problem is to use AJAX to make the request to the server in the background and then update the user once it has finished processing.
You may submit the form with an asynchronous request (ajax) and handle the page forward also with javascript. This way your form is handled asynchronously - you may even wait for the response to tell the user once you have an answer. This asynchronous request will not block the UI.
Just for completeness if you really really want to use php only for this:
Run PHP Task Asynchronously

Terminate connection to jQuery AJAX request, keep processing php on server side?

I have a signup form that calls a PHP script which can interact with our CRM's API, like so:
CRM API <--> PHP script <--> Signup form
The signup form passes some information to the PHP script in one
AJAX call
The PHP script run a dozen API calls to the CRM to create
an account and attach various data
The CRM returns the new account id it just created to the PHP script
The PHP script passes the account id back to the signup form, at which point the AJAX call is complete and the signup form can continue.
The problem is #2, those dozen calls take about 20 seconds to complete, but the data the signup form needs is generated after the first API call so it could in theory return that data much sooner and do the rest of the stuff server side without holding that AJAX call open the whole time.
I tried flush() and ob_flush() which does output account id to the client before processing is complete, but the jQuery AJAX connection remains open so I'm still stuck waiting for the connection to be closed on the signup form side before anything happens.
So what's the easiest route for returning that account id to the form as fast as possible?
Maybe break out using curl and exec?
if(signing up){
stuff
exec(curl myself, notsignup)
}
else {
bunch of api calls
}
You should probably think about creating a seperate process for the rest of the steps that are needed. One way is that you could after the #1 first api calls has been completed. It responds back to the user, and doesn't try to complete the rest of the 20 calls on the user side.
Then create a queue that will finish the rest. You could always create a table in mysql to store the queue.
Next just create a cronjob that will run in the background, knocking the queue out.
Note: You will not want this cronjob to just start and never stop. Maybe have it run every 5 minutes, but before it starts to run, check to see if another cron is still in progress. If it is then it will check in another 5 minutes to see if it is ok to run.
Hope this helps!
If you only need the information from the first API call to return the form, then I would probably try a different workflow:
Form calls PHP Script
PHP Calls first API Call
PHP Returns to Form
Form processes response
Form calls second PHP Script to complete the process
PHP finishes API Calls (the form can abandon at this point since it sounds like you don't care what happens from here on out).
The workflow requires a little more work and co-ordination for the developer, but presents the most responsive interface to the user.

AJAX call to check status of running process in the background, timing out

I've looked all over for any answer on this and haven't been able to find one, hopefully someone can point me in the right direction, I think I'm close
I have two hosts let's call them host1.mydomain.com and host2.mydomain.com (to get around the 2 concurrent connections per host/per browser issue), so they both point to the same content one is just an alias of the other
User goes to host1.mydomain.com, enters some information to register, clicks Go, which loads an iframe on the same page pointing to a page on host2.mydomain.com which calls a php script via exec("curl") sending the request to the background to start a website scraper, the process ID is then stored in the database for the user. After the iframe has successfully loaded (only takes 1 second since it's creating a background process) I have an AJAX request set on an interval to check the status periodically of the cURL process (by it's process ID in the database) so that I can display the current step of the scraper (there are 6 steps in total). All good so far.
The problem is that the AJAX requests are timing out after step 4 of the scraper (browser default timeout is 115/120 seconds) even though it shouldn't be because I'm working with two different hosts...meaning to say that it's almost as if I'm clogging both connections on host1.mydomain.com when I'm not because I initiated the scraper from host2
The iframe loads this URL: http://host2.mydomain.com/page.php
The contents of the PHP script calls:
exec("curl -o /dev/null 'http://host2.mydomain.com/page.php?method=process' > /dev/null & echo $!", $op);
Then my ajax request is polling http://host1.mydomain.com/status.php?pid=x which looks up in the database to check the status by the process ID
and once the scraper gets to step 4, my ajax requests are timing out
I think I confused myself explaining this, but hopefully someone can help me
Turns out I was successfully getting around the 2 connections per server/browser limitation...however in doing some research I found the reason why my ajax request was hanging is because I was trying to access and write to the session data from both of the requests. Digging a little deeper I found a session_write_close() which closes the session for reading/writing, I basically have to call this after each page request of the scraper and then reinitialize the session, this allows my ajax requests to go through and stops the blocking of the request.
Hopefully someone else finds this useful if you stumble across the same issue
Cheers!
Jeff
Instead of waiting for the request to finish, you should spawn new process which runs in the background on server. And use javascript to "check back" each few seconds to see when the execution has finished. Then all you have to do is pick up the result and display it.
Additionally you might want to make sure that only one php process is spawned.

HOW TO make test.php continue with rest of coding without waiting for a function to complete task?

This is the case. At test.php, I have a function dotask(a,b,c,d);
This function need to do task that need 2-4 minutes to complete.
Task including insert new record into db, curl call to other url to do task and etc.
However, I want test.php to:
Just make sure dotask(a,b,c,d) is called, no need to wait until task completed then only return and continue with the remaining code at bottom.
Reason: test.php is a thank you page. I can't expect the user to wait few minutes for the task to be completed. Because I need to show thank you messages etc at that page to user.
What can I do?
You can't fork a process in PHP without a lot of hackery. I'd recommend using a queue and worker pattern instead. See this answer: PHP- Need a cron for back site processing on user signup... (or fork process)
I seen a few solutions here that require you to pass in the page that you want to run, eg:
BackgroundProcess::fork('process_user.php');
But my case is at test.php, I have dotask(a,b,c,d)
I need to pass in parameters from this site to continue the work.
So should I just pass in this few parameter into a new dbtable call pendingprocess, then at process_user.php , I read from database and continue the task instead?
I can't embed parameter into taskpage.php right...
Another solution I can think of is at thankyou.php I do a body onload ajax call by passing in parameter to process_user.php to perform task. Anyone can advice whether this is a good way?
Will the process stop executing when user click STOP at browser? What if they go to refresh the browser.

Categories