Ajax/PHP - should I use one long running script or polling? - php

I have a PHP script that is kicked off via ajax. This PHP script uses exec() to run a separate PHP script via the shell.
The script that is called via exec() may take 30 seconds or so to complete. I need to update the UI once it is finished.
Which of these options is preferred?
a) Leave the HTTP connection open for the 30 seconds and wait for it to finish.
b) Have exec() run the PHP script in the background and then use ajax polling to check for completion (every 5 seconds or so).
c) Something else that I haven't thought of.
Thank you, Brian

Poll the server for updates every few seconds. When you leave connections open for that long a period of time there's always the possibility that they may be dropped by the server or their browser (browsers timeout if an HTTP request takes too long).

Option b) feels a little too stateful to me. Does the server need to receive a request once the 30 seconds are done, else it gets into a bad state? (like it doesn't relinquish resources or something of the like) If so, definitely go with a) methinks.
As for c), maybe you'll find something on the AJAX Pattern's Web Site under Browser-Server Diaglog.

The AJAX option seems good to me. One alternative is Comet (Ajax Push) style to minimize required traffic: Server sends signal to client(browser) when it has to say something (update UI).

a)
could have problems with timeout and locks server requests (usually you set limit server accepts connections). you could lock the server if many users add requests on the server. on http environment i would only keep open connections as long as necessary.
b)
if it is 30 seconds long i would poll not so often like each second. i would increase the polling time. is the execution time always 30 seconds? example polling style (payload is json):
# trigger job/execution
POST /job
=> response gives 301 redirect to /jobs/{job-id}
# polling
GET /jobs/{job-id}
=> {status:busy}
or
=> {status:completed,result:...}
but in the end it depends on the problem, i like b) more but it adds more effort to implement. maybe you have more details? is it a high traffic scenario?

Related

long running php script called via ajax (Don't want to wait for response)

I have a long running script that can run for awhile. (It sends an email every 5 seconds) to many users. This script is triggered via an ajax request
If the response is never received such as the browser is closed, will the script continue to run? (It appears it does, but are there any conditions on when it won't?
Does sleep count towards the max execution time. (It appears this is false also)
1.
The short answer is: it depends.
In fact, it can be configured in PHP and in web server you use. It depends on the mode you use PHP in (module or CGI or whatever).
You can configure it sometimes, though. There is an option in php.ini:
/* If enabled, the request will be allowed to complete even if the user aborts
; the request. Consider enabling it if executing long requests, which may end up
; being interrupted by the user or a browser timing out. PHP's default behavior
; is to disable this feature.
; http://php.net/ignore-user-abort
;ignore_user_abort = On*/
2. Almost always sleep does count. There are conditions when it does not, but in that case not the execution time is measured but execution cpu time. IIS does count CPU usage by app pools, not sure how it applies to PHP scripts.
It is true that PHP does not kill a script that is in sleep right now. That mean that the script will be killed once the sleep is over (easy test: add sleep(15); to your php and set max execution time to 10. You will get an error about time limit but in 15 seconds, not in 10).
In general, you can not rely on freely using sleeps in script. You should not rely on script that is run without a user (browser) within a web server, either. You are apparently solving a problem with wrong methods: you really should consider using cron jobs/separate tasks.
This depends on the server. Some servers could terminate the script when the socket is closed (which will probably happen when the browser is closed), others could let the script execute until the timeout is reached.
Again, would depend on the server. I can really see a implementation looking at the time the script puts load on a CPU, then again - just measuring how long ago the script was started is an equally good approach. It all depends on what the person(s) making the server software was after.
If you want definite answers I would suggest sifting through the documentation for the webserver and php-implementation your script is running on.

push and pull technologies using Ajax or Socket

I have a website that needs to send notifications to the online clients at real time same as Facebook, after more googling, I found a lot of documentation about push and pull technology. I found from this documentation ways for implementing them using Ajax or Sockets. I need to know what is the best to use in my case and how is it coded using javascript or jquery and php.
I cannot say you what's the best use in your case without knowing your case in detail.
In most cases it is enough to have the clients check with the server every one or two seconds, asking if something new has happened. I prefer this over sockets most of the time because it works on every web server without any configuration changes and in any browser supporting AJAX, even old ones.
If you have few clients (because every client requires an open socket on the server) and you want real realtime, you can use websockets. There are several PHP implementations, for example this one: http://code.google.com/p/phpwebsocket/
If you can ensure that there will be only single browser open per logged in user then you can apply this long polling technique easily.
Policy for Ajax Call:
Do not make request every 2 seconds.
But wait and make request only after 2 seconds of getting response from previous request.
If a request does not respond within 12 seconds then do not wait send a fresh request. This is connection lost case.
Policy for server response:
if there is update response immediately. to check if there is update rely on session ; (better if you could send some hint from client side like latest message received; this second update checking mechanism will eliminate the restriction of single browser open as mentioned above)
otherwise sleep() for 1 second; (do not use infinite loop but use sleep) and then check whether there is update; if update is there respond; if not sleep again for 1 second; repeat this until total 10 seconds has elapsed and then respond back with no update
If you apply this policy (commonly known as long polling), you will find processor usage reduced from 95% to 4% under heavy load case.
Hope this explains. Best of luck.
Just use apply the long-polling technique using jQuery.
Sockets are not yet supported everywhere and also you would need to open a listening socket on the server for this to work.

What benefits does long-polling have in php?

Say I am creating a webchat, where the chat messages are stored in a SQL database (I don't know how else to do it), what benefits does using AJAX to long-poll instead of simply polling every x seconds?
Since PHP runs only when you open the page, the long-polled PHP script will have to check for new messages every second as well. What benefits does long-polling have then? Either way, I'm going to have a latency of x seconds, only with long-polling the periodic checking happens on the server.
Long polling, in your case, has two advantages:
First, long polling allows clients to receive message updates immediately after they become available on the server, increasing the responsiveness of your webchat.
The second advantage is that almost no change is required in the client application in order to work in this mode. From the client’s point of view, a blocked poll request looks like a network delay, the only difference is that the client doesn't need to wait between sending poll requests, as it would if you were simply polling every x seconds.
However, making a server hold requests increases server load. Usual web servers with synchronous request handling use one thread per each request, this means that waiting request blocks the thread by which it is handled. Thus, 100 chat clients which use long polling to get message updates from the server, will block 100 threads.
Most of these threads will be in the waiting state, but every thread still uses a considerable amount of resources. This problem is solved in Comet by asynchronous request processing, a technique allowing request blocking without blocking a thread, which is now
supported by several web servers including Tomcat.
Reference for my answer: oBIX Watch communication engine reference document

PHP simultaneous TCP connections from same browser

I have a PHP script which opens a socket connection(using fsockopen) and takes about 15secs to complete/return the result to the browser. Mean while, if the browser sends a second request it is serialized. This is giving a bad user experience because if the user clicks 3 times, then the third request which gets sent after 30sec is the one that gets the response -- the first 2 requests from browser prespective are getting lost.
I do not have any session in my script, but tried putting session_write_close() at the beginning of my script which didnt help.
Also session.auto_start in the php.ini = 0.
Any ideas as to how to make the client requests from the same browser parallel??
Thanks
Gary
1) Download and install Firefox
2) Download and install Firebug
3) Add a sleep(10) to your PHP script so that it pauses for a few seconds before returning its response
4) Open up your webpage, and watch the outbound connections with Firebug. You should see several that are open and do not yet have a response. They should all return at about the same time, when each one finishes the 10 second delay.
If you do not see multiple connections open at the same time, and return at approximately the same time, then you need to look at your front end code. AJAX requests are asynchronous and can run in parallel. If you are seeing them run serially instead, then it means you need to fix your JavaScript code, not anything on the server.
Parallel asynchronous Ajax requests using jQuery
You should if at all possible install(*nix) redis.
To install just do simple
make
With lpush/brpop you can handle this stuff asynchronously and keep order intact. If you spawn couple of worker threads you could even handle multiple requests simultaneous. the predis client library is pretty solid

Can PHP scripts continue to run even if the user closed the browser?

For example, there is a very simple PHP script which updates some tables on database, but this process takes a long time (maybe 10 minutes). Therefore, I want this script to continue processing even if the user closed the browser, because sometimes users do not wait and they close the browser or go to another webpage.
If the task takes 10 minutes, do not use a browser to execute it directly. You have lots of other options:
Use a cronjob to execute the task
periodically.
Have the browser
request insert a new row into a
database table so that a regular
cronjob can process the new row and
execute the PHP script with the
appropriate arguments
Have the
browser request write a message to
queue system, which has a subscriber
listening for such events (which then
executes the script).
While some of these suggestions are probably overkill for your situation, the key, combining feature is to de-couple the browser request from the execution of the job, so that it can be completed asynchronously.
If you need the browser window updated with progress, you will need to use a periodically-executed AJAX request to retrieve the job status.
To answer your question directly, see ignore_user_abort
More broadly, you probably have an architecture problem here.
If many users can initiate this stuff, you'll want the web application to add jobs to some kind of queue, and have a set number of background processes that chew through all the work.
The PHP script will keep running after the client terminates the connection (not doing so would be a security risk), but only up to max_execution_time (set in php.ini or through a PHP script, generally 30 seconds by default)..
For example:
<?php
$fh = fopen("bluh.txt", 'w');
for($i=0; $i<20; $i++) {
echo $i."<br/>";
fwrite($fh,$i."\n");
sleep(1);
}
fclose($fh);
?>
Start running that in your browser and close the browser before it completes. You'll find that after 20 seconds the file contains all of the values of $i.
Change the upper bound of the for loop to 100 instead of 20, and you'll find it only runs from 0 to 29. Because of PHP's max_execution_time the script times out and dies.
if the script is completely server based (no feedback to the user) this will be done even if the client is closed.
The general architecture of PHP is that a clients send a request to a script that gives a reply to the user. if nothing is given back to the user the script will still execute even if the user is not on the other side anymore. More simpler: their is no constant connection between server and client on a regular script.
You can make the PHP script run every 20 minutes using a crontab file which contains the time and what command to run in this case it would be the php script.
Yes. The server doesn't know if the user closed the browser. At least it doesn't notice that immediately.
No: the server probably (depending of how it is configured) won't allow for a php script to run for 10 minutes. On a cheap shared hosting I wouldn't rely on a script running for longer than a reasonable response time.
A server-side script will go on what it is doing regardless of what the client is doing.
EDIT: By the way, are you sure that you want to have pages that take 10 minutes to open? I suggest you to employ a task queue (whose items are executed by cron on a timely basis) and redirect user to a "ok, I am on it" page.

Categories