Multiple connection to php script on behalf of the same user - php

I have long running operations in my php app. It uses mod_rewrite so all traffic is heading to index.php. When it is doing those long running tasks it doesn't response in other cards in browser until task ends.
Is there any configuration allowing to multiple connection to one php script on behalf of the same user?

You can use Ajax or WebSocket. But there is no way to solve it with a pure server side solution because of blocking (operations are executed in order). You can try multi-threading, there is a pthread extension for PHP, but you have to compile PHP with it.
Other solution is to use some sort of work queue if you don't have to wait for the tasks to end and return some response from them.

#mlask reply about session_write_close() did exactly what i wanted. After calling session_write_close() next connection where accepted immediately.
more about session_write_close()

Related

Asynchronously calling a PHP Script via another PHP script

Let's assume I have two PHP scripts, s1.php and s2.php. Let's also assume that s2.php takes about 30 minutes of running.
I would like to use s1.php to call s2.php asynchronously. When s2.php is called, it will run on its own without returning any value to s1.php. s1.php would not wait for s2.php to finish; s1.php will continue the next command, while s2.php starts on its own.
So here is the pseudo code for s1.php
Do something
Call s2.php
Continue s1.php, while s2.php is running (this step does not need to wait for s2.php to return in order to continue, it immedieately startes after s2.php starts).
How can I do that?
IMPORTANT NOTE: I am using a shared hosting environment
Out of the box, PHP does not support async processing or threading. What you're most likely after is Queueing and/or Messaging.
This can be as simple as storing a row in a database in s1.php and running s2.php on a cron as suggested in the comments, which reads that database, pulls out new rows and executes whatever logic you need.
It would be up to you to clean up - making sure you aren't reprocessing the same rows multiple times.
Other solutions would be using something like RabbitMQ or IronMQ. IronMQ might be a good place to look because its a cloud based service which would work well in your shared hosting environment and they allow for a 'dev tier' account which is free and probably far more api calls then you'll ever need.
Other fun things to look at is ReactPHP as that does allow for non-blocking io in php.
afaik, there's no good way to do this :/
you can use proc_open / exec ("nohup php5 s2.php") ~ , or $cmh=curl_multi_init();
$ch=curl_init("https://example.org/s2.php"); curl_multi_add_handle($chm,$ch);curl_multi_exec($chm,$foo);
though (or if you don't have curl, substitute with fopen... or if allow_url_fopen is false, you can even go as far as socket_create ~~ :/ )

Send data with opened connection in PHP

I'm trying to find a solution to my problem with sending data to client with PHP. The biggest issue is - what I'm trying to do is to keep sending data inside a single connection with PHP script. I'm sure there are other ways but currently I don't know how to solve this.
What I'm trying to do is: A client connects to a web server and keeps the connection opened, so the TCP connection is "established". He will keep making for example GET requests every X seconds to keep this connection alive.
Inside of this connection on certain event I want to send the client some data without him making the request! So it means the event is triggered on the server side, not on the client side.
There is no possibility of using any JavaScript or any client-side technique as my client is Arduino module. I can keep the connection opened but I need to pass data to the client by using HTTP.
I have a database set up on the server and PHP will send data to the client when something changes inside the database.
I was trying to play with php flush() running in loop in the PHP script; but that doesn't do it the way I want.
So any advices appreciated.
Thank you.
edit: would be perfect it the solution would also work on Windows machine!
edit2: There will be multiple clients, not just one (eg hundreds)
As long as you don’t have lots of clients, Server-side Events sounds like it could work for you.
http://dsheiko.com/weblog/html5-and-server-sent-events
Just read that you will have hundreds of clients, in that case you probably won't want to use PHP but use node.js instead.
How about CRON jobs?
http://en.wikipedia.org/wiki/Cron
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
I think that might be the solution for your project. If i undrestand CRON's correctly, what thy do is execute a given script at given intervals. So that is basicly what u want, scripts to execute for every X seconds. And inside your script u have your function working with the database.
I think what you are looking for is IPC - Inter Process Communication. In your case I would suggest a message queue (or multiple of those)
(On the client)
Open connection to foo.php
When receiving a new line, process it.
If connection times out, re-open it
(On the server - foo.php)
Open a Message Queue (You will have to register a message queue for each user!)
Register it so that your bar.php knows about it.
Start a blocking receive.
When a message is received, send whatever you want to send, FLUSH OUTPUT BUFFERS, go back to 2.
If anything times out, back to 2.
(On the server - bar.php)
When the database changes, send message to all active queues
There are a few problems with this approach:
The server side only really works on Linux / Unix (that includes Macs)
There is a limited number of message queues
You may have to do some housekeeping, removing old queues, etc.
The benefits:
This is application independent. Message queues are an operating system feature, thus your bar.php could really be say a java application.
Ok, so I think I found the way I want it to work. The problem why flush() wasn't working is that I didn't reach the flush buffer limit before flushing. Also I'm using Nginx server and I disabled gzip (just in case).
My test code which absolutely works looks like this:
<?
ob_implicit_flush(1);
for($i=0; $i<10; $i++){
echo $i;
//this is for the buffer achieve the minimum size in order to flush data
echo str_repeat(' ',1024*64);
sleep(1);
}
?>
Found my answer here: PHP Flush that works... even in Nginx
I will test it with my Arduinos if it can accept such output. Thanks all for your help.

proc_open() run a process make PHP wait it to finish?

I use XAMPP 1.7.7 on windows7.(PHP Version 5.3.8)
I use proc_open() run a process and want to redirect to another web page,
but PHP will wait until the process is finished.
I don't want the running process make my web to wait it.
What should I do?
And I need pipes and the return value.
What I need:
A user submit something in page A,then the web will redirect to page B(and user can leave page B).
At the same time some processes will be called , produce some results and update the
database,so when the user refresh the page B,the right result will be show.
What's more,the user can view the page B any time.
I notice that chris's comment on PHP Manual,his method can run a process which is
independent with PHP.But I don't know how to use pipes on the hide process or get
the return value.
And I have no idea on AJAX,I think the Gearman maybe work,but it's maybe a little complex.
This should be done using a job queue like Gearman so that you can leave a worker running and then interrogate it for its status later from the page you redirect to.
To install Gearman on Windows please see this previous SO question and answers: How to configure or install GEARMAN in windows OS?
PHP is single threaded by design. There is no way to leave a php process running when the HTTP request has finished.
Having said that, you could exploit AJAX to do what you want. Instead of having one HTTP request, fire two requests at the same time. One of them will contain the long process (along with set_time_limit(0)).
There are a lot of different ways to do that. What I usually do is that: When I receive the initial request, I respond immediately with an HTML page that contains an automatic AJAX call to the second php file that contains the long process. So everybody is happy: the user sees immediate response and the long process can take its time as nobody is waiting.
Try seeing if your problem is answered by this: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
I have solved this by start a new PHP to implements my request.
http://www.php.net/manual/en/function.proc-open.php#90584

Forcing chat bot made with JAXL/XMPPHP to reconnect upon disconnection

I'm using the JAXL library to implement a jabber chat bot written in php, which is then ran as a background process using the PHP CLI.
Things work quite well, but I've been having a hard time figuring out how to make the chat bot reconnect upon disconnection!
I notice when I leave it running over night sometimes it drops off and doesn't come back. I've experimented with $jaxl->connect() and $jaxl->startStream(), and $jaxl->startCore() after jaxl_post_disconnect hook, but I think I'm missing something.
One solution would be to test your connection:
1) making a "ping" request to your page/controller or whatever
2) setTimeout(functionAjaxPing(), 10000);
3) then read the Ajax response and if == "anyStringKey" then your connection works find
4) else: reconnect() / errorMessage() / whatEver()
This is what IRC chat use i think.
But this will generate more traffic since the ping/ping request will be needed.
Hop this will help you a bit. :)
If you are using Jaxl v3.x all you need is to add a callback for on_disconnect event.
Also you must be using XEP-0199 XMPP Ping. What this XEP will do is, periodically send out XMPP pings to connected jabber server. It will also receive server pings and send back required pong packet (for instance if your client is not replying to server pings, jabber.org will drop your connection after some time).
Finally you MUST also use whitespace pings. A whitespace ping is a single space character sent to the server. This is often enough to make NAT devices consider the connection “alive”, and likewise for certain Jabber servers, e.g. Openfire. It may also make the OS detect a lost connection faster—a TCP connection on which no data is sent or received is indistinguishable from a lost connection.
What I ended up doing was creating a crontab that simply executed the PHP script again.
In the PHP script I read a specific file for the pid of the last fork. If it exists, the script attempts to kill it. Then the script uses pcntl_fork() to fork the process (which is useful for daemonifying a PHP script anyway) and capture the new PID to a file. The fork then logs in with to Jabber with JAXL per usual.
After talking with the author of JAXL it became apparent this would be the easiest way to go about this, despite being hacky. The author may have worked on this particular flaw in more recent iterations, however.
One flaw to this particular method is it requires pcntl_fork() which is not compiled with PHP by default.

Does php execution stop after a user leaves the page?

I want to run a relatively time consuming script based on some form input, but I'd rather not resort to cron, so I'm wondering if a php page requested through ajax will continue to execute until completion or if it will halt if the user leaves the page.
It doesn't actually output to the browser until a json_encode at the end of the file, so would everything before that still execute?
It depends.
From http://us3.php.net/manual/en/features.connection-handling.php:
When a PHP script is running normally
the NORMAL state, is active. If the
remote client disconnects the ABORTED
state flag is turned on. A remote
client disconnect is usually caused by
the user hitting his STOP button.
You can decide whether or not you want
a client disconnect to cause your
script to be aborted. Sometimes it is
handy to always have your scripts run
to completion even if there is no
remote browser receiving the output.
The default behaviour is however for
your script to be aborted when the
remote client disconnects. This
behaviour can be set via the
ignore_user_abort php.ini directive as
well as through the corresponding
php_value ignore_user_abort Apache
httpd.conf directive or with the
ignore_user_abort() function.
That would seem to say the answer to your question is "Yes, the script will terminate if the user leaves the page".
However realize that depending on the backend SAPI being used (eg, mod_php), php cannot detect that the client has aborted the connection until an attempt is made to send information to the client. If your long running script does not issue a flush() the script may keep on running even though the user has closed the connection.
Complicating things is even if you do issue periodic calls to flush(), having output buffering on will cause those calls to trap and won't send them down to the client until the script completes anyway!
Further complicating things is if you have installed Apache handlers that buffer the response (for example mod_gzip) then once again php will not detect that the connection is closed and the script will keep on trucking.
Phew.
It depends on your settings - usually it will stop but you can use ignore_user_abort() to make it carry on.
Depending on the configuration of the web server and/or PHP, the PHP process may, or may not, kill the thread when the user terminates the HTTP connection. If an AJAX request is pending when the user walks away from the page, it is dependent on the browser killing the request (not guaranteed) ontop of your server config (not guaranteed). Not the answer you want to hear!
I would recommend creating a work queue in a flat file or database that a constantly-running PHP daemon can poll for jobs. It doesn't suffer from cron delay but keeps CPU/memory usage to a usable level. Once the job is complete, place the results in the flat file/database for AJAX fetch. Or promise to e-mail the user once the job is finished (my preferred method).
Hope that helps
If the client/user/downloader/viewer aborts or disconnects, the script will keep running until something tries do flush new data do the client. Unless you have used
ignore_user_abort(), the script will die there.
In the same order, PHP is unable to determine if client is still there without trying to flush any data to the httpd.
found the actual solution for my case of it not terminating the connection. The SESSION on my Apache/Php server needed to close before the next one could start.
Browser waits for ajax call to complete after abort.

Categories