I'm trying to find a solution to my problem with sending data to client with PHP. The biggest issue is - what I'm trying to do is to keep sending data inside a single connection with PHP script. I'm sure there are other ways but currently I don't know how to solve this.
What I'm trying to do is: A client connects to a web server and keeps the connection opened, so the TCP connection is "established". He will keep making for example GET requests every X seconds to keep this connection alive.
Inside of this connection on certain event I want to send the client some data without him making the request! So it means the event is triggered on the server side, not on the client side.
There is no possibility of using any JavaScript or any client-side technique as my client is Arduino module. I can keep the connection opened but I need to pass data to the client by using HTTP.
I have a database set up on the server and PHP will send data to the client when something changes inside the database.
I was trying to play with php flush() running in loop in the PHP script; but that doesn't do it the way I want.
So any advices appreciated.
Thank you.
edit: would be perfect it the solution would also work on Windows machine!
edit2: There will be multiple clients, not just one (eg hundreds)
As long as you don’t have lots of clients, Server-side Events sounds like it could work for you.
http://dsheiko.com/weblog/html5-and-server-sent-events
Just read that you will have hundreds of clients, in that case you probably won't want to use PHP but use node.js instead.
How about CRON jobs?
http://en.wikipedia.org/wiki/Cron
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
I think that might be the solution for your project. If i undrestand CRON's correctly, what thy do is execute a given script at given intervals. So that is basicly what u want, scripts to execute for every X seconds. And inside your script u have your function working with the database.
I think what you are looking for is IPC - Inter Process Communication. In your case I would suggest a message queue (or multiple of those)
(On the client)
Open connection to foo.php
When receiving a new line, process it.
If connection times out, re-open it
(On the server - foo.php)
Open a Message Queue (You will have to register a message queue for each user!)
Register it so that your bar.php knows about it.
Start a blocking receive.
When a message is received, send whatever you want to send, FLUSH OUTPUT BUFFERS, go back to 2.
If anything times out, back to 2.
(On the server - bar.php)
When the database changes, send message to all active queues
There are a few problems with this approach:
The server side only really works on Linux / Unix (that includes Macs)
There is a limited number of message queues
You may have to do some housekeeping, removing old queues, etc.
The benefits:
This is application independent. Message queues are an operating system feature, thus your bar.php could really be say a java application.
Ok, so I think I found the way I want it to work. The problem why flush() wasn't working is that I didn't reach the flush buffer limit before flushing. Also I'm using Nginx server and I disabled gzip (just in case).
My test code which absolutely works looks like this:
<?
ob_implicit_flush(1);
for($i=0; $i<10; $i++){
echo $i;
//this is for the buffer achieve the minimum size in order to flush data
echo str_repeat(' ',1024*64);
sleep(1);
}
?>
Found my answer here: PHP Flush that works... even in Nginx
I will test it with my Arduinos if it can accept such output. Thanks all for your help.
Related
This question came up to me when I encountered a bug that caused my PHP program to loop infinitely. Here is an example situation:
Suppose I have a PHP webpage that receives picture uploads (the page perhaps is a response page for an image upload form). In the server, the script should store the image in a temporary file. The script should then output a confirmation message to the client then stop sending data so that the client would not wait. The script should then continue executing, processing the image (like resizing it) before ending.
I think this "technique" could be useful such that the client will not wait during time-consuming processes, therefore preventing time-outs.
Also, could this be solved using HTTP methods?
Yes.
This can easily be done without any asynchronous processing if you correctly utilize HTTP headers.
Under normal conditions PHP will stop processing as soon as the client on the other end closes the connection. If you want to continue processing after this event, you need to do one thing: tell PHP to ignore user aborts. How?
ignore_user_abort()
This will allow your script to keep running even after the client gets the heck out of dodge. But we're also faced with the problem of how to tell the client that the request they made is finished so that it will close the connection. Normally, PHP transparently handles sending these headers for us if we don't specify them. Here, though, we need to do it explicitly or the client won't know when we want them to stop reading the response.
To do this, we have to send the appropriate HTTP headers to tell the client when to close:
Connection: close
Content-Length: 42
This combination of headers tells the client that once it reads 42 bytes of entity body response that the message is finished and that they should close the connection. There are a couple of consequences to this method:
You have to generate your response BEFORE sending any output because you have to determine its content length size in bytes so you can send the correct header.
You have to actually send these headers BEFORE you echo any output.
So your script might look something like this:
<?php
ignore_user_abort();
// do work to determine the response you want to send ($responseBody)
$contentLength = strlen($responseBody);
header('Connection: close');
header("Content-Length: $contentLength");
flush();
echo $responseBody;
// --- client will now disconnect and you can continue processing here ---
The big "Gotchya" with this method is that when you're running PHP in a web SAPI you can easily run up against the max time limit directive if you do time-consuming processing after the end user client closes the connection. If this is a problem, you may need to consider an asynchronous processing option using cron because there is no time limit when PHP runs in a CLI environment. Alternatively, you could just up the time limit of your scripts in the web environment using set_time_limitdocs.
It's worth mentioning that if you do something like this, you may also want to add a check to connection_aborted()docs while generating your response body so that you can avoid the additional processing if the user aborts before completing the transfer.
I have facing the same problem when i upload image on twitter & facebook from iphone through web service of php.
If the processing time of image upload is not much then you can check the comment of #Musa this may help you but if it takes too much time to process then try this steps.
1. Image store in folder
2. Fetch image from folder using cron
3. Cron run for every 2 min in backend
these will decrease your processing time.
Hope this help you.
It is advisable to do these asynchronously. That is, make another script which only processes the previously-created tmp files, and run it with cron (don't even involve apache). When php is running as web-server module, it should be dedicated to quickly forming a response, and then going away to free up resources for the next request.
You are doing the right thing by thinking this way; just keep going one small architectural step further, and fully decouple the request from the heavy lifting that needs to take place.
You can do it several ways #
1 #
ob_start();
//output
header("Content-Length: ".ob_get_length());
header("Connection: close");
ob_end_flush();
//do other stuff
2 #
Using system() or exec() of PHP, close the Process
3 #
Close the Process using Shell Script
You can use ob_implicit_flush(), It will turn implicit flushing on or off. Implicit flushing will result in a flush operation after every output call, so that explicit calls to flush() will no longer be needed.
refer to
How do i implement this scenario using PHP?
OR
You should Create a standalone cron, which will run after a specific amount of time, and do the in asynchronous way, with out letting the user to know what processing is going on, or with out letting the user to wait. This way you will even be able to detect the failed cases also.
And you should also try to minimize the loading time.
I'm using the JAXL library to implement a jabber chat bot written in php, which is then ran as a background process using the PHP CLI.
Things work quite well, but I've been having a hard time figuring out how to make the chat bot reconnect upon disconnection!
I notice when I leave it running over night sometimes it drops off and doesn't come back. I've experimented with $jaxl->connect() and $jaxl->startStream(), and $jaxl->startCore() after jaxl_post_disconnect hook, but I think I'm missing something.
One solution would be to test your connection:
1) making a "ping" request to your page/controller or whatever
2) setTimeout(functionAjaxPing(), 10000);
3) then read the Ajax response and if == "anyStringKey" then your connection works find
4) else: reconnect() / errorMessage() / whatEver()
This is what IRC chat use i think.
But this will generate more traffic since the ping/ping request will be needed.
Hop this will help you a bit. :)
If you are using Jaxl v3.x all you need is to add a callback for on_disconnect event.
Also you must be using XEP-0199 XMPP Ping. What this XEP will do is, periodically send out XMPP pings to connected jabber server. It will also receive server pings and send back required pong packet (for instance if your client is not replying to server pings, jabber.org will drop your connection after some time).
Finally you MUST also use whitespace pings. A whitespace ping is a single space character sent to the server. This is often enough to make NAT devices consider the connection “alive”, and likewise for certain Jabber servers, e.g. Openfire. It may also make the OS detect a lost connection faster—a TCP connection on which no data is sent or received is indistinguishable from a lost connection.
What I ended up doing was creating a crontab that simply executed the PHP script again.
In the PHP script I read a specific file for the pid of the last fork. If it exists, the script attempts to kill it. Then the script uses pcntl_fork() to fork the process (which is useful for daemonifying a PHP script anyway) and capture the new PID to a file. The fork then logs in with to Jabber with JAXL per usual.
After talking with the author of JAXL it became apparent this would be the easiest way to go about this, despite being hacky. The author may have worked on this particular flaw in more recent iterations, however.
One flaw to this particular method is it requires pcntl_fork() which is not compiled with PHP by default.
I've finally made a simple chat page that I had wanted to make for a while now, but I'm running into problems with my servers.
I'm not sure if long polling is the correct term, but from what I understand, I think it is. I have an ajax call to a php page that checks a mysql database for messages with times newer than the time sent in the ajax request. If there isn't a newer message, it keeps looping and checking until there is. Else, it just returns the new messages and the client script sends another ajax request as soon as it gets the messages.
Everything is working fine, except for the part where the server on 000webhost stops responding after a few chat messages, and the server on x10 hosting gives me a message about hitting a resource limit.
Maybe this is a dumb way to do a chat system, but it's all I know how to do. If there is a better way please let me know.
edit: Holy hell, it's just occurred to me that I didn't put any sleep time in the while loop on the server.
You can find a lot of reading on this, but I disbelieve that free web hosting is going to allow to do what you are thinking of doing. PHP was also not really designed to create chat systems.
I would recommend using WebSockets, and use for example, Node.JS with Socket.IO, or Tornado with Python; There is a lot of solutions out there, but most of them would require you to run your own server since it requires to run a whole program that interacts with many connections at once instead of simple scripts that just start and finish with a single connection.
What about using the same strategy whether there are newer messages on the server or not. The server would always return a list of newer messages - this list could be empty when there are no newer messages. The empty list could be also be encoded as a special data token.
The client then proceeds in both cases the same way: it processes the received data and requests new messages after a time period.
Make sure you sleep(1) your code on each loop, the code gonna enter the loop several times per second, stressing your database/server.
But still, nodejs or websockets are better tecnologies to deal with real time chats.
I'm trying to build a web interface for some python scripts. The thing is I have to use PHP (and not CGI) and some of the scripts I execute take quite some time to finish: 5-10 minutes. Is it possible for PHP to communicate with the scripts and display some sort of progress status? This should allow the user to use the webpage as the task runs and display some status in the meantime or just a message when it's done.
Currently using exec() and on completion I process the output. The server is running on a Windows machine, so pcntl_fork will not work.
LATER EDIT:
Using another php script to feed the main page information using ajax doesn't seem to work because the server kills it (it reaches max execution time, and I don't really want to increase this unless necessary)
I was thinking about socket based communication but I don't see how is this useful in my case (some hints, maybe?
Thank you
You want inter-process communication. Sockets are the first thing that comes to mind; you'd need to set up a socket to listen for a connection (on the same machine) in PHP and set up a socket to connect to the listening socket in Python and send it its status.
Have a look at this socket programming overview from the Python documentation and the Python socket module's documentation (especially the examples at the end). I'm sure PHP has similar resources.
Once you've got an more specific idea of what you want to build and need help, feel free to ask a new question on StackOverflow (if it isn't already answered).
I think you would have to use a meta refresh and maybe have the python write the status to a file and then have the php read from it.
You could use AJAX as well to make it more dynamic.
Also, probably shouldn't use exec()...that opens up a world of vulnerabilities.
You could use a queuing service like Gearman, with a client in PHP and a worker in Python or vice versa.
Someone has created an example setup here.
https://github.com/dbaltas/gearman-python-worker
Unfortunately my friend, I do believe you'll need to use Sockets as you requested. :( I have little experience working with them, but This Python Tutorial on Sockets/Network Programming may help you get the Python socket interaction you need. (Beau Martinez's links seem promising as well.)
You'd also need to get some PHP socket connections, too, so it can request the status.
Continuing on that, my thoughts would be that your Python script is likely going to run in a loop. Ergo, I'd put the "Check for a status request" check inside the beginning of a part of that loop. It'd reply one status, while a later loop inside that script would reply with an increased status.. etc.
Good luck!
Edit: I think that the file writing recommendation from Thomas Schultz is probably the easiest to implement. The only downside is waiting for the file to be opened-- You'll need to make sure your PHP and Python scripts don't hang or return failure without trying again.
I'm currently running a Linux based VPS, with 768MB of Ram.
I have an application which collects details of domains and then connect to a service via cURL to retrieve details of the pagerank of these domains.
When I run a check on about 50 domains, it takes the remote page about 3 mins to load with all the results, before the script can parse the details and return it to my script. This causes a problem as nothing else seems to function until the script has finished executing, so users on the site will just get a timer / 'ball of death' while waiting for pages to load.
**(The remote page retrieves the domain details and updates the page by AJAX, but the curl request doesnt (rightfully) return the page until loading is complete.
Can anyone tell me if I'm doing anything obviously wrong, or if there is a better way of doing it. (There can be anything between 10 and 10,000 domains queued, so I need a process that can run in the background without affecting the rest of the site)
Thanks
A more sensible approach would be to "batch process" the domain data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for domains that aren't marked as processed.
Carry out the CURL lookup, etc.
Update the database record accordingly and mark it as processed.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add domains to the database/review the results of processing, etc. via a separate web front end.
This isn't the ideal solution, but if you need to trigger this process based on a user request, you can add the following at the end of your script.
set_time_limit(0);
flush();
This will allow the PHP script to continue running, but it will return output to the user. But seriously, you should use batch processing. It will give you much more control over what's going on.
Firstly I'm sorry but Im an idiot! :)
I've loaded the site in another browser (FF) and it loads fine.
It seems Chrome puts some sort of lock on a domain when it's waiting for a server response, and I was testing the script manually through a browser.
Thanks for all your help and sorry for wasting your time.
CJ
While I agree with others that you should consider processing these tasks outside of your webserver, in a more controlled manner, I'll offer an explanation for the "server standstill".
If you're using native php sessions, php uses an exclusive locking scheme so only a single php process can deal with a given session id at a time. Having a long running php script which uses sessions can certainly cause this.
You can search for combinations of terms like:
php session concurrency lock session_write_close()
I'm sure its been discussed many times here. I'm too lazy to search for you. Maybe someone else will come along and make an answer with bulleted lists and pretty hyperlinks in exchange for stackoverflow reputation :) But not me :)
good luck.
I'm not sure how your code is structured but you could try using sleep(). That's what I use when batch processing.