I've read lots of answers but I haven't found a clear explanation. I'm trying to make an open port check tool in PHP.
When using fsockopen the server stops working while it checks every single port, so I cannot use this method to check the ports status.
Is there any efficient way of doing the same thing but in a non-blocking or asynchronous way?
This is my (non efficient) code:
$resultList = [];
foreach ($list as $key => $value) {
$object = new stdClass();
$object->id = $value["id"];
if (fsockopen($value["ip"], $value["port"])) {
$object->status = true;
} else {
$object->status = false;
}
array_push($resultList,$object);
}
A PHP script is always synchronous - you call a function, you wait for it to finish before your script moves on, however you can use PHP to trigger a second PHP script as a new process via the following generalised command:
shell_exec('/usr/bin/php -f /path/to/script.php &> /dev/null &');
Now this doesn't really help your problem I just included it to answer the exact words of your question - you can't have this second PHP script interact with the first in any way without involving AJAX, at which point you may as well just AJAX the entire thing.
To try and answer the spirit of your question I would like to suggest a different approach:
You have your parent page:
<?php
$hostname = '192.168.0.1';
?>
<h1>Ports for server <?php echo $hostname; ?>:</h1>
<iframe src="/path/to/my/post-scanner-script.php?hostname=<?php echo urlencode($hostname); ?>"></iframe>
Then your child page scans each port in sequence, and as it has finished scanning that port it uses flush()
<?php
// Stuff here to interpret $_GET['hostname'];
foreach ($list as $key => $value) {
$status = !!fsockopen($value["ip"], $value["port"]);
echo "<div>{$value['id']}: " . ($status ? 'OPEN' : 'CLOSED') . "</div>";
flush();
}
What flush() will do is immediatley send all rendered output to the client machine (it will not send the same content twice), so what you will get is a slowly appearing list of each port and its status, appearing as soon as your script knows the status of the port.
This provides some immediate feedback to the end user as well as a less frustrating wait (it's much more bearable when you can actually see progress being made).
I have a PHP script that processes data downloaded from multiple REST APIs into a standardized format and builds an array or table of this data. The script currently executes everything synchronously and therefore takes too long.
I have been trying to learn how to execute the function that fetches and processes the data, simultaneously or asynchronously so that the total time is the time of the slowest call. From my research it appears that ReactPHP or Amp are the correct tools.
However, I have been unsuccessful in creating test code that actually executes correctly. A simple example is attached, with mysquare() representing my more complex function. Due to a lack of examples on the net of exactly what I'm trying to achieve I have been forced to use a brute force method with 3 examples listed in my code.
Q1: Am I using the right tool for the job?
Q2: Can you fix my example code to execute asynchronously?
NB: I am a real beginner, so the simplest possible code example with a minimum of high level programming lingo would be appreciated.
<?php
require_once("../vendor/autoload.php");
for ($i = 0; $i <= 4; $i++) {
// Experiment 1
$deferred[$i] = new React\Promise\Deferred(function () use ($i) {
echo $x."\n";
usleep(rand(0, 3000000)); // Simulates long network call
return array($x=> $x * $x);
});
// Experiment 2
$promise[$i]=$deferred[$i]->promise(function () use ($i) {
echo $x."\n";
usleep(rand(0, 3000000)); // Simulates long network call
return array($x=> $x * $x);
});
// Experiment 3
$functioncall[$i] = function () use ($i) {
echo $x."\n";
usleep(rand(0, 3000000)); // Simulates long network call
return array($x=> $x * $x);
};
}
$promises = React\Promise\all($deferred); // Doesn't work
$promises = React\Promise\all($promise); // Doesn't work
$promises = React\Promise\all($functioncall); // Doesn't work
// print_r($promises); // Doesn't return array of results but a complex object
// This is what I would like to execute simulatenously with a variety of inputs
function mysquare($x)
{
echo $x."\n";
usleep(rand(0, 3000000)); // Simulates long network call
return array($x=> $x * $x);
}
Asynchronous doesn't mean multiple threads execute in parallel. 2 functions can only really run at the 'same time', if they (for example) do IO such as a HTTP request.
usleep() blocks, so you gain nothing. Both ReactPHP and Amp will have some kind of 'sleep' function themselves that's built right into the event loop.
For the same reason you will not be able to just use curl, because it will also block out of the box. You need to use the HTTP libraries that React and Amp provide and/recommend.
Since your end-goal is just doing HTTP requests, you could also not use any of these frameworks and just use the curl_multi functions. They're a bit hard to use though.
I'm answering my own question in an attempt to help other users, however this solution was developed alone without the help of an experienced programmer and so I do not know if it is ultimately the best way to do this.
TL;DR
I switched from ReactPHP because I didn't understand it to using amphp/parallel-functions which offers a simplified end user interface... sample code using this interface attached.
<?php
require_once("../vendor/autoload.php");
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$start = \microtime(true);
$mysquare = function ($x) {
sleep($x); // Simulates long network call
//echo $x."\n";
return $x * $x;
};
print_r(wait(parallelMap([5,4,3,2,1,6,7,8,9,10], $mysquare)));
print 'Took ' . (\microtime(true) - $start) . ' milliseconds.' . \PHP_EOL;
The example code executes in 10.2 seconds which is slightly longer than the longest running instance of $mysquare().
In my actual use case I was able to fetch data via HTTP from 90 separate sources in around 5 seconds.
Notes:
The amphp/parallel-functions library appears to be using threads under the hood. From my preliminary experience this appears to require a lot more memory than just a single threaded PHP script, but I haven't yet ascertained the full impact. This was highlighted when I was passing a large array to $mysquare via the "use ($myarray)" expression and array was 65Mb. This brought the code to a standstill and it increased execution time exponentially so much so that it took orders of magnitude longer than synchronous execution. Also the memory usage peaked at over 5G! at one point leading me to believe that amphp was duplicating $myarray for each instance. Reworking my code to avoid the "use ($myarray)" expression fixed that problem.
I have implemented the long polling successfully using normal Apache server, PHP, AJAX and Javascript. I don't use the Jquery to communicate with the server.
The problem is that the Apache server capabilities are limited, the server is not able to serve more than 5 browser tabs.
I wonder if there is any customization for the Apache or for the PHP to make them handle more concurrent connections? Or if there is any new/smart technique to do that? What are the maximum threads can be handled by a robust web server specialized in long polling?
I am not interested in the Web Sockets because of the browsers compatibility. I need something easy and robust into PHP. What Facebook are doing? I wonder how can they handle all the dynamic updates for million of users! What products/techniques they use?
A sample of my code:
srv_polling.php
<?php
function getResults(){..... return result;}
// recursive function inside the server
function hasResultChanged($old,$timeStart){
// to avoid server timeout (in seconds) in case no change for results
if(round(abs(time() - $timeStart) / 60*60,2) > 50)
return;
$new = getResults();
if($new != $old) // get back to browser
return true;
else{
$old = getResults();
sleep(2);
return $hasResultChanged($old,$timeStart);
}
}
$timeStart = time();
$old = $getResults();
sleep(2);
$hasResultChanged($old,$timeStart);
?>
// Javascript code to be executed at browser end
alert('Result has changed');
// Send AJAX request again to same page(srv_polling.php):
ajax.call({......})
Thank you for your hints! Greatly appreciated.
I am using this in my project
public function getLPollData($user, $handlerName) {
set_time_limit (600);
date_default_timezone_set('Europe/Berlin');
$counterEnd = (int)$_REQUEST["counterEnd"];
$counterStart = (int)$_REQUEST["counterStart"];
$this->expireNotifications($counterStart, $counterEnd);
$secCount = IDLE_WAIT;
do {
sleep(IDLE_TIME);
$updates = $this->fetchAllNotifications($counterEnd);
} while (!$updates && ($secCount--)>0);
if($updates){
}
header("HTTP/1.0 200");
return sprintf ('{"time" : "%s", "counter" : "%d", start : %d, data : %s}'
, date('d/m H:i:s'), $counterEnd,$counterStart,json_encode($updates));
}
Its combination of IDLE_WAIT & IDLE_TIME (10*3=~30 secs).
But i dont think your problem is on server side, if you are opening 5-6 connections from a browser, then remember each browser has limitation on how many active connections it can have to some particular domain, at a time. try diff browser or better diff machines, max two tabs in one browser.
I am trying to implement a realtime chat application using PHP . Is it possible to do it without using a persistent data storage like database or file . Basically what I need is a mediator written in PHP who
accepts messages from client browsers
Broadcasts the message to other clients
Forgets the message
You should check out Web Sockets of html5. It uses two way connection so you will not need any database or file. Any chat message comes to the server will directly sent to the other users browser without any Ajax call. But you need also to setup web socket server.
Web sockets are used in many real time applications as well. I am shortly planing to write full tutorial on that. I will notify you.
Just tried something I had never done before in response to this question. Seemed to work but I only tested it once. Instead of using a Socket I had an idea of using a shared Session variable. Basically I forced the Session_id to be the same value regardless of the user therefore they are all sharing the same data. From a quick test it seems to work. Here is what I did:
session_id('12345');
session_start();
$session_id = session_id();
$_SESSION['test'] = $_SESSION['test'] + 1;
echo "session: {$session_id} test: {$_SESSION['test']} <br />";
So my thought process was that you could simply store the chat info in a Session variable and force everyone regardless of who they are to use a shared session. Then you can simply use ajax to continually reload the current Session variable, and use ajax to edit the session variable when adding a message. Also you would probably want to set the Session to never expire or have a really long maxlifetime.
As I said I just played around with this for a few minutes to see if it would work.
You will want to use Sockets. This article will cover exactly what you want to do: http://devzone.zend.com/209/writing-socket-servers-in-php/
When I tried to solve the same problem, I went with Nginx's Push Module. I chose to go this way since I had to support older browsers (that usually won't support WebSockets) and had no confidence in setting up an appropriate solution like Socket.io behind a TCP proxy.
The workflow went like this:
The clients connect through long-polling to my /subscriber location, which is open to all.
The /publisher location only accepts connections from my own server
When a client subscribes and talks, it basically just asks a PHP script to handle whatever data is sent.
This script can do validation, authorization, and such, and then forwards (via curl) the message in a JSON format to the /publisher.
Nginx's Push Module handles sending the message back to the subscribers and the client establishes a new long-polling connection.
If I had to do this all over again, then I would definitely go the Socket.io route, as it has proper fallbacks to Comet-style long-polling and has great docs for both Client and Server scripts.
Hope this helps.
If you have a business need for PHP, then adding another language to the mix just means you then have two problems.
It is perfectly possible to run a permanent, constantly-running daemonised PHP IRCd server: I know, because I've done it, to make an online game which ran for years.
The IRC server part I used is a modified version of WaveIRCd:
http://sourceforge.net/projects/waveircd/
I daemonised it using code I made available here:
http://www.thudgame.com/node/254
That code might be overkill: I wrote it to be as rugged as I could, so it tries to daemonise using PHP's pcntl_fork(), then falls back to calling itself recursively in the background, then falls back to perl, and so on: it also handles the security restrictions of PHP's safe mode in case someone turns that on, and the security restrictions imposed by being called through cron.
You could probably strip it down to just a few lines: the bits with the comments "Daemon Rule..." - follow those rules, and you'll daemonize your process just fine.
In order to handle any unexpected daemon deaths, etc, I then ran that daemoniser every minute through cron, where it checked to see if the daemon was already running, and if so either quietly died, or if the daemon was nonresponsive, killed it and took its place.
Because of the whole distributed nature of IRC, it was nicely rugged, and gave me a multiplayer browser game with no downtime for a good few years until bit-rot ate the site a few months back. I should try to rewrite the front end in Flash and get it back up again someday, when I have time...
(I then ran another daemonizer for a PHP bot to manage the game itself, then had my game connect to it as a java applet, and talk to the bot to play the game, but that's irrelevant here).
Since WaveIRCd is no longer maintained, it's probably worth having a hunt around to find if anyone else has forked the project and is supporting it.
[2012 edit: that said, if you want your front end to be HTML5/Javascript, or if you want to connect through the same port that HTTP connects through, then your options are more limited than when using Flash or Java. In that case, take the advice of others, and use "WebSockets" (poor support in most current browsers) or the "Socket.io" project (which uses WebSockets, but falls back to Flash, or various other methods, depending what the browser has available).
The above is for situations where your host allows you to run a service on another port. In particular, many have explicit rules in their ToS against running an IRCd.]
[2019 edit: WebSockets are now widely supported, you should be fine using them. As a relevant case study, Slack is written in PHP (per https://slack.engineering/taking-php-seriously-cf7a60065329), and for some time supported the IRC protocol, though I believe that that has since been retired. As its main protocol, it uses an API based on JSON over WebSockets (https://api.slack.com/rtm). This all shows that a PHP IRCd can deliver enterprise-level performance and quality, even where the IRC protocol is translated to/from another one, which you'd expect to give poorer performance.]
You need to use some kind of storage as a buffer. It IS plausable not to use file or db (which also uses a file). You can try using php's shared memory functions, but I don't know any working solution so you'll have to do it from scratch.
Is it possible to do it without using a persistent data storage like
database or file?
It is possible but you shouldn't use. Database or file based doesn't slows down chat. It will be giving additional security to your chat application. You can make web based chat using ajax and sockets without persistent data.
You should see following posts:
Is database based chat room bad idea?
Will polling from a SQL DB instead of a file for chat application increase performance?
Using memcached as a database buffer for chat messages
persistent data in php question
https://stackoverflow.com/questions/6569754/how-can-i-develop-social-network-chat-without-using-a-database-for-storing-the-c
File vs database for storage efficiency in chat app
PHP is not a good fit for your requirements (in a normal setup like apache-php, fastcgi etc.), because the PHP script gets executed from top to bottom for every request and cannot maintain any state between the requests without the use of external services or databases/files (Except e.g. http://php.net/manual/de/book.apc.php, but it is not intended for implementing a chat and will not scale to multiple servers.)
You should definitely look at Node.js and especially the Node.js module Socket.IO (A Websocket library). It's incredibly easy to use and rocks. Socket.IO can also scale to multiple chat servers with an optional redis backend, which means it's easier to scale.
Trying to use $_SESSION with a static session id as communication channel is not a solution by the way, because PHP saves the session data into files.
One solution to achieving this is by writing a PHP socket server.
<?php
// Set time limit to indefinite execution
set_time_limit (0);
// Set the ip and port we will listen on
$address = '192.168.0.100';
$port = 9000;
$max_clients = 10;
// Array that will hold client information
$clients = Array();
// Create a TCP Stream socket
$sock = socket_create(AF_INET, SOCK_STREAM, 0);
// Bind the socket to an address/port
socket_bind($sock, $address, $port) or die('Could not bind to address');
// Start listening for connections
socket_listen($sock);
// Loop continuously
while (true) {
// Setup clients listen socket for reading
$read[0] = $sock;
for ($i = 0; $i < $max_clients; $i++)
{
if ($client[$i]['sock'] != null)
$read[$i + 1] = $client[$i]['sock'] ;
}
// Set up a blocking call to socket_select()
$ready = socket_select($read,null,null,null);
/* if a new connection is being made add it to the client array */
if (in_array($sock, $read)) {
for ($i = 0; $i < $max_clients; $i++)
{
if ($client[$i]['sock'] == null) {
$client[$i]['sock'] = socket_accept($sock);
break;
}
elseif ($i == $max_clients - 1)
print ("too many clients")
}
if (--$ready <= 0)
continue;
} // end if in_array
// If a client is trying to write - handle it now
for ($i = 0; $i < $max_clients; $i++) // for each client
{
if (in_array($client[$i]['sock'] , $read))
{
$input = socket_read($client[$i]['sock'] , 1024);
if ($input == null) {
// Zero length string meaning disconnected
unset($client[$i]);
}
$n = trim($input);
if ($input == 'exit') {
// requested disconnect
socket_close($client[$i]['sock']);
} elseif ($input) {
// strip white spaces and write back to user
$output = ereg_replace("[ \t\n\r]","",$input).chr(0);
socket_write($client[$i]['sock'],$output);
}
} else {
// Close the socket
socket_close($client[$i]['sock']);
unset($client[$i]);
}
}
} // end while
// Close the master sockets
socket_close($sock);
?>
You would execute this by running it through command line and would always have to run for your PHP clients to connect to it. You could then write a PHP client that would connect to the socket.
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
You would have to use some type of ajax to call with jQuery posting the message to this PHP client.
http://devzone.zend.com/209/writing-socket-servers-in-php/
http://php.net/manual/en/function.fsockopen.php
Better use a node.js server for this. WebSockets aren't cross-browser nowadays (except socket.io for node.js that works perfect)
in short answer, you can't.
the current HTTP/HTML implementation doesn't support the pushstate so the algorithm of your chat app should follow :
A: sent message
B,C,D: do while a new message has been sent get this message.
so the receivers always have to make a new request and check if a new message has been sent. (AJAX Call or something similar )
so always there are a delay between the sent event and the receive event.
which means the data must be saved in something global, like db or file system.
take a look for :
http://today.java.net/article/2010/03/31/html5-server-push-technologies-part-1
You didn't say it had to all be written it PHP :)
Install RabbitMQ, and then use this chat implementation built on top of websockets and RabbitMQ.
Your PHP is pretty much just 'chat room chrome'. It's possible most of your site would fit within the 5 meg limit of offline HTML5 content, and you have a very flexible (and likely more robust than if you did it yourself) chat system.
It even has 20 messages of chat history if you leave the room.
https://github.com/videlalvaro/rabbitmq-chat
If You need to use just PHP, then You can store chat messages in session variables, session could be like object, storing a lot of information.
If You can use jQuery then You could just append paragraph to a div after message has been sent, but then if site is refreshed, messages will be gone.
Or combining, store messages in session and update that with jQuery and ajax.
Try looking into socket libraries like ZeroMQ they allow for instant transport of the message, and are quicker than TCP, and is realtime. Their infrastructure allows for instant data send between points A and B, without the data being stored anywhere first (although you can still choose to).
Here's a tutorial for a chat client in ZeroMQ
here is my problem: I have a script (let's call it comet.php) whic is requsted by an AJAX client script and wait for a change to happen like this:
while(no_changes){
usleep(100000);
//check for changes
}
I don't like this too much, it's not very scalable and it's (imho) "bad practice"
I would like to improve this behaviour with a semaphore(?) or anyway concurrent programming
technique. Can you please give me some tips on how to handle this? (I know, it's not a short answer, but a starting point would be enough.)
Edit: what about LibEvent?
You can solve this problem using ZeroMQ.
ZeroMQ is a library that provides supercharged sockets for plugging things (threads, processes and even separate machines) together.
I assume you're trying to push data from the server to the client. Well, a good way to do that is using the EventSource API (polyfills available).
client.js
Connects to stream.php through EventSource.
var stream = new EventSource('stream.php');
stream.addEventListener('debug', function (event) {
var data = JSON.parse(event.data);
console.log([event.type, data]);
});
stream.addEventListener('message', function (event) {
var data = JSON.parse(event.data);
console.log([event.type, data]);
});
router.php
This is a long-running process that listens for incoming messages and sends them out to anyone listening.
<?php
$context = new ZMQContext();
$pull = $context->getSocket(ZMQ::SOCKET_PULL);
$pull->bind("tcp://*:5555");
$pub = $context->getSocket(ZMQ::SOCKET_PUB);
$pub->bind("tcp://*:5556");
while (true) {
$msg = $pull->recv();
echo "publishing received message $msg\n";
$pub->send($msg);
}
stream.php
Every user connecting to the site gets his own stream.php. This script is long-running and waits for any messages from the router. Once it gets a new message, it will output this message in EventSource format.
<?php
$context = new ZMQContext();
$sock = $context->getSocket(ZMQ::SOCKET_SUB);
$sock->setSockOpt(ZMQ::SOCKOPT_SUBSCRIBE, "");
$sock->connect("tcp://127.0.0.1:5556");
set_time_limit(0);
ini_set('memory_limit', '512M');
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
while (true) {
$msg = $sock->recv();
$event = json_decode($msg, true);
if (isset($event['type'])) {
echo "event: {$event['type']}\n";
}
$data = json_encode($event['data']);
echo "data: $data\n\n";
ob_flush();
flush();
}
To send messages to all users, just send them to the router. The router will then distribute that message to all listening streams. Here's an example:
<?php
$context = new ZMQContext();
$sock = $context->getSocket(ZMQ::SOCKET_PUSH);
$sock->connect("tcp://127.0.0.1:5555");
$msg = json_encode(array('type' => 'debug', 'data' => array('foo', 'bar', 'baz')));
$sock->send($msg);
$msg = json_encode(array('data' => array('foo', 'bar', 'baz')));
$sock->send($msg);
This should prove that you do not need node.js to do realtime programming. PHP can handle it just fine.
Apart from that, socket.io is a really nice way of doing this. And you could connect to socket.io to your PHP code via ZeroMQ easily.
See also
ZeroMQ
ZeroMQ PHP Bindings
ZeroMQ is the Answer - Ian Barber (Video)
socket.io
It really depends on what you are doing in your server side script. There are some situations in which your have no option but to do what you are doing above.
However, if you are doing something which involves a call to a function that will block until something happens, you can use this to avoid racing instead of the usleep() call (which is IMHO the part that would be considered "bad practice").
Say you were waiting for data from a file or some other kind of stream that blocks. You could do this:
while (($str = fgets($fp)) === FALSE) continue;
// Handle the event here
Really, PHP is the wrong language for doing stuff like this. But there are situations (I know because I have dealt with them myself) where PHP is the only option.
As much as I like PHP, I must say that PHP isn't the best choice for this task.
Node.js is much, much better for this kind of thing and it scales really good. It's also pretty simple to implement if you have JS knowledge.
Now, if you don't want to waste CPU cycles, you have to create a PHP script that will connect to a server of some sort on a certain port. The specified server should listen for connections on the chosen port and every X amount of time check for whatever you want to check (db entries for new posts for example) and then it dispatches the message to every connected client that the new entry is ready.
Now, it's not that difficult to implement this event queue architecture in PHP, but it'd take you literally 5 minutes to do this with Node.js and Socket.IO, without worrying whether it'll work in majority of browsers.
I agree with the consensus here that PHP isn't the best solution here. You really need to be looking at dedicated realtime technologies for the solution to this asynchronous problem of delivering data from your server to your clients. It sounds like you are trying to implement HTTP-Long Polling which isn't an easy thing to solve cross-browser. It's been tackled numerous times by developers of Comet products so I'd suggest you look at a Comet solution, or even better a WebSocket solution with fallback support for older browsers.
I'd suggest that you let PHP do the web application functionality that it's good at and choose a dedicated solution for your realtime, evented, asynchronous functionality.
You need a realtime library.
One example is Ratchet http://socketo.me/
The part that takes care of the pub sub is discussed at http://socketo.me/docs/wamp
The limitation here is that PHP also needs to be the one to initiate the mutable data.
In other words this wont magically let you subscribe to when MySQL is updated. But if you can edit the MySQL-setting code then you can add the publish part there.