Best way to get a minecraft servers status - query, stream_socket...? - php

some time ago I used query to check the status of a minecraft server via. php, but I wasn´t so happy with the results. Sometimes it just took more than 10 seconds or didn´t even got the status although the mc server was up and the webserver within the same data center.
Which method do you think would work most stable and with a good performance: query, stream_stocket or something else ?
Should I run the test every 30 seconds via. a cronjob or just cache the results for 30 secs '?

You can create a PHP file with this code and run periodically to store the status.
<?php
/**
* #author Kristaps Karlsons <kristaps.karlsons#gmail.com>
* Licensed under MPL 1.1
*/
function mc_status($host,$port='25565') {
$timeInit = microtime();
// TODO: implement a way to store data (memcached or MySQL?) - please don't overload target server
$fp = fsockopen($host,$port,$errno,$errstr,$timeout=10);
if(!$fp) die($errstr.$errno);
else {
fputs($fp, "\xFE"); // xFE - get information about server
$response = '';
while(!feof($fp)) $response .= fgets($fp);
fclose($fp);
$timeEnd = microtime();
$response = str_replace("\x00", "", $response); // remove NULL
//$response = explode("\xFF", $response); // xFF - data start (old version, prior to 1.0?)
$response = explode("\xFF\x16", $response); // data start
$response = $response[1]; // chop off all before xFF (could be done with regex actually)
//echo(dechex(ord($response[0])));
$response = explode("\xA7", $response); // xA7 - delimiter
$timeDiff = $timeEnd-$timeInit;
$response[] = $timeDiff < 0 ? 0 : $timeDiff;
}
return $response;
}
$data = mc_status('mc.exs.lv','25592'); // even better - don't use hostname but provide IP instead (DNS lookup is a waste)
print_r($data); // [0] - motd, [1] - online, [2] - slots, [3] - time of request (in microseconds - use this to present latency information)
Credits: skakri (https://gist.github.com/skakri/2134554)

Related

How to process GuzzleHTTP async requests without blocking?

I need to write a processor that can potentially send out many HTTP requests to an external service. Since I want to maximize performance, I wish to minimize blocking. I'm using PHP 5.6 and GuzzleHTTP.
GuzzleHTTP does have an option for async requests. But since we do have only 1 thread available in PHP, I need to allocate some time for them to be processed. Unfortunately I only see one way to do it - calling wait which blocks until all the requests are processed. That's not what I want.
Instead I'd like to have some method that handles whatever has arrived, and then returns. So that I can do something along the lines of:
$allRequests = [];
while ( !checkIfNeedToEnd() ) {
$newItems = getItemsFromQueue();
$allRequests = $allRequests + spawnRequests($newItems);
GuzzleHttp::processWhatYouCan($allRequests);
removeProcessedRequests($allRequests);
}
Is this possible?
Alright... figured it out myself:
$handler = new \GuzzleHttp\Handler\CurlMultiHandler();
$client = new \GuzzleHttp\Client(['handler' => $handler]);
$promise1 = $client->getAsync("http://www.stackoverflow.com");
$promise2 = $client->getAsync("http://localhost/");
$doneCount = 0;
$promise1->then(function() use(&$doneCount) {
$doneCount++;
echo 'Promise 1 done!';
});
$promise2->then(function() use(&$doneCount) {
$doneCount++;
echo 'Promise 2 done!';
});
$last = microtime(true);
while ( $doneCount < 2 ) {
$now = microtime(true);
$delta = round(($now-$last)*1000);
echo "tick($delta) ";
$last = $now;
$handler->tick();
}
And the output I get is:
tick(0) tick(6) tick(1) tick(0) tick(1001) tick(10) tick(96) Promise 2 done!tick(97) Promise 1 done!
The magic ingredient is creating the CurlMultiHandler yoursef and then calling tick() on that when it's convenient. After that it's promises as usual. And if the queue is empty, tick() returns immediately.
Note that it can still block for up to 1 second (default) if there is no activity. This can be also changed if needed:
$handler = new \GuzzleHttp\Handler\CurlMultiHandler(['select_timeout' => 0.5]);
The value is in seconds, but with floating point.

Measure pagespeed "wait" in PHP

I know I could measure the total site loading time for an external url just with something like:
$start_request = time();
file_get_contents($url);
$end_request = time ();
$time_taken = $end_request - $start_request;
But I don't need the total site loading, I want to measure only the server-response-time like it's displayed here in the "wait"-part of the result:
http://www.bytecheck.com/results?resource=https://www.example.com
How can I do this with php?
You can't do this with PHP like so. With time() or microtime() you can only get the complete time that one or more commands took.
You need a tool where you have access to the Network Layer Data. cURL can do this for you, but you have to enable php curl, it if its not already done.
PHP can than take the result and process it.
<?php
// Create a cURL handle
$ch = curl_init('http://www.example.com/');
// Execute
curl_exec($ch);
// Check if any error occurred
if (!curl_errno($ch)) {
$info = curl_getinfo($ch);
echo 'Took ', $info['total_time'], ' seconds to send a request to ', $info['url'], "\n";
}
// Close handle
curl_close($ch);
You have a bunch of informations in $info like
"filetime"
"total_time"
"namelookup_time"
"connect_time"
"pretransfer_time"
"starttransfer_time"
"redirect_time"
The complete list could be found here
The "Wait" time should be the starttransfer_time - pretransfer_time,
so in your case you need:
$wait = $info['starttransfer_time'] - $info['pretransfer_time'];

Is there another way to implements Long Polling in PHP

I have read some articles (like this, or this), and all of them give me the same way to implements Long Polling in PHP (using usleep() and loop), like that:
$source; // some data source - db, etc
$data = null; // our return data
$timeout = 30; // timeout in seconds
$now = time(); // start time
// loop for $timeout seconds from $now until we get $data
while((time() - $now) < $timeout) {
// fetch $data
$data = $source->getData();
// if we got $data, break the loop
if (!empty($data)) break;
// wait 1 sec to check for new $data
usleep(10000);
}
// if there is no $data, tell the client to re-request (arbitrary status message)
if (empty($data)) $data = array('status'=>'no-data');
// send $data response to client
echo json_encode($data);
Is there another way? I know that PHP is a script language only, but i would like a way that base on event rather than checking and doing or waiting until timeout. It maybe be something like Continuations in Java that would be perfect.
You could try React: http://reactphp.org/
Is not very mature yet, but it may suit your needs. Instead of doing long pooling, you can do it async.
I would recommend: http://ape-project.org/
mature and scalable

What would cause fastcgi_finish_request to take several seconds to execute?

I have run into a rather strange issue with a particular part of a large PHP application. The portion of the application in question loads data from MySQL (mostly integer data) and builds a JSON string which gets output to the browser. These requests were taking sevar seconds (8 - 10 seconds each) in Chrome's developer tools as well as via curl. However the PHP shutdown handler I had reported that the requests were executing in less than 1 second.
In order to debug I added a call to fastcgi_finish_request(), and suddenly my shutdown handler reported the same time as Chrome / curl.
With some debugging, I narrowed it down to a particular function. I created the following simple test case:
<?php
$start_time = DFStdLib::exec_time();
$product = new ApparelQuoteProduct(19);
$pmatrix = $product->productMatrix();
// This function call is the problem:
$ranges = $pmatrix->ranges();
$end_time = DFStdLib::exec_time();
$duration = $end_time - $start_time;
echo "Output generation duration was: $duration sec";
fastcgi_finish_request();
$fastcgi_finish_request_duration = DFStdLib::exec_time() - $end_time;
DFSkel::log(DFSkel::LOG_INFO,"Output generation duration was: $duration sec; fastcgi_finish_request Duration was: $fastcgi_finish_request_duration sec");
If I call $pmatrix->ranges() (which is a function that executes a number of calls to mysql_query to fetch data and build an in-memory PHP object structure from that data) then I get the output:
Output generation duration was: 0.2563910484314 sec; fastcgi_finish_request Duration was: 7.3854329586029 sec
in my log file. Note that the call to $pmatrix->ranges() does not take long at all, yet somehow it causes the PHP FastCGI handler to take seven seconds to fihish the request. (This is true even if I don't call fastcgi_finish_request -- the browser takes 7-8 seconds to display the data either way)
If I comment out the call to $pmatrix->ranges() I get:
Output generation duration was: 0.0016419887542725 sec; fastcgi_finish_request Duration was: 0.00035214424133301 sec
I can post the entire source for the $pmatrix->ranges() function, but it's very long. I'd like some advice on where to even start looking.
What is it about the PHP FastCGI request process which would even cause such behavior? Does it call destructor functions / garbage collection? Does it close open resources? How can I troubleshoot this further?
EDIT: Here's a larger source sample:
<?php
class ApparelQuote_ProductPricingMatrix_TestCase
{
protected $myProductId;
protected $myQuantityRanges;
private $myProduct;
protected $myColors;
protected $mySizes;
protected $myQuantityPricing;
public function __construct($product)
{
$this->myProductId = intval($product);
}
/**
* Return an array of all ranges for this matrix.
*
* #return array
*/
public function ranges()
{
$this->myLoadPricing();
return $this->myQuantityRanges;
}
protected function myLoadPricing($force=false)
{
if($force || !$this->myQuantityPricing)
{
$this->myColors = array();
$this->mySizes = array();
$priceRec_finder = new ApparelQuote_ProductPricingRecord();
$priceRec_finder->_link = Module_ApparelQuote::dbLink();
$found_recs = $priceRec_finder->find(_ALL,"`product_id`={$this->myProductId}","`qtyrange_id`,`color_id`");
$qtyFinder = new ApparelQuote_ProductPricingQtyRange();
$qtyFinder->_link = Module_ApparelQuote::dbLink();
$this->myQuantityRanges = $qtyFinder->find(_ALL,"`product_id`=$this->myProductId");
$this->myQuantityPricing = array();
foreach ($found_recs as &$r)
{
if(false) $r = new ApparelQuote_ProductPricingRecord();
if(!isset($this->myColors[$r->color_id]))
$this->myColors[$r->color_id] = true;
if(!isset($this->mySizes[$r->size_id]))
$this->mySizes[$r->size_id] = true;
if(!is_array($this->myQuantityPricing[$r->qtyrange_id]))
$this->myQuantityPricing[$r->qtyrange_id] = array();
if(!is_array($this->myQuantityPricing[$r->qtyrange_id][$r->color_id]))
$this->myQuantityPricing[$r->qtyrange_id][$r->color_id] = array();
$this->myQuantityPricing[$r->qtyrange_id][$r->color_id][$r->size_id] = &$r;
}
$this->myColors = array_keys($this->myColors);
$this->mySizes = array_keys($this->mySizes);
}
}
}
$start_time = DFStdLib::exec_time();
$pmatrix = new ApparelQuote_ProductPricingMatrix_TestCase(19);
$ranges = $pmatrix->ranges();
$end_time = DFStdLib::exec_time();
$duration = $end_time - $start_time;
echo "Output generation duration was: $duration sec";
fastcgi_finish_request();
$fastcgi_finish_request_duration = DFStdLib::exec_time() - $end_time;
DFSkel::log(DFSkel::LOG_INFO,"Output generation duration was: $duration sec; fastcgi_finish_request Duration was: $fastcgi_finish_request_duration sec");
Upon continued debugging I have narrowed down to the following lines from the above:
if(!is_array($this->myQuantityPricing[$r->qtyrange_id][$r->color_id]))
$this->myQuantityPricing[$r->qtyrange_id][$r->color_id] = array();
These statements are building an in-memory array structure of all the data loaded from MySQL. If I comment these out, then fastcgi_finish_request takes roughly 0.0001 seconds to run. If I do not comment them out, then fastcgi_finish_request takes 7+ seconds to run.
It's actually the function call to is_array that's the issue here
Changing to:
if(!isset($this->myQuantityPricing[$r->qtyrange_id][$r->color_id]))
Resolves the problem. Why is this?

Twitter API always saying 400 Bad Request

I am using the following code to retrieve an amount of Tweets from the Twitter API:
$cache_file = "cache/$username-twitter.cache";
$last = filemtime($cache_file);
$now = time();
$interval = $interval * 60; // ten minutes
// Check the cache file age
if ( !$last || (( $now - $last ) > $interval) ) {
// cache file doesn't exist, or is old, so refresh it
// Get the data from Twitter JSON API
//$json = #file_get_contents("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=" . $username . "&count=" . $count, "rb");
$twitterHandle = fopen("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=$username&count=$count", "rb");
$json = stream_get_contents($twitterHandle);
fclose($twitterHandle);
if($json) {
// Decode JSON into array
$data = json_decode($json, true);
$data = serialize($data);
// Store the data in a cache
$cacheHandle = fopen($cache_file, 'w');
fwrite($cacheHandle, $data);
fclose($cacheHandle);
}
}
// read from the cache file with either new data or the old cache
$tweets = #unserialize(file_get_contents($cache_file));
return $tweets;
Of course $username and the other variables inside the fopen request are correct and it produces the correct URL because I get the error:
Warning: fopen(http://api.twitter.com/1/statuses/user_timeline.json?screen_name=Schodemeiss&count=5) [function.fopen]: failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request in /home/ellexus1/public_html/settings.php on line 187
that ^^ error returns whenever I try and open my page.
Any ideas why this might be? Do I need to use OAuth to even just get my tweets!? Do I have to register my website as somewhere that might get posts?
I'm really not sure why this is happening. My host is JustHost.com, but I'm not sure if that makes any diffrence. All ideas are welcome!
Thanks.
Andrew
PS. This code lies inside a function where username, interval and count are passed in correctly, hence in the error code its created a well formed address.
Chances are you are getting rate-limited
400 Bad Request: The request was invalid. An accompanying error
message will explain why. This is the status code will be returned
during rate limiting.
150 requests per hour for non authenticated calls (Based on IP-addressing)
350 requests per hour for authenticated calls (Based on the authenticated users calls)
You have to authenticate to avoid these errors popping up.
And also please use cURL when dealing with twitter. I've used file_get_contents and fopen to call the twitter API, and found that it is very unreliable. You would get hit with that every now and then.
Replace the fopen with
$ch = curl_init("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=$username&count=$count");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$it = curl_exec($ch); //content stored in $it
curl_close($ch);
This may help
Error codes
https://developer.twitter.com/en/docs/basics/response-codes.html
Error codes defination is given in above link

Categories