Client PHP calling .NET webservice keep session information - php

I'm bulding a PHP client (I have already a C# client) as proof of concept. The PHP client will connect with the same .NET server using SOAP. As example I'm using the game blackjack.
Now first the C# client works perfect, but there is an issue in the PHP. After much debugging I found out that PHP always uses a new connection for every remote server call. This makes it impossible to program a game.
For example, I have a PHP file that just sets up the client like this (client.php):
try {
$client = #new soapClient("http://localhost:8000/BlackJack/Service?wsdl",
array(
"trace" => 1,
"exceptions" => 0,
"cache_wsdl" => 0)
);
} catch (Exception $e) {
print 'Caught exception: ' . $e->getMessage() . "\n";
}
Then in my main file (ill be using jQuery and ajax calls to load it dynamically but now im just testing) I have the following code (blackJackClient.php):
require_once("client.php");
$ini = ini_set("soap.wsdl_cache_enabled","0");
$BetAmountPost = 100;
print_r($result = $client->buttonDeal_Click(array("BetAmount" => (string)$BetAmountPost))->buttonDeal_ClickResult);
print_r($result = $client->PlayerMoney()->PlayerMoneyResult);
The first call will start a new game (Deal) and the player his bet amount (for example 100) gets subtracted from the total amount (1000). So what do I get returned in result is money = 900.
The following commando will ask what money I currently have, and one expects to get returned 900, but no instead I have 1000 (my starting amount).
So my question is how can I make it that all call's are made in 1 session so we still use the same connection?
Thanks!

Since SOAP uses the HTTP protocol, and HTTP is stateless, there is no way to keep your connection open during the session of the game.
Instead, you should send your user's authentication with every request to the server.

Related

Rate-limiting Guzzle Requests in Symfony

This actually follows on from a previous question I had that, unfortunately, did not receive any answers so I'm not exactly holding my breath for a response but I understand this can be a bit of a tricky issue to solve.
I am currently trying to implement rate limiting on outgoing requests to an external API to match the limit on their end. I have tried to implement a token bucket library (https://github.com/bandwidth-throttle/token-bucket) into the class we are using to manage Guzzle requests for this particular API.
Initially, this seemed to be working as intended but we have now started seeing 429 responses from the API as it no longer seems to be correctly rate limiting the requests.
I have a feeling what is happening is that the number of tokens in the bucket is now being reset every time the API is called due to how Symfony handles services.
I am setting currently setting up the bucket location, rate and starting amount in the service's constructor:
public function __construct()
{
$storage = new FileStorage(__DIR__ . "/api.bucket");
$rate = new Rate(50, Rate::MINUTE);
$bucket = new TokenBucket(50, $rate, $storage);
$this->consumer = new BlockingConsumer($bucket);
$bucket->bootstrap(50);
}
I'm then attempting to consume a token before each request:
public function fetch(): array
{
try {
$this->consumer->consume(1);
$response = $this->client->request(
'GET', $this->buildQuery(), [
'query' => array_merge($this->params, ['api_key' => $this->apiKey]),
'headers' => [ 'Content-type' => 'application/json' ]
]
);
} catch (ServerException $e) {
// Process Server Exception
} catch (ClientException $e) {
// Process Client Exception
}
return $this->checkResponse($response);
}
I can't see anything obvious in that, that would allow it to request more than 50 times per minute unless the amount of available tokens was being reset on each request.
This is being supplied to a set of repository services that handle converting the data from each endpoint into objects used within the system. Consumers use the appropriate repository to request the data needed to complete their process.
If the amount of tokens is being reset by the bootstrap function being in service constructor, where should it be moved to within the Symfony framework that would still work with consumers?
I assume that it should work, but maybe try to move the ->bootstrap(50) call from every request? Not sure, but it can be the reason.
Anyway it's better to do that only once, as a part of your deployment (every time you deploy a new version). It doesn't have anything with Symfony, really, because the framework doesn't have any restrictions on deployment procedure. So it depends on how you do the deployment.
P.S. Have you considered to just handle 429 errors from the server? IMO you can wait (that's what BlockingConsumer does inside) when you receive 429 error. It's simpler and doesn't require an additional layer in your system.
BTW, have you considered nginx's ngx_http_limit_req_module as an alternative solution? It usually comes with nginx by default, so no additional actions to install, only a small configuration is required.
You can place an nginx proxy behind your code and the target web service and enable limits on it. Then in your code you will handle 429 as usual, but the requests will be throttled by your local nginx proxy, not by the external web service. So the final destination will get only limited amount of requests.
I have found a trick using Guzzle bundle for symfony.
I had to improve a sequential program sending GET requests to a Google API. In code example, it a pagespeed URL.
To have a rate limit, there an option to delay the requests before they are sent asynchronously.
Pagespeed rate limit is 200 requests per minute.
A quick calculation gives 200/60 = 0.3s per request.
Here is the code I tested on 300 urls, getting a fantastic result of no error, except if the url passed as a parameter in the GET request gives a 400 HTTP Error (Bad request).
I put a delay of 0.4s and the average result time is less then 0.2s, whereas it took more than a minute with a sequential program.
use GuzzleHttp;
use GuzzleHttp\Client;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\Exception\ClientException;
// ... Now inside class code ... //
$client = new GuzzleHttp\Client();
$promises = [];
foreach ($requetes as $i=>$google_request) {
$promises[] = $client->requestAsync('GET', $google_request ,['delay'=>0.4*$i*1000]); // delay is the trick not to exceed rate limit (in ms)
}
GuzzleHttp\Promise\each_limit($promises, function(){ // function returning the number of concurrent requests
return 100; // 1 or 100 concurrent request(s) don't really change execution time
}, // Fulfilled function
function ($response,$index)use($urls,$fp) { // $urls is used to get the url passed as a parameter in GET request and $fp a csv file pointer
$feed = json_decode($response->getBody(), true); // Get array of results
$this->write_to_csv($feed,$fp,$urls[$index]); // Write to csv
}, // Rejected function
function ($reason,$index) {
if ($reason instanceof GuzzleHttp\Exception\ClientException) {
$message = $reason->getMessage();
var_dump(array("error"=>"error","id"=>$index,"message"=>$message)); // You could write the errors to a file or database too
}
})->wait();

MQTT Subscribe with PHP to IBM Bluemix

I want to connect to IBM Bluemix through the MQTT protocol using PHP to subscribe to messages come from IoT Foundation.
I use this code:
<?php
require("../phpMQTT.php");
$config = array(
'org_id' => 't9m318',
'port' => '1883',
'app_id' => 'phpmqtt',
'iotf_api_key' => 'my api key',
'iotf_api_secret' => 'my api secret',
'device_id' => 'phpmqtt'
);
$config['server'] = $config['org_id'] .'.messaging.internetofthings.ibmcloud.com';
$config['client_id'] = 'a:' . $config['org_id'] . ':' .$config['app_id'];
$location = array();
// initialize client
$mqtt = new phpMQTT($config['server'], $config['port'], $config['client_id']);
$mqtt->debug = false;
// connect to broker
if(!$mqtt->connect(true, null, $config['iotf_api_key'], $config['iotf_api_secret'])){
echo 'ERROR: Could not connect to IoT cloud';
exit();
}
$topics['iot-2/type/+/id/phpmqtt/evt/+/fmt/json'] =
array("qos"=>0, "function"=>"procmsg");
$mqtt->subscribe($topics, 0);
// process messages
while ($mqtt->proc(true)) {
}
// disconnect
$mqtt->close();
function procmsg($topic, $msg) {
echo "Msg Recieved: $msg";
}
?>
But the browser show this message:
Fatal error: Maximum execution time of 30 seconds exceeded in /Library/WebServer/Documents/phpMQTT/phpMQTT.php on line 167
subscribe is not meant to run in the web browser as it has an infinite look, its best being run from the command line.
If you are using the subscribe method to receive messages you can look at persistent msgs and breaking out of the loop on msg receipt.
There is an example of how to use phpMQTT in the web browser in the file
web-app.php of this respository https://github.com/vvaswani/bluemix-iotf-device-tracker
You don't provide very much information about what you want to achieve by doing this; do you want to keep sending messages to the browser until the page is closed in the browser?
Server Sent Events or Websockets might be a better bet, and PHP might not be the best choice for this, because it uses up quite a lot of memory per connection (compared to node.js for example).
However if you just want to remove the 30 second PHP timeout, then you can use this function:
http://php.net/manual/en/function.set-time-limit.php
Or set max_execution_time in php.ini:
http://php.net/manual/en/info.configuration.php
Setting the maximum execution time to 0 should stop it from timing out.
But be warned that PHP and/or your webserver will have a limited number of concurrent HTTP connections.

PHP Azure SDK Service Bus returns Malformed Response

I'm working on trace logger of sorts that pushes log message requests onto a Queue on a Service Bus, to later be picked off by a worker role which would insert them into the table store. While running on my machine, this works just fine (since I'm the only one using it), but once I put it up on a server to test, it produced the following error:
HTTP_Request2_MessageException: Malformed response: in D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php on line 1013
0 HTTP_Request2_Response->__construct('', true, Object(Net_URL2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:1013
1 HTTP_Request2_Adapter_Socket->readResponse() D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:139
2 HTTP_Request2_Adapter_Socket->sendRequest(Object(HTTP_Request2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2.php:939
3 HTTP_Request2->send() D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\Http\HttpClient.php:262
4 WindowsAzure\Common\Internal\Http\HttpClient->send(Array, Object(WindowsAzure\Common\Internal\Http\Url)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\RestProxy.php:141
5 WindowsAzure\Common\Internal\RestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\ServiceRestProxy.php:86
6 WindowsAzure\Common\Internal\ServiceRestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:139
7 WindowsAzure\ServiceBus\ServiceBusRestProxy->sendMessage('<queuename>/mes…', Object(WindowsAzure\ServiceBus\Models\BrokeredMessage)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:155
⋮
I've seen previous posts that describe similar issues; Namely:
Windows Azure PHP Queue REST Proxy Limit (Stack Overflow)
Operations on HTTPS do not work correctly (GitHub)
That imply that this is a known issue regarding the PHP Azure Storage libraries, where there are a limited amount of HTTPS connections allowed. Before requirements were changed, I was accessing the table store directly, and ran into this same issue, and fixed it in the way the first link describes.
The problem is that the Service Bus endpoint in the connection string, unlike Table Store (etc.) connection string endpoints, MUST be 'HTTPS'. Trying to use it with 'HTTP' will return a 400 - Bad Request error.
I was wondering if anyone had any ideas on a potential workaround. Any advice would be greatly appreciated.
Thanks!
EDIT (After Gary Liu's Comment):
Here's the code I use to add items to the queue:
private function logToAzureSB($source, $msg, $severity, $machine)
{
// Gather all relevant information
$msgInfo = array(
"Severity" => $severity,
"Message" => $msg,
"Machine" => $machine,
"Source" => $source
);
// Encode it to a JSON string, and add it to a Brokered message.
$encoded = json_encode($msgInfo);
$message = new BrokeredMessage($encoded);
$message->setContentType("application/json");
// Attempt to push the message onto the Queue
try
{
$this->sbRestProxy->sendQueueMessage($this->azureQueueName, $message);
}
catch(ServiceException $e)
{
throw new \DatabaseException($e->getMessage, $e->getCode, $e->getPrevious);
}
}
Here, $this->sbRestProxy is a Service Bus REST Proxy, set up when the logging class initializes.
On the recieving end of things, here's the code on the Worker role side of this:
public override void Run()
{
// Initiates the message pump and callback is invoked for each message that is received, calling close on the client will stop the pump.
Client.OnMessage((receivedMessage) =>
{
try
{
// Pull the Message from the recieved object.
Stream stream = receivedMessage.GetBody<Stream>();
StreamReader reader = new StreamReader(stream);
string message = reader.ReadToEnd();
LoggingMessage mMsg = JsonConvert.DeserializeObject<LoggingMessage>(message);
// Create an entry with the information given.
LogEntry entry = new LogEntry(mMsg);
// Set the Logger to the appropriate table store, and insert the entry into the table.
Logger.InsertIntoLog(entry, mMsg.Service);
}
catch
{
// Handle any message processing specific exceptions here
}
});
CompletedEvent.WaitOne();
}
Where Logging Message is a simple object that basically contains the same fields as the Message Logged in PHP (Used for JSON Deserialization), LogEntry is a TableEntity which contains these fields as well, and Logger is an instance of a Table Store Logger, set up during the worker role's OnStart method.
This was a known issue with the Windows Azure PHP, which hasn't been looked at in a long time, nor has it been fixed. In the time between when I posted this and now, We ended up writing a separate API web service for logging, and had our PHP Code send JSON strings to it over cURL, which works well enough as a temporary work around. We're moving off of PHP now, so this wont be an issue for much longer anyways.

How to speedup php soap client

I'm using a web service to send SMS in PHP. The code in like below:
$options = array(
'login' => 'yourusername',
'password' => 'yourpassword'
);
$client = new SoapClient('http://sms.hostiran.net/webservice/?WSDL', $options);
try
{
$messageId = $client->send(destination mobile number, 'test sms');
sleep(3);
print ($client->deliveryStatus($messageId));
var_dump($client->accountInfo());
}
catch (SoapFault $sf)
{
print $sf->faultcode."\n";
print $sf->faultstring."\n";
}
The problem is that when i run this code on a WAMP server, it runs rapidly.But when i use this code in an ubuntu server, the speed of running this code is very low.
Is there any configuration in php.ini to solve this problem ?
Thanks!
First, you need to remove sleep(3). That makes it take an extra 3 seconds.
Second, it looks like the sms provider is in Iran so it'd be best for you to get a web server in Iran.
As far as I know there is no reason why a Ubuntu server would be slower at SOAP than a Windows server
If you want to try and speed up the webpage what you should do is instead of running the SOAP request on page load you save the request to a database and then have a cron that runs every few minutes, pulls the requests out of the database, and makes the request.

Speeding up a soap powered website

We're currently looking into doing some performance tweaking on a website which relies heavily on a Soap webservice. But ... our servers are located in Belgium and the webservice we connect to is locate in San Francisco so it's a long distance connection to say the least.
Our website is PHP powered, using PHP's built in SoapClient class.
On average a call to the webservice takes 0.7 seconds and we are doing about 3-5 requests per page. All possible request/response caching is already implemented so we are now looking at other ways to improved the connection speed.
This is the code which instantiates the SoapClient, what i'm looking for now is other ways/methods to improve speed on single requestes. Anyone has idea's or suggestions?
private function _createClient()
{
try {
$wsdl = sprintf($this->config->wsUrl.'?wsdl', $this->wsdl);
$client = new SoapClient($wsdl, array(
'soap_version' => SOAP_1_1,
'encoding' => 'utf-8',
'connection_timeout' => 5,
'cache_wsdl' => 1,
'trace' => 1,
'features' => SOAP_SINGLE_ELEMENT_ARRAYS
));
$header_tags = array('username' => new SOAPVar($this->config->wsUsername, XSD_STRING, null, null, null, $this->ns),
'password' => new SOAPVar(md5($this->config->wsPassword), XSD_STRING, null, null, null, $this->ns));
$header_body = new SOAPVar($header_tags, SOAP_ENC_OBJECT);
$header = new SOAPHeader($this->ns, 'AuthHeaderElement', $header_body);
$client->__setSoapHeaders($header);
} catch (SoapFault $e){
controller('Error')->error($id.': Webservice connection error '.$e->getCode());
exit;
}
$this->client = $client;
return $this->client;
}
So, the root problem is number of request you have to do. What about creating grouped services ?
If you are in charge of the webservices, you could create specialized webservices which do multiple operations at the same time so your main app can just do one request per page.
If not you can relocate your app server near SF.
If relocating all the server is not possible and you can not create new specialized webservices, you could add a bridge, located near the webservices server. This bridge would provide the specialized webservices and be in charge of calling the atomic webservices. Instead of 0.7s * 5 you'd have 0.7s + 5 * 0.1 for example.
PHP.INI
output_buffering = On
output_handler = ob_gzhandler
zlib.output_compression = Off
Do you know for sure that it is the network latency slowing down each request? 0.7s seems a long round time, as Benoit says. I'd look at doing some benchmarking - you can do this with curl, although I'm not sure how this would work with your soap client.
Something like:
$ch = curl_init('http://path/to/sanfrancisco/');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = curl_exec($ch);
$info = curl_getinfo($ch);
$info will return an array including elements for total_time, namelookup_time, connect_time, pretransfer_time, starttransfer_time and redirect_time. From these you should be able to work out whether it's the dns, request, the actual soap server or the response that's taking up the time.
One obvious thing that's just occurred to me is are you requesting the SOAP server via a domain or an IP? If you're using a domain, your dns might be slowing things down significantly (although it will be cached at several stages). Check your local dns time (in your soap client or php.ini - not sure) and the TTL of your domain (in your DNS Zone). Set up a static IP for your SanFran server and reference it that way if not already.
Optimize the Servers (not the client!) HTTP response by using caching and HTTP compressing. Check out the tips at yahoo http://developer.yahoo.com/performance/rules.html
1 You can assert your soap server use gzip compression for http content, as well as your site output does. A 0,7s roundup to SF seems a bit long, it's either webservice is long to answer, either there is an important natwork latency.
If you can, give a try to other hosting companies for your belgium server, in France some got a far better connectivity to the US than others.
I experienced to move a website from one host o another and network latency between Paris and New york has almost doubled ! it's uge and my client with a lot of US visitors was unhappy with it.
The solution of relocating web server to SF can be an option, you'll get a far better connectivity between servers, but be careful of latency if your visitors are mainly located in Europe.
2 You can use an opcode cache mecanism, such as xcache or APC. It wil not change the soap latency, but will improve php execution time.
3 Depending if soap request are repetitive, and how long could a content update could be extended, you can give it a real improvement using cache on soap results. I suggest you to use in-memory caching system (Like xcache/memcached or other) because they're ay much faster than file or DB cache system.
From your class, the createclient method isn't the most adapted exemple functionality to be cached, but for any read operation it's just the best way to perf :
private function _createClient()
{
$xcache_key = 'clientcache'
if (!xcache_isset($key)) {
$ttl = 3600; //one hour cache lifetime
$client = $this->_getClient(); ///private method embedding your soap request
xcache_set($xcache_key, $client, $ttl);
return $client;
}
//return result form mem cache
return xcache_get($xcache_key);
}
The example is for xcache extension, but you can use other systems in a very similar manner
4 To go further you can use similar mecanism to cache your php processing results (like template rendering output and other ressource consumming operations). The key to success with this technic is to know exactly wich part is cached and for how long it will stay withous refreshing.
Any chance of using an AJAX interface.. if the requests can be happening in the background, you will not seem to be left waiting for the response.

Categories