I am working on a project that needs realtime data from twitter streaming api.
To do that I am using ReactPHP.
When I make the request to the twitter streamintg API, I am getting BadRequest response.
The request (with the payload) is the following:
POST /1.1/statuses/filter.json HTTP/1.0
Host: stream.twitter.com
User-Agent: React/alpha
Authorization: OAuth oauth_nonce="0f1add32ee06141004ea4c02d892bdaf",oauth_timestamp="1405130866",oauth_consumer_key="72XsNwUvJjw0xaSinIk1mzHL0",oauth_token="366141931-Zxljl6Ycwgdh9a5IYPqImJT5zeat2EJOAJOarTq6",oauth_signature_method="HMAC-SHA1",oauth_signature="Rs57ywc26IRNQA1l9zQZwoRMV5Q%3D"
Content-Type: application/x-www-form-urlencoded
Content-Length: 11
track=arg
and to do that I am using:
"jacobkiers/oauth": "1.0." => for oauth
"react/http-client": "" => to connect to the streaming API
you can check the code here:
TwitterConnector.php
thanks
So the answer is really simple, you're making a HTTP 1.0 request while the Twitter API requires a HTTP 1.1 request.
The source of the problem resides in the Request object. In order to work around that I've quickly created two more classes. RequestData11:
<?php
use React\HttpClient\RequestData;
class RequestData11 extends RequestData
{
public function setProtocolVersion($version)
{
}
}
And Client11:
<?php
use React\EventLoop\LoopInterface;
use React\HttpClient\Request;
use React\SocketClient\ConnectorInterface;
class Client11
{
private $connectionManager;
private $secureConnectionManager;
public function __construct(ConnectorInterface $connector, ConnectorInterface $secureConnector)
{
$this->connector = $connector;
$this->secureConnector = $secureConnector;
}
public function request($method, $url, array $headers = array())
{
$requestData = new RequestData11($method, $url, $headers);
$connectionManager = $this->getConnectorForScheme($requestData->getScheme());
return new Request($connectionManager, $requestData);
}
private function getConnectorForScheme($scheme)
{
return ('https' === $scheme) ? $this->secureConnector : $this->connector;
}
}
You normally would create a new client like this:
<?php
$client = $httpClientFactory->create($loop, $dnsResolver);
Instead you'll have to create it this way:
<?php
$connector = new Connector($loop, $dnsResolver);
$secureConnector = new SecureConnector($connector, $loop);
$client = new Client11($connector, $secureConnector);
This forces the client to use HTTP 1.1, beware though, the http-client is a 1.0 client so forcing it to 1.1 should only be done when you know what you're doing.
Note that you didn't lock the react\http-client version in your composer file so I have no clue what version you're using and the bove code might not work because of that.
The previous comment fix the issue with the HTTP version.
But I had another issue.
Twitters kept answering Unauthorize.
Debugging a little, I find out that the issue was that Reacts uses STREAM_CRYPTO_METHOD_TLS_CLIENT encription method instead of STREAM_CRYPTO_METHOD_SSLv23_CLIENT in
React\SocketClient\StreamEncryption
I cant find an elegant way to fix, just rewrite StreamEncryption to use SSL and SecureConnector to use the rewrited StreamEncryption
Related
I have an application running on webserver A. I have a second application running on webserver B. Both webservers require a login. What I need to do is have a request to webserver A pass through to webserver B and return a file to the client without having the client login to Webserver B. (In other words, webserver B will be invisible to the client and I will take care of the auth credentials with my request to B from A). The code below is built on a laravel framework, but I don't believe the answer needs to be laravel specific.
The code works but it is only returning the HEAD information of the file to the calling client. Not the file itself.
Any help will be greatly appreciated!
Controller:
public function getAudioFile(Request $request)
{
//This is the id we are looking to pull
$uid = $request->uniqueid;
$audioServices = new AudioServices();
return $audioServices->getWavFile($uid);
}
Service:
public function getWavFile(String $uniqueId)
{
$client = new GuzzleHttp\Client(['verify' => false]);
return $client->request('GET', $this->connectString.$uniqueId, ['auth' => ['username', 'password']]);
}
As mentioned by bishop you can use sink option from Guzzle to stream the response of a Guzzle request.
You can pass that stream to a response from your controller. I'm not sure if Laravel has built-in stream support, but the underlying symfony httpfoundation components do. An example of it's usage can be found in this tutorial.
If you prefer not to use the sink option from Guzzle you can also use the response itself as that implements PSR-7 stream objects.
I'm working on a php application using Slim framework. My application homepage is making about 20 REST API calls, which is slowing down the page load.
I read that I can use Http Clients like Guzzle to call these API's asynchronously but I couldn't find any article that tells how to use Guzzle with Slim.
Can someone tell how to use Guzzle with Slim.
Or is there any other solution that can speed up the page load?
N.B: I'm a novice in PHP
To use Guzzle with Slim, you need to
Install it by running composer
$ composer require guzzlehttp/guzzle:~6.0
Guzzle installation
Guzzle Quickstart
Create dependency registration, for example
<?php
use GuzzleHttp\Client;
$container = $app->getContainer();
$container['httpClient'] = function ($cntr) {
return new Client();
};
and put it somewhere where it will be executed when index.php the main bootstrap file is loaded.
Then in your code, you can get guzzle instance from container
$guzzle = $container->httpClient;
For example if you have following route
$app->get('/example', App\Controllers\Example::class);
And controller Example as follow
<?php
namespace App\Controllers;
use GuzzleHttp\ClientInterface;
use Psr\Http\Message\ServerRequestInterface as Request;
use Psr\Http\Message\ResponseInterface as Response;
class Example
{
private $httpClient;
public function __construct(ClientInterface $httpClient)
{
$this->httpClient = $httpClient;
}
public function __invoke(Request $request, Response $response, array $args)
{
//call api, etc..etc
$apiResponse = $this->httpClient->get('http://api.blabla.org/get');
//do something with api response
return $response;
}
}
To inject guzzle instance to Example controller, you create its dependency registration
use App\Controllers\Example;
$container[Example::class] = function ($cntr) {
return new Example($cntr->httpClient);
}
To speed up your page load, if you are API developer then start from there. If you are not API developer and have no control, try to think if you can reduce number of API calls by removing non essential ones. Or as last resort, cache API call response to storage that is faster for your application to retrieve later.
For example using redis.
You calculate hash of API url call including its querystring and use hash as key to access cached API call response.
I'm currently working on a PHP application that will be using some websocket connections to talk to another service.
To talk to this websocket service, we are using Ratchet - which is a PHP library based on react PHP.
This piece of code needs to send and respond to a couple of requests, and after that, should return the information to the "main thread".
Example flow:
HTTP request -> controller -> Starts a service which opens a websocket client -> websocket client is talking to server -> once its done it should return the outcome to the controller code -> controller outputs to user
The issue I'm having is that I'm not familiar with Reactive PHP and am not sure how to handle this.
I've tried;
$service = new WebsocketService();
$startTimer = time();
$service->getList(44);
while($service->getResponse() == null) {
usleep(500);
if (time() > $startTimer + 10) {
continue; //Timeout on 10 seconds
}
}
var_dump($service->getResponse());
The service code would set its "response" variable to something other than null once its done. This obviously fails, because the sleep method is blocking the thread. Also without, it seems like the while loop is blocking I/O and the reactive code fails.
A solution would be to open up a new thread and run the websocket code there, but I wouldn't be happy with that.
I feel like I need to implement some sort of "watcher" around the websocket process, but I'm not sure how to do that.
Our Websocket service client code looks like this;
private $response = null;
/**
* #return null|object
*/
public function getResponse() {
return $this->response;
}
public function getList($accountId) {
$this->response = null;
\Ratchet\Client\connect('ws://192.168.56.1:8080')->then(function(\Ratchet\Client\WebSocket $conn) use ($accountId) {
$login = new \stdClass();
$login->action = 'login';
$conn->on('message', function($msg) use ($conn, $login, $accountId) {
try {
$response = json_decode($msg);
if ($response->result_id == 100) {
//Succesfully logged in to websocket server
//Do our request now.
$message = new \stdClass();
$message->target = 'test';
$conn->send(json_encode($message));
}
if (isset($response->reply) && $response->reply == 'list') {
$this->response = $response; //This is the content I need returned in the controller
$conn->close(); //Dont need it anymore
}
} catch (\Exception $e) {
echo 'response exception!';
//Do nothing for now
}
});
$conn->send(json_encode($login));
}, function ($e) {
echo "Could not connect: {$e->getMessage()}\n";
});
}
Running the code like this also does not work;
$service = new WebsocketService();
$service->getList(44);
echo 'Test';
var_dump($service->getResponse());
because the "test" echo comes before I even get a response from the websocket server.
Please, enlighten me! I'm not sure what to search for.
PHP and websockets still seem to be a bit experimental. Nevertheless I have found a great tutorial on medium.com, written by Adam Winnipass which should be really helpful for solving your problem: https://medium.com/#winni4eva/php-websockets-with-ratchet-5e76bacd7548
The only difference is that they are implementing their websocket client with JavaScript instead of PHP. But in the end there should not be much of a difference, because as soon as we have opened the Websocket connection of each end both applications have to send and also wait to receive notifications - this is how they illustrate it:
Seems like one possibility to create a successful Websocket connection is to extend the MessageComponentInterface
use Ratchet\MessageComponentInterface;
which also requires
use Ratchet\ConnectionInterface;
The message component interface defines the following methods:
onOpen
onMessage
onClose
onError
And I think this is how the Ratchet library is implementing it. This is how they are finally starting their server:
use Ratchet\Server\IoServer;
use MyApp\MyCustomMessageComponentInterface;
use Ratchet\Http\HttpServer;
use Ratchet\WebSocket\WsServer;
require dirname(__DIR__) . '/vendor/autoload.php';
$server = IoServer::factory(
new HttpServer(
new WsServer(
new MyCustomMessageComponentInterface()
)
),
8080
);
$server->run();
With this architecture you already can receive (onMessage) and sending is also possible with the send() method.
I can not solve the exact problem with your existing code. But I guess if you are using the pre-built classes and interfaces of the library as intended (and demonstrated here) you should be able to achieve what you want by adding your code to the corresponding methods.
More information and examples can be found in the docs:
http://socketo.me/docs/server
http://socketo.me/api/namespace-Ratchet.html
Are you extending class with WsServer, This might be issue, if you are getting fatal errors. I am not sure whether you are getting fatal errors or warnings. Also i notice the public function onOpen() opens a connection. Please try referring this document http://socketo.me/api/class-Ratchet.WebSocket.WsServer.html might be useful.
(Note: I've intentionally put non adequate websocket tag here, as it's best chance for WebSocket expert folks to know architecture of Ratchet).
I'm up for implementing HTML5 server side events, and what I need is server side solution. Since hanging Apache's one process per connection (connection pool limit, memory consumption...) is out of consideration I was hoping that Ratchet project can be of help, since it's most maintained project and they have http server coupled along with other components.
My question is: how can I use it? Not for upgrading http request (default usage), but for serving dynamically generated content.
What have I tried so far?
installed Ratchet as explained in tutorial
tested WebSocket functionality - works properly
followed very basic set of instructions given on page that describes http server component:
/bin/http-server.php
use Ratchet\Http\HttpServer;
use Ratchet\Server\IoServer;
require dirname(__DIR__) . '/vendor/autoload.php';
$http = new HttpServer(new MyWebPage);
$server = IoServer::factory($http);
$server->run();
One should not be an expert to figure out that MyWebPage class here needs to be declared in order for server to work, but how?
The Ratchet documentation does not seems to cover this.
Your MyWebPage class needs to implement HttpServerInterface. Since it's just going to be a simple request/response you need to send a response and then close the connection within the onOpen() method of your class:
<?php
use Guzzle\Http\Message\RequestInterface;
use Guzzle\Http\Message\Response;
use Ratchet\ConnectionInterface;
use Ratchet\Http\HttpServerInterface;
class MyWebPage implements HttpServerInterface
{
protected $response;
public function onOpen(ConnectionInterface $conn, RequestInterface $request = null)
{
$this->response = new Response(200, [
'Content-Type' => 'text/html; charset=utf-8',
]);
$this->response->setBody('Hello World!');
$this->close($conn);
}
public function onClose(ConnectionInterface $conn)
{
}
public function onError(ConnectionInterface $conn, \Exception $e)
{
}
public function onMessage(ConnectionInterface $from, $msg)
{
}
protected function close(ConnectionInterface $conn)
{
$conn->send($this->response);
$conn->close();
}
}
I ended up using the Ratchet\App class instead of Ratchet\Http\HttpServer because it allows you to set up routing among other things, so your /bin/http-server.php would then look like this:
<?php
use Ratchet\App;
require dirname(__DIR__) . '/vendor/autoload.php';
$app = new App('localhost', 8080, '127.0.0.1');
$app->route('/', new MyWebPage(), ['*']);
$app->run();
When you run php bin/http-server.php and visit http://localhost:8080 you should see the Hello World! response in your browser.
This is all you need for a basic request/response system, but it could be extended further by implementing HTML templates and things like that. I've implemented this myself in a little test project which I've uploaded to github along with a lot of other things, including an abstract controller which I can extend for different pages.
Chat server using Ratchet - Basic
Chat server using Ratchet - Advanced
Check the link above. The guy here is using Ratchet to build a real time chat server. He is basically storing usernames initially and then sending/broadcasting to all. You can modify it and check at the time of sending that certain username or uid is active at the moment and send data to them only. You can generate data dynamically and send to particular users or to all. May be this will help.
I'm using Goutte to make a webscraper.
For development, I've saved a .html document I'd like to traverse (so i'm not constantly making requests to the website). Here's what I have so far:
use Goutte\Client;
$client = new Client();
$html=file_get_contents('test.html');
$crawler = $client->request(null,null,[],[],[],$html);
Which based of what I know should call request in Symfony\Component\BrowserKit, and pass in the raw body data. Here's the error message I'm getting:
PHP Fatal error: Uncaught exception 'GuzzleHttp\Exception\ConnectException' with message 'cURL error 7: Failed to connect to localhost port 80: Connection refused (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)' in C:\Users\Ally\Sites\scrape\vendor\guzzlehttp\guzzle\src\Handler\CurlFactory.
If I were to just use DomCrawler, it's non-trivial to create a crawler using a string. (see: http://symfony.com/doc/current/components/dom_crawler.html). I'm just unsure about how to do the equivalent with Goutte.
Thanks in advance.
Tools you decided to use make real http connections and are not suitable for what you want to do. At least out of the box.
Option 1: Implement your own BrowserKit Client
All goutte does is it extends BrowserKit's Client. It implements http requests with Guzzle.
All you need to do to implement your own client, is to extend the Symfony\Component\BrowserKit\Client and provide the doRequest() method:
use Symfony\Component\BrowserKit\Client;
use Symfony\Component\BrowserKit\Request;
use Symfony\Component\BrowserKit\Response;
class FilesystemClient extends Client
{
/**
* #param object $request An origin request instance
*
* #return object An origin response instance
*/
protected function doRequest($request)
{
$file = $this->getFilePath($request->getUri());
if (!file_exists($file)) {
return new Response('Page not found', 404, []);
}
$content = file_get_contents($file);
return new Response($content, 200, []);
}
private function getFilePath($uri)
{
// convert an uri to a file path to your saved response
// could be something like this:
return preg_replace('#[^a-zA-Z_\-\.]#', '_', $uri).'.html';
}
}
$client = new FilesystemClient();
$client->request('GET', '/test');
Client's request() needs to accept real URIs, therefore you need to implement your own logic to convert it to a filesystem location.
Have a look at Goutte's Client for insipration.
Option 2: Implement a custom Guzzle handler
Since Goutte uses Guzzle, you could provide your own Guzzle handler that would load responses from files, instead of making real http requests. Have a look at the handlers and middleware doc.
If you're just after caching responses so you make less http requests, Guzzle provides support for this already.
Option 3: Use DomCrawler directly
new Crawler(file_get_contents('test.html'))
The only drawback is you'll loose some of convenience methods of the BrowserKit client, like click() or selectLink().