Handling multiple outbound API calls in a PHP web application - php

I'm working on a PHP (Zend Framework) web application that, for each user request, makes multiple calls to external APIs (SOAP and/or REST over HTTP).
At the moment, the API calls are sequential:
Call API A, wait around 1 second for results
Call API B, wait around 1 second for results
Send page back to the user
In this instance there is no dependency or relation between APIs A and B; I simply want to return the page with all the information as quickly as possible.
At the moment I'm thinking of either:
curl_multi_exec() - http://php.net/manual/en/function.curl-multi-exec.php
ZeroMQ - http://www.zeromq.org/
curl_multi_exec() would bind my client code for APIs A and B more tightly than I'd like.
ZeroMQ seems more complex to implement, and I'm not sure how I'd manage the worker processes and sockets.
Has anyone successfully implemented this behaviour in a PHP/Apache application without too much fuss?

Sounds like you need a cache. They are pretty easy to make and can be either filesystem or any database extension.

Related

Creating a new elastic search client with every requests

I was checking a part of my application in which I connect to elasticsearch host server and then I realized for every time the front-end sends an report request to my back-end I'm creating an instance of elasticsearch client class using the following code :
$elasticClient = ClientBuilder::create()->setHosts($this->setHostsParams())->build();
Since our application sends about 20 requests to the back-end by loading the first page, I was considering if PHP's elasticsearch library might be capable of optimizing the initiation phase, or if anyone has a better solution for this, or it might not be a big of a deal after all and it's not a real overhead!?
PS : I did some research with it and didn't find any resources covering this subject.
Sharing an object instance is already discussed here and elsewhere so I'm not going to go into that.
What I'd point out, though, is there there's an elasticsearch API called _msearch which enables you to send multiple search payloads at the same time and the system will respond after all the individual requests have resolved. Here's some sample PHP usage.
This might be useful if you need all your ~20 requests resolved at once -- though it may be useless if you defer some of those requests only after, say, a user scrolls down and what not.

How to call microservice without slowing down the response?

I want to integrate a new functionality with a Laravel based ecommerce solution. At this point the main scripts takes around 2.7s to run. The whole site loads in above 6s and we've just started to monitor it. The goal is to get below 2s with script and 4s with everything.
The microservice and the functionality is exposed through a gRPC.
There is a TLS based client-server authentication in place (ecommerce instances and my service can prove who they are). This eats few milliseconds.
When testing Go-client and Go-server, with a pool of 20 connections, it achieved below 35ms per requests.
In PHP each request takes above 200ms.
Is it possible to:
cache the connection to service between requests?
call RPC methods asynchronously?
Among other solutions I'm considering:
Setting up a local gRPC proxy which will accept only localhost GET requests made by PHP script and make them a secure gRPC calls.
Setting up a proxy in front of PHP application to call microservice.
Calling a service directly from website with JavaScript (puts a burden on a users browser, need to maintain JavaScript).
Any suggestions?
The connection should be re-used if you are using the same Client. On the other hand, there is an option to pre-create a Grpc\Channel object first and then pass it to the your service client as an optional 3rd parameter: https://github.com/grpc/grpc/blob/master/src/php/lib/Grpc/BaseStub.php#L58. That way you should be able to re-use the same connection across services.
Currently we don't provide an async API for PHP. We did have a tracking issue https://github.com/grpc/grpc/issues/6654 which we may consider in the future

About WebJobs in Azure: Need to execute PHP code continuosly

I'm quite new to the Azure interface, but I have been working with PHP for a while.
I have been asked to make a routine that executes every some time at background, whose objective is to send some marketing mail.
And I have been reading about WebJobs. I can't quite get the grasp of it, though.
For me the documentation is a bit overwhelming, to say at least. So what I want to do is understand how WebJobs work and use them to execute PHP code every some time, without needing user input.
As I have said before, I have never used Azure before and have been never asked to do such things on PHP either, at least not this complex.
There is a walkthrough of how to create a webjob on the azure docs - php is supported in webjobs. Webjobs are essentially a means for App Services to run a non-interactive process on a triggered or continuous basis. You don't have to use PHP, you can run another .exe if you like. Personally I write code in c# using the webjobs SDK and deploy those, they ease the way in which triggers, inputs and outputs are passed to/from your webjob via a nice simple binding process.
Theres a more detailed explanation here. Webjobs are hosted in your app service plan, which you can look at as a container for resources used to run and host your web sites, web apis, and web jobs.
Last copuple of things to say are 1 - that via the portal you can see the status of all your webjobs, when they triggered, what the console output was, if they succeeded or failed, etc. and 2 - Azure Functions do the same thing but in a different way - they use the webjobs api but present as a "serverless" experience instead (ie. no app service required). So if you don't want to be concerned with a web site or managing the scaling yourself, see Functions documentation

How to talk between HTTP request & cli class objects

I have a application running that's listing to HTTP request. Each request is passed to single page where a framework object $app is instantiated and this takes care of routing / controller / model etc.
Now i have a another class whose object is instantiated via. a CLI script lets call it $cliApp now problem is how do i make both the object talk to each other. $app is instantiate every-time there is a new request.
But $cliApp is instantiate only once when script is ran. This scripts runs in loop via $loop object by PHP React Event loop.
Cli App is running websockets. So basically i want http & sockets to communicate via. http api.
P.s. :
Right now i have one solution to use message queueing e.g. 0mq etc. but that seems overkill since i'm not looking to scale and keep it simple.
Another solution i'm currently trying and feels right is to share a SptStorageObject between threads created by $http request and thread created by $cli request. Maybe this is question of dependency injection and i'm having troubles to share this $store object.
If I understand correctly, you have (assumptions noted):
a normal PHP web app that communicates over HTTP (presumably on Apache or similar webserver)
a long-running PHP cli app that communicates over websockets.
Presumably both apps are receiving communication from web clients on an ongoing basis. Presumably they also have their own persistent data stores, such as a MySQL database or similar, perhaps even sharing the same one.
I'm going to assume that what you need goes beyond each application accessing the most up-to-date data from the persistent data store (or that the two processes use separate data stores), and you actually need on-demand communication between the two processes.
You're on the right path with message queues, but as you note it's needless complexity to add a third dedicated inter-process communication layer when you've already got two communication layers that work perfectly fine on their own.
What you need is for your cli app to speak HTTP when it needs to initiate communication with your web app, and for your web app to speak web sockets when it needs to initiate communication with your cli app.
What this looks like in practice is fairly simple.
In your cli app, just use cURL to initiate an HTTP connection to your web app. This is fairly simple, there are endless resources out on the web to help you along the way and if you get stuck then coming here with a new question specific to your problem will get you going. All this requires in your web app is the following:
appropriate endpoint(s) which the cli app can send requests to, if the basic client facing pages won't suffice
some method to authenticate the cli app if it needs to access data that should not be available to web clients
For your web app to initiate a websocket connection to the cli app, it's a bit more complicated because I'm not aware of any native PHP functionionality that specifically targets the websocket protocol. However, I did find this (extremely permissive) github project that purports to give you the ability to set up a web socket server, and it also includes a client script that you could use to connect to and send/receive data while your web app process lives, and then shut it down when you're done. It appears to still have some minimal activity, you could use that directly or use it as a starting point to write your own websocket client.
In this case, just as in the reverse situation, you need the cli client to recognize and authenticate traffic from your web client so it can serve appropriate data just to it.
If for some reason this scenario won't work for you, then you're back to message queues or shared data stores (someone suggested redis, which can act as a hybrid data store/message queue under some circumstances).

standard method to get notfication from database on change/insertion

I am curently trying to make a chat application aimed at 1000-1500 users. What currently happens is once my webpage loads I make an ajax request every second to check if there is anything new in the database. I want to know if this is the standard practise or if there is a more efficient way to be notified by the server somehow when an insertion occurs.
Use WebSockets. Or at the very least AJAX polling. Firing a request every second from 1500 clients will most likely kill your server.
Look at http://socket.io/ if you are open to introduce something new to your stack. But there are PHP websocket solutions out there if you are limited to PHP.
Your approach is a standard method called Polling. Based on the number of clients this should be perfectly fine for a server with up-to-date hard-ware (do HEAD requests via AJAX that indicate the status via HTTP status code).
The other alternative - as pointed out by Jan - is called Pushing.
Pros: Involves a lot less requests to the server.
Cons: Requires technology that may or may not be provided by your client's browser.
In case you'll opt for the second approach, take a look into Server-Sent Events (W3C draft).
This specification defines an API for opening an HTTP connection for receiving push notifications from a server in the form of DOM events. The API is designed such that it can be extended to work with other push notification schemes such as Push SMS.

Categories