Guzzle custom handler that switches to another host on failure - php

I need to create a sort of High Availability Guzzle 6 client which, given a list of 3 hosts, falls back to sending its requests on the next one when the current one is down / does not respond.
This functionality has to be built in the Client object. I was told I needed to use a custom handler, but can't understand how to do it.

Related

How to log every request to the REST API in the Database?

I have a PHP application built in Symfony. All the infrastructure is on AWS. I also have multiple APIs which are hosted on a ec2 instance connected to a READ-ONLY replica of MySQL Aurora.
My question is how can I log(store in the DB) every API call (user info, call timestamp, which parameters they are passing etc.).
I cannot insert the logging (storing in the DB) into the api endpoint because insert is a time consuming and will degrade our API performance.
Thanks
As I understand the question the main goal is to log all requests (e.g. in the database) without a negative impact on serving the response to the user.
Symfony offers multiple KernelEvents which are are triggered at different points in time while serving a request.
The kernel.terminate event is triggerd after the response was send to the user, when the kernel is about to shout down. This is the perfect time to do clean up work and perform other stuff which should have no influces on the necessary time to create the response.
So, simply create a event subscriber or event listener to handle kernel.terminate and perform your logging here without influencing performance.

Async processing and Laravel

We have API calls going from Laravel back-end to multiple providers for fetching flight fare/availability data. The response from these providers come after different time periods. One provider may give us a response in 2 seconds, another in 5 seconds and so on. The customer ends up waiting till all providers return data to the back-end. As a workaround, now we are sending multiple requests from the front-end to Laravel - one for each provider. So the customer starts seeing data as soon as we get a response from one provider. This approach has issues - if we want to add one more provider, we have code changes at the UI level. If we want to enable/disable providers, again code change at UI is necessary. We are using Ionic for UI and Laravel for the back-end. Which is the best approach to tackle this problem? We want to keep pushing data to the front-end as and when we receive responses at the back-end. The UI layer should be able to keep receiving data till the back-end sort of says - 'Done, no more data'. A combination of web sockets and Laravel queues? Just a wild guess based on google. Switching from Laravel to another technology can be considered.

Is there any way to post multiple item's in one request?

I'am developing a mobile web application Tasks (for iPhone) with a local database (using json files) so my app is still usable when the user is offline.
This is working perfectly, but I want to save the local data on a server. So I need to synchronize the local DB with a DB on a server (using REST(ful)way).
What I want to do is:
Collect all tasks and send to the server. At the moment I see two options to do this:
Send each task to the server: POST /tasks
I actually don't want to do this because I want to limit the number of requests to the server so option 2:
Collect all the tasks and send them to the server at once.
Is there any way to do this with (maybe with slimframework php) ?
I guess that you want to do some bulk updates on your RESTful application.
In fact, the method POST on the list resource /tasks is generally used to add an element but it can also be used to add more than one element. In such case, you need to support a parameter (something in a dedicated header) to determine which "action" to execute on the method POST.
A method PATCH can also be used for such use case. This is typically designed for this and can contain a list of operations to do for elements (add, delete, update).
I think that these two answers could give you some more hints:
REST API - Bulk Create or Update in single request - REST API - Bulk Create or Update in single request
How to Update a REST Resource Collection - How to Update a REST Resource Collection
Hope this helps you,
Thierry

Using node with MySql OR using node with PHP (which will handle the MySql operations)

I have a node.js server that serves an interactive HTML page to the client browser. SO in this case, the client frontend page is not served from the Apache server but is served from node.js server.
When user perform some actions (i.e. create a wall post or comment), I am thinking of using one of the possible flow of operations to manage them, but I have no idea which one is better (in term of performance, scalability for large amount of users, and ease of implementation).
So these are my options:
OPTION 1:
Socket.io will send the message details to node server (which listens to a particular port). Message will be encoded with JSON.
Node receives the message.
Node will communicate directly to the mysql database. Input sanitization is performed at this step.
Node broadcast the details to the other users who are subscribed within the same socket.io room.
OPTION 2:
Socket.io will send the details to node server. Message will be encoded with JSON.
Node receives the message.
Node calls PHP using HTTP POST. The PHP controller will then handle the HTTP POST message and will save all the corresponding details to the mysql databsase. Input sanitization will be performed at this step.
PHP will notify node.js server via redis, let's say using a channel called "mysqldb".
node will subscribe to "mysqldb" channel, and will then broadcast the corresponding details to the users who are subscribed within the same socket.io room.
Can you advise me the advantage and disadvantages of the above 2 approach? Is there any better approach?
Counter questions:
How many steps are required for the same action in each of you options until everything is through?
Think about all the things that could go wrong / fail / break for each of the options?
How much easier would it be if everything is written in one language?
How much time could be saved if data is not passed arround between node, PHP, MySQL and redis?
How many software packages do you have to install and configure for each of the options?
Unless you have a very very specific reason to plug PHP into fully functional software stack and actually do not know why this question even was asked.
Short version: Option 1

Multiple and Growing Outbound CURL Request

Multi Outbound Request - Curl
Here's the problem, I have several clients I have to CURL outbound requests to. Say for instance I currently have 20 clients and I send around 100~1000 requests per minute to each of them. I also use CURL multi however it seems to have a limitation of how many requests it can make at a time and it will also depend on the longest CURL response for the entire routine to complete.
For instance I have the following clients:
Client 1
url: http://www.exampleclient1.com/process/
Client 2
url: http://www.exampleclient2.com/process/
... and so on
The main issue here is that I have a single script that does the job for every client. Say,
http://localhost/app/send/client1
> will send out the pending outbound queue to client 1's url
http://localhost/app/send/client2
> will send out the pending outbound queue to client 2's url
... and so on
The reason why I separated them is because there should be dedicated connections between clients and their latencies are different from each other. Some clients respond faster and have faster servers while some clients have slow servers or take more hops to reach.
Here's my question: Is there a means to simplify this process? Because it's a hassle that everytime I have to add a client to my database I'll also have to add
http://localhost/send/newclient1
http://localhost/send/newclient2
.
.
.
http://localhost/send/newclientn
to the list of cronjobs. Is it possible to put it in a single script instead so as my list of clients grow it will not affect the overall performance of the outbound CURL function I have?
By the way, I'm using PHP, CURL. If there's a solution that recommends the use of another technology other than PHP for this, a linux queuing manager...etc, you're welcome to suggest.
Thanks!
I have similar problem that you face, common issues are:
- Different response time of Resources URLs ( as you may call them clients )
- The Demand of adding new URL you add one or ten at once.
In addition to commons, my other observations:-
- URL from different services providers,
- with different data format
- and they require different type of requests for what you want.
I have reviewed couple of PHP-cURL libraries:
Rolling-Curl and Perallel-Curl , covered in this article .
PHP-Multi-curl (epiCurl) , this one I prefer.
multi-curl-php, code example .
phpmulticurl, Code Exmaple .
and there are other libraries and classes...
Now managing the the database and clients' send requests, you will at least well know RDBMS such MySQL or SQLite, other options RDBMS are considered.
In the RDBMS create a table that contains requests as in { "clientNo"
=> "URL" }
You can add your list of clients using PHPmyAdmin or any database management interface.
The Script: create a script that fetch the whole entry of database, one by one within foreach loop; preform creation on CURL node and adding handler to queue of multiple CURl calls.
test your script, and for production add it to cronjob.
every time you have to add a client , just add him to RDBMS (MySQL, SQLite ) using database management interface.
NOTE: if you are doing outbound management, you have to consider security issues, proxy; and possible create PHP and HTML form for submitting new clients.

Categories