We have API calls going from Laravel back-end to multiple providers for fetching flight fare/availability data. The response from these providers come after different time periods. One provider may give us a response in 2 seconds, another in 5 seconds and so on. The customer ends up waiting till all providers return data to the back-end. As a workaround, now we are sending multiple requests from the front-end to Laravel - one for each provider. So the customer starts seeing data as soon as we get a response from one provider. This approach has issues - if we want to add one more provider, we have code changes at the UI level. If we want to enable/disable providers, again code change at UI is necessary. We are using Ionic for UI and Laravel for the back-end. Which is the best approach to tackle this problem? We want to keep pushing data to the front-end as and when we receive responses at the back-end. The UI layer should be able to keep receiving data till the back-end sort of says - 'Done, no more data'. A combination of web sockets and Laravel queues? Just a wild guess based on google. Switching from Laravel to another technology can be considered.
Related
I have a PHP application built in Symfony. All the infrastructure is on AWS. I also have multiple APIs which are hosted on a ec2 instance connected to a READ-ONLY replica of MySQL Aurora.
My question is how can I log(store in the DB) every API call (user info, call timestamp, which parameters they are passing etc.).
I cannot insert the logging (storing in the DB) into the api endpoint because insert is a time consuming and will degrade our API performance.
Thanks
As I understand the question the main goal is to log all requests (e.g. in the database) without a negative impact on serving the response to the user.
Symfony offers multiple KernelEvents which are are triggered at different points in time while serving a request.
The kernel.terminate event is triggerd after the response was send to the user, when the kernel is about to shout down. This is the perfect time to do clean up work and perform other stuff which should have no influces on the necessary time to create the response.
So, simply create a event subscriber or event listener to handle kernel.terminate and perform your logging here without influencing performance.
I have made a website for Employers in Laravel 5. Now I want to connect and sync it with a second website on different domain but same server. The 2nd website is for jobseekers which will listen to events for database change by 1st application. For example if a candidate is shortlisted by a employer that corresponding event should be captured and handled in jobseeker application. Please suggest a way to do that, I am not much experienced in Laravel.
You can have the first website make an API call to the second to let it know something has changed. Alternatively, you can have the first website use Laravel's event system in conjunction with the Queue system. Your second application can connect to the database of first application to retrieve and modify its queue.
The API solution provides a bit more flexibility, especially if they were to ever be on separate servers. I would choose whichever you are more comfortable with, it wouldn't take long to implement either if you change your mind later.
I'm working on a API project that needs to send emails and store statistics after (or before depending on the implementation) the response has been returned to the client. For both cases I'm considering the Symfony's EventDispatcher component (I'm not using Symfony as the framework) so each controller action will dispatch an event to add an email to the queue or insert the data into the statistics database table.
So the thing would look something like this
Controller
=> Send Response to client
=> Dispatch Event email => EmailEventListener => Mail queue
=> Dispatch Event stats => StatsEventLister => Database
I'm considering this because I want this internal actions to be as asynchronous as they could be. Is this appropriate solution for this case?
EDIT: As Jovan Perovic suggested I'm adding more information. The API is a REST API that users communicate with it via web or mobile apps and I want to log, store stats and send notifications (emails primarily) without penalizing the performance of the API, the first idea was to use something that run after returning the response to the client but I don't know if that's possible using the EventDispatcher. Even if a use queue to process stats or notifications I need a centralized place where all the controllers can send information so logs be written and stats stored.
I hope my goal is now more clear. Sorry.
I think you could use Request filters (After would be suitable for you), although, I have never attempted to use them outside of Symfony2 framework.
As for async operations, in general, sockets are your friend. You could externalize the logic, by sending data to some socket which will in turn process the data accordingly. If that processing is non-essential (e.g. email and stats), your request could be finished even if your external mechanism fails.
I read some time ago about Gearman here (just an example) which might help up externalize that by creating a separate jobs.
Hope this sheds some light here :)
We have an web application built in PHP Laravel, which exposes a bunch of schema objects via JSON API calls. We want to tie changes in our schema to AngularJS in such a way that when the database updates, the AngularJS model (and subsequently the view) also updates, in real-time.
In terms of the database, it can be anything, such as mySQL, SQL Server, etc. There's a couple of ways we're thinking about this:
mySQL commits fire some sort of event at Laravel, which then fires a call to all relevant/listening models/views in AngularJS.
Before any data is changed (edited/added) - Laravel fires an event to AngularJS. In other words, after any successful DB commit, another "thing" is done to notify.
The second seems the obvious, clean way of doing this - since the database is not involved lower down the stack. Is there any better way of doing this?
This question is related:
How to implement automatic view update as soon as there is change in database in AngularJs?
but I don't quite understand the concept of a "room" in the answer.
What (if any) is the best way to efficiently tie database commits (pushing) to the AngularJS view (to render changes)? We want to avoid polling a JSON API for changes every second, of course.
I've also had a similar requirements on one of my projects. We solved it with using node.js and sockjs. Flow is like this:
There is a node.js + SockJS server to which all clients connect.
When db is updated, laravel issues a command to node.js via http (redis also a posibility)
Node.js broadcasts the event to all interested clients (this depends upon your business logic)
Either the client reloads the data required or if message is small enough it can be included in node.js broadcast.
Hope this helps. There is no clean way to do this without using other technologies (node.js / web socket / SSE etc). Much of it depends up on the configuration your clients will be using as well.
I'm trying to create an application that will create a chart that will get the data by calling a financial web service. Users will interact with the chart by clicking on it. There could be millions of views on the application, so millions of requests on the web service. What's the best way to do this? Will I need to call the web service each time, a million times?
The way I see it you would be better off reading the data from the web service at fixed interval (every X seconds/minutes/hours depending if your data is real-time or not) and displaying the graph with that information. That way your number of requests to the web service will not vary and you can change one part of the equation without affecting the other.
Edit after first comment: my answer stands. You would be better off getting the data at a fixed interval and inform users that the data is supplied with a delay (the usual delay with financial activity data is 15 minutes). That way you know in advance the number of requests you will run and you greatly speed up the service to your users.
A word of warning: Yahoo Finance changed its API licensing in 2012 or 2011 and it is now forbidden to use data from the API for public applications without a commercial license.