How to optimize an api performance - php

I am using a script that fetch users infos from an api.
This is how it works I get the id of the user posted in php.
Then make a curl request to the api with the user id to get his infos.
The problem is that whenever the user refresh the page my script do the work again and fetch the infos to display them. But this api take much time to respond and it makes my script works slower.
Is there any solution to cache the api response for sometime (The server i request force cache : no cache in headers) ?
I am thinking about creating a directory with json file (the api response) for every user id . Or I can go with mysql solution .
What's the best solution for me ?
N.B : Im getting about 200 000 request/day.
Thanks

It sounds like you actually need to optimize this API, so that updates will be there as you expect them to be.
However, if you insist that caching is appropriate for your use, there's no need to reinvent the wheel. Simply put Nginx, Varnish, etc. in front of the API server. Then configure your application to request from this caching proxy rather than the API server directly.
See also:
Nginx Caching Configuration Documentation

Related

Electron desktop app with online server and database?

I am working on a desktop application with electron and I am considering online storage to store data. I would like to get some idea on the approach as I couldn't find reliable answers from google search.
Approach 1. electron app (front end ) + php (like purchase a hosting package from godaddy with a domain e.g: www.mysite.com)
with this approach I am planning to create api calls in php to perform basic CRUD.
is this is a good way?
will this affect the speed/load time?
are there better ways for this situation?
Thank you very much in advance for the help.
Well, this is not an easy topic. Your solution could work: you Electron app ask your server for data and store data to it. Anyway the best solution depends from your application.
The most important points that you have to ask yourself are:
How often do you need to reach your server ?
Your users could work without data from server ?
How long does it takes to read and store data on your server ? (it's different if you store some kb or many gb of data)
The data stored online must be shared with other users or every user has access to its own data ?
If all the information are stored in your server your startup have to wait for the request to complete but you can show a loader or something like this to mitigate the waiting.
In my opinion you have many choices, from the simplest (and slowest) to the most complex (but that mitigate network lag):
Simple AJAX requests to your server: as you described you will do some HTTP requests to your server and read and write data to be displayed on your application. Your user will have to wait for the requests to complete. Show them some loading animations to mitigate the wait.
There are some solutions that save the data locally to your electron installation and then sync them online, Have a check to PuchDB for an example
Recently I'm looking at GraphQL. GraphQL is an API to query your data. It's not that easy but it has some interesting features, it has an internal cache and it's already studied for optimistic update. You update your application immediately thinking that your POST will be fine and then if something goes wrong you update it accordingly.
I'd also like to suggest you to try some solutions as a service. You don't have a server already and you will have to open a new contract so why don't you check some dedicated service like Firebase? Google Firebase Realtime Database allows you to work in javascript (just one language involved in the project), sync your data online automatically and between devices without the need to write any webservice. I'have just played with it for some prototypes but it looks very interesting and it's cheap. It also has a free plan that it's enough for many users.
Keep in mind that if your user has access only to their data the fastest and easies solution is to use a database inside your electron application. A sqlite database, an IndexDB database or even serialize to JSON and then store everything in localstorage (if your data fits size limits).
Hope this helps

Request page from same server?

I have a website that it consuming it's own API. This means sending a request to itself.
I feel like there might be a way to send a request to a local page quicker than just including the API url.
Right now I'm doing: file_get_contents("http://domain.com/api/recent")
These didn't work when I tried it:
file_get_contents("http://localhost/api/recent")
file_get_contents("http://127.0.0.1/api/recent")
Sorry I can't comment since don't have enough reputation - my 5 cents -
You could easily use the local API by including / requiring the API php file while setting up all the posted / get variables prior to the inclusion. You could cache them if the user sent you important data and then re-set them from cache.
When you call the API with http it is basically slower as it goes through the web server and not through the PHP engine. cheers.

Sending data to a slow API without slowing down page load

I am doing a curl call to pass information to an API. The issue I am having is sometimes the API responds slowly. I need to immediately pass the data, but I don't want the user to be stuck on the processing page while the API tries to make the connection.
Is there a good alternative, kind of like multithreading or something that I could use to still query this API while moving the user onto the next page?
Thanks!
Use fire-and-forget.
I don't know if CURL can do this but make it so it wont attempt to read from the stream. Just send and close.
If the connection to the remote site is slow as well you'll need to do some proxying.
Fire-and-forget proxying is a poor man's solution to threading.

Notifications via socket.io on php site

I am building a website in PHP that handles the sessions in Redis in JSON format.
This way the session can be accessed by both the PHP interpreter and a node.js server.
What I am trying to do is now to add notifications in said website; the procedure I was thinking of is the following: (just figure it as a simple friend request to simplify it all)
user A sends friend request.
PHP uses cURL to say to node.js service to send notification
user B gets a notification because he is connected to node.js via socket.io
What are the general guidelines to achieve this? Because it looks clear to me that if user A and B are in different servers, it will be impossible to scale horizontally;
Thanks in advance.
Sounds like a you could make use of Web Sockets here with a Publication / Subscription protocol, architecture.
You get Server client functionality with web sockets.
Node is a perfect choice for a websocket server, lots of small IO.
See http://en.wikipedia.org/wiki/Web_sockets
I'm wouldn't think if the shared session is required for php - node communication, just have your clients push requests through the socket and handle the reposes as needed.
I think the approach you propose sounds quite reasonable. However, instead of doing a direct request to the service, you could consider using a message queue (zeromq, rabbitmq, whatever) which would allow you to scale it more easily as you can easily add logic to the queue processing to pass the message to the correct node instance.
I managed to get 1400 concurrent connections with socket.io on a cheap VPS with no special configuration so even with no tricks it should scale quite well. In my case most of these connections were also sending and receiving data almost constantly. It could probably have handled more than that, 1400'ish was simply the max number of users I happened to get.
Though I'd worry more about getting all those users first ;)
Use Redis's built in pub-sub capability. When the PHP server gets a request, it publishes it to a channel set up for that purpose. Each of your node servers subscribes to that channel and checks if the user involved is connected to it. If so, it sends the notification via socket.io. As a bonus, you avoid an extra network connection and your overall logic is simplified.
simply setup ur database as per required then whenever a activity is made just tell ur node js to transfer the related information through redis to php and make a process and in make a response back from php to node via channel keep checking the notification from table and show

Facebook Graph API Caching

Im working with the facebook graph API from inside CodeIgniter, the only problem is - its so damn slow to work and i was wondering if it would be possible to somehow cache the response from the graph servers on my own webserver somehow?
TO give you an indication of what i mean im making 1 graph API call for each record from the database(getting the number of likes for analytics) in my model, each record from the database is a single page on my site so you can imagine what that does for the performance of my app...
Any help would be appreciated...
Sure it's just like any other cache file. You save the JSON response as a file and check the filemtime() against time(). Only hit the graph API if your cache file is old.
There are many ways to cache data. You could use Redis, Mongo, Memcached, you could store it as a file on disk, you could even write it to a SQL database. Facebook's API Terms of Service prohibit the permanent storage of such data, however, so be wary of that and make sure your cache expires or is temporary in some fashion.

Categories