Im working with the facebook graph API from inside CodeIgniter, the only problem is - its so damn slow to work and i was wondering if it would be possible to somehow cache the response from the graph servers on my own webserver somehow?
TO give you an indication of what i mean im making 1 graph API call for each record from the database(getting the number of likes for analytics) in my model, each record from the database is a single page on my site so you can imagine what that does for the performance of my app...
Any help would be appreciated...
Sure it's just like any other cache file. You save the JSON response as a file and check the filemtime() against time(). Only hit the graph API if your cache file is old.
There are many ways to cache data. You could use Redis, Mongo, Memcached, you could store it as a file on disk, you could even write it to a SQL database. Facebook's API Terms of Service prohibit the permanent storage of such data, however, so be wary of that and make sure your cache expires or is temporary in some fashion.
Related
I am using a script that fetch users infos from an api.
This is how it works I get the id of the user posted in php.
Then make a curl request to the api with the user id to get his infos.
The problem is that whenever the user refresh the page my script do the work again and fetch the infos to display them. But this api take much time to respond and it makes my script works slower.
Is there any solution to cache the api response for sometime (The server i request force cache : no cache in headers) ?
I am thinking about creating a directory with json file (the api response) for every user id . Or I can go with mysql solution .
What's the best solution for me ?
N.B : Im getting about 200 000 request/day.
Thanks
It sounds like you actually need to optimize this API, so that updates will be there as you expect them to be.
However, if you insist that caching is appropriate for your use, there's no need to reinvent the wheel. Simply put Nginx, Varnish, etc. in front of the API server. Then configure your application to request from this caching proxy rather than the API server directly.
See also:
Nginx Caching Configuration Documentation
I am working on a desktop application with electron and I am considering online storage to store data. I would like to get some idea on the approach as I couldn't find reliable answers from google search.
Approach 1. electron app (front end ) + php (like purchase a hosting package from godaddy with a domain e.g: www.mysite.com)
with this approach I am planning to create api calls in php to perform basic CRUD.
is this is a good way?
will this affect the speed/load time?
are there better ways for this situation?
Thank you very much in advance for the help.
Well, this is not an easy topic. Your solution could work: you Electron app ask your server for data and store data to it. Anyway the best solution depends from your application.
The most important points that you have to ask yourself are:
How often do you need to reach your server ?
Your users could work without data from server ?
How long does it takes to read and store data on your server ? (it's different if you store some kb or many gb of data)
The data stored online must be shared with other users or every user has access to its own data ?
If all the information are stored in your server your startup have to wait for the request to complete but you can show a loader or something like this to mitigate the waiting.
In my opinion you have many choices, from the simplest (and slowest) to the most complex (but that mitigate network lag):
Simple AJAX requests to your server: as you described you will do some HTTP requests to your server and read and write data to be displayed on your application. Your user will have to wait for the requests to complete. Show them some loading animations to mitigate the wait.
There are some solutions that save the data locally to your electron installation and then sync them online, Have a check to PuchDB for an example
Recently I'm looking at GraphQL. GraphQL is an API to query your data. It's not that easy but it has some interesting features, it has an internal cache and it's already studied for optimistic update. You update your application immediately thinking that your POST will be fine and then if something goes wrong you update it accordingly.
I'd also like to suggest you to try some solutions as a service. You don't have a server already and you will have to open a new contract so why don't you check some dedicated service like Firebase? Google Firebase Realtime Database allows you to work in javascript (just one language involved in the project), sync your data online automatically and between devices without the need to write any webservice. I'have just played with it for some prototypes but it looks very interesting and it's cheap. It also has a free plan that it's enough for many users.
Keep in mind that if your user has access only to their data the fastest and easies solution is to use a database inside your electron application. A sqlite database, an IndexDB database or even serialize to JSON and then store everything in localstorage (if your data fits size limits).
Hope this helps
I'm trying to get NodeJS and CodeIgniter to share data with each other (sessions & db data). I've googled around quite a bit for a solution to this problem, but I still haven't found the most convenient way to do things. It seems the most appropriate way is to use some sort of caching software, such as memcached or redis as wrappers are available for Node and PHP.
This is what I've thought of so far:
Client logs in as normal on CodeIgniter powered website. Session is created and added to memcached.
Client connects to a secure socket.io server using SSL.
Client sends raw cookie data to socket.io server.
Server does some string splitting with the cookie data and gets the session id.
Server checks the cache to see if session id exists. If yes, user is logged in!
User logs out on CodeIgniter site. Session data is destroyed.
However, there are a few problems that I can think of with this approach:
1. How would I cleanup expired sessions from Memcached? From what I can tell, there is no way of detecting expired sessions in CodeIgniter? - Edit - Just realized that I could set a timeout on the memcached data to solve this.
The CodeIgniter docs says that session ids change every five minutes. Wouldn't this kinda ruin my approach?
Is there a better solution out there? I'd like to hear what other options there are before I start implementing this.
I am making a php mysql web app, my idea is to install in the customer home a web server so they can connect with whatever device they want, probably most of the time they will be using an ipad to connect to the app.
Sometimes the client needs to take out the app with them in the ipad, so after discarding other options(like phonegapp because i need to mantain a mysql db for some functions)i realized that Application Cache may be a good solution: They can use the application with the web server(using db functions like randomize the content, generate Statistics)and when they are offline they can access a local copy of the content, with limited function but working.
The problems that i have is that the site have images,video and audio so at least there are 20mb to cache, i read that with application cache you can only store 5mb and the other problem is that my content is dynamic so i cant add all the files that i need to the cache manifest. I want something like make a wget of the site(save an static html file) and to use the dynamic content when online. I dont know if i can make something like that.
Thanks
the cache.manifest for the ipad can store more than 5mb.
the currently ios limit is 50mb.
if you cache more files, automatically the ipad ask if you want increase the storeage to 50mb.
take a look at this
it explains you how to create and implement the cache.manifest. its a great tutorial
hope this help.
My client has an offline product database for a high street shop that they update fairly frequently for their own purposes. They are now creating an online store which they want to use product information from this database.
Migrating the database to a hosted server and abandoning the offline database is not an option due to their current legacy software set up.
So my question is: how can I get the information from their offline database to an online database? Their local server is always connected to the internet so is it possible to create a script on the website that somehow grabs the data from their server and imports it into the online server? If this ran every 24 hours it would be perfect. But is it even possible? And if so how would I do it?
The only other option I can think of is to manually upload the database after every update, but this isn't really a viable idea.
I did something like this with quickbooks using an odbc connection. Using that I synced data to MySQL. This synchronization however, was just one way. Unless you have keys in the data that indicates when something was changed (updated date), you will end up syncing alot of extra data.
Using SQLYog, i set up a scheduled job that connected to the odbc data source, and pushed the changes since last sync to the mysql database I was using to generate reports. If you can get the data replicated into MySQL it should be easy at that point to make use of it in your online store.
The downside is that it wont be realtime. Inventory could become a problem.
In an ideal world I would look at creating a restful API that would run on the same server or at least run on the same network as your offline database. This restful API would run as a web server via http and return JSON or even XML structures of data from the offline database. Clients running on the internet would be able to connect and fetch any data they need, at any time. A restful API like this has a number of advantages.
Firstly it's secure. You don't have to open up an attack vector to the public by making connections to your offline database public. The only thing you have to do is enable public access to your restful API. In your API's logic you might not even include functionality to write to the database so even if your API's security is compromised at very worst all attackers can do is read your data, not corrupt it.
Having a restful api in this situation represents a good separation of concerns. Your client code should not know anything about the database nor should it know about any internal systems that the offline database uses. What happens when your clients want to update their offline system or even change it? In this situation all you would have to do is update the restful api. Your client that is connecting to the data no longer cares about anything else but the api so changing databases would be easy.
Another reason to consider an API is concurrency. I hinted at this before but having an API would be great if you ever need to have more than one client accessing the offline databases' data. In a web server set up where you would have the API sitting and waiting for requests there is no reason why you could not have more than one client connecting to the api at the same time. HTTP is really good at this!
You talked about having to place old data in a new database. Something like this could be done easily with a restful API as you would just have to map the endpoints of your API to tables in the new database and run that when you need. You could even forgo the new database and use the API as your backend. This solution would require some caching but it would cut down on the duplication of a database if you don't feel it's needed.
The draw back to all of this is the fact that writing an API over a script is more complex. So in this situation I believe in horses for courses. If this database is the backbone of a long term project that will be expanding in the future an API is the way to go. If its a small part of your project then maybe you can swing it with a script that runs every 24 hours however I have done this before and the second I have to change/edit the solution things start getting a little "hairy". Hope this helps and good luck with it.