Save API data to create a Javascript chart - php

I am wondering how to auto-save API data from http://api.bitcoinaverage.com/ticker/USD and https://crypto-trade.com/api/1/ticker/dvc_btc to create a chart using jqplot.
How can I make automated calls to each of the sites every 10 minutes and save data, and not have the data be overwritten by future calls?
Something like the chart here: vircurex.com/

You will need to use a combination of Database , Cache and a Cron (Scheduled Job) on your server to achieve what you want to do.
A high level approach would be:
1) Run a Backend Cron Job every 10 minutes. This will make a call to your Data Source i.e. HTTP Services and save the data in the database.
2) When the front-end makes a call, you should check if the results are present in the Cache. If they are present, return from the Cache itself so that Database calls (expensive) are avoided. If the results are not present in the Cache, retrieve from Database, put in Cache and return the data.
You might want to design your Cache and the items as per your requirements. Depending on the Caching libraries that you use, you could look for features like auto-expiring the cache items, reloading them automatically, etc.

Related

Pull API data real-time and publish

Info about the website I'm working:
Website related to soccer live score
API provides real-time data for soccer score (they got no webhooks)
What I want:
Want to deliver the real-time score in frontend
Also, I want to save that data in Redis temporarily and after the match finishes, want to push from Redis to the database.
Preferably don't use any external JS libraries ( http://socket.io ), pusher, etc. Laravel Broadcasting+ Redis is my preferred way since I won't need pusher or socket js code to load.
Parts of this problem:
Part 1: Pulling external API data to database(or Redis).
--> So far, the only way I've managed to pull data from the API to the database is, I've created a route which will trigger the load data from external API. Again, this is so useless as of now, because live score data in API is updated almost every second, and so far I need to trigger the route(or refresh the URL every second) just to pull up data from API. Also, not to forget, it will take 2-3 minimum second just to completely transfer API data to the database. This section is not dependant on whether to pull only if the user(frontend is requesting). It should do its job even if there are 0 users online.
So, my question is what is the best, most efficient and complete way to pull API data real-time and save it in Redis until the match is finished? (we can know the status of the match by checking in API data example: {match_id{status: finished}}xxxx). Because after the match is finished, I will push Redis to the database.
Part 2: Publishing that data real-time from the database(or Redis).
-> Okay this one for me is fairly easier than part 1, I've already found ways to publish Redis data real-time via pusher and socket.io. But other than that, what is the best way to do in my scenario? Also, do I need any JS libraries if I have to use a combination of Redis+ Laravel broadcasting?
Thank you for your suggestion!
Possible answer for part 1:
I would use Task Scheduling to ->daily(); or ->hourly(); an Artisan Console Command to check when the next soccer match is and write a record in the matches table, or update existing records in case the start/end time changes.
Another Console Command on ->cron('* * * * *'); (change to desired seconds) that executes every few seconds can check the matches table; if the current time is between the starts_at and ends_at of a match, retrieve realtime data.
To prevent multiple executions of the command (if for some reason an API call takes a bit longer) at the same time, Symfony's Lockable Trait might be used.

Show real time data to users with same response time

I've doubt regarding speed and latency for show real time data.
Let's assume that I want to show read time data to users by fire ajax requests at every second that get data from MySql table by simple collection query.
For that currently these two options are bubbling in my mind
MySql / Amazon Aurora
File system
Among these options which would be better? Or any other solution?
As I checked practically, if we open one page in browser then ajax requests gives response in less than 500ms using PHP, MySql, Nginx stack.
But if we open more pages then same ajax requests gives response in more than 1 second that should be less than 500ms for every visitors.
So in this case if visitors increase then ajax requests gives very poor response.
I also checked with Node.js+MySql but same result.
Is it good to create json files for records and fetch data from file? Or any other solution?
Indeed, you have to use database to store actual data but you can easily add memory cache (it could be internal dictionary or separate component) to track actual updates.
Than your typical ajax request will look something like:
Memcache, do we have anything new for user 123?
Last update was 10 minutes ago
aha, so nothing new, let's return null;
When you write data:
Put data into database
Update lastupdated time for clients in memcache
Actual key might be different - e.g. chat room id. Idea is to read database only when updates actually happened.
Level 2:
You will burn you webserver and also client internet with high number of calls. You can do something:
DateTime start = DateTime.Now;
while(Now.Subtract(30 seconds) < start)
{
if (hasUpdates) return updates;
sleep(100);
}
Than client will call server 1 time per 30 seconds.
Client will get response immediately when server notices new data.

How to update real time data with php and mysql

Currently I have a data file in dropbox that is uploaded every 15 seconds, I want to take this data, which has several different data types, and graph the real time data that the user selects on a website. I have a data server but my data is not on there. Is there any way for me to take this data from the file and graph it while also having a control panel that selects which data I want to graph.
You can refresh your web page using Ajax. Note that if your refresh is set to every 15 seconds and your data comes in every 15 seconds, worst-case is that you will show data that's almost 30 seconds old if the timing of the data update and the Ajax refresh is unfortunate.
You probably want to check for new data using Ajax more frequently, depending on your specific needs. On the server side, cache the result of the Ajax update to avoid too much duplicate processing.
To create the data that you return from the Ajax query, open and process the data file. No need for MySQL. You can use the timestamp of the file to invalidate the result cache I suggest in the previous paragraph.
There are many JavaScript based charting libraries that can update via Ajax. Here's a good starting point:
Graphing JavaScript Library

Zend_Cache vs cronjob

I am working on my bachelor's project and I'm trying to figure out a simple dilemma.
It's a website of a football club. There is some information that will be fetched from the website of national football association (basically league table and matches history). I'm trying to decide the best way to store this fetched data. I'm thinking about two possibilities:
1) I will set up a cron job that will run let's say every hour. It will call a script that will fetch the league table and all other data from the website and store them in a flat file.
2) I will use Zend_Cache object to do the same, except the data will be stored in cached files. The cache will get updated about every hour as well.
Which is the better approach?
I think the answer can be found in why you want to cache the file. Is it to place minimal load on the external server by only updating the cache every so often, or is it to keep pages loading fast because the file takes long to download or process?
If it's only to respect the other server, and fetching/processing the page takes little noticable time to the end user, I'd just implement Zend_Cache. It's simple, you don't have to worry about one script downloading the page, then another script loading the downloaded data (plus the cron job).
If the cache is also needed because fetching/processing the page is significant, I'd still use Zend_Cache; however, I'd set the cache to expire every 2 hours, and setup a cron job (or something similar) to manually update the cache every hour. Sure, this adds back the complexity of two scripts (or at least adding a request flag to manually refresh the cache), but should the cron job fail, you're still fine.
Well if you choose 1 it somewhat adds complexity because you have to use cron as well (not that cron is overly complex) and then you have to test that the data file is complete before using it or deall with moving files from a temp location after they have downloaded and been parsed to the proper format.
If you use two it eliminates much of 1, except now on the request where the cache is dead you have to wait for the download/parse.
I would say 1 is the better option, but 2 is going to be easier to implement and less prone to error. That said its fairly trivial to implement things in the cron script to prevent the negatives i describe. So i would probably go with 1.

What's the best way to use the Twitter API via PHP?

A client would like me to add their Twitter stream to their website homepage, using a custom solution built in PHP.
The Twitter API obviously has a limited number of calls you can make to it per hour, so I can't automatically ping Twitter every time someone refreshes my client's homepage.
The client's website is purely HTML at the moment and so there is no database available. My solution must therefore only require PHP and the local file system (e.g. saving a local XML file with some data in it).
So, given this limited criteria, what's the best way for me to access the Twitter API - via PHP - without hitting my API call limit within a few minutes?
It will be quite easy, once you can pull down a timeline and display it, to then add some file-based-caching to it.
check age of cache
Is it more than 5 mins old?
fetch the latest information
regenerate the HTML for output
save the finished HTML to disk
display the cached pre-prepared HTML
PEAR's Cache_Lite will do all you need on the caching layer.
a cron job (not likley - if there's not even a database, then there are no cron jobs)
write the microtime() to a file. on a page view compare the current timestamp to the saved one. its the difference greater than N minutes, pull the new tweetfeed and write the current timestamp to the file
if the front page is a static html-file not calling any php, include an image <img src="scheduler.php"/> that returns an 1px transparent gif (at least you did it this way when i was young) and does your twitter-pulling silently
or do you mean local-local filesystem, as in "my/the customers computer not the server"-local?
in this case:
get some server with a cron job or scheduler and PHP
write a script that reads and saves the feed to a file
write the file to the customers server using FTP
display the feed using javascript (yes, ajax also works with static files as datasources). jquery or some lib is great for this
or: create the tweet-displaying html file locally and upload it (but be careful ... because you may overwrite updates on the server)
imo: for small sites you often just don't need a fully grown sql database anyway. filesystems are great. a combination of scandir, preg_match and carefully chosen file names are often good enough.
and you can actually do a lot of front-end processing (like displaying XML) using beautiful javascript.
Since we don't know your server config I suggest you set up a cron job (assuming your on a Linux box). If you have something like cPanel on a shared hosting environment than it should be not much of an issue. You need to write a script that is called by cron and that will get the latest tweets and write them to a file (xml?). You can schedule cron to run every 30 min. or what ever you want.
You may want to use TweetPHP by Tim Davies. http://lab.lostpixel.net/classes/twitter/ - This class has lots of features including the one you want, showing your clients time line.
The page shows good examples on how to use it.
You can then put the output of this in a file or database. If you want the site visitor to update the database or the file like every 5 minutes so, you can set a session variable holding a timestamp and just allow another update if the timestamp was at least 5 minutes ago.
Hope this helps
My suggestion: Create a small simple object to hold the cache date and an array of tweets. Every time someone visits the page, it performs the following logic:
A) Does file exist?
Yes: Read it into a variable
No: Proceed to step D)
B) Unserialize the variable (The PHP pair serialize()/unserialize() would do just fine)
C) Compare the age of the cache stored with the current time (a Unix timestamp would do it)
Its over 5 minutes from each other:
D) Get the newest tweets from Twitter, update the object, serialize it and write in the cache again. Store the newest tweets for printing, too.
Its not: Just read the tweets from the cache.
E) Print the tweets
Simplest and easiest way to serialize the object is the serialize()/unserialize() pair. If you're not willing to put off the effort to make the object, you could just use 2D array, serialize() will work just fine. Give a look on http://php.net/serialize
Considering you have no cPanel access, its the best solution since you won't have access to PEAR packages, cron or any other simpler solutions.
array(
'lastrequest' => 123,
'tweets' => array ()
)
now in your code put a check to see if the timestamp in the datastore for lastrequest is more than X seconds old. If it is then its time to update your data.
serialize and store the array in a file, pretty simple

Categories