php pass json to vue every x seconds - php

What I would like to do is push JSON data from a PHP script to Vue (in the view of an MVC PHP framework), comparable to what you would see in an online exchange where the data updates in near real time. So without a page reload.
Now I understand you can use fetch to fetch data (and probably set an interval so it does that every x seconds), but I would like the UI to update in (near) real-time as the PHP script outputs the data. How to do this? Preferably without using a third party like pusher.com.
Some details about the UI:
It's about 10 to 30 rows in a <table>
In each row there is one status label and a few numbers that would get updated
There are a few other generic elements on the page that need an update (eg overall status label)

Related

Pull API data real-time and publish

Info about the website I'm working:
Website related to soccer live score
API provides real-time data for soccer score (they got no webhooks)
What I want:
Want to deliver the real-time score in frontend
Also, I want to save that data in Redis temporarily and after the match finishes, want to push from Redis to the database.
Preferably don't use any external JS libraries ( http://socket.io ), pusher, etc. Laravel Broadcasting+ Redis is my preferred way since I won't need pusher or socket js code to load.
Parts of this problem:
Part 1: Pulling external API data to database(or Redis).
--> So far, the only way I've managed to pull data from the API to the database is, I've created a route which will trigger the load data from external API. Again, this is so useless as of now, because live score data in API is updated almost every second, and so far I need to trigger the route(or refresh the URL every second) just to pull up data from API. Also, not to forget, it will take 2-3 minimum second just to completely transfer API data to the database. This section is not dependant on whether to pull only if the user(frontend is requesting). It should do its job even if there are 0 users online.
So, my question is what is the best, most efficient and complete way to pull API data real-time and save it in Redis until the match is finished? (we can know the status of the match by checking in API data example: {match_id{status: finished}}xxxx). Because after the match is finished, I will push Redis to the database.
Part 2: Publishing that data real-time from the database(or Redis).
-> Okay this one for me is fairly easier than part 1, I've already found ways to publish Redis data real-time via pusher and socket.io. But other than that, what is the best way to do in my scenario? Also, do I need any JS libraries if I have to use a combination of Redis+ Laravel broadcasting?
Thank you for your suggestion!
Possible answer for part 1:
I would use Task Scheduling to ->daily(); or ->hourly(); an Artisan Console Command to check when the next soccer match is and write a record in the matches table, or update existing records in case the start/end time changes.
Another Console Command on ->cron('* * * * *'); (change to desired seconds) that executes every few seconds can check the matches table; if the current time is between the starts_at and ends_at of a match, retrieve realtime data.
To prevent multiple executions of the command (if for some reason an API call takes a bit longer) at the same time, Symfony's Lockable Trait might be used.

PHP DRUPAL: How to detect changes to table in database

Current situation: I have a web page that uses AJAX/JQUERY to refresh all the content on the page every 17 seconds. Every time this happens the server queries the database for data from two tables, one of which is large (450MiB in size, 11 columns) and processes all the data.
This is too resource intensive. I want:
The server queries the database only when one of the two tables have changed.
The page then reloads the page through AJAX only when the tables have been updated and the server has re-processed the data.
I think this falls under the category of comet programming. I'm not sure.
2 is easy. The webpage calls 'update.php' every 17 (or maybe less) seconds. The PHP script returns no data if no changes have been made. Only if data is returned then the current page is replaced with the new data. Please advise me if there is a better way.
As for 1 my googling tells that every time one of my two tables is updated I can put a notification in a table (or maybe just a single byte in a file) to indicate that I must query the database again and then the next time that the webpage sends an AJAX request I return the data.
The problem is that I'm working with a rather large code base I didn't write and I don't know of all the places that either of the two tables may be updated. Is there an easier way to check when the database is modified.
I'm working with PHP, Apache, Drupal and MYSQL.
You can chekout Server Sent Events
A server-sent event is when a web page automatically gets updates from a server.
there is an excellent article on HTML5rocks.com - Server Sent Events
All You have to do is create an object
var source = new EventSource('xyz.php'); //Your php files which will return the updates.
Once you create an object,you can listen to the events
source.addEventListener('message', function(e) {
console.log(e.data);
}, false);

Data-aware PHP components?

Many desktop applications (e.g, those built with Delphi) use "Database aware components", such as a grid, which display the contents of a database - usually the result of a query - and auto-update their display when the database contents change.
Does such a thing exist for PHP (maybe displaying the result of a query in an HTML table, and updating)?
If not, how would we go about creating it?
(Note: This seemingly similar question didn't help me)
It's not technically possible to update a HTML page once rendered with pure PHP because of the static nature of the HTTP protocol so any solution would have to include JavaScript and AJAX calls.
Emulating using AJAX to re-render the table every 5 minutes
It wouldn't be hard to emulate though, just make a PHP page which gets the results of a database and puts them in a table, then use jQuery's .load() function to get that table and render it in a DIV of your choice on an interval of 5 seconds or whatever.
Something like:
function updateTable(){
$('#tableDiv').load(url);
}
var url = 'renderTable.php';
setInterval(updateTable,5000);
You can put this in any PHP (or HTML) page with a DIV with id tableDiv and it will render the output of renderTable.php in that div every 5 seconds without refreshing.
Actually monitoring the database
This is possible, you'd have to set up a PHP file on a cron for every 5 seconds (or an AJAX call every 5 seconds) to run something like SELECT MD5(CONCAT(rows,'_',modified)) AS checksum FROM information_schema.innodb_table_stats WHERE table_schema='db' AND table_name='some_table'; (assuming innoDB), you'd then compare this to the previous checksum.
If the checksums are identical, you could pass 'false' to your modified AJAX call, which would tell it not to render anything over the table. If they aren't, you could pass it the HTML of the new table to render in place.
it could be done but with mix of different technologies
if I would like to monitor in real time changes made in database I would think about triggers and sockets - trigger in database should call (on insert or update) function that will add event to queue - here's example function for postgresql (plsh is custom handler)
CREATE FUNCTION propagate_event(tablename) RETURNS text AS '
#!/bin/sh
# execute script that will add event to message queue
/var/scripts/change_in_table.sh "$1"
' LANGUAGE plsh;
client connects to socket and receives that events in real time

Is there a standard way to randomly search my database items with a cron until complete, then reset in PHP?

I already have a sceen scraper built using PHP cURL, tied to a mySQL database. I have stored products that need to be updated weekly rather than what I have now (a form that I input the url/product and hit go).
My first thought would be to use standard cron every 30 minutes on a PHP file like so.
I would like to randomize two things, the delay on the PHP script actually accessing the source site (i.e. 0 - 20 minutes) so the process timing is random. Second, I want to access my target items/pages randomly, but be sure to get all of them weekly and/or consistently, before cycling through the list again.
The timer is fairly strait forward and needs no storage of data, but how should I keep track of my items/uri's in this fashion? I was thinking a second cron to clear data, while the first just increments. But still I have to set flags as to what was updated already and I am just not familiar enough for choice of where and how to store this data.
I am using mySQL, with HTML5 options and is on Codeigniter, so can also hold data in SQLite as options..along with cookies if that makes sense. I couple questions on this part, do I query my database (mySQL) for what I need every-time, or do I store on a jSON file once a week, and run that? This obviously depends and/or determines on where I flag what was already processed.
You have a list of items to scrape in your MySQL database. Ensure that there is field that holds the last time the item was scraped.
Set a cron job to run every minute with this workflow:
Ensure that the previous run of the script has completed (see step #4). If not, end.
Check last time you scraped any item.
Ensure enough time has passed (see step #9). If not, end.
Set a value somewhere to show that you are processing (so step #1 of subsequent runs is aware).
Select an item to scrape at random. (from those that haven't been scraped in n time.)
Delay random interval of seconds to ensure all requests aren't always on the minute.
Scrape it.
Update time last scraped for that item.
Set a random time to wait before next operation (so step #3 of subsequent runs is aware).
Set a value to show that you are not processing (so step #1 of subsequent runs is aware).
End.
Once all items have been scraped, you can set a variable to hold the time the batch was completed and use it for n in step #5.

How to update real time data with php and mysql

Currently I have a data file in dropbox that is uploaded every 15 seconds, I want to take this data, which has several different data types, and graph the real time data that the user selects on a website. I have a data server but my data is not on there. Is there any way for me to take this data from the file and graph it while also having a control panel that selects which data I want to graph.
You can refresh your web page using Ajax. Note that if your refresh is set to every 15 seconds and your data comes in every 15 seconds, worst-case is that you will show data that's almost 30 seconds old if the timing of the data update and the Ajax refresh is unfortunate.
You probably want to check for new data using Ajax more frequently, depending on your specific needs. On the server side, cache the result of the Ajax update to avoid too much duplicate processing.
To create the data that you return from the Ajax query, open and process the data file. No need for MySQL. You can use the timestamp of the file to invalidate the result cache I suggest in the previous paragraph.
There are many JavaScript based charting libraries that can update via Ajax. Here's a good starting point:
Graphing JavaScript Library

Categories