i have changable JSON file http://91.205.205.18:8000/info.xsl?mount=/128 (it includes icecast2 statistic), and i cant understand how to check if JSON data change for parsing and inserting parsed data to database instantly. Do i need use cron for some php script every 3-5 min?
P.S. Sorry for my bad english ;)
Write a CRON job, that get JSON data from http://91.205.205.18:8000/info.xsl?mount=/128 and then check that received data in your database if that data is already present in db discard it otherwise save it.
First time you may use timestamp in database with the imported json data (This time value will be provided in the json) and after that in each process time you can check only json timestamp and if there are any difference in timestamp then just insert/update else ignore it and for this insert/update process you can create a Cron job.
You will either have to setup a cron job to check at an interval. On a project I work on we have a section of the program checking constantly (every 2 seconds) for some xml data to change on a website. This is the only solution if one doesn't have control of the source as we do not.
If you truly need a form of push back notification and you are in control of the source data you can use a pubsubhubub server:
https://code.google.com/p/pubsubhubbub/
I haven't setup a pubsubhubbub server but it is the best solution if you need realtime data updates being sent from one output to a remote input.
Related
I have a big data API ( 500000 records ) each record has big data in JSON format. I write code to read that API and store that data in the desired format in the MySql table, then I have to put that data to another API so I coded to send data in JSON format to other API, everything works well. The issue is server goes out of memory or 500 error after 100 records only and I have to check my database again for what the last id inserted there and then update code manually each time to get that API execution again.
My Question is how can we continuously add very big data without server outrages or any error may be Async mode or something else, I don't need to check manually each time after 100 records. I am using the latest version of PHP, MySql, Digital Ocean VPN server. Data is very big and it needs to be updated daily so it is not possible for me to keep looking at this part daily.
Thanks in advance.
I have a database that has 20 rows each row I had set a Boolean value to it, so it is by default zero and when a row gets viewed its value changes to 1
I want the database to send any kind of signal that when 10 rows their value change from zero to 1, a certain PHP file fires up and starts a process that will affect only these 10 rows
How can I do that?
Thanks in advance
I would say, query from the php file every set amount of time to your database
The other way, database to execute a php file is almost impossible.
If you are using mySQL as database, a trigger could invoke the sys_exec() UDF available here: https://github.com/mysqludf/lib_mysqludf_sys#readme
So, there might be a possibility, actually, via an UDF function that would launch the php executable/script; not that easy, but seems possible ;-)
Invoking php from mysql is impossible, all you can do is set cron jobs for it. Cron job check mysql after certain interval of time and run the respected code
Every database is only a storage and it is its purpose in the system. Don't try to trigger any external process by the storage. The communication with the storage should be only a one way.
Rather think how to trigger your process from outside. Generally, there are two approaches:
a script that will check your database data in some interval like 1s, 10s, 1min or whatever would fit for a particular process
the current process that is updating your data can check your data and trigger another process if needed.
You can not trigger external file/script from mysql.
What you can do is create a cron job which run after certain interval of time which check database and perform certain operations.
Currently I have a data file in dropbox that is uploaded every 15 seconds, I want to take this data, which has several different data types, and graph the real time data that the user selects on a website. I have a data server but my data is not on there. Is there any way for me to take this data from the file and graph it while also having a control panel that selects which data I want to graph.
You can refresh your web page using Ajax. Note that if your refresh is set to every 15 seconds and your data comes in every 15 seconds, worst-case is that you will show data that's almost 30 seconds old if the timing of the data update and the Ajax refresh is unfortunate.
You probably want to check for new data using Ajax more frequently, depending on your specific needs. On the server side, cache the result of the Ajax update to avoid too much duplicate processing.
To create the data that you return from the Ajax query, open and process the data file. No need for MySQL. You can use the timestamp of the file to invalidate the result cache I suggest in the previous paragraph.
There are many JavaScript based charting libraries that can update via Ajax. Here's a good starting point:
Graphing JavaScript Library
I need to check for updates on a (max) one second interval for updates.
I'm now looking for the fastest way to do that using AJAX for the requests and PHP and MySQL.
Solution 1
Every time new data, that needs to be retreived by other clients, is added to the MySQL database a file.txt is updated with 1. AJAX makes a request to a PHP file which will check if file.txt contains a 1 or 0. If it contains a 1 it will get the data from the MySQL database and return it to the client.
Solution 2
Every AJAX request calls a PHP file which will check directly into MySQL database for new data.
Solution ..?
If there is any faster solution i'd be happy to know! (considering I can only use PHP/MySQL and AJAX)
Avoiding the database will probably not make the process significantly faster, if at all.
You can use a comet-style ajax request to get near real-time polling. Basically, create an ajax request as usual to a php-script, but on the server side you poll the database and sleep for a short interval if there is nothing new. Repeat until there is something of interest for the client. If nothing appears within a timeframe of e.g. 60 seconds, close the connection down. On the client side, you only open a new connection once the first has terminated (either with a response or as a timeout).
See: https://en.wikipedia.org/wiki/Comet_(programming)
I was just wondering how the PHP is behaving in the background.
Say I have a PHP which creates an array and populates it with names.
$names = Array("Anna", "Jackson" .... "Owen");
Then I have a input field which will send the value on every keypress to the PHP, to check for names containing the value.
Will the array be created on every call? I also sort the array before looping through it, so the output will be alphabetical. Will this take up time in the AJAX call?
If the answer is yes, is there some way to go around that, so the array is ready to be looped through on every call?
There's no difference between an AJAX request and a "normal" http request. So yes, a new php instance will be created for each request. If the time it takes to parse the script(s) is a problem you can use something like APC.
If those arrays are created at runtime and the time this takes is a problem you might store and share the values between requests in something like memcache
No matter what method you use to create the array, if it's in the code, if you pull it out of a database, a text file or any other source, when the web server gets an http request, ( whether it be Ajax or not ) it will start the execution of the PHP script, create its space in memory, and the array will be created.
There's only one entry point for a PHP script and it's the first line of it, when an http rquest points to it. (or when another script is included, which is the same)
As far as I know then it will have to create the array each time as the AJAX will make a new server request on each key press on the input input field. Each server request will create the array if you create the script to do so.
A better method would be to use a database to store the names.
Yes it will be created and destroyed every time you run the PHP script.
If this is a problem you could look at persisting this data somewhere (e.g. in a Session or in a Database), but I would ask whether it is really causing you so much of a performance problem that you need to do this?
(it's not a direct answer to your question, but it can help, if you are concerned about performances)
You say this :
Then I have a input field which will
send the value on every keypress to
the PHP
In this case, it is common pratice to not send the request to the server as soon as a key is pressed : you generally wait a couple of milliseconds (between 100 ms and 150 ms, I'd say), to see if there is not another keypress in that interval :
if the user types several keys, he usually types faster than the time you are waiting, so, you only send a request for the last keypress, and not every keypress
if the user types 4 letters, you only do 1 request, instead of 4 ; which is great for the health of your server :-)
speaking of time for the user : waiting 100 ms plus the time to go to the server, have the script running, and get back from the server is almost the same as without waiting 100 ms first ; so, not bad for the user
As a sidenote : if your liste of data is not too big (20 names is definitly OK ; 100 names would probably be OK ; 1000 might be too much), you could store it directly as a Javascript array, and not do an Ajax request : it's the fastest way (no client-server call), and it won't load your server at all.