I currently have a MySQL database which I was hoping to use to store regularly updated data from a temperature sensor connected to the internet.
I currently have a page that, when opened, will grab the current temperature and the current timestamp and add it as an entry to the database, but I was looking for a way to do that without me refreshing the page every 5 seconds.
Detail:
The data comes from an Arduino Ethernet, posted to an IP address.
Currently, I'm using cURL to grab the data from the IP, add a timestamp and save it to the DB.
Obviously only updates when the page is refreshed (it uses PHP).
Here is a live feed of the data - http://wetdreams.org.uk/ChrisProject/UI/live_graph_two.html
TL;DR - Basically I need a middle man to grab the data from the IP and post it to a MySQL
Edit: Thanks for all the advice. There might be a little bit of confusion, I'm looking for a solution that (ideally) doesn't require a computer to be on at all (other than the Server containing Database). Since I'm looking to store data over long periods of time (weeks), I'd like to set it up and leave a script running on the server (or Arduino) that gets the temp and posts it to the Database.
In my head I would like to have a page on the server that automatically (without any browser open, or any other prompting other than a timer) calls a PHP script.
Hope that clears things up!
you can post directly to web server from your arduino using ArduinoEtherenetClient (click link to get example)
POST /insertData.php - in insertData.php use $_POST["tempCaptured"] to get the temp value and insert that in db.
Good article on using ArduinoEthernetClient http://interactive-matter.eu/how-to/arduino-http-client-library/
Write a code(ping.php) which pings this url at fixed intervals.
Now, setup a cronjob which runs this code at fixed intervals.
Your cron can be 0 */2 * * * PATH_TO_{ping.php} // will run every 2 hours
your ping.php file will connect to the live feed, grab the data and store results to the db.:
If I understand the problem. You just need a replacement for refreshing the web page every 5 seconds?
Not getting the data?
I would setup an ajax connection, and have the php run in a kind of infinite loop. echoing new data back to your javascript to update the graph. The PHP loop would have to have a timeout check to eventually close the script.
In this case, the best solution was using the Arduino to post a request to a PHP script which done its thing and added the retrieved value to a database. The way I have it running just now (for simplicity sake) is with a GET request from the Arduino using the EthernetClient.
Code:
char server[] = "www.example.com";
and
if(client.connect(server, 80)){
client.println("GET /test.php?input_val=99 HTTP/1.0");
client.println("HOST: www.example.com");
}
However, since I'd already built my website around the fact that the data was posted to an independent server/IP*, I opted to use cron to schedule a task. In this case I wanted to update my DB every 5 - 10 seconds. However, I'm piggybacking on somebody else's server, so I contacted the server owner and asked them to set up a cron job calling "/mysubdir/cron_update.php" every minute (the fastest cron can call). From there I done a bit of 'ScriptCeption' and within the PHP script, completed my calls every 10 seconds for a minute before finishing the script.
Thanks to everybody who helped me out, I'm posting this here as a complete answer and explanation, because technically everybody was correct.
You can use JavaScript to auto refresh the page
<script type="text/javascript">
function timedRefresh(timeoutPeriod) {
setTimeout("location.reload(true);", timeoutPeriod);
}
</script>
A better solution would be ajax and send the date regulary in the background to the database.
Another way is to use a PHP-CLI script and to schedule it with cron jobs that gets the values from your sensor automatically.
Related
I have a PHP script to pull user specific data from a 3rd party source and dump it into a local table, which I want to execute every X mins when a user is logged in, but it takes about 30 seconds to run, which I don't want the user to experience. I figured the best way of doing this would be to timestamp each successful pull, then place some code in every page footer that checks the last pull and executes the PHP script in the background if it was more than X minutes ago.
I don't want to use a cron job to do this, as there are global and session variables specific to the user that I need when executing the pull script.
I can see that popen() would allow me to run the PHP script, but I can't seem to find any information relating to whether this would be run as the visitor within their session (and therefore with access to the user specific global or session variables) or as a separate session altogether.
Will popen() solve my problem?
Am I going about this the right way or is there a better method to execute the script at intervals while the user is logged in?
Any help or advice is much appreciated, as always!
Cheers
Andy
Maybe an idea to put the session data also in a table.
That way you can easily access it from your background process. You only have to pass the session_id() or the user id as argument so the process knows which user it is currently processing.
No, because PHP still needs to wait on the process started by popen() before it can exit
Probably not, because the solution doesn't fit the architectural constraints of the system
Your reasoning for not using a cron job isn't actually sound. You may have given up on exploring that particular path too quickly and drawn yourself into a corner with trying to fit a square peg in a round hole.
The real problem you're having is figuring out how to do inter-process communication between the web-request and the PHP running from your crontab. This can be solved in a number of ways, and more easily then trying to work against PHP's architectural constraints.
For example, you can store the session ids in your database or some other storage, and access the session files from the PHP script running in your crontab, independently of the web request.
Let's say you determine that a user is logged in based on the last request they made to your server by storing this information in some data store as a timestamp, along with the current session id in the user table.
The cron job can startup every X minutes, look at the database or persistence store for any records that have an active timestamp within the given range, pull those session ids and do what's needed.
Now the question of how do you actually get this to scale effectively if the processing time takes more than 30 seconds can be solved independently as well. See this answer for more details on how to deal with distributed job managers/queues from PHP.
I would use Javascript and AJAX requests to call the script from the frontend and handle the result.
You could then set the interval in JavaScript and send an AJAX-Request each interval tick.
AJAX is what you need:
http://api.jquery.com/jquery.ajax/
The use the method "success" to do something after those 30 seconds.
i have a php script that scrapes data from a bunch of websites and stores them in a db. What i want to happen is instead of having the php load at every connection, i want to set it on a 10 minute interval that then stores the data it gets into a DB so i can instantly retrieve info instead of having to have the php run everytime which takes up time. I don't know ajax well and would like to keep it as php/mysql as possible. Any help is apreciated.
TL;DR: Want php to save data to a db every 10 minutes then output that db the same way until it gets over written, instead of loading new data on a refresh.
Basic options as follows. No need (or use) for AJAX here.
Make a cron job / scheduled task (Linux / Windows) that calls your script at intervals.
Add a timed javascript browser refresh to your PHP script. See here for how.
Use a browser plugin like "Easy Auto Refresh" (Chrome) or "ReloadEvery" (Firefox).
The first one is the cleanest way around, spares you from keeping a browser tab open.
I have a basic server in PHP and from the mobile device I send data and save the server every 1 minute.
But when ever my mobile phone loses the connection, I would like to set a counter on the server and if the mobile does not insert anything the database longer than 2 min. I would like to call a function in the server saying that your mobile lost connection. Every time the mobile phone sends the data to the server, timer will be reset.
I am not familiar to PHP but I searched and couldn't find any similar things. I am sure there must be an easy way of doing it. setting a listener or creating a count down timer.
You can't set a timer in your PHP code as it runs only when a client requests a page.
As you doesn't have an access to CRON jobs, you can't do it from CLI side either.
However, some webservices allow you to periodically call a page of your choice, so you can save() on every http://something/save.php call. Take a look at http://www.google.fr/search?ie=UTF-8&q=online+cron for more informations.
Note that if someone get the save() url, he can easily overload your server. Try to secure it as most as possible, maybe with a user/password combination passed as parameters.
Finally I got it worked. What I learned is that the server cant do anything if there is no request:) So I created a timestamp field in the database and update the current time with the request. Of course before updating the field gets it and compare it with the current time and see when was the last request. and find out the time difference. if the time difference bigger than 2 min change the position. I hope this helps other people as well.
I'm wondering if it is possible to push an xml file update from server to all client browsers?
Basically my proposed situation is that my server holds an xml file, when a user loads a page that uses said xml file they can request to change it, if the change is allowed (determined by the page client side) then the xml file is updated on the server side. I'm fine up to this point (well, I have plenty of reading to get me to this point). Then I want all pages who are connected to refresh all elements of the page reliant on the xml file with out refreshing the whole page.
Another words all those pages using the file to update their copy->data if the copy on the server is newer than theirs. Is this possible via a server push, or do I have to constantly poll the server to compare files? (That seems sloppy to me..) And if it is possible, what's the best way to go about it?
Thanks for any points in the right direction.
Because a webpage is stateless you cannot push data to it. You need to poll the server for updates. Think about a small ajax script that polls about every 5 minutes, when content is update that script calls something to update the page. You will need a lot of ajax to do this; take jQuery or alike to accomplish this.
You may try an AJAX call to a some php script like:
set_time_limit(3600); // one hour or set it as long as your session timeout is
// Keep on repeating this to prevent PHP from stopping the script
while (true)
{
sleep(5); //5 seconds between polling the server
//do the updates xml updates
flush();
ob_flush();
}
The connection will stay open and every 5 seconds xml will be pulled and client updated.
If you don't wont to spend a lot of time and resources by pulling the data if it's not changed, you may use APC, memcache or any other server stored variable which notifies you the XML was changed.
if(apc_fetch('xml_updated') == 1)
{
//do the xml pull
}
You may test what happens if you are trying to pull data every second in terms of resources. In my opinion it's best to have a greater delay.
Hope it helps!
Ok, I didn't really now how to formulate this question, and especially not the title. But i'll give it a try and hope i'm being specific enough while trying to keep it relevant to others.
I you want to run a php script in the background (via ajax) every X seconds that returns data from a database, how do you do this the best way without using to much of the server resources?
My solution looks like this:
A user visits a webpage, ever x seconds that page runs a javascript. The javascript calls a PHP script/file that calls the database, retrieves the data and returns the data to the javascript. The javascript then prints the data to the page. My fear is that this way of solving it will put a lot of pressure on the server if there is a lot (10 000) simultaneous visitors on the page. Is there another way to do this?
That sounds like the best way, given the spec/requirement you set out.
Another way is to have an intermediary step. If you are going to have a huge amount of traffic (otherwise this does not introduce any benefit, but to the contrary may overcomplicat/slow the process), add another table that records the last time a dataset was pulled, and a hard file (say, XML) which if the 'last time' was deemed too long ago, is created from a new query, this XML then feeds the result returned to the user.
So:
1.Javascript calls PHP script (AJAX)
2.PHP pings DB table which contains last time data was fully output
3.If time is too great, 'main' query is rerun and XML file is regenerated from output
ELSE skip to 4
4.Fetch the XML file and output as appropriate for returned AJAX
You can do it the other way, contacting the client just when you need it and wasting less resources.
Comet it's the way to go for this option:
Comet is a programming technique that
enables web servers to send data to
the client without having any need for
the client to request it. This
technique will produce more responsive
applications than classic AJAX. In
classic AJAX applications, web browser
(client) cannot be notified in real
time that the server data model has
changed. The user must create a
request (for example by clicking on a
link) or a periodic AJAX request must
happen in order to get new data fro
the server.