I have an Android app that needs to receive some data from the webserver. I already got everything working, my question is.. What's the best way to synchronize data from the webserver if I'm always adding some data to it's remote database through a HTML form?
I thought of putting a timestamp, then I can receive the data with the timestamp, I would be able to compare the timestamps with each data entry from my Android app database, then, update or add the information to my local database.
What's the best way to do it? Should I put a timestamp to each entry I put in the remote and local server? If yes.. How does timestamp works in PHP(MySQL) and Android(SQLite)?
Thanks in advance! ;D
I'm assuming the data is read-only to the application and only changes on the server.
Here's how we do something similar:
The application passes up a "last_sync_date" to the server. If the last_sync_date is null, the server passes back all data.
The server always passes back a last_sync_date back to the application. The last_sync_date is in UTC server time. The application stores the last_sync_date.
Obviously, last_sync_date will be null the first time the app sends it, so the first time, it gets all data. From then on, it only gets new data.
Yes, a timestamp on each entry is used to determine which records are new/modified.
If need be, add a deleted column to each entry to "delete" them.
Related
I'm trying to send certain data from iOS to online MySQL database. PHP is used on the server to handle the data receiving and inserting.
The thing is that I have several data packages. And the key is to send them one by one, which means I need a mechanism to let the second data package in the queue wait until iOS received feedback from the server confirming the first set of data has already been stored into the database.
I initially tried creating a serial dispatch queue, aiming to have the iOS app execute uploading work in a sequence. Although iOS side did carry out the work according to the sequence, but each task simply "finished" at sending out its data package without waiting to see if the data have been inserted into the database. Then the problem is there will always be some time lapse between sending out the data and data being fully saved to MySQL in the server, due to issues like network connection.
So the result is the data may not be saved in desired sequence, with some later data may be saved earlier than the previous data.
I'm guess what is missing is a "feedback" mechanism from the server side to the iOS side.
Can anybody suggest a way to realize this feedback mechanism, so I can control the serial sequence of uploading data tasks.
Thank you very much!
Regards,
Paul
If you are sending data to server then most of available frameworks offers callback. With AFNetworking (or now known as Almofire) it would look like this:
[[ConnectionManager instance] GET: #"link" parameters: nil
success:^(AFHTTPRequestOperation* operation, id responseObject)
{
}
failure:^(AFHTTPRequestOperation* operation, NSError* error)
{
}];
So you can put your code in given handlers and continuously make requests.
You may also want to create concurrent Operations and put those on OperationQueue while setting proper dependencies but it's surely more time consuming.
I currently have a MySQL database which I was hoping to use to store regularly updated data from a temperature sensor connected to the internet.
I currently have a page that, when opened, will grab the current temperature and the current timestamp and add it as an entry to the database, but I was looking for a way to do that without me refreshing the page every 5 seconds.
Detail:
The data comes from an Arduino Ethernet, posted to an IP address.
Currently, I'm using cURL to grab the data from the IP, add a timestamp and save it to the DB.
Obviously only updates when the page is refreshed (it uses PHP).
Here is a live feed of the data - http://wetdreams.org.uk/ChrisProject/UI/live_graph_two.html
TL;DR - Basically I need a middle man to grab the data from the IP and post it to a MySQL
Edit: Thanks for all the advice. There might be a little bit of confusion, I'm looking for a solution that (ideally) doesn't require a computer to be on at all (other than the Server containing Database). Since I'm looking to store data over long periods of time (weeks), I'd like to set it up and leave a script running on the server (or Arduino) that gets the temp and posts it to the Database.
In my head I would like to have a page on the server that automatically (without any browser open, or any other prompting other than a timer) calls a PHP script.
Hope that clears things up!
you can post directly to web server from your arduino using ArduinoEtherenetClient (click link to get example)
POST /insertData.php - in insertData.php use $_POST["tempCaptured"] to get the temp value and insert that in db.
Good article on using ArduinoEthernetClient http://interactive-matter.eu/how-to/arduino-http-client-library/
Write a code(ping.php) which pings this url at fixed intervals.
Now, setup a cronjob which runs this code at fixed intervals.
Your cron can be 0 */2 * * * PATH_TO_{ping.php} // will run every 2 hours
your ping.php file will connect to the live feed, grab the data and store results to the db.:
If I understand the problem. You just need a replacement for refreshing the web page every 5 seconds?
Not getting the data?
I would setup an ajax connection, and have the php run in a kind of infinite loop. echoing new data back to your javascript to update the graph. The PHP loop would have to have a timeout check to eventually close the script.
In this case, the best solution was using the Arduino to post a request to a PHP script which done its thing and added the retrieved value to a database. The way I have it running just now (for simplicity sake) is with a GET request from the Arduino using the EthernetClient.
Code:
char server[] = "www.example.com";
and
if(client.connect(server, 80)){
client.println("GET /test.php?input_val=99 HTTP/1.0");
client.println("HOST: www.example.com");
}
However, since I'd already built my website around the fact that the data was posted to an independent server/IP*, I opted to use cron to schedule a task. In this case I wanted to update my DB every 5 - 10 seconds. However, I'm piggybacking on somebody else's server, so I contacted the server owner and asked them to set up a cron job calling "/mysubdir/cron_update.php" every minute (the fastest cron can call). From there I done a bit of 'ScriptCeption' and within the PHP script, completed my calls every 10 seconds for a minute before finishing the script.
Thanks to everybody who helped me out, I'm posting this here as a complete answer and explanation, because technically everybody was correct.
You can use JavaScript to auto refresh the page
<script type="text/javascript">
function timedRefresh(timeoutPeriod) {
setTimeout("location.reload(true);", timeoutPeriod);
}
</script>
A better solution would be ajax and send the date regulary in the background to the database.
Another way is to use a PHP-CLI script and to schedule it with cron jobs that gets the values from your sensor automatically.
Using a PHP script, I need to manage data sent to the script in a variable format.
The URL sent is something like: http://hawkserv.co.uk/heartbeat.php?port=25565&max=32&name=My%20Server&public=True&version=7&salt=wo6kVAHjxoJcInKx&players=&worlds=guest&motd=testtet&lvlcount=1&servversion=67.5.0.1&hash=randomhash&users=0
(clicking the link returns a formatted table of the results)
What is the best method of storing this information for it to be used in a formatted HTML page?
Multiple URL's will be sent to the script, with different values. The script needs to store each response to be used later, and also "time out" responses that haven't been updated in a while.
Example scenario:
3 servers exist, Server 1, Server 2, and Server 3. Each of these servers send the above url every 45 seconds with a few values changed per server. A formatted table can display information when the page is requested, and is updated when the page refreshes to any new information that the servers send.
Server 1 goes offline, and doesn't send any more requests. The script accounts for this lack of request and removes Server 1's information from the list, declaring it offline.
Although code is greatly appreciated to have, I think I can just go off the best way of doing it. Is it storing each url as an array in a file, and reading the file when needed, or is there some other way?
I would store the variables + the time the request was received in a database. The database can be a SQLite one if you don't like to go through the hassle of setting up a full blown system. The advantages of using SQLite over dumping arrays to a file is that you can do flexible queries without coding up parsing routines and the like.
I have a basic server in PHP and from the mobile device I send data and save the server every 1 minute.
But when ever my mobile phone loses the connection, I would like to set a counter on the server and if the mobile does not insert anything the database longer than 2 min. I would like to call a function in the server saying that your mobile lost connection. Every time the mobile phone sends the data to the server, timer will be reset.
I am not familiar to PHP but I searched and couldn't find any similar things. I am sure there must be an easy way of doing it. setting a listener or creating a count down timer.
You can't set a timer in your PHP code as it runs only when a client requests a page.
As you doesn't have an access to CRON jobs, you can't do it from CLI side either.
However, some webservices allow you to periodically call a page of your choice, so you can save() on every http://something/save.php call. Take a look at http://www.google.fr/search?ie=UTF-8&q=online+cron for more informations.
Note that if someone get the save() url, he can easily overload your server. Try to secure it as most as possible, maybe with a user/password combination passed as parameters.
Finally I got it worked. What I learned is that the server cant do anything if there is no request:) So I created a timestamp field in the database and update the current time with the request. Of course before updating the field gets it and compare it with the current time and see when was the last request. and find out the time difference. if the time difference bigger than 2 min change the position. I hope this helps other people as well.
I have developed a web application that uses a web server and database hosted by a web host (on the ground) and a server running on Amazon Web Services EC2. Both servers may be used by a user during a session and both will need to know some session information about a user. I don't want to POST the information that is needed by both servers because I dont want it to be visible to browsers / Firebug etc. So I need my session data to persist across servers. And I think that this means that the best option is to store all / some of the data that I need in the database rather than in a session. The easiest thing to do seems to be to keep the sessions but to POST the session_id between servers and use this as the key to lookup the data I need from a 'user_session_data' table in the database.
I have looked at Tony Marston's article "Saving PHP Session Data to a database" - should I use this or will a table with the session data that I need and session_id as key suffice? What would be the downside of creating my own table and set of methods for storing the data I need in the database?
If transfer speed and response times between EC2 and the database server are good enough, I see no problem with storing the session data as described, and passing the session_id along when transfering the user to a different server.
--
Storing session data in a database is pretty common practice. "Storing data on a session" does not imply any actual storage method – creating files on disk is simply PHP's default setting, since it doesn't require any setup.