i have a php script that scrapes data from a bunch of websites and stores them in a db. What i want to happen is instead of having the php load at every connection, i want to set it on a 10 minute interval that then stores the data it gets into a DB so i can instantly retrieve info instead of having to have the php run everytime which takes up time. I don't know ajax well and would like to keep it as php/mysql as possible. Any help is apreciated.
TL;DR: Want php to save data to a db every 10 minutes then output that db the same way until it gets over written, instead of loading new data on a refresh.
Basic options as follows. No need (or use) for AJAX here.
Make a cron job / scheduled task (Linux / Windows) that calls your script at intervals.
Add a timed javascript browser refresh to your PHP script. See here for how.
Use a browser plugin like "Easy Auto Refresh" (Chrome) or "ReloadEvery" (Firefox).
The first one is the cleanest way around, spares you from keeping a browser tab open.
Related
I have recently updated my site with the use of ajax calls to improve the end-user experience. Some calls are set to poll the db repeadedly, others are called at to alter the database upon user interaction ie. completing a task or cancelling a cart item.
Now I am getting server errors resulting from reaching my servers open file limit.
Here is an example of the sort of code I am using: (credit goes to every tutorial found on google...)
function checkForNewData() {
$.get('checkForNewData.php',false,function(data){
if(data.length){
$('#newData').html(data);
}
});
}
$(function(){
checkForNewData();
setInterval('checkForNewData()',10000);
});
I realize that by using "setInterval('checkForNewData()',10000);" that this means that file is loaded every 10000ms for every user that has this page open.
Here are my questions regarding my ignorance of ajax:
Does a unix server record each ajax call (of this manor) as a page load or open file?
If the page loads behind the scenes, do I have to close it?
Is there a better way to keep a site up-to-date than the repetitiously polling of my db.
Thanks for your time and assistants.
Does a unix server record each ajax call (of this manor) as a page load or open file?
Every-time a php file is run, it is logged. Executed PHP of any manner is recorded. That's why you can see errors in your error log if anything goes wrong during AJAX calls.
If the page loads behind the scenes, do I have to close it?
Which page? "checkForNewData.php"? No you don't. The AJAX call waits for the script to execute & finish and than gets the response.
Is there a better way to keep a site up-to-date than the repetitiously polling of my db?
Yes, there is. I would:
On the server
Use cache (maybe APC cache)
Run a DB check once every minute/ two minutes/ five minutes only
Store/ Update the results in an XML file
On the client
Get the timestamp of the most recent update on client-side page load
Get AJAX to check the timestamp (stored in the XML) of the last update
If timestamp of the AJAX response differs from the first-load timestamp, get new HTML from the XML file
Use AJAX headers or AJAX post-data to request a specific function (like asking for timestamp update vs getting HTML data).
Remember to use the correct flags for json_encode.
print_r(json_encode($html,JSON_HEX_QUOT|JSON_HEX_TAG|JSON_HEX_AMP|JSON_HEX_APOS));
Also remember to zip the data.
ob_start('ob_gzhandler');
It is a best practice to have as fewer DB calls as possible.
I need to run a PHP script from another site, without the use of CRON, so that will be called whenever anyone comes or refreshes the page.
The script will perform some kind of update my database, it is possible that it takes several tens of seconds, so I needed to run the PHP script so that it also does not limit the site visitor, from where the script will be called.
But I do not want to make the script really starts up every time someone visits or refreshes the page, I would like to limit one minute and so, before calling the script, I would like to put into MySQL database current time someone (who is the one - the one who was first) arrives or refreshes the page, and in the case where someone just to update the page is first compares the current time with the database from the last call script, and if the difference is less than one minute, so the script does not call, but if more than one minute is executed while the database again writes the current time with the last script execution.
I do not need any response from running the script.
Importantly, it shall not affect the page loading user, where it should be called.
Thanks for help
You could do a jQuery AJAX call in background after the page is load, so the user wont wait the script finish to load the page
http://api.jquery.com/jQuery.ajax/
However, I do not think the way you want to do it is correct. It's possible, but Im not sure if it's usefull.
Can you split your script in different tasks?? So you can do them before loading the page, and the users wont notice any difference.
Ok, if javascript is not an option you can do a little bit more research on php forking. It's basically php version of Thread but much more limited. So you can actually fork a child php process to run in the background while the main process still doing the usual thing. So it won't affect your day to day process.
http://php.net/manual/en/function.pcntl-fork.php
I have a form which when submitted, calls a php page (sample.php)
my php page does a lot of execution, which takes around 5 mins of time. i am also printing "Executed!" on my sample.php page.
This Executed gets printed.. only after it has executed everything ( 5mins).
I want my php page to print "Executed" before it does all the processing.
How shuld i go about this?
There have been several solutions posted that use ignore_user_abort() and flush() to continue background work after a page has been delivered to the client. You should start reading the documentation on connection handling on the php web site
However, if you ask for a stable solution, I would design your application in a way that 'sample.php' (the form action) will just recieve a job, adds it to a queue (maybe a database table) and reports that the job has been added. Where another process runs in background (maybe per cron or whatever) and runs the jobs itself. Also I would create a page like 'progress.php' where the progress of a job can be viewed. The response could be json or something like this, so that it can be easily integrated into other pages or used as data feed for the javascript progress bar you've been asked for.
I currently have a MySQL database which I was hoping to use to store regularly updated data from a temperature sensor connected to the internet.
I currently have a page that, when opened, will grab the current temperature and the current timestamp and add it as an entry to the database, but I was looking for a way to do that without me refreshing the page every 5 seconds.
Detail:
The data comes from an Arduino Ethernet, posted to an IP address.
Currently, I'm using cURL to grab the data from the IP, add a timestamp and save it to the DB.
Obviously only updates when the page is refreshed (it uses PHP).
Here is a live feed of the data - http://wetdreams.org.uk/ChrisProject/UI/live_graph_two.html
TL;DR - Basically I need a middle man to grab the data from the IP and post it to a MySQL
Edit: Thanks for all the advice. There might be a little bit of confusion, I'm looking for a solution that (ideally) doesn't require a computer to be on at all (other than the Server containing Database). Since I'm looking to store data over long periods of time (weeks), I'd like to set it up and leave a script running on the server (or Arduino) that gets the temp and posts it to the Database.
In my head I would like to have a page on the server that automatically (without any browser open, or any other prompting other than a timer) calls a PHP script.
Hope that clears things up!
you can post directly to web server from your arduino using ArduinoEtherenetClient (click link to get example)
POST /insertData.php - in insertData.php use $_POST["tempCaptured"] to get the temp value and insert that in db.
Good article on using ArduinoEthernetClient http://interactive-matter.eu/how-to/arduino-http-client-library/
Write a code(ping.php) which pings this url at fixed intervals.
Now, setup a cronjob which runs this code at fixed intervals.
Your cron can be 0 */2 * * * PATH_TO_{ping.php} // will run every 2 hours
your ping.php file will connect to the live feed, grab the data and store results to the db.:
If I understand the problem. You just need a replacement for refreshing the web page every 5 seconds?
Not getting the data?
I would setup an ajax connection, and have the php run in a kind of infinite loop. echoing new data back to your javascript to update the graph. The PHP loop would have to have a timeout check to eventually close the script.
In this case, the best solution was using the Arduino to post a request to a PHP script which done its thing and added the retrieved value to a database. The way I have it running just now (for simplicity sake) is with a GET request from the Arduino using the EthernetClient.
Code:
char server[] = "www.example.com";
and
if(client.connect(server, 80)){
client.println("GET /test.php?input_val=99 HTTP/1.0");
client.println("HOST: www.example.com");
}
However, since I'd already built my website around the fact that the data was posted to an independent server/IP*, I opted to use cron to schedule a task. In this case I wanted to update my DB every 5 - 10 seconds. However, I'm piggybacking on somebody else's server, so I contacted the server owner and asked them to set up a cron job calling "/mysubdir/cron_update.php" every minute (the fastest cron can call). From there I done a bit of 'ScriptCeption' and within the PHP script, completed my calls every 10 seconds for a minute before finishing the script.
Thanks to everybody who helped me out, I'm posting this here as a complete answer and explanation, because technically everybody was correct.
You can use JavaScript to auto refresh the page
<script type="text/javascript">
function timedRefresh(timeoutPeriod) {
setTimeout("location.reload(true);", timeoutPeriod);
}
</script>
A better solution would be ajax and send the date regulary in the background to the database.
Another way is to use a PHP-CLI script and to schedule it with cron jobs that gets the values from your sensor automatically.
I'm wondering if it is possible to push an xml file update from server to all client browsers?
Basically my proposed situation is that my server holds an xml file, when a user loads a page that uses said xml file they can request to change it, if the change is allowed (determined by the page client side) then the xml file is updated on the server side. I'm fine up to this point (well, I have plenty of reading to get me to this point). Then I want all pages who are connected to refresh all elements of the page reliant on the xml file with out refreshing the whole page.
Another words all those pages using the file to update their copy->data if the copy on the server is newer than theirs. Is this possible via a server push, or do I have to constantly poll the server to compare files? (That seems sloppy to me..) And if it is possible, what's the best way to go about it?
Thanks for any points in the right direction.
Because a webpage is stateless you cannot push data to it. You need to poll the server for updates. Think about a small ajax script that polls about every 5 minutes, when content is update that script calls something to update the page. You will need a lot of ajax to do this; take jQuery or alike to accomplish this.
You may try an AJAX call to a some php script like:
set_time_limit(3600); // one hour or set it as long as your session timeout is
// Keep on repeating this to prevent PHP from stopping the script
while (true)
{
sleep(5); //5 seconds between polling the server
//do the updates xml updates
flush();
ob_flush();
}
The connection will stay open and every 5 seconds xml will be pulled and client updated.
If you don't wont to spend a lot of time and resources by pulling the data if it's not changed, you may use APC, memcache or any other server stored variable which notifies you the XML was changed.
if(apc_fetch('xml_updated') == 1)
{
//do the xml pull
}
You may test what happens if you are trying to pull data every second in terms of resources. In my opinion it's best to have a greater delay.
Hope it helps!