Is it possible to keep checking a MySQL database and then update text? I'm using this to get the number of rows and then it's echo-ed:
$result = mysql_query("SELECT * FROM socialacc WHERE transid='$transID' AND access='$access'");
$i = 0;
while($row = mysql_fetch_array($result)) {
$i = $i + 1;
}
echo $i;
I want this code to be repeated so that if there's ever another row that matches, the text would be updated without the page being reloaded. Would I use Javascript to repeat the check? Any idea's how I would go about it? Thanks for the help.
First of all, you can let the database count for you:
$result = mysql_query("SELECT COUNT(*) FROM socialacc WHERE transid='$transID' AND access='$access'");
$result = mysql_fetch_array($result);
echo $result[0];
Then, you'll probably want to repeat this query every so often through an AJAX call, but not so often that it'll kill your database.
You have two choices here: server push or AJAX.
Server push is probably simpler to program at the client side but quite complicated to write the server side in a way that won't eat some system resource or other on your server. AJAX more complicated at the client side, but simpler and easier to implement overall.
I would recommend the AJAX route for this job.
Related
alright,The problem is I can't get my database to add a vote to it on the votes page or my database after an individual on the previous page clicks submit for there vote to a question. I've tried things like $vote = $vote + 1 in my PHP code and then displaying it with SQL through $command but that doesn't work and many other things I've tried like making a variable = 1 ($avote = 1) so I'm kinda stuck, someone told me that I can just add 1 by doing this -
$votes = $row['Votes' . $i] + 1; and then I'd have to display it under somehow but I couldn't figure it out, here's my code within my page to retrieve the information and add it to my database and page (I've taken out unneccesary HTML table tags for a clearer view):
// if $row is false, then the pollid must have been wrong
if ($row) {
// display the poll
echo '<h1>' . $row['Title'] . '</h1>';
echo '<p><b>' . $row['Question'] . '</b></p>';
for ($i = 1; $i <= 4; $i++) {
$answer = $row['Answer' . $i];
$votes = $row['Votes' . $i];
$avote = 1;
$command = "UPDATE FROM polls SET $answer WHERE $answer='$answer+$avote'";
$stmt = $dbh->prepare($command);
$stmt->execute(array($_SESSION['Votes']));
If my friend is right about the $votes = $row['Votes' . $i] + 1; can someone please let me know how I'd display it within my $command SQL code:
$command = "UPDATE FROM polls SET $answer WHERE $answer='$answer+$avote'";
Please and Thank you for any insights.
Do you get error messages?
It sounds like you're trying to show the results of a poll (after the user submits their own vote), but you're having trouble retrieving the results of the poll. Since the poll results need to persist across users and sessions, you have to store it somewhere. I guess that's what $answer is in your database?
Your UPDATE query is a bit broken. First, you should make sure it works properly without the variables, I like to go the the command line client or a graphical tool like phpMyAdmin. It might look more like:
UPDATE polls SET result = result + 1 WHERE poll_id = 1;
Where poll is your table and result and poll_id are columns in your table.
It appears as if you're trying to ask the user multiple questions, so you would have a different poll_id for each one, and use a hidden form field to get the value for the poll_id. You appear to be using the $row array for that now, but it seems fragile and it won't scale as you build more poll options (well, technically it will scale up, but you won't ever be able to remove a question or get rid of old polls.
You seem to be using PDO (since the "object oriented style" mysqli execute takes a void parameter and none of the PHP libraries that speak to SQL Server seem to have that exact syntax), but then I would expect your prepare statement to have question marks for the variables rather than direct substitution. See the PHP manual for details if you're unclear about the proper syntax, but hopefully you've already been through that before coming here.
Once you get those issues cleaned up, if you have further problems it should be a bit easier to trace through what's going on.
I have a bit of a problem when I try to take a huge amount of data from a mysql table to a redis database. Anyway I'm getting the error "MySQL server has gone away" after a while and I have no idea why..
EDIT:
OR when I use the commented code that breaks the loop it just goes "finished" when it isn't finished.
This is the php code I use (runned by php-cli):
<?php
require 'Predis/Autoloader.php';
Predis\Autoloader::register();
mysql_connect('localhost', 'root', 'notcorrect') or die(mysql_error());
mysql_select_db('database_that_i_use') or die(mysql_error());
$redis = new Predis\Client();
//starting on 0 but had to edit this when it crashed :(
for($i = 3410000; $i<999999999999; $i += 50000) {
echo "Query from $i to " . ($i + 50000) . ", please wait...\n";
$query = mysql_unbuffered_query('SELECT * FROM table LIMIT ' . $i . ', 50000')or die(mysql_error());
// This was code I used before, but for some reason it got valid when it wasn't supposed to.
/*if(mysql_num_rows($query) == 0) {
echo "Script finished!\n";
break;
}*/
while($r = mysql_fetch_assoc($query)) {
$a = array('campaign_id' => $r['campaign_id'],
'criteria_id' => $r['criteria_id'],
'date_added' => $r['date_added'],
);
$redis->hmset($r['user_id'], $a);
unset($a);
usleep(10);
}
echo "Query completed for 50000 rows..\n";
sleep(2);
}
unset($redis);
?>
My question is how to do this better, I have seriously no idea why it crashes. My server is pretty old and slow and maybe can't handle this large amount of data? This is just a testserver before we switch to real production.
Worth to notice is that the script ran fine for maybe half an hour and it may be the limit statement that makes it very slow when the number get high? Is there then an easier way to do this? I need to transfer all the data today! :)
Thanks in advance.
EDIT: running example:
Query from 3410000 to 3460000, please wait...
Query completed for 50000 rows..
Query from 3460000 to 3510000, please wait...
Query completed for 50000 rows..
Query from 3510000 to 3560000, please wait...
Query completed for 50000 rows..
Query from 3560000 to 3610000, please wait...
MySQL server has gone away
EDIT:
The table consist of ~5 million rows of data and is approx. 800 MB in size.
But I need to do similar things for even larger tables later on..
First, you may want to use another script language. Perl, Python, Ruby, anything is better than PHP to run this kind of scripts.
I cannot comment on why the mysql connection is lost, but to get better performance you need to try to eliminate as many roundtrips as you can with the mysql server and the redis server.
It means:
you should not use unbuffered queries but buffered ones (provided LIMIT is used in the query)
OR
you should not iterate on the mysql query using LIMIT since you get a quadratic complexity while it should be only linear. I don't know if it can be avoided in PHP though.
you should pipeline the commands you sent to Redis
Here is an example of pipelining with Predis:
https://github.com/nrk/predis/blob/v0.7/examples/PipelineContext.php
Actually, if I really had to use PHP for this, I would export the mysql data in a text file (using "select into outfile" for instance), and then read the file and use pipelining to push data to Redis.
I have a mysql query that retrieves all my topic results. I then have a pagination system where results are separated into pages and the query's limit #,# changes based on what page you are on.
What I want to do is put all those results into two separate div containers. I want 21 results on each page. The first 9 I will put in one div. The next 12 will go in the other. Does anyone know an efficient way to do this? Should I use two queries, or javascript, or another way? I am just looking for the best most efficient way to do this. Unfortunately the pagination system makes two queries difficult. Any suggestions highly appreciated.
$sql = "SELECT * FROM topics LIMIT ?,?";
$stmt2 = $conn->prepare($sql);
$result=$stmt2->execute(array(somenumber,somenumber2));
I don't see any reason why you can't do a single MySQL query and use JavaScript to sort the results. Understand that I don't understand here what your data is coming back looking like, so any example I provide will have to remain pretty agnostic in this regard.
I will, however, assert as an assumption that you have a JavaScript array of length 21 with some data that is the basis for your display.
Assuming that we're just talking about the first 9, and the last 12, the sorting code is as simple as:
// assume my_array is the array mentioned above
for (var i = 0; i < 9; i += 1) {
var html = code_to_transform_data_from_array(array[i]);
$('.div1').append($(html));
}
for (var i = 9; i < 21; i += 1) {
var html = code_to_transform_data_from_array_b(array[i]);
$('.div2').append($(html));
}
If your sorting condition is any more complicated, then you'd be better off with something like...
while (my_array.length > 0) {
var item = my_array.pop();
if (sorting_condition) {
$('.div1').append(f1(item));
}
else {
$('.div2').append(f2(item));
}
}
(In the second example, I became a lazy typist and assumed f1 and f2 to be complete transformation functions. sorting_condition is your criteria for determining in which bucket something goes.
Hope that sets you off on the right track.
I have a php page query mysql database, it will return about 20000 rows. However the browser will take above 20 minutes to present. I have added index on my database and it do used it, the query time in command line is about 1 second for 20000 rows. but in web application, it takes long. is anyone know which causing this problem? and better way to improve it?Below is my php code to retrieve the data:
select * from table where Date between '2010-01-01' and '2010-12-31'
$result1 = mysql_query($query1) or die('Query failed: ' . mysql_error());
while ($line = mysql_fetch_assoc($result1)) {
echo "\t\t<tr>\n";
$Data['Date'] = $line['Date'];
$Data['Time'] = $line['Time'];
$Data['Serial_No'] = $line['Serial_No'];
$Data['Department'] = $line['Department'];
$Data['Team'] = $line['Team'];
foreach ($Data as $col_value) {
echo "\t\t\t<td>$col_value</td>\n";
};
echo "\t\t</tr>\n";
}
Try adding an index to your date column.
Also, it's a good idea to learn about the EXPLAIN command.
As mentioned in the comments above, 1 second is still pretty long for your results.
You might consider putting all your output into a single variable and then echoing the variable once the loop is complete.
Also, browsers wait for tables to be completely formed before showing them, so that will slow your results (at least slow the process of building the results in the browser). A list may work better - or better yet a paged view if possible (as recommended in other answers).
It's not PHP that's causing it to be slow, but the browser itself rendering a huge page. Why do you have to display all that data anyway? You should paginate the results instead.
Try constructing a static HTML page with 20,000 table elements. You'll see how slow it is.
You can also improve that code:
while ($line = mysql_fetch_assoc($result1)) {
echo "\t\t<tr>\n";
foreach ($line as $col_value) {
echo "\t\t\t<td>$col_value</td>\n";
flush(); // optional, but gives your program a sense of responsiveness
}
echo "\t\t</tr>\n";
}
In addition, you should increase your acceptance rate.
You could time any steps of the script, by echoing the time before and after connecting to the database, running the query and outputting the code.
This will tell you how long the different steps will take. You may find out that it is indeed the traffic causing the delay and not the query.
On the other hand, when you got a table with millions of records, retreiving 20000 of them can take a long time, even when it is indexed. 20 minutes is extreme, though...
I'm trying to draw a real time graph as my mysql table is constantly being inserted with values, like a moving graph referenced from
http://kalanir.blogspot.com/2009/11/how-to-plot-moving-graphs-using-flot.html
The values actually come from a carbon dioxide sensor which updates the value of the table with co2 values with positions id. I changed her Math.Random to the code below:
<?php $result = mysql_query("SELECT * FROM node1 ORDER BY id DESC LIMIT 1")or die(mysql_error());?>
<?php $row = mysql_fetch_array( $result );?>
var j = "<?php echo $row['co2'];?>";
var next = "<?php echo $row['id'];?>";
for (var i = 0; i < this.xscale - 1; i++)
{
this.array[i] = [i,this.array[i+1][1]]; // (x,y)
}
this.array[this.xscale - 1] = [this.xscale - 1,j];
However, when i run this code, the first value changes, after which it remains constant, even though the last row of the table is being updated.
I heard it is because in php, the server is only polled once. Therefore i only get a constant reading of the first data. Is there any way in which i can make the graph update to the last value of the table? with ajax?
Thanks for your help
Yes, you can use Periodic refresh (Polling)
or
HTTP Streaming.
Note that both of these options can be quite bandwidth demanding.
you have to do some sort of polling. But even before you do that,
1. create a php file that retrieves all the important data from the db.
2. let that file echo/return that data in a formatted way.
3. have js function poll that file at intervals (a function that runs in setInterval() )
and yes.. there would be some bandwith issues but i think its manageable.