I have a script that adds tweets to a data base and another one the fetches those tweets and displays via php/ajax. This all works good with an empty db but if I already have a lot of tweets in there It runs intp problems. What I'm trying to do now is set the msql query to get the first say 500 rows and then continue on as tweets are added and ignore the preivous 500 tweets etc. So start at the latest row, get the latest row upto 500max then next time its queried get ro 1001, 1002, 1003 each.
Any ideas on how I could do this? Once its got the latest 500 rows I could set a variable to change the query??
thanks
EDIT:
$ts = mysql_real_escape_string($_GET['lasttweet']);
$result = mysql_query("SELECT tweet_id, tweet_text, screen_name FROM tweets WHERE tweet_id > $ts LIMIT 100") or die (mysql_error());
Thsi is what I've tried for the query, but everytime it just gets another 100 old posts, where I need it to get new ones from then on.
If you don't want to update the records, use the full version of the LIMIT clause.
SELECT ... FROM tweets WHERE ... LIMIT 1 500
SELECT ... FROM tweets WHERE ... LIMIT 501 500
SELECT ... FROM tweets WHERE ... LIMIT 1001 500
of course, this will mean you need to keep track of which block you last retrieved.
you have to keep a different field like 'tweeted' with default value 0 and once you fetch the 500 records update tweeted field for those records to 1 and modify your select statement as
"SELECT tweet_id, tweet_text, screen_name FROM tweets WHERE tweeted != 1 LIMIT 500"
$ts = mysql_real_escape_string($_GET['lasttweet']);
mysql_query('SET CHARACTER SET utf8');
//Query to get number of rows
$numRows = mysql_query("SELECT tweet_id FROM tweets ") or die (mysql_error());
$rows = mysql_num_rows($numRows);
//Only show last/latest 500 tweets
$tweetLimit = 500;
if($_GET['lasttweet'] == 0){
if($rows > $tweetLimit){
$startFrom = $rows - $tweetLimit;
}
}else{
$startFrom = 0;
}
$result = mysql_query("SELECT tweet_id, tweet_text, screen_name, profile_image_url
FROM tweets
WHERE tweet_id > $ts
ORDER BY tweet_id
LIMIT $startFrom, $tweetLimit
")
or die (mysql_error());
Related
I have combined a PHP script to count words in MySQL textfield and update another field accordingly.
It works well with relatively small tables - but when I tried with really big table (10M records) - of course I've got "PHP Fatal error: Allowed memory size of 134217728 bytes exhausted"
Could somebody hint how to modify the script below to process the data "row-by-row" ?
<?php
$con1 = mysqli_connect('localhost','USERNAME','PASSWORD','DATABASE');
if (!$con1) {
die('Could not connect: ' . mysqli_error($con1));
}
$sql = "SELECT id FROM TableName";
$result = mysqli_query($con1, $sql);
while ($row = mysqli_fetch_assoc($result)) {
$to = $row["id"];
$sql1 = "SELECT textfield FROM TableName WHERE id = '$to' ";
$result1 = mysqli_query($con1,$sql1);
$row1 = mysqli_fetch_assoc($result1);
$words=(str_word_count($row1['textfield'],0));
$sql2="UPDATE TableName SET wordcount = '$words' WHERE id ='$to'";
$result2 = mysqli_query($con1,$sql2);
}
mysqli_close($con1);
?>
MySQL queries have the clause limit o, n, so you can run
SELECT id FROM TableName limit 0, 10
for example to get only 10 elements from the start. Now, the first number is the offset (the index where you start your work from) and the second is the number of elements you would expect to get. Now, these are the ideas you need to know in order to have success in doing this:
you will need to write a loop
in the loop you will always get n elements (n could be 1 as you wanted, or more)
in each step you increment o by n, so the new offset will be starting where the results ended previously
you can ensure an order, like order by id, for example
you can wrap the loop we are speaking about here around most of your code
I am making a web application where I want to save and display the 10 most recent pages viewed by the user for the user, and I can only come up with everytime a user view a page, the page will insert a data into a MySQL table called 'recentview', then I will call the data and display it as recently viewed pages in the user interface.
Now the problem is I want to limit the number of entries for each user to 10 inside the MySQL table, and when a user visited the 11th page, the table should automatically delete the oldest data for that particular user, so there must always been max of 10 entries for each user inside the table.
So far I have been searching through the web and most I could find are people asking how to limit the data entries for the whole table, while what I'm looking for is limiting the entries for each user.
So far here is the PHP code :
<?php
$con = mysqli_connect('localhost','root','','my_db');
$email= $_COOKIE['email'];
$recentsql = "INSERT INTO recent (email, page) VALUES ('email', 'page1')";
$recentquery = mysqli_query($con, $recentsql);
$countsql = "SELECT COUNT(*) FROM recent WHERE email='$email'";
$countquery = mysqli_query($con,$countsql);
$countrow = mysqli_fetch_array($countquery);
$count = $countrow[0];
if ($count > 10) {
//php codes here to delete the oldest entries for this particular user($_COOKIE['email'])//
//I could not figure out what is the code to do this task, please help//
}
?>
All helps and better suggestions are very much appreciated. Thx
OK, I apologize if my question is not very clear, so here is an example of what the MySQL table looks like :
*I classified my user id by their e-mail address
PID email recentpage
1 user1#user1.com page1
2 user1#user1.com some other page
3 user1#user1.com other page
4 user1#user1.com page2
5 user1#user1.com some more page
6 user1#user1.com more page
7 user1#user1.com pages
8 user1#user1.com different page
9 user1#user1.com still different page
10 user1#user1.com page in the web
11 user2#user2.com page1
12 user2#user2.com other page
13 user2#user2.com still different page
so what I want is set a limit for each user (i.e user1#user1.com & user2#user2.com, etc) to only 10 recentpage data in the table, and if they visit another new different page, it will delete the oldest entries, in this case for user1#user.com, "page1" and insert the newest visited page in the table, hope this makes my question clearer. Thx for helps.
I'm not hundred percent sure if I get what you want but I think this would do it:
<?php
$con = mysqli_connect('localhost','root','','my_db');
$email= $_COOKIE['email'];
$countsql = "SELECT COUNT(*) FROM recent WHERE email='$email'";
$countquery = mysqli_query($con,$countsql);
$countrow = mysqli_fetch_array($countquery);
$count = $countrow[0];
if ($count == 10) {
$deletesql = "DELETE FROM `recent` WHERE `id` = (SELECT `id` FROM (SELECT id FROM `recent` WHERE `email` = '$email' ORDER BY `id` DESC LIMIT 1) as t)";
$deletesql_res = mysqli_query($con, $deletesql);
}
$recentsql = "INSERT INTO recent (email, page) VALUES ('email', 'page1')";
$recentquery = mysqli_query($con, $recentsql);
?>
You can use a subquery to delete everything except the 10 most recent emails
$query = "delete from recent
where email = '$email'
and id not in (
select id from recent
where email = '$email'
order by id desc limit 10
)";
You can use the following query to show 10 most recent pages without deleting all the previous records. In the query id is the primary key of your table
SELECT * FROM recent WHERE email='$email' ORDER BY id DESC LIMIT 10
If you have $count records in your table, you can retain the last 10 by deleting $count - 10
$limit = $count-10 ;
$deletesql = "DELETE * from recent ORDER by id LIMIT $limit";
$deletequery = mysqli_query($con,$deletesql);
You have to run a batch job periodically and it should be cabable to delete oldest data
Currently im selecting most updated row using the following php block, i have an ajax loop which runs this php block every few seconds to return feed. i want to echo false to ajax when latest timestamp hasn't changed so that ajax doesn't duplicate results and fill my Div (#feed) with identical content.
<?php
require_once 'db_conx.php';
$Result = mysql_query("SELECT * FROM profiles ORDER BY lastupdated desc limit 1") or die (mysql_error());
while($row = mysql_fetch_array($Result)){
echo $row['name'];
}
mysql_close($con);
?>
Somewhere in the session, or in the requestor, you need to store the last fetched time. It would be better to store it as a session variable (this I presume is client-specific because different clients will have loaded at different times) and then fetch all records that have their lastupdated time greater than the last_fetched time.
Everytime entries are fetched from the DB, just update the last_fetched variable to the current timestamp.
If you are running this every 5 seconds, I would do something like
$Result = mysql_query("SELECT * FROM profiles WHERE lastupdated > ADDTIME( NOW( ) , '-00:00:05.00000' ) ORDER BY lastupdated desc limit 1") or die (mysql_error());
$num_rows = mysql_num_rows($result);
if ($num_rows = 0) {
return false;
} else {
return true;
}
This will give you any rows that have been updated in the last 5 seconds, if it is older than this it should have been picked up in the last run.
Hope this helps
When you retrieve your results from the server I'm guessing you store the timestamp of the last query in some PHP variable in order to compare the returned results of the next query.
I would concatenate the timestamp into your query as a where clause so your sql query would become something like
$Result = mysql_query("SELECT * FROM profiles WHERE lastupdated > " . $timestamp . " ORDER BY lastupdated desc limit 1") or die (mysql_error());
It's also worth noting that when this executes if there has been more than one profile updated since the last update cycle then it will only return the most recently updated row.
Does this answer your question?
I have insert in table any time when users open any post on my site, in this way im get real time 'Whats happend on site'
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
but now have problem because have over 100K hits every day, this is a 100K new rows in this table every day, there is any way to limit table to max 100 rows, and if max is exceeded then delete old 90 and insert again or something like that, have no idea what's the right way to make this
my table just_watched
ID - content_id
ID INT(11) - AUTO_INCREMENT
content_id INT(11)
Easiest way that popped into my head would be to use php logic to delete and insert your information. Then every time a user open a new post you would then add the count the database. (this you are already doing)
The new stuff comes here
Enter a control before the insertion, meaning before anything is inserted you would first count all the rows, if it does not exceed 100 rows then add a new row.
If it does exceed 100 rows then you before inserting a new row you, first do a delete statement THEN you insert a new row.
Example (sudo code) :
$sql = "SELECT COUNT(*) FROM yourtable";
$count = $db -> prepare($sql);
$count -> execute();
if ($count -> fetchColumn() >= 100) { // If the count is over a 100
............... //Delete the first 90 leave 10 then insert a new row which will leave you at 11 after the delete.
} else {
.................. // Keep inserting until you have 100 then repeat the process
}
More information on counting here. Then some more information on PDO here.
Hopefully this helps :)
Good luck.
Also information on how to set up PDO if you haven't already.
What I would do? :
At 12:00 AM every night run a cron job that deletes all rows from the past day. But thats just some advice. Have a good one.
Use this query for deleting old rows except last 100 rows:
DELETE FROM just_watched where
ID not in (SELECT id fromjust_watched order by ID DESC LIMIT 100)
You can run it by CRON in every n period where (n= hours, or minutes, or any)
$numRows = mysql_num_rows(mysql_query("SELECT ID FROM just_watched"));
if ($numRows > 100){
mysql_query("DELETE FROM just_watched LIMIT 90");
}
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
I guess this should work fine.
You can get the number of rows in your table with:
$size = mysql_num_rows($result);
With the size of the table, you can check, if it's getting to big, and then remove 90 rows:
// Get 90 lines of code
$query = "Select * FROM just_watched ORDER BY id ASC LIMIT 90";
$result = mysql_query($query);
// Go through them
while($row = mysql_fetch_object($result)) {
// Delete the row with the id
$id = $row['id'];
$sql = 'DELETE FROM just_watched
WHERE id=$id';
}
Another way would be to just delete an old row if you add a new row to the table. The only problem is, that if something get's jammed, the table might get to big.
You may use
DELETE FROM just_watched ORDER BY id DESC LIMIT 100, 9999999999999999;
So, it'll delete all the rows from the offset 100 to a big number (for end of the tables). if you always run this query before you insert new one then it'll do the job for you.
MY SQL QUERY:
$q = mysql_query("SELECT * FROM `ads` WHERE keywords LIKE '%$key%' ORDER BY RAND()");
RESULTS: KEYWORD123
This query searches and results in one random row but i want to show 2 random rows.
How to do that?
any solution?
how??
im grabbing it using this
$row = mysql_fetch_array($q); if ($row
<= 0){ echo 'Not found'; }else{ echo
$row['tab']; }
That query (as-is) will return more than one row (assuming more than one row is LIKE %$key%). If you're only seeing one record, it's possible you're not cycling through the result set, but rather pulling the top response off the stack in your PHP code.
To limit the response to 2 records, you would append LIMIT 2 onto the end of the query. Otherwise, you'll get every row that matches the LIKE operator.
//Build Our Query
$sql = sprintf("SELECT tab
FROM ads
WHERE keyword LIKE '%s'
ORDER BY RAND()
LIMIT 2", ('%'.$key.'%'));
// Load results of query up into a variable
$results = mysql_query($sql);
// Cycle through each returned record
while ( $row = mysql_fetch_array($result) ) {
// do something with $row
echo $row['tab'];
}
The while-loop will run once per returned row. Each time it runs, the $row array inside will represent the current record being accessed. The above example will echo the values stored in your tab field within your db-table.
Remove your order by and add a LIMIT 2
That happens after the execution of the SQL.
Right now you must be doing something like
$res = mysql_query($q);
$r = mysql_fetch_array($res);
echo $r['keywords'];
what you need to do
$q = mysql_query("SELECT * FROM ads WHERE keywords LIKE '%$key%' ORDER BY RAND() LIMIT 2");
$res = mysql_query($q);
while($r = mysql_fetch_array($res)){
echo "<br>" . $r['keywords'];
}
Hope that helps
This query will return all rows containing $key; if it returns only one now this is simply by accident.
You want to add a LIMIT clause to your query, cf http://dev.mysql.com/doc/refman/5.0/en/select.html
Btw both LIKE '%... and ORDER BY RAND() are performance killers