I have a MySQL database that has around 600,000 records in the largest table. The other tables are fairly small in comparison. The data is somewhat normalized but there is some duplication because I'm using it for personal use and when I tried fully normalizing it, I found the queries to be unnecessarily complex and slow from all of the joins. I am using PHP to execute the queries.
Now, for example, say that the 600,000 record table contains email addresses. Imagine I have about 10 applications/clients that need to retrieve an email address from this table based on conditions and joins and no two clients should get the same email address. So, I created a query that selects an email address and then another query that uses the selected email address to update a flag field to mark the email address as "in use" and so another client cannot take the same email address. The problem is the query to select the email address takes about 25 seconds to execute and when two clients execute at the same time, they receive the same email address. The speed is not an issue because the clients will only be executing this query once every few hours but I need the clients to get unique email addresses.
I'm kind of new to MySQL so I don't know if selecting the field and then setting a flag is the proper way to go about this. Is there a way to set the flag before I select the field? Also, I don't know much about transactions but could this be solved using them?
Thanks!
START TRANSACTION;
SELECT email FROM myemails WHERE flag = 0 LIMIT 1 FOR UPDATE;
UPDATE myemails SET flag = 1 WHERE email = '$email';
COMMIT;
Another possible approach is to generate a unique flag in php and update first i.e.
$flag = uniqid();
UPDATE myemails SET flag = '$flag' WHERE flag IS NULL LIMIT 1;
SELECT email FROM myemails WHERE flag = '$flag';
Related
I am using the query below to fetch emails records from table.
but its not getting full email list from the table.
select group_concat(email) as email from table
$temp=$Db1->fetch_array($sql);
$elist['emails']=$temp[email];
It brings out only 50+- records , but i have 1400+ records in db for email.
Question:
How to get complete emails from the db using group_concat i.e. comma seperated.
You can't. GROUP_CONCAT() has a maximum length of 1024 characters (by default), and you're hitting that limit. It's possible to raise that limit, but not indefinitely, so that's not a good solution.
You don't need GROUP_CONCAT() here, though. What you want to do is fetch one row for each email address, i.e.
SELECT email FROM table
...
while ($row = $db->fetch_array) {
$emails[] = $row["email"];
}
You have to reset the default length of GROUP_CONCAT function by running below query then you have run your query.
System variable: group_concat_max_len
SET SESSION group_concat_max_len = 10000;
SELECT GROUP_CONCAT(email) AS email FROM TABLE;
You can reset its value globally or session wise. Update the length based on you usage.
I am trying to create a simple Support Request system which users can insert their email address in a form though jQuery - Ajax and PHP into MySQL database.
After that I need to send a Confirmation Email to the inserted email owner "
every time that a new request inserted into the database". I was thinking about using the Ajax call from database but I am not sure how to select
1- latest inserted row AND
2- Not selected rows to do this( there might be a situation to have two insert at exact same time then the
SELECT email FROM tbl-request ORDER BY id DESC LIMIT 1;
might return Only the last inserted however there where at least two new entries)?
can you please let me know if there is solution to do this through MySQL Trigger or jQuery Ajax
suffii you can add a new colum to the table eg. status which contain 0 as a default value.
Now every time you send a email then update this value to 1.
so you can select the rows for which an email is not sent yet like this..
SELECT email FROM tbl-request where status=0;
It will select only the latest entry containing status = 0.
There can be many way But as my point of view this also can be a better and simplest way
you can do this using cron job.
Run a cron job line every 5 mins and set a flag to check if mail is sent or not. after sending mail set the flag to 1.
We can easily save the last time we checked in a database or file. This method of doing it would allow you to have the emailer system separate from how the record is inserted, which is what I gather you want given that you're suggesting use of Triggers or AJAX to handle it. This method will work even without access to write the database from the PHP script.
At the end of the email script run:
$fh=#fopen('last-email','w');
if (!empty($fh)){
fwrite($fh,time());
fclose($fh);
}
At the start run
$last_email_time=file_get_contents('last-email');
Then add a timestamp field to your table; this will automatically append the time the record was last edited or added.
Then your query will be:
$last_time_as_sql_date=date('Y-m-d H:i:s', $last_email_time);
$query="SELECT email FROM tbl-request WHERE timestamp>'$last_time_as_sql_date' ORDER BY timestamp DESC LIMIT 1;"
How you actually run the script depends more on your implementation; if on a server back end you could run every 5 minutes using crontab -e
*/5 * * * * php /path/to/script.php
You could send the mail from PHP at the moment the request is inserted, but you may want to keep those processes separated.
To do so, an easy fix would be to add a field 'ConfirmationMailed' or something to indicate that that mail was sent. That way you can just query for requests that weren't emailed yet.
A little bit more flexible would be to create a separate table tblRequestCommunication in which you store the communications about the request.
That table could have:
Id (PK), Date, RequestId
Subject
Content
CommunicationType
The communication type could be an enum or a reference to a separate type table in which you store the types of communication to send. One of those types could be 'Automated confirmation message', and in the table you can even store the exact date time, subject and content of that message.
Now, in your query, all you have to do is search for requests without such a confirmation:
SELECT r.email
FROM
tbl-request r
WHERE NOT EXISTS
( SELECT 'x' FROM tblRequestCommunication c
WHERE c.RequestId = r.RequestId
AND c.CommunicationTypeId = 1 /* Automated confirmation */)
This structure will allow you to expand this system for other types as well, for instance an automated mail when the request was closed:
SELECT r.email
FROM
tbl-request r
WHERE
r.State = 'Closed'
AND NOT EXISTS
( SELECT 'x' FROM tblRequestCommunication c
WHERE c.RequestId = r.RequestId
AND c.CommunicationTypeId = 14 /* Close notification */)
Also, you can store 'manual' e-mails and phone reports that are linked to the request in the same table, so you've got a full history of communication.
So, it's a bit of work to create one or two extra tables and change the query, but the abilities of your system will be a lot larger.
I have an array with 100,000 users personal info in (ID, name, email etc). I need to loop through each row of the array and insert a mysql record to a table based on the row data. My problem is that I am running out of memory after about 70,000 rows.
My code:
if(!empty($users)){
$c = 0;
foreach($users as $user){
$message = // Some code to create custom email
queue_mail_to_send($user->user_email, $subject, $message, $db_options, $mail_options, $mail_queue);
}
}
Background:
I am building an email system which sends out an email to the users of my site. The code above is looping through the array of users and executing the function 'queue_mail_to_send' which inserts a mysql row into a email queue table. (I am using a PEAR library to stagger the email sending)
Question:
I know that I am simply exhausting the memory here by trying to do too much in one execution. So does anybody know a better approach to this rather than trying to execute everything in one big loop?
Thanks
I think reducing the payload of the script will be cumbersome and will not give you a satisfying result. If you have any possibility to do so, I would advise you to log which rows you have processed already, and have a script run the next x rows. If you can use a cronjob, you can stage a mail, and let the cronjob add mails to the queue every 5 minutes, until all users are processed.
The easiest way would be to store somewhere, the highest user id you have processed. I would not advise you to store the number of users, because in between batches a user can be added or removed, resulting in users not receiving the e-mail. But if you order by user id (assuming you use an auto-incrementing column for the id!), you can be sure every user gets processed.
So your user query would be something like:
SELECT * FROM users WHERE user_id > [highest_processed_user_id] ORDER BY user_id LIMIT 1000
Then process your loop, and store the last user id:
if(!empty($users)) {
$last_processed_id = null;
foreach($users as $user) {
$message = // Message creation magic
queue_mail_to_send( /** parameters **/ );
$last_processed_id = $user->id;
}
// batch done! store processed user id
$query = 'UPDATE mail_table SET last_processed_user_id = '. $last_processed_id; // please use parameterized statements here
// execute the query
}
And on the next execution, do it again until all users have received the mail.
I have exactly same problem with you. Anyway the answer from #giorgio is the best solutions.
But like java or python, we have "yield" in php. #see [here] (http://php.net/manual/en/language.generators.syntax.php)
Here is my sample code, my case is 50.000 records. and I also test successfully with 370.000 records. But it takes times.
$items = CustomerService::findAll();
foreach ($items AS $item)
{
yield (new self())->loadFromResource($item);
}
You may split that operation in multiple operations, seperated in time.
For instance, only allow your routine to process 40 emails per minute, or maybe use an array of an array, to create "pages" of records (use sql LIMIT function).
And set the arrays of array to null and unset it, when you no longer need that information.
I think you can use MySQL IN clause rather then doing foreach for every user.
Like
user_ids = array (1,2,3,4);
// Do something WHERE user_id IN ($user_ids);
and of sending mails you can user PHPMailer class by supplying comma separated email addresses in $to.
USE just one query like:
INSERT INTO table_name (COL1, Col2,...) SELECT COL1, COL2 FROM other_table;
Is there a way I can select from the database the entries with certain data? I got a lot of email addresses in the database but I want to select only from one domain. Is it even possible?
Sure - just use the LIKE operator.
SELECT email FROM Persons
WHERE email LIKE '%gmail.com'
You are not advisable to do a wildcard search.
This is because mysql not able to use index to fasten the select query.
Especially you mention you have lots of email in the database.
Alternatively, you can use an additional field, as hostname to store just the hostname only.
And of course build an index to it.
If you need to search for email with gmail.com,
then you can do straight string comparison
SELECT email FROM Persons
WHERE hostname='gmail.com';
As the straight string comparison is the good mate to mysql index, your query will be optimized.
As ajreal points out, MySQL can't use indexes to optimise a LIKE query in the general case. However in the specific case of a trailing wildcard where the only % is at the very end of the pattern (effectively a "starts with" query), the optimiser can do a good job of speeding up the query using an index.
Therefore, if you were to add an additional indexed column storing the email address in reverse, you could efficiently query for
SELECT email FROM xyz WHERE reverse_email LIKE 'moc.liamg#%`
to find all gmail addresses, or LIKE 'ku.% for all addresses under uk domains, etc. You can have the database keep this column up to date for you using triggers, so it doesn't affect your existing update code
CREATE TRIGGER emailinsert BEFORE INSERT ON xyz
FOR EACH ROW SET NEW.reverse_email = REVERSE(NEW.email);
CREATE TRIGGER emailupdate BEFORE UPDATE ON xyz
FOR EACH ROW SET NEW.reverse_email = REVERSE(NEW.email);
You need to use LIKE MYSQL CLAUSE
SELECT * FROM email_table WHERE email LIKE "%gmail.com"
Im not even sure if this is possible (Im new to php)
Anyway, what I want to do is this:
mysql_query("SELECT * FROM user_table WHERE concat(username,'#',domain)='$username' LIMIT=1");
Ok, so the $username is an email address that is submitted by a user to search the database to check that they exist. In the user_table usernames are not stored in a single column, they are stored in several with the domain and the actual username being separate.
for example username could be bob and the domain could be website.com.au
Then when the user wants to search for that user the type in bob#website.com.au
This goes to the query above.
So, should it work or not? If not how can I make this work or what suggestions do you have for me?
As BobbyJack has mentioned, this is a slow way of locating a user record.
If you cannot store email address in a single column and place an index on that column, split the string in PHP and make your query:
SELECT * FROM user_table WHERE `username` = '$username' AND `domain` = '$domain'
You could then create a unique index combining domain + username so you wouldn't need LIMIT 1
probably worded the question slightly wrong.
Anyway this is what I have done "SELECT * FROM virtual_user WHERE concat_ws('#',username,domain)='$username'"
I no longer need to use the LIMIT=1, I probably never needed to as all results in the table are individual, so it will always only return a limit of 1 or nothing at all.
It isn't slow in my opinion, but then again Im not really sure what to compare it to. We have about 7000+ records it sorts through so yeah. Is there anyway to get it to tell you how long the query took to complete?
I would like to put both the username and domain into just a single indexed field but its for a postfix mail server and I'm not allowed or game to play with the queries it uses. Especially not on a functioning server that actually handles mail.