Report generation based from mysql query - php

I have this 3 tables namely form, form_responses, metrics with the following structure
form
->id
->phone
->calldatetime
form_reponses
->id
->form_id
->metrics_id
->response
metrics
->id
->description
->question
And I want to make report with a format something like this
|Metrics Description|Metrics Question|Phone1|Phone2|Phone3|Phone4
|___________________|________________|______|______|______|______
| Sample | Sample | Yes | Yes | Yes | Yes
Is it possbile to this output just by the mysql query alone? Please note that the Phone1, Phone2, Phone3... is scaling horizontally. Originally I need that output in the excel file I have already tried this using Laravel PHP and http://www.maatwebsite.nl/laravel-excel/docs
$query = "SELECT id, phone FROM qcv.forms WHERE calldatetime >= '$from' AND calldatetime <= '$to' ORDER BY id ASC LIMIT 250 ;";
$phone = DB::connection('mysql')->select($query);
$metrics = Metric::all();
$metric_start = 10;
$start = "D";
$count = 10;
foreach ($phone as $key => $value2) // Populate Phone Numbers Horizontally
{
$sheet->cell($start.'9', $value2->phone);
// This will fill the responses for each number
foreach ($metrics as $key => $value)
{
$responses = FormResponses::where('form_id', '=', $value2->id)->where('metrics_id', '=', $value->id)->get();
$sheet->cell($start.$count, $responses[0]->response);
$count++;
}
$start++;
$count = 10;
}
foreach ($metrics as $key => $value) // Populate Metrics Vertically
{
$sheet->cell('C'.$metric_start, $value->question);
$sheet->cell('B'.$metric_start, $value->description);
$sheet->cell('A'.$metric_start, $value->metrics_name);
$metric_start++;
}
But seems this method is really slow especially in processing so I'm wondering if I could do the output in mysql command alone?

To get multiple sub-records per row in a one-to-many relationship using SQL, you would have to use a sub-query:
SELECT
m.description,
m.question,
(select phone from form f1 where f1.id = m.id and ...
/* some other unique criteria */) as Phone1,
(select phone from form f2 where f2.id = m.id and ...
/* some other unique criteria */) as Phone2,
(select phone from form f3 where f3.id = m.id and ...
/* some other unique criteria */) as Phone3,
(select phone from form f4 where f4.id = m.id and ...
/* some other unique criteria */) as Phone4
FROM metrics m
However...you may not have any columns to uniquely identify each form in this way...and your SQL engine may not allow a 3rd level of nesting of sub-queries (which is another way to individually select records from the same table).
So here's one other variation that would work. It should be slightly less code and fewer database connections, so it should perform better, even if you find it less intuitive. Here's the SQL portion:
SELECT
m.description,
m.question,
f.phone
FROM metrics m
INNER JOIN form f ON f.id = m.id
And then in PHP:
$lastid = '';
$phone_count = 0;
foreach ($record as $key => $value) {
$phone[$phone_count] = $value->phone;
$phone_count++;
if ($lastid != $value->id) {
// new record
$sheet->cell ( /* whatever */ );
$phone_count = 0;
}
$lastid = $value->id;
}

Related

Creating dataset for network graph using data from database

I get data from my database and loop through it and compare employees of one company with employees of other companies to create an array of nodes and egdes. Nodes being companies and edges being employees working for both companies.
Comparing so many variables seems to be slowing the process a lot and is really inefficient so I am looking for a better way of achieving said array/json object.
Here is how the database looks like http://imgur.com/8iVJfwW
The final json object for d3 should look like:
{"nodes":[{"fullName":"Anglo American plc"},{"fullName":"Associated British Foods plc"},{"fullName":"ARM Holdings plc"},{"fullName":"Dixons Carphone plc"},{"fullName":"Diageo plc"},{"fullName":"Direct Line Insurance Group PLC"},{"fullName":"easyJet plc"},{"fullName":"GKN plc"},{"fullName":"Hammerson plc"},{"fullName":"International Consolidated Airlines Group, S.A."},{"fullName":"Imperial Brands PLC"},{"fullName":"intu properties plc"},{"fullName":"Intertek Group plc"},{"fullName":"ITV plc"},{"fullName":"Johnson Matthey Plc"},{"fullName":"Kingfisher plc"},{"fullName":"Lloyds Banking Group plc"},{"fullName":"Mediclinic International plc"},{"fullName":"Merlin Entertainments plc"},{"fullName":"National Grid plc"},{"fullName":"Next Plc"},{"fullName":"Provident Financial plc"},{"fullName":"Pearson plc"},{"fullName":"Reckitt Benckiser Group plc"},{"fullName":"Royal Dutch Shell plc"},{"fullName":"RELX PLC"},{"fullName":"Rio Tinto plc"},{"fullName":"RSA Insurance Group plc"},{"fullName":"SABMiller plc"},{"fullName":"J Sainsbury plc"},{"fullName":"Sky plc"},{"fullName":"Standard Life plc"},{"fullName":"SSE plc"},{"fullName":"Severn Trent Plc"},{"fullName":"Travis Perkins plc"},{"fullName":"Tesco PLC"},{"fullName":"Taylor Wimpey plc"},{"fullName":"United Utilities Group PLC"},{"fullName":"Worldpay Group plc"},{"fullName":"Whitbread PLC"}],"edges":[{"source":0,"target":12,"officers":["PARKER, Thomas, Sir"]},{"source":0,"target":19,"officers":["STEVENS, Anne"]},{"source":0,"target":47,"officers":["GROTE, Byron"]},{"source":1,"target":14,"officers":["BASON, John"]},{"source":2,"target":13,"officers":["PUSEY, Stephen"]},{"source":2,"target":51,"officers":["KENNEDY, Christopher"]},{"source":3,"target":7,"officers":["BARKER, Glyn"]},{"source":3,"target":13,"officers":["WHEWAY, Jonathan"]},{"source":4,"target":9,"officers":["REYNOLDS, Paula"]}]};
What I am doing is executing this query:
SELECT
CD.Company_ID,
CD.Company_Name,
OD.Officer_Name,
CO.Officer_Role
FROM
Company_Details CD
INNER JOIN Company_Officer CO
ON CD.Company_ID = CO.Company_ID
INNER JOIN Officer_Details OD
ON CO.Officer_ID = OD.Officer_ID
WHERE CD.Company_Index='FTSE 100' AND
CO.Resigned_On='' AND
CO.Officer_ID IN
( SELECT CO2.officer_id
FROM Company_Officer CO2
INNER JOIN Company_Details CD2
ON CO2.Company_ID = CD2.Company_ID
WHERE CO2.Resigned_On='' AND CD2.Company_Index ='FTSE 100'
GROUP BY CO2.officer_id
HAVING Count( DISTINCT CO2.company_id ) > 1
)
ORDER BY `CD`.`Company_ID` ASC;
Which gives me names of officers and companies, only officers that work for more than 1 company (to create edges) and only officers that have not resigned.
First I create $nodes by looping through the query and getting only unique companies.
while($row = mysqli_fetch_array($data)){
array_push($Officers_DB,array("name"=>$row['Officer_Name'], "company"=>$row['Company_Name']));
if(!valueExists($nodes, 'fullName', $row['Company_Name'])){ //Get rid of duplicates
array_push($nodes, array("fullName"=>$row['Company_Name']));
}
}
Then I create $edges by comparing each company with every other company from my $nodes array then I check if I am not comparing the same companies after that I loop through all officers from query and compare them with all other officers from query again I check if this isn't the same officer and then I check if officer from first loop works for $i company and in the next loop if officer works for '$j' company so if there is a person with the same name working for 2 different companies I create edge in $edges.
$edges = array();
for ($i = 0; $i < count($nodes); $i++) {
for ($j = $i + 1; $j < count($nodes); $j++) {
if ($nodes[$i]['fullName'] != $nodes[$j]['fullName']) {
foreach($Officers_DB as $Officer){
if($Officer['company']==$nodes[$i]['fullName']){
foreach($Officers_DB as $Officer2){
if($Officer2['company']==$nodes[$j]['fullName']){
if($Officer['name']==$Officer2['name']){
array_push($edges, array("source"=>$i, "target"=>$j, "officers"=>array($Officer['name'])));
}
}
}
}
}
}
}
}
foreach ($edges as $i => &$edge) {
for ($j = $i + 1; $j < count($edges); $j++) {
if ($edge['source'] == $edges[$j]['source'] && $edge['target'] == $edges[$j]['target']) {
foreach ($edges[$j]['officers'] as $officer) {
array_push($edge['officers'], $officer);
}
array_splice($edges, $j, 1);
}
}
}
This method works but is really slow and inefficient and I was wondering of other ways of achieving the same result.
Here is how the database looks like Company_details: http://i.imgur.com/bzDBIPI.png companies are unique
Officer_details: http://i.imgur.com/xce9DW5.png officers are unique
and Company_Officer: http://i.imgur.com/SNYOx0i.png which is the relational table between the other two. With relation one to many and many to one.

Pull number of rows from a SQL query and put it in PHP as a variable?

This is 4 queries put into one. This is really old code and once I can make this work we can update it later to PDO for security. What I am trying to do is count rows from
select count(*) from dialogue_employees d_e,
dialogue_leaders d_l where
d_l.leader_group_id = d_e.leader_group_id
and use it in a formula where I also count how many rows from dialogue.status = 1.
The formula is on the bottom to create a percentage total from the results. This is PHP and MySQL and I wasn't sure the best way to count the rows and put them as a variable in php to be used in the formula on the bottom?
function calculate_site_score($start_date, $end_date, $status){
while($rows=mysql_fetch_array($sqls)){
$query = "
SELECT
dialogue.cycle_id,
$completecount = sum(dialogue.status) AS calculation,
$total_employees = count(dialogue_employees AND dialogue_leaders), dialogue_list.*,
FROM dialogue,
(SELECT * FROM dialogue_list WHERE status =1) AS status,
dialogue_employees d_e,
u.fname, u.lname, d_e.*
user u,
dialogue_list,
dialogue_leaders d_l
LEFT JOIN dialogue_list d_list
ON d_e.employee_id = d_list.employee_id,
WHERE
d_l.leader_group_id = d_e.leader_group_id
AND d_l.cycle_id = dialogue.cycle_id
AND u.userID = d_e.employee_id
AND dialogue_list.employee_id
AND site_id='$_SESSION[siteID]'
AND start_date >= '$start_date'
AND start_date <= '$end_date'";
$sqls=mysql_query($query) or die(mysql_error());
}
$sitescore=($completecount/$total_employees)*100;
return round($sitescore,2);
}
If you separate out your queries you will gain more control over your data. You have to be careful what your counting. It's pretty crowded in there.
If you just wanted to clean up your function you can stack your queries like this so they make more sense, that function is very crowded.
function calculate_site_score($start_date, $end_date, $status){
$query="select * from dialogue;";
if ($result = $mysqli->query($query))) {
//iterate your result
$neededElem = $result['elem'];
$query="select * from dialogue_list where status =1 and otherElem = " . $neededElem . ";";
//give it a name other than $sqls, something that makes sense.
$list = $mysqli->query($query);
//iterate list, and parse results for what you need
foreach($list as $k => $v){
//go a level deeper, or calculate, rinse and repeat
}
}
Then do your counts separately.
So it would help if you separate queries each on their own.
Here is a count example How do I count columns of a table

Performance, sql heavy join vs multiple small request

I have the following Mysql database structure
[Table - Category1]
[Table Category1 -> Category2 ] (One to N relation)
[Table - Category2]
[Table Category2 -> Item ] (One to N relation)
[Table - Item]
and I want to get everything into an array in PHP with the following structure
$arr[$i]['name'] = 'name of something in category1';
$arr[$i]['data'][$j]['name'] = 'name of something in category2';
$arr[$i]['data'][$j]['data'][$k]['name'] = 'name of something in item';
So basically I don't know if I should use one "heavy" sql request with JOIN like the following one or use an iterative method
The join request
SELECT c1.name as c1name, c2.name as c2name, i.name
FROM category1 c1
LEFT JOIN category1_to_category2 c1tc2 ON c1.id = c1tc2.id_category1
LEFT JOIN category2 c2 ON c1tc2.id_category2 = c2.id
LEFT JOIN category2_to_item c2ti ON c2.id = c2ti.id_category2
LEFT JOIN item i ON c2ti.id_item = i.id
The iterative method
$sql = 'SELECT id, name FROM category1';
$result = $mysqli->query($sql);
$arr = array();
$i = 0;
while ($arr[$i] = $result->fetch_assoc()) {
$join = $mysqli->query('SELECT c2.id, c2.name FROM category2 c2 LEFT JOIN category1_to_category2 c1tc2 ON c2.id = c1tc2.id_category 2 WHERE c1tc2.id_category1 = '.$arr[$i]['id']);
$j = 0;
while ($arr[$i]['data'][$j] = $join->fetch_assoc())
/* same request as above but with items */
$i++;
}
The iterative solution will make around 10 * 20 request which seems a lot to me that's why I would choose the first solution (4 JOIN single request).
However, with the single request solution, my array will look like that
$arr[0]['c1name'];
$arr[0]['c2name'];
$arr[0]['iname'];
And it will require some PHP traitement to obtain the desired array which I require to display in tabs in an HTML page. So my question is, is it better to have one big SQL request with some PHP array manipulation or to have multiple small request without the PHP array manipulation ? I know that in most case, getting all the data from SQL is a better solution but in this case I'm not sure. By the way, my only consideration is the loading time of my web page.
Thanks in advance for your help =).
It is typically better, and your example is no exception, to have the SQL server do as much of the data formatting and iteration as possible as SQL servers are typically more efficient at the task than common programming languages.
Add to this that you are cutting down on query load of the server and you have a very good reason for using complex joins.
The only downside is complex SQL queries can be hard to format and debug, if not already using a 3rd party SQL tool I would recommend getting one.
To go with the answer by Wobbles (that I agree with), I would suggest that you do a single query but you store the last key for each of c1name, c2name and iname. When these change you increment the relevant array subscript and initialise the lower level ones again to build up your array.
Something like this:-
<?php
$sql = "SELECT c1.name AS c1name, c2.name AS c2name, i.name AS iname
FROM category1 c1
LEFT JOIN category1_to_category2 c1tc2 ON c1.id = c1tc2.id_category1
LEFT JOIN category2 c2 ON c1tc2.id_category2 = c2.id
LEFT JOIN category2_to_item c2ti ON c2.id = c2ti.id_category2
LEFT JOIN item i ON c2ti.id_item = i.id"
$result = $mysqli->query($sql);
$arr = array();
$i = 0;
$j = 0;
$k = 0;
$c1name = '';
$c2name = '';
$iname = '';
while ($row = $result->fetch_assoc())
{
switch(true)
{
case $row['c1name'] != $c1name :
$i++;
$j = 0;
$k = 0;
$arr[$i]['name'] = $row['c1name'];
$arr[$i]['data'][$j]['name'] = $row['c2name'];
$arr[$i]['data'][$j]['data'][$k]['name'] = $row['iname'];
break;
case $row['c2name'] != $c2name :
$j++;
$k = 0;
$arr[$i]['data'][$j]['name'] = $row['c2name'];
$arr[$i]['data'][$j]['data'][$k]['name'] = $row['iname'];
break;
default :
$k++;
$arr[$i]['data'][$j]['data'][$k]['name'] = $row['iname'];
break;
}
$c1name = $row['c1name'];
$c2name = $row['c2name'];
$iname = $row['iname'];
}
As an aside there is some code at work that is used to generate a menu. Just 2 levels, and it was originally coded as one query for the first level and then one query for each of the records in the first level to get all the items below it. Not complex (there are only ~16 items in the first level, and on average under 10 items below each of those). I rewrote that to a single joined query. Typical time to generate that menu dropped from 0.25 seconds down to 0.004 seconds. It is easy for the time taken sending queries to the database to rapidly become excessive.

Optimizing a while loop that contains PDO query's

Information
Currently building an notification page which lists all of the logged in users notifications which contain information about the notification on each one.
For example, without information
You have an unread message
With information
<Sarah> Sent you an message
Problem
Because the notifications require data such as Username (for message notifications) or a article title (say your following an author and they release a new blog post, one notification would need to pull username form users table and then also the title of the blog from the blog table) this causes my page to lag even on localhost which I'm guessing would get significantly worse once uploaded and tested in the wild.
Current Code
function showNotifications($userid){
$STH = $this->database->prepare('SELECT * FROM notifications WHERE user_id = :userid ORDER BY timestamp DESC');
$STH->execute(array(':userid' => $userid));
while($row = $STH->fetch(PDO::FETCH_ASSOC)){
$this->sortNotif($row);
}
Quick explanation about the function below, because I have different types of notifications I created a bunch of ID's for specific types, for example type 1 = new message, type 2 = new blog post
function sortNotif($notif){
switch ($notif['type']) {
case "1":
$msg = $this->getMessageData($notif['feature_id']);
$user = $this->userData($msg['sender']);
echo '<li><i>'.timeAgo($notif['timestamp']).'</i>'.$user['first_name'].' sent you a message</li>';
break;
}
}
As you can see for just showing that a user has a new message it creates 2 query's and once looped through 40 or so notifications, over 100 or so users becomes a strain on the server.
Final Words
If anyone needs more information please ask and I'll be sure to update this question asap, thanks!
Edit
Below are table structures as requested in the below comments.
notifications
id | user_id | feature_id | type | timestamp | read
users
id | username | password | first_name | last_name | email | verify_hash | avatar | type
messages
id | receiver | sender | replying_to | deleted | body | timestamp | read
Change of plan, since I misunderstood the set up.
You will want to pull all of the data from each of the 'type' tables in one go, rather than on a per notification basis. This means that you will need to loop through your notifications twice, once to grab all the ids and appropriate types and then a second time to output the results.
function showNotifications($userid){
$STH = $this->database->prepare('SELECT * FROM notifications WHERE user_id = :userid ORDER BY timestamp DESC');
$STH->execute(array(':userid' => $userid));
// Centralized Book keeping for the types.
// When you add a new type to the entire system, add it here as well.
$types = array();
// Add the first notification type
$types["1"] = array();
// "query" is pulling all the data you need concerning a notification
$types["1"]["query"] = "SELECT m.id, u.username, u.firstname FROM messages m, users u WHERE m.sender = u.id AND m.id IN ";
// "ids" will hold the relevant ids that you need to look up.
$types["1"]["ids"] = array();
// A second type, just for show.
// $types["2"] = array();
// $types["2"]["query"] = "SELECT a.id, u.username, u.firstname FROM articles a, users u WHERE a.sender = u.id AND a.id IN ";
// $types["2"]["ids"] = array();
// Use fetchAll to gather all of the notifications into an array
$notifications = $STH->fetchAll();
// Walk through the notifications array, placing the notification id into the corret
// "ids" array in the $types array.
for($i=0; $i< count($notifications); $i++){
$types[$notifications[$i]['type']]["ids"][] = $notifications[$i]['feature_id'];
}
// Walk through the types array, hit the database once for each type of notification that has ids.
foreach($types as $type_id => $type){
if(count($type["ids"]) > 0){
$STH = $this->database->prepare($type["query"] . "( " . implode(",", $type["ids"]) . " )");
$STH->execute();
// Creates a hash table with the primary key as the array key
$types[$type_id]['details'] = $STH->fetchAll(PDO::FETCH_GROUP|PDO::FETCH_ASSOC);
$types[$type_id]['details'] = array_map('reset', $types[$type_id]['details']);
// run array_map to make it easier to work with, otherwise it looks like this:
// $results = array(
// 1234 => array(0 => array('username' => 'abc', 'firstname' => '[...]')),
// 1235 => array(0 => array('username' => 'def', 'firstname' => '[...]')),
// );
}
}
// Now walk through notifications again and write out based on notification type,
// referencing $types[<notification type>]["details"][<message id>] for the details
for($i=0; $i< count($notifications); $i++){
// check to see if details for the specific notification exist.
if(isset($types[$notifications[$i]['type']]["details"][$notifications[$i]['feature_id']])){
$notification_details = $types[$notifications[$i]['type']]["details"][$notifications[$i]['feature_id']];
switch ($notifications[$i]['type']) {
case "1":
echo '<li><i>'.timeAgo($notifications[$i]['timestamp']).'</i>' . $notification_details['first_name'].' sent you a message</li>';
break;
}
}
}
}
Update : Added logic to skip a notification if no details are pulled (ex. either the message or the user were deleted)
I think you want to run a single query that gathers all the info either via joins or a more complicated where statement.
Option 1: this might need to be tweaked so as to not the cartesian product of the tables
SELECT n.id, n.type, m.id, m.body, u.username, u.first_name
FROM notifications n, messages m, users u
WHERE n.user_id = :userid AND m.id = n.feature_id AND u.id = m.sender
Option 2: if the table aliases don't work, then you'll need to replace them with the full table name
SELECT SELECT n.id, n.type, m.id, m.body, u.username, u.first_name
FROM notifications n
JOIN messages m
ON n.feature_id = m.id
JOIN users u
ON m.sender = u.id
WHERE n.user_id = :userid

MySQL Selecting million records to generate urls

I currently getting a 2 million records from different tables to generate a url to create a sitemap. The script eat too much resources and use 100% of the servers performance
query
SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
UNION ALL
Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
From city c
Join region r On r.id = c.id_region
Join country country On country.id = c.id_country
cross join param p
Where country.used = 1
And p.active = 1
//i store it on an array $url_list then process for creating a sitemap..but it takes time and to much resources
//i tried to get the data by batch using LIMIT 0,50000
but getting the maxrow for paging takes time. also the code doesn't look good for i have to run a two query that has a large data
$url_list = array();
$maxrow = SELECT COUNT(*) AS max from (
SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
UNION ALL
Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
From city c
Join region r On r.id = c.id_region
Join country country On country.id = c.id_country
cross join param p
Where country.used = 1
And p.active = 1) as tmp
$limit = 50,000;
$bybatch = ceil($maxrow/$limit);
$start = 0;
for($i = 0;$i < $bybatch; $i++){
// run query and store to $result
(SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
UNION ALL
Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
From city c
Join region r On r.id = c.id_region
Join country country On country.id = c.id_country
cross join param p
Where country.used = 1
And p.active = 1 LIMIT $start,$limit);
$start += $limit;
//push to $url_list
$url_list = array_push($result);
}
//when finish i use this to create a site map
$linkCount = 1;
$fileNomb = 1;
$i = 0;
foreach ($url_list as $ul) {
$i += 1;
if ($linkCount == 1) {
$doc = new DOMDocument('1.0', 'utf-8');
$doc->formatOutput = true;
$root = $doc->createElementNS('http://www.sitemaps.org/schemas/sitemap/0.9', 'urlset');
$doc->appendChild($root);
}
$url= $doc->createElement("url");
$loc= $doc->createElement("loc", $ul['url']);
$url->appendChild($loc);
$priority= $doc->createElement("priority",1);
$url->appendChild($priority);
$root->appendChild($url);
$linkCount += 1;
if ($linkCount == 49999) {
$f = fopen($this->siteMapMulti . $fileNomb .'.xml', "w");
fwrite($f,$doc->saveXML());
fclose($f);
$linkCount = 1;
$fileNomb += 1;
}
}
Any better way to do this? or to speed up the performance?
Added
Why is this faster than sql query but consumes 1 hundred percent of the servers resources and performance
$this->db->query('SELECT c.id, c.city_name, r.region_name, cr.country_name FROM city AS c, region AS r, country AS cr WHERE r.id = c.id_region AND cr.id = c.id_country AND cr.id IN (SELECT id FROM country WHERE use = 1)');
$arrayCity = $this->db->recordsArray(MYSQL_ASSOC);
$this->db->query('SELECT id, title FROM param WHERE active = 1');
$arrayParam = $this->db->recordsArray(MYSQL_ASSOC);
foreach ($arrayCity as $city) {
foreach ($arrayParam as $param) {
$paramTitle = str_replace(' ', '+', $param['title']);
$url = 'url/city/'. $city['id'] .'/paramId/'. $param['id'] .'/'. $paramTitle .'/'. $city['region_name'] .'/'. $city['city_name'];
$this->addChild($url);
}
}
I suggest you not to use UNION and just issue two separated queries. It will speed up a query itself.
Also as you mentioned above it's good idea to get data by batches.
And finally, don't collect all data in memory. Immediately write it to file in your loop.
Just open file in beginning, write each URL entry in loop and close file in end.
— open file for writing
— count query users table
— do several selects with LIMIT in loop (as you already done)
— right here in loop while ($row = mysql_fetch_array()) write each row to file
and than repeat such algorithm for another table.
It would be useful to implement a function for writing data to file, so you can call that function and adhere to the DRY principle.

Categories