I hope you are well... I have a problem and it is that when I try to export more than 500 lines of MYSQL, the server reaches its waiting limit, I would like to know if there is a way to export in the background. i wanna download for example 2k rows.
<?php
$con = mysqli_connect(DATA TO CONNECT) or die ("could not connect to mysql");
mysqli_set_charset($con,'utf8mb4');
session_start();
if(!$_SESSION['user']){
header('Location: pages/login.php');
}
if (isset($_GET['country'])) {
$query = mysqli_query($con,"SELECT * FROM profiles WHERE username='{$_SESSION['user']}' AND status='0' AND country='{$_GET['country']}'");
$token =''.substr(md5("random".mt_rand()),0,10);
$file =$_GET['country']."_".$token.'.txt';
//
$counter = 0;
while($data=mysqli_fetch_array($query)){
//
$counter++;
$current = file_get_contents($file);
$current .= $data['name'].":".$data['country']."\n";
file_put_contents($file, $current);
mysqli_query($con,"UPDATE profiles SET status=1 WHERE id='{$data['id']}'");
if ( $counter >= 200 ) {
break;
}
}
$content = file_get_contents ($file);
header ('Content-Type: application/octet-stream');
header ("Content-Disposition: attachment; filename=\"". basename($file) ."\"");
unlink($file);
echo $content;
}
?>
You're doing a lot of reading and writing a file while you're building the output, and this is slowing everything down. You don't need to be creating the file at all since you can download the data directly from memory.
You're also performing an individual update query for every row. For 2000 rows that's 2000 update queries when only one is needed.
Lastly, there's a counter that breaks the update loop after 200 rows. I imagine that's there for testing, but a better way to limit data is to add ORDER BY and LIMIT clauses to your queries.
Here's a new version of your code that addresses these issues, and also refactors the queries to use prepared statements. I've taken the precaution of selecting the records FOR UPDATE to prevent someone else updating them while this runs, and wrapping the entire thing in a transaction so that it can be rolled back if things go wrong.
<?php
/**
* exportText.php
*
*/
error_reporting(E_ALL);
ini_set('display_errors',1);
session_start();
$_SESSION['user'] = 'Fred'; // for testing only
if (!$_SESSION['user']) {
header('Location: pages/login.php');
}
if (isset($_GET['country'])) {
try {
mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);
$con = new mysqli('host', 'user', 'password', 'schema') or die ("could not connect to mysql");
$con->set_charset('utf8mb4');
$con->begin_transaction();
$stmt = $con->prepare("SELECT name, country FROM profiles WHERE username=? AND status='0' AND country=? ORDER BY id LIMIT 200 FOR UPDATE");
$stmt->bind_param('ss', $_SESSION['user'], $_GET['country']);
$result = $stmt->execute();
$stmt->bind_result($name, $country);
// Store the data until we've readi ti all and completed
// the database update. This avoids error messages being written to the data file.
$output = [];
while ($stmt->fetch()) {
$output[] = "$name:$country\n"; //
}
// update the statuses using the same criteria we had before.
$stmt = $con->prepare( "UPDATE profiles set status = 1 WHERE username=? AND status='0' AND country=? ORDER BY id LIMIT 200");
$stmt->bind_param('ss', $_SESSION['user'], $_GET['country']);
$result = $stmt->execute();
// send the data
$token = '' . substr(md5("random" . mt_rand()), 0, 10);
$file = $_GET['country'] . "_" . $token . '.txt';
// Start the output
header('Content-Type: application/octet-stream');
header("Content-Disposition: attachment; filename=\"" . basename($file) . "\"");
echo implode('',$output);
$con->commit();
} catch (Exception $e) {
$con->rollback();
echo "Exception:".$e->getMessage();
}
}
Related
I have the following code which I am using to download a CSV of the data to a file called Out of Area.csv
<?php
// START EXPORT OF LTHT DATA //
exportMysqlToCsv('Out of Area.csv');
// export csv
function exportMysqlToCsv($filename = 'Out of Area.csv')
{
$conn = dbConnection();
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$sql_query = "SELECT * FROM MDSData WHERE Downloaded IS NULL ORDER BY RegisteredCCG";
// Gets the data from the database
$result = $conn->query($sql_query);
$f = fopen('php://temp', 'wt');
$first = true;
while ($row = $result->fetch_assoc()) {
if ($first) {
fputcsv($f, array_keys($row));
$first = false;
}
fputcsv($f, $row);
} // end while
$sql_query = "UPDATE MDSData SET Downloaded='YES' WHERE Downloaded IS NULL";
// Gets the data from the database
if ($conn->query($sql_query) === TRUE) {
} else {
}
$conn->close();
$size = ftell($f);
rewind($f);
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Length: $size");
// Output to browser with appropriate mime type, you choose ;)
header("Content-type: text/x-csv");
header("Content-type: text/csv");
header("Content-type: application/csv");
header("Content-Disposition: attachment; filename=$filename");
fpassthru($f);
exit;
}
// db connection function
function dbConnection(){
$servername = "servername";
$username = "username";
$password = "password";
$dbname = "databasename";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
return $conn;
}
header('location:csv-upload-mds.php?week='.$SelectedWeek);
?>
However I have a column in my table called 'TownName' and I would like to loop through creating a CSV Named 'Out of Area - The Town Name Here.csv' for example 'Out of Area - Berkshire.csv'
Currently i've got to the stage where I can make it download 1 file with all the data in, however my issue comes when trying to download multiple files. Even if I duplicate the exportMysqlToCsv() call I get an error as this function can only be called once. So I need to put the loop inside an if statement I guess...
My thoughts are something like the following
$sqlquery = "SELECT * FROM MDSData WHERE Downloaded IS NULL GROUP BY REGISTERED CCG"
$result = $conn->query($sql_query);
while ($row = $result->fetch_assoc()) {
// Now I've looped through each CCG for each one I can set the file name and add the data to the file.
$filename = "Out of Area - " . $row["TownName"]. ".csv";
$sql_query = "SELECT * FROM MDSData WHERE Downloaded IS NULL ORDER BY RegisteredCCG";
// Gets the data from the database
$result = $conn->query($sql_query);
$f = fopen('php://temp', 'wt');
$first = true;
while ($row = $result->fetch_assoc()) {
if ($first) {
fputcsv($f, array_keys($row));
$first = false;
}
fputcsv($f, $row);
} // end while of inside loop
} // end while of 1st SQL query
However whilst I belive my idea is solid, I'm struggling to put it all together and looking for a little help.
Thanks
I want to export only those records which a particular user has entered. For example: I have logged in by user id - user_test and i have few records which i have entered. for the same when the record is saved into the database my login name is also being saved along with the record. When i want to fetch and display the data of the particular user in session its working fine. but when i am using the same query for downloading the report i am not getting any error, the page is getting download with blank fields. Here is the code i am using for downloading the records in excel sheet.
<?php
include_once 'classes/admin-class.php';
$admin = new itg_admin();
$admin->_authenticate();
$ten = $_SESSION['admin_login'];
?>
<?php
$host="localhost";
$username="root";
$password="";
$dbname="kuidfc_project_monitoring";
$con = new mysqli($host, $username, $password,$dbname);
$sql_data="SELECT * from user_daily_records WHERE users = '" . $admin->get_nicename() . "';";
$result_data=$con->query($sql_data);
$results=array();
$filename = "Daily_Report.xls"; // File Name
// Download file
header("Content-Disposition: attachment; filename=\"$filename\"");
header("Content-Type: application/vnd.ms-excel");
$flag = false;
while ($row = mysqli_fetch_assoc($result_data)) {
if (!$flag) {
// display field/column names as first row
echo implode("\t", array_keys($row)) . "\r\n";
$flag = true;
}
echo implode("\t", array_values($row)) . "\r\n";
}
?>
"SELECT * FROM daily_progress_demo WHERE users = '" . $admin->get_nicename() . "' ORDER BY startdate DESC ;"
I just sort by order in query it worked. Everything is coming perfect.
I export from MySQL to Excel with this code.
I have no problem when executing this code without CMS,
but when I use this code in my CMS the template is in export and I want to get this query result.
$db_name = "test";
$link = mysql_connect("localhost", "root", "") or die("Could not connect to server!");
$table_name = 'users';
$select_db = mysql_select_db($db_name, $link);
mysql_query("SET NAMES 'utf8'");
$query = "SELECT * from users";
$result = mysql_query($query, $link) or die("Could not complete database query");
$num = mysql_num_rows($result);
$num2=mysql_num_fields($result);
$header="";
for ($i = 0; $i < $num2; $i++) {
$header .= mysql_field_name($result, $i) . "\t";
}
if ($num != 0) {
$_xml ="<?xml version='1.0' encoding='UTF-8' standalone='yes'?>\r\n";
$_xml.="<dataroot xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'>\r\n";
while ($row=mysql_fetch_array($result)){
$_xml .="\t<qq>\r\n";
if($row[0]<>'') $_xml.="\t\t<q>".$row[0]."</q>\r\n";
if($row[1]<>'') $_xml.="\t\t<a>".$row[1]."</a>\r\n";
$_xml.="\t</qq>\r";
}
$_xml.="</dataroot>";
header("Content-Type: application/vnd.ms-excel; charset=utf-8");
header("Content-Disposition: attachment; filename=filename.xls");
header("Pragma: no-cache");
header("Expires: 0");
header("Lacation: excel.htm?id=yes");
print($_xml);
} else {
echo "No Records found";
}`enter code here`
I dont know if I understand your problem. But in the end you are talking about getting a query-result. ALSO you defined an attachment into the header. You cant get output on your website AND download a file at the same step.
If output is generated, the download will be corrupted. So you have to decide: Output or download. If you really need both another way would be: First generate your output, then redirect the user to a blank download-page which appends the attachment. If correctly done, you should visually stay on your main-page while the download-page loads up the attachment after the output on your main-page is done.
Correct me if I got something wrong here.
Ps: Instead of
$var = "";
better use
unset($var);.
I am new to this and I am having problems trying to get my website to count the number of times a files has been downloaded. The website has download buttons linking to the files which are stored on my database, and I would like to be able to count the amount of downloads per file to display as a statistic, ideally once they click the link to download, the column in the files table should be incremented, but I cannot wrap my head around it. please help?
<?php
error_reporting(E_ALL ^ E_NOTICE);
session_start();
$userid = $_SESSION['userid'];
$username= $_SESSION['username'];
?>
<?php
// Make sure an ID was passed
if(isset($_GET['id'])) {
// Get the ID
$id = ($_GET['id']);
// Make sure the ID is in fact a valid ID
if($id <= 0) {
die('The ID is invalid!');
}
else {
// Connect to the database
$dbLink = new mysqli('dragon.kent.ac.uk', 'repo', '3tpyril', 'repo');
if(mysqli_connect_errno()) {
die("MySQL connection failed: ". mysqli_connect_error());
}
// Fetch the file information
$query = "
SELECT `mime`, `name`, `size`, `data`
FROM `files`
WHERE `id` = {$id}";
$result = $dbLink->query($query);
if($result) {
// Make sure the result is valid
if($result->num_rows == 1) {
// Get the row
$row = mysqli_fetch_assoc($result);
// Print headers
header("Content-Type: ". $row['mime']);
header("Content-Length: ". $row['size']);
header("Content-Disposition: attachment; filename=". $row['name']);
// Print data
echo $row['data'];
}
else {
echo 'Error! No files exists with that ID.';
}
}
else {
echo "Error! Query failed: <pre>{$dbLink->error}</pre>";
}
#mysqli_close($dbLink);
}
}
else {
echo 'Error! No ID was passed.';
}
?>
The download link would point to a php file, such as download.php?id=123 -- this file would then take the ID and check the downloads database. If the ID exists, you run a query such as UPDATE files SET downloads = downloads + 1 WHERE id = 123.
Afterwards, you set the headers using header() to set content-type. Then use readfile().
See How to force file download with PHP on how to set headers and force a download.
Cheers!
I don't know the best way to title this question but am trying to accomplish the following goal:
When a client logs into their profile, they are presented with a link to download data from an existing database in CSV format. The process works, however, I would like for this data to be 'fresh' each time they click the link so my plan was - once a user has clicked the link and downloaded the CSV file, the database table would 'erase' all of its data and start fresh (be empty) until the next set of data populated it.
My EXISTING CSV creation code:
<?php
$host = 'localhost';
$user = 'username';
$pass = 'password';
$db = 'database';
$table = 'tablename';
$file = 'export';
$link = mysql_connect($host, $user, $pass) or die("Can not connect." . mysql_error());
mysql_select_db($db) or die("Can not connect.");
$result = mysql_query("SHOW COLUMNS FROM ".$table."");
$i = 0;
if (mysql_num_rows($result) > 0) {
while ($row = mysql_fetch_assoc($result)) {
$csv_output .= $row['Field'].", ";
$i++;
}
}
$csv_output .= "\n";
$values = mysql_query("SELECT * FROM ".$table."");
while ($rowr = mysql_fetch_row($values)) {
for ($j=0;$j<$i;$j++) {
$csv_output .= '"'.$rowr[$j].'",';
}
$csv_output .= "\n";
}
$filename = $file."_".date("Y-m-d",time());
header("Content-type: application/vnd.ms-excel");
header("Content-disposition: csv" . date("Y-m-d") . ".csv");
header( "Content-disposition: filename=".$filename.".csv");
print $csv_output;
exit;
?>
any ideas?
$query = mysql_query('TRUNCATE TABLE '.$table);
or if you want to archive them
$query = mysql_query('RENAME TABLE db_name.'.$table.' TO db_name.'.$table.time());
$query = mysql_query('CREATE TABLE db_name.'.$table.' ( /* table structure */ )');
i'm extremely confused as to why the data needs to be 'fresh' and why you have a database table that can be cleared out at will.
can't you just export whatever data you want directly to a CSV and skip the whole writing-to-and-subsequently-deleting-the-db step?
I would think that whatever data you're collecting and storing in this 'temporary table' can be accumulated elsewhere, and then when they want the CSV, you can just get all the data you need at that time instead of keeping around a pointless table.