I have a large table with 500,000 records, on clicking of a button the data gets downloaded, but it is very slow especially on a bad internet.
I was thinking of Zipping the file and then save it but again I am sure it will take up extra memory for the whole process.
Is there a better way to optimize this CSV download.
<?php
// mysql database connection details
$host = "localhost";
$username = "admin";
$password = "root";
$dbname = "db_books";
// open connection to mysql database
$connection = mysqli_connect($host, $username, $password, $dbname) or die("Connection Error " . mysqli_error($connection));
// fetch mysql table rows
$sql = "select * from tbl_books";
$result = mysqli_query($connection, $sql) or die("Selection Error " . mysqli_error($connection));
$fp = fopen('books.csv', 'w');
while($row = mysqli_fetch_assoc($result))
{
fputcsv($fp, $row);
}
fclose($fp);
//close the db connection
mysqli_close($connection);
?>
I would use MySQL INTO OUTFILE. It is much faster than looping through the results of your query. You add this to your select statement and MySQL will take care of creating your file for you.
See more documentation on the abilities here.
It sounds like you're using the page-load thread to compile the CSV before sending to the user. This is why it seems so slow.
If possible, you might want to simply pre-compile the CSV downloads, before the user gets to that point. That way their browser will simply receive the file, not hang while you generate it. If you're concerned about wasting too much time generating files that users never download, perhaps have a background job that generates files when needed, but only if the user has logged on (or into a certain area of your site) within the last X hours.
Alternatively, maybe you could use jQuery/Ajax to display a pop-up dialog that tells the user to wait while their file is being generated, and then disappears once the download is ready.
Related
We have multiple masters that are synced into a slave. We have decided to create a database for each master (let say MDB0001; MDB0002; MDB0003, etc...). This will allow to not corrupt the entire database if one replication fails or has corrupted data... The slave is used to show information to the people that are on the web (the master is only available in the local network)
The purpose is: we want to have a website (in php) on the server (slave) that shows the content for each database depending who is logged in. So if the user MDB0001 is connected, we have to read the data from the database MDB0001.
How can this be done? Is it a good way to do that? Or, do I have to duplicate the website for each database?
I hope I'm clear in my explanation. Thanks
assuming you get a variable from the login you could put a key->value array together on your db.php page;
$userDBs = array('login1'=>'db1','login2'=>'db2');
$dbName = $userDBs[$loggedinID]; // if login1 logs in, db1 would be result.
$db = new PDO('mysql:host=localhost;dbname='.$dbName, 'someUser', 'somePass');
or have a seperate db for the associations:
$sel = "Select dbName from databases where userId='".$loggedInID."'";
$stmt = $db->query($sel);
while($r = $stmt->fetch()){
$dbName = $r['dbName'];
}
$db = new PDO('mysql:host=localhost;dbname='.$dbName, 'someUser', 'somePass');
My first post, because I haven't found answer to this problem anywhere! And i looked way beyond Google.. :)
DESCRIPTION:
So I have a set-up where an arduino device is connected to a laptop via USB serial cable and the laptop is connected to internet.
Like this: http://postimg.org/image/cz1g0q2ib/
arduino ---USB---> laptop (transit.py) ---WWW---> server (insert.php)-> mysql DB
There is a python script (transit.py) on the pc running continuously and listening to the COM port, analyzing received data and forwarding it to a file (insert.php) on a remote server (a free hosting site)
See code to learn how that works...
Then there is the insert.php script that receives this data (still almost every second), analyzes it and stores it in the mySql database.
This, however, is not the only file that requires mySql connection, therefore i include connect.php at the beginning of every such file.
PROBLEM:
Warning: mysqli::mysqli() [mysqli.mysqli]: (42000/1226): User 'user' has exceeded the 'max_connections_per_hour' resource (current value: 1500) in /server/connect.php on line 8
As a result of all this data travel and it's frequency (and cheapness of the hosting) i run into a "maximum connections per hour exceeded" error. The limit is 1500 per hour and i can't change it (it's a remote server). And no, i don't want to pay for hosting to get a bigger allowance - that's not the point- the issue is inefficiency of my code. Can i have one, persistent connection? Like a service?
Sending data from python script straight to remote mysql is not an option, because i don't have access to this feature.
CODE:
transit.py:
try:
ser = serial.Serial('COM4',9600,timeout=4)
except:
print ('=== COULD NOT CONNECT TO BOARD ===')
value = ser.readline()
strValue = value.decode("utf-8")
if strValue:
mylist = strValue.split(',')
print(mylist[0] + '\t\t' + mylist[1]+ '\t\t' + mylist[2])
path = 'http://a-free-server.com/insert.php'
dataLine = {"table": mylist[0], "data": mylist[1], "value": mylist[2]}
toServer = requests.post(path, params=dataLine, timeout=2)
insert.php:
<?php
include 'connect.php';
//some irrelevant code here...
if (empty($_GET['type']) && isset($_GET['data'])) {
$table = $_GET['table'];
$data = $_GET['data'];
$value = $_GET['value'];
if($mysqli->connect_errno > 0){
die('Unable to connect to database [' . $mysqli->connect_error . ']');
}
else
{
date_default_timezone_set("Asia/Hong_Kong");
$clock = date(DATE_W3C);
if (isset($_GET['time'])) {
$time = $_GET['time'];
}
else{
$time = $clock;
}
echo "Received: ";
echo $table;
echo ",";
echo $data;
echo ",";
echo $value;
echo ",";
echo $time;
if ($stmt = $mysqli->prepare("INSERT INTO ".$table." (`id`, `data`, `value`, `time`) VALUES (NULL, ?, ?, ?) ON DUPLICATE KEY UPDATE time='".$time."'"))
{
$stmt->bind_param('sss', $data, $value, $time);
$stmt->execute();
$stmt->free_result();
$stmt->close();
}
else{
echo "Prepare failed: (" . $mysqli->errno . ") " . $mysqli->error;
}
}
}else{
echo " | DATA NOT received!";
}
?>
connect.php:
<?php
define("HOST", "p:a-free-host.com"); // notice the p: for persistence
define("USER", "user");
define("PASSWORD", "strongpassword1"); // my password. don't look!
define("DATABASE", "databass");
$GLOBALS["mysqli"] = new mysqli(HOST, USER, PASSWORD, DATABASE, 3306);
$count = intval(file_get_contents('conns.txt'));
file_put_contents('conns.txt', ++$count); //just something i added to monitor connections
?>
P.S. Everything works fine and all data is handled in a rather desirable manner, except for exceeding the limit and perhaps some other hidden caveats.
Any suggestion on how to decrease the connection count but still receive data every second?
If I have understood your issue correctly, your web host sucks. If you are limited to 1500 connections / hour, and each page requires a connection, that means you can never exceed 1500 page views per hour; that's not very much.
Many programming languages support connection pooling; in this model, the server opens one or more connection at start-up, and individual page requests get one of those connections when they need them. This reduces the overhead of opening and closing connections. See here for a discussion of connection pooling and PHP. You may be able to use one of the answers without too much trouble.
The alternative - and probably better - solution is to batch up data in your Python scripts so you don't have to connect to the web server so often. The classic waty to do this for applications that aren't time critical is to use a message bus. I'm not a Pythonist, but this should do the job...
Did you try to create a script that is all the time alive(here you make the connection)(S1) and then the rest?
(S2)
In the script that you are doing the operations first check if the connection is alive and if is not redo connection.
Close the connection in S1 at the end of the script.
I'm starter.
I want to know what will happen if we don't close the MySQL connection.
1- Is it possible to open more than one database if we don't close them? I mean can we open more than one database in a same time?
2- Does closing database increase the speed?
3- Is it necessary to close the database or it is optional?
Look at this code. I don't use "mysql_close()" so I don't close the database after each request. There are a lot of requests for this PHP page. Maybe 50000 per each minute. I want to know closing database is necessary for this code or no?
<?php
//Include the file that lets us to connect to the database.
include("database/connection.php");
//Call "connect" function to connect to the database.
connect("database", "localhost", "root", "", "user");
//The GPRS module send a string to this site by GET method. The GPRS user a variable named variable to send the string with.
$received_string = $_GET["variable"];
//Seprates data in an array.
$array_GPRS_data = explode(",", $received_string);
//we need to remove the first letter.
$array_GPRS_data[9] = substr($array_GPRS_data[9], 1);
$array_GPRS_data[13] = substr($array_GPRS_data[13], 4, 2).substr($array_GPRS_data[13], 2, 2).substr($array_GPRS_data[13], 0, 2);
//Query statement.
$query = "INSERT INTO $array_GPRS_data[17](signal_quality, balance, satellite_derived_time, satellite_fix_status, latitude_decimal_degrees,
latitude_hemisphere, longitude_decimal_degrees, longitude_hemisphere, speed, bearing, UTCdate, theChecksum)
VALUES('$array_GPRS_data[0]', '$array_GPRS_data[1]', '$array_GPRS_data[5]', '$array_GPRS_data[6]', '$array_GPRS_data[7]',
'$array_GPRS_data[8]', '$array_GPRS_data[9]', '$array_GPRS_data[10]', '$array_GPRS_data[11]', '$array_GPRS_data[12]', '$array_GPRS_data[13]',
'$array_GPRS_data[16]')";
//Run query.
$result = mysqli_query($query);
//Check if data are inserted in the database correctly.
if($result)
{
echo("*#01");
}
else
{
echo("Error: 001");
echo (mysqli_error());
}
?>
Yes, you can have multiple database connections. You are not opening a database, you are opening a database connection. The database is 'open' (i.e. running) all of the time, generally speaking, whether you are connected to it or not.
Depends... if you only have one open connection on a page, then you don't need to close it because it will automatically close when PHP is done. If you have many, then you could potentially make the database server slower, or make the database server run out of available connections (it can only have a certain number of connections open at the same time). That said, most modern database servers can handle hundreds of concurrent connections.
Optional, but recommended. It's not a big deal for small-medium projects (i.e. if you have less than 100 concurrent visitors at any given time, you probably won't have any issues regardless). Since you have many thousand visitors per minute, you should actively close the database connection as soon as you are done with it, to free it up as soon as possible.
Once you connect to the database it is not necessary to close. As non-persistent connection automatically closed at the end of script execution.
Follow this for more information
I have an issue, it has only cropped up now. I am on a shared web hosting plan that has a maximum of 10 concurrent database connections. The web app has dozens of queries, some pdo, some mysql_*.
Loading one page in particular peaks at 5-6 concurrent connections meaning it takes a minimum of 2 users loading it at the same time to spit an error on one or both of them.
I know this is inefficient, I'm sure I can cut that down quite a bit, but that's what my idea is at the moment is to move the pdo code into a function and just pass in a query string and an array of variables, then have it return an array (partly to tidy my code).
THE ACTUAL QUESTION:
How can I get this function to continue to retry until it manages to execute, and hold up the script that called it (and any script that might have called that one) until it manages to execute and return it's data? I don't want things executing out of order, I am happy with code being delayed for a second or so during peak times
Since someone will ask for code, here's what I do at the moment. I have this in a file on it's own so I have a central place to change connection parameters. the if statement is merely to remove the need to continuously change the parameters when I switch from my test server to the liver server
$dbtype = "mysql";
$server_addr = $_SERVER['SERVER_ADDR'];
if ($server_addr == '192.168.1.10') {
$dbhost = "localhost";
} else {
$dbhost = "xxxxx.xxxxx.xxxxx.co.nz";
}
$dbname = "mydatabase";
$dbuser = "user";
$dbpass = "supersecretpassword";
I 'include' that file at the top of a function
include 'db_connection_params.php';
$pdo_conn = new PDO("mysql:host=$dbhost;dbname=$dbname", $dbuser, $dbpass);
then run commands like this all on the one connection
$sql = "select * from tbl_sub_cargo_cap where sub_model_sk = ?";
$capq = $pdo_conn->prepare($sql);
$capq->execute(array($sk_to_load));
while ($caprow = $capq->fetch(PDO::FETCH_ASSOC)) {
//stuff
}
You shouldn't need 5-6 concurrent connections for a single page, each page should only really ever use 1 connection. I'd try to re-architect whatever part of your application is causing multiple connections on a single page.
However, you should be able to catch a PDOException when the connection fails (documentation on connection management), and then retry some number of times.
A quick example,
<?php
$retries = 3;
while ($retries > 0)
{
try
{
$dbh = new PDO("mysql:host=localhost;dbname=blahblah", $user, $pass);
// Do query, etc.
$retries = 0;
}
catch (PDOException $e)
{
// Should probably check $e is a connection error, could be a query error!
echo "Something went wrong, retrying...";
$retries--;
usleep(500); // Wait 0.5s between retries.
}
}
10 concurrent connections is A LOT. It can serve 10-15 online users easily.
Heavy efforts needed to exhaust them.
So there is something wrong with your code.
There are 2 main reasons for it:
slow queries take too much time and thus serving one hit uses one mysql connection for too long.
multiple connections opened from every script.
The former one have to be investigated but for the latter one it's simple:
Do not mix myqsl_ and PDO in one script: you are opening 2 connections at a time.
When using PDO, open connection only once and then use it throughout your code.
Reducing the number of connections in one script is the only way to go.
If you have multiple instances of PDO class in your code, you will need to add that timeout handling code you want to every call. So, heavy code rewriting required anyway.
Replace these new instances with global $pdo; instead. It will take the same amount of time but it will be permanent solution, not temporary patch as you want it.
Please be sensible.
PHP automatically closes all the connections st the end of the script, you don't have to care about closing them manually.
Having only one connection throughout one script is a common practice. It is used by ALL the developers around the world. You can use it without any doubts. Just use it.
If you have transaction and want to log something in database you sometimes need 2 connections in one script
I am using wamp server on windows. while getting a little bit of data from my database hangs my page badly. it's just like a simple post which have 1 image 1 title and a little bit discription and when I trigger the command it hangs my page badly. here is how my code looks like.
<?php
//1. Create a connection
$connection= mysql_connect("localhost","root","");
if(!$connection){
die("Database Connection Failed :" . mysql_error());
}
//2 Select a database to use
$db_select = mysql_select_db("gat", $connection);
if (!$db_select) {
die("Database selection failed: " . mysql_error());
}
?>
<html>
<head>
<title>Database Check</title>
</head>
<body>
<?php
//3 perform database query
$result=mysql_query("SELECT * FROM recent_works",$connection);
if (!$result) {
die("Database query failed:" . mysql_error());
}
//4 use returned data
while ($row= mysql_fetch_assoc($result)) {
echo "<div class='work_item'>";
echo "<img src='{$row['image']}' alt=''>";
echo "<h2>{$row['title']}</h2>";
echo "<p>{$row['short_discription']}</p>";
echo "</div>";
}
?>
</body>
</html>
<?php
//5 close connection
mysql_close($connection);
?>
Fetching data from a database will always involve some level of blocking. The question is how much data are you fetching. Your example indicates you are selecting everything from the table and fetching all of the data to print out onto the page. So how many rows are in the table, how much data is stored in each column, and how much of that data gets transferred to the client are all provisioning factors of speed here. Additionally, you have to consider that connecting to the database also has a cost.
Here are a few suggestions I can make to the above code:
Don't use the old mysql extension (mysql_* functions), but consider using the newer MySQLi extension, which can help you do things the old extension can't; like asynchronous queries. It's also highly discouraged to use the old mysql extension in new development, since it's in plans for deprecation currently. See MySQL: choosing an API in the PHP manual for more information.
Check phpinfo() to make sure you aren't using output buffering (which requires buffering up to a certain amount of data before it gets sent to the client). This could result in the client waiting around until there's data ready to be sent. Pushing some HTML content out to the client as soon as possible could help improve the user experience.
Don't use SELECT * FROM table in your queries, instead, consider explicitly selecting only the fields you need for each query: SELECT image,title,short_discription FROM recent_works
If there's a lot of data (more than say hundred rows maybe) consider using pagination and LIMIT the query to a certain number of rows per page view. This can greatly reduce the amount of traffic between your DBMs and PHP on a per request basis.
If it's a high load site consider using a persistent database connection.