I have all my files stored in a mysql database as blobs. I am trying to add a speed limit to the rate at which a user can download them through our PHP website. I have tried to use the "sleep(1);" method, it does not seem to work or i am not doing it right. So if anyone knows a way to limit the speed, i would love your help.
Here is my download code
$query=mysql_query("SELECT * FROM file_servers WHERE id='$file_server_id'");
$fetch=mysql_fetch_assoc($query);
$file_server_ip=$fetch['ip'];
$file_server_port=$fetch['port'];
$file_server_username=$fetch['username'];
$file_server_password=$fetch['password'];
$file_server_db=$fetch['database_name'];
$connectto=$file_server_ip.":".$file_server_port;
if (!$linkid = #mysql_connect($connectto, $file_server_username, $file_server_password, true))
{
die("Unable to connect to storage server!");
}
if (!mysql_select_db($file_server_db, $linkid))
{
die("Unable to connect to storage database!");
}
$nodelist = array();
// Pull the list of file inodes
$SQL = "SELECT id FROM file_data WHERE file_id='$file_id' order by id";
if (!$RES = mysql_query($SQL, $linkid))
{
die("Failure to retrive list of file inodes");
}
while ($CUR = mysql_fetch_object($RES))
{
$nodelist[] = $CUR->id;
}
// Send down the header to the client
header("Content-Type: $data_type");
header("Content-Length: $size");
header("Content-Disposition: attachment; filename=$name");
// Loop thru and stream the nodes 1 by 1
for ($Z = 0 ; $Z < count($nodelist) ; $Z++)
{
$SQL = "select file_data from file_data where id = " . $nodelist[$Z];
if (!$RESX = mysql_query($SQL, $linkid))
{
die("Failure to retrive file node data");
}
$DataObj = mysql_fetch_object($RESX);
echo $DataObj->file_data;
}
One way of doing this may be the combination of flush and sleep:
read part of what you get from database
output some bytes
flush the output to the user
sleep for 1 second
But also take a loot at throttle function:
http://php.net/manual/en/function.http-throttle.php
It also have an example there. I think it is better suited.
it is in the very last echo line in your code where you would like to implement throtling. Im not familiar with whether php supports throtling output.
if not, you can try to split up that content ($DataObj->file_data) you wish to echo, and echo it little piece by little piece with small pauses in between
and be sure to disable outbut buffering. otherwise all that you echo will not be outputted until the entire script is done.
Related
I am trying to upload two images with php. And add them to the database. Somehow it only uploads one image and the records in the database always have the same values.
this is the code i use
<?php
include "../connect.php";
$name1 = $_FILES['pic1']['name'];
$size1 = $_FILES['pic1']['size'];
$name2 = $_FILES['pic2']['name'];
$size3 = $_FILES['pic2']['size'];
if(isset($_POST['name']))
{
$extension1 = pathinfo($name1,PATHINFO_EXTENSION);
$array = array('png','gif','jpeg','jpg');
if (!in_array($extension1,$array)){
echo "<div class='faild'>".$array[0]."-".$array[1]."-".$array[2]."-".$array[3]." --> (".$name.")</div>";
}else if ($size>10000000){
echo "<div class='faild'>Size</div>";
}else {
$new_image1 = time().'.'.$extension1;
$file1 = "images/upload";
$pic1 = "$file1/".$new_image1;
move_uploaded_file($_FILES["pic1"]["tmp_name"],"../".$pic1."");
$insert = mysql_query("update temp set pic='$pic1' ") or die("error ins");
}
$extension2 = pathinfo($name2,PATHINFO_EXTENSION);
$array = array('png','gif','jpeg','jpg');
if (!in_array($extension2,$array)){
echo "<div class='faild'>".$array[0]."-".$array[1]."-".$array[2]."-".$array[3]." --> (".$name.")</div>";
}else if ($size>10000000){
echo "<div class='faild'>Size</div>";
}else {
$new_image2 = time().'.'.$extension2;
$file2 = "images/upload";
$pic2 = "$file2/".$new_image2;
move_uploaded_file($_FILES["pic2"]["tmp_name"],"../".$pic2."");
$insert = mysql_query("update temp set passport='$pic2'") or die("error ins");
}
}
?>
One of the problems you have is with your update statement. There is no 'where' statement saying which record in the database should be updated so this query updates them all. That's why you only have the last image in all the database rows.
Besides that, your code is not very good from a security point of view. You should take a look at mysqli or pdo for your database connection and queries because MySQL is deprecated and removed from PHP. Also take a look at SQL injections and data validation. Besides some very basic extension and size validation there is nothing there to keep things save. Try escaping and validating all user inputs.
And another point would be to take a look at 'functions'. You're running almost the exact same piece of code at least twice. And every code change has to be done twice. Perfect for a function call, something like
function storeImage($image){
// write the uploading and storing PHP here
}
In one of my application, users can upload CSV file (| separated fields), after uploading I am storing all the content of file in temporary table (I truncate this table every time for new upload so that it contains the current file data). After that I am iterating over each and every row of that table, and performs some database operation as per the business logic.
The following code will illustrate this:
if(isset($_POST['btn_uploadcsv']))
{
$filename = $_FILES["csvupload"]["name"];
$uploads_dir = 'csvs'; //csv files...
$tmp_name = $_FILES["csvupload"]["tmp_name"];
$name = time();
move_uploaded_file($tmp_name, "$uploads_dir/$name");
$csvpath = "$uploads_dir/$name";
$row = 0;
$emptysql = "TRUNCATE TABLE `temp`";
$connector->query($emptysql);
if (($handle = fopen($csvpath, "r")) !== FALSE) {
$str_ins = "";
while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) {
/*
* Here I am getting the column values to be store in the
* the table, using INSERT command
*/
unset($data);
}
fclose($handle);
}
/*Here I am selecting above stored data using SELECT statement */
for($j=0;$j<count($allrecords);$j++)
{
echo "In the loop";
/*If I use echo statement for debugging it is working fine*/
//set_time_limit(300);
/* I have tried this also but it is not working*/
if(!empty($allrecords[$j]['catid']))
{
// Here is my business logic which mailny deals with
// conditional DB operation
}
echo "Iteration done.";
/*If I use echo statement for debugging it is working fine*/
}
}
The problem is when I execute aboe script on server it is giving server timeout error. But when I test above script on my localhost, is is working fine.
Also as mentioned in the code, if I use echo statements for debugging, then it is working fine, and when I remove that it starts giving connection timeout problem.
I have tried set_time_limit(300), set_time_limit(0), but none of them seems to work.
Any idea, how can I resolve the above problem.
-- Many thanks for your time.
Edit:
I have checked that, files are uploading on the server.
set_time_limit
change to
ini_set("max_execution_time",300);
When max_execution_time is not set in php.ini set_time_limit valid.
I have resolved the issue using flush, to send intermediate output to the browser, while the query is executing in the background.
This is how I modified the code:
for($j=0;$j<count($allrecords);$j++)
{
/*At the end of each iteration, I have added the following code*/
echo " ";
flush();
}
Thanks to the contributors over this link PHP: Possible to trickle-output to browser while waiting for database query to execute?, from where I got inspiration.
I read a lot on this forum but couldn't find what I need.
I'm trying to use a php script to clean a desknow database, I managed to make it loop through all my recording and sending the command to an iframe but my issue is that I need it to wait until the iframe as finish loading before it does the next loop since it's sending a database command to the desknow server. I tried to had a sleep in the loop but when I do so it will just wait for the time a specify time the number of loop and then output all iframe at the same time. It does not have to open a new iframe for each record like it's doing here if there is a way to make it just change the src of the iframe on each loop it would be even better.
<?php
echo "start";
# $db = new mysqli('localhost', 'desknow', 'xxxxxxxx', 'desknow');
if (mysqli_connect_errno()) {
echo "fail";
exit;
}
echo "pass";
for ($i=1; $i <4; $i++) {
flush();
sleep(2);
$query = "select email from compte where no = $i";
$result = $db->query($query);
$row= $result->fetch_assoc();
$email = htmlspecialchars(stripslashes($row['email']));
echo "<br>";
echo "<iframe onload=\"load()\" src=\"http://0.0.0.0:81/desknow/admin?pwd=xxxxxx&action=mail_deleteemails&username=$email&domain=mydomain.com&path=inbox&before=20121231_1300/\"></iframe>";
}
$result->free();
$db->close();
?>
try to use ob_flush(); before flush(); , and echo 'start';
Your URL has the IP, username and password of your database server – this is very unsafe and really bad practise.
If you really want to update things like that you need to do it client side. It would be quite simple to have a javascript method that will reload the HTML content of an existing DOM member.
You could also use a page refresh or a form post to complete another round trip to the server which would update the view.
I have a small problem with the php content-disposition, I kind of understand where the problem lies but I have no idea how to solve it (new to using databases). Calling this php page will result in not showing any of the echos and only showing the download box, which I intended for the "cv" only (not sure if it's working that way, because the downloadable file I receive cannot be opened)
Removing the header(content... line will result in showing the echos, but I won't be able to download the specified file. I want it to show as a link which would download its contents when clicked.
$newEmployeeName = $_POST['name'];
$newEmployeeArea = $_POST['area'];
$newEmployeeCV = $_POST['cv'];
include('databaseConnection.php');
$result = mysql_query("SELECT * FROM participants");
while($row = mysql_fetch_array($result))
{
$download_me = $row['cv'];
header("Content-Disposition: attachment; filename=$download_me");
echo $row['name'] . " " . $row['area_of_exp'] . " " . $download_me;
echo "<br />";
}
The Content-Disposition header will force the script to present anything echoed after it as a download. You would normally use this with reading a file from the file system, so you can offer that as a download. In your case, if you’re storing CVs on your server then you may offer them as a download as follows:
<?php
$sql = "SELECT * FROM table WHERE id = :id LIMIT 1";
$stmt = $db->prepare($sql);
$stmt->bindParam(':id', $id, PDO::PARAM_INT);
$stmt->execute();
$row = $stmt->fetchObject();
if ($row) {
header('Content-Disposition: attachment; filename=' . $row['filename']);
readfile($uploads_dir . $row['filename']);
exit;
}
else {
die('Invalid CV requested.');
}
Obviously the above is a simplified version of the process and you will need to tweak it to fit your application, but that’s the gist of it.
Also, don’t use the mysql_ functions. They’re deprecated (as per the warning on this page). Use either PDO or the new MySQLi (MySQL improved) extension.
And by best I mean most efficient, right now placing this on my post.php file is the only thing I can think of:
$query = mysql_query(" UPDATE posts SET views + 1 WHERE id = '$id' ");
is there a better way, a method that would consume less server resources. I ask because if this was a small app I would have no problem with the above, but I am trying to build something that will be used by a lot of people and I want to be as query conscious as possible.
If you're interested in conserving resources and still using SQL for reporting, and precise # doesn't matter, you could try sampling like this (modify sample rate to suit your scale):
$sample_rate = 100;
if(mt_rand(1,$sample_rate) == 1) {
$query = mysql_query(" UPDATE posts SET views = views + {$sample_rate} WHERE id = '{$id}' ");
// execute query, etc
}
If memcache is an option in your server environment, here's another cool way to sample, but also keep up with the precise number (unlike my other answer):
function recordPostPageView($page_id) {
$memcache = new Memcached(); // you could also pull this instance from somewhere else, if you want a bit more efficiency*
$key = "Counter for Post {$page_id}";
if(!$memcache->get($key)) {
$memcache->set($key, 0);
}
$new_count = $memcache->increment($key);
// you could uncomment the following if you still want to notify mysql of the value occasionally
/*
$notify_mysql_interval = 100;
if($new_count % $notify_mysql_interval == 0) {
$query = mysql_query("UPDATE posts SET views = {$new_count} WHERE id = '{$page_id}' ");
// execute query, etc
}
*/
return $new_count;
}
And don't mind purists crying foul about Singletons. Or you could pass it into this function, if you're more purist than pragmatist :)
IMHO best solution is to have views_count stored inside memory (memcached, whatever),
and do updates in memory. (Of course updates have to be synchronized)
Then you can use cron script which will push those values to db. (after some time - seconds, minutes, whatever.)
in the database there is only one column ip with primary key defined and then store ip in database using PHP code below:
Connection file :
<?php
$conn = mysqli_connect("localhost","root","");
if (!$conn) {
die("Connection failed: " . mysqli_connect_error());
}
$db=mysqli_select_db($conn,"DB_NAME");
if(!$db)
{
echo "Connection failed";
}
?>
PHP file:
<?php
$ip=$_SERVER['REMOTE_ADDR'];
$insert="INSERT INTO `id928751_photography`.`ip` (`ip`)VALUES ('$ip');";
$result = mysqli_query($conn,$insert);
?>
show count :
<?php
$select="SELECT COUNT(ip) as count from ip;";
$run= mysqli_query($conn,$select);
$res=mysqli_fetch_array($run);
echo $res['count'];
?>
using this method in the database store all server ip
NOTE: only server ip can store or count not device ip
You could keep a counter-array in cache (like APC or Memcache) and increase the counter for certain posts in that. Then store the updates once a while. You might loose some views if a cache-reset occures
Other solution would be to keep a separate table for visits only (Field: postid, visits). That is the fasters you can get from mysql. Try to use InnoDB engine, since it provides row-level-locking!
You can also check these lines of code. I think it will be helpful because you can achieve your goal with just a text file. It does not require any database activity.
<?php
session_start();
$counter_name = "counter.txt";
// Check if a text file exists. If not create one and initialize it to zero.
if (!file_exists($counter_name)) {
$f = fopen($counter_name, "w");
fwrite($f,"0");
fclose($f);
}
// Read the current value of our counter file
$f = fopen($counter_name,"r");
$counterVal = fread($f, filesize($counter_name));
fclose($f);
// Has visitor been counted in this session?
// If not, increase counter value by one
if(!isset($_SESSION['hasVisited'])){
$_SESSION['hasVisited']="yes";
$counterVal++;
$f = fopen($counter_name, "w");
fwrite($f, $counterVal);
fclose($f);
}
echo "You are visitor number $counterVal to this site";
This way show how many actual people viewed your website not just how many times they viewed your website.
Step1: Connecting to MySQL
dbconfig.php
try
{
// Returns DB instance or create initial connection
$pdo = new PDO("mysql:host={$DB_host};port={$DB_port};dbname={$DB_name};charset=utf8mb4",$DB_user,$DB_pass);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
}
catch(PDOException $e)
{
echo $e->getMessage();
}
Step2: Creating MySQL table
--
-- Table structure for table `unique_visitors`
--
CREATE TABLE `unique_visitors` (
`date` date NOT NULL,
`ip` text COLLATE utf8_unicode_ci NOT NULL,
`views` int(1) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
Step3: Create a visitor counter by using IP address.
<?php
require_once("dbconfig.php");
// Returns current date in YYYY-MM-DD format
$date = date("Y-m-d");
// Stores remote user ip address
$userIP = $_SERVER['REMOTE_ADDR'];
// Query for selecting record of current date from the table
$stmt = $pdo->prepare("SELECT * FROM unique_visitors WHERE date=:date");
$stmt->execute(['date' => $date]);
if(count($stmt->fetchAll()) === 0){
// Block will execute when there is no record of current date in the database table
$data = [
'date' => $date,
'ip' => $userIP,
];
// SQL query for inserting new record into the database table with current date and user IP address
$sql = "INSERT INTO unique_visitors (date, ip) VALUES (:date, :ip)";
$pdo->prepare($sql)->execute($data);
}else{
$row = $stmt->fetchAll(PDO::FETCH_ASSOC);
// Will execute when current IP is not in database
if(!preg_match('/'.$userIP.'/i',$row['ip'])){
// Combines previous and current user IP address with a separator for updating in the database
$newIP = "$row[ip] $userIP";
$data = [
'ip' => $newIP,
'date' => $date,
];
$sql = "UPDATE unique_visitors SET ip=:ip, views=views+1 WHERE date=:date";
$pdo->prepare($sql)->execute($data);
}
}
?>
<?php
session_start();
$counter_name = "counter.txt";
// Check if a text file exists. If not create one and initialize it to zero.
if (!file_exists($counter_name)) {
$f = fopen($counter_name, "w");
fwrite($f,"0");
fclose($f);
}
// Read the current value of our counter file
$f = fopen($counter_name,"r");
$counterVal = fread($f, filesize($counter_name));
fclose($f);
// Has visitor been counted in this session?
// If not, increase counter value by one
if(!isset($_SESSION['hasVisited'])){
$_SESSION['hasVisited']="yes";
$counterVal++;
$f = fopen($counter_name, "w");
fwrite($f, $counterVal);
fclose($f);
}
echo "You are visitor number $counterVal to this site";