PHP Connection Timeout Issue - php

In one of my application, users can upload CSV file (| separated fields), after uploading I am storing all the content of file in temporary table (I truncate this table every time for new upload so that it contains the current file data). After that I am iterating over each and every row of that table, and performs some database operation as per the business logic.
The following code will illustrate this:
if(isset($_POST['btn_uploadcsv']))
{
$filename = $_FILES["csvupload"]["name"];
$uploads_dir = 'csvs'; //csv files...
$tmp_name = $_FILES["csvupload"]["tmp_name"];
$name = time();
move_uploaded_file($tmp_name, "$uploads_dir/$name");
$csvpath = "$uploads_dir/$name";
$row = 0;
$emptysql = "TRUNCATE TABLE `temp`";
$connector->query($emptysql);
if (($handle = fopen($csvpath, "r")) !== FALSE) {
$str_ins = "";
while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) {
/*
* Here I am getting the column values to be store in the
* the table, using INSERT command
*/
unset($data);
}
fclose($handle);
}
/*Here I am selecting above stored data using SELECT statement */
for($j=0;$j<count($allrecords);$j++)
{
echo "In the loop";
/*If I use echo statement for debugging it is working fine*/
//set_time_limit(300);
/* I have tried this also but it is not working*/
if(!empty($allrecords[$j]['catid']))
{
// Here is my business logic which mailny deals with
// conditional DB operation
}
echo "Iteration done.";
/*If I use echo statement for debugging it is working fine*/
}
}
The problem is when I execute aboe script on server it is giving server timeout error. But when I test above script on my localhost, is is working fine.
Also as mentioned in the code, if I use echo statements for debugging, then it is working fine, and when I remove that it starts giving connection timeout problem.
I have tried set_time_limit(300), set_time_limit(0), but none of them seems to work.
Any idea, how can I resolve the above problem.
-- Many thanks for your time.
Edit:
I have checked that, files are uploading on the server.

set_time_limit
change to
ini_set("max_execution_time",300);
When max_execution_time is not set in php.ini set_time_limit valid.

I have resolved the issue using flush, to send intermediate output to the browser, while the query is executing in the background.
This is how I modified the code:
for($j=0;$j<count($allrecords);$j++)
{
/*At the end of each iteration, I have added the following code*/
echo " ";
flush();
}
Thanks to the contributors over this link PHP: Possible to trickle-output to browser while waiting for database query to execute?, from where I got inspiration.

Related

error to upload to images with php

I am trying to upload two images with php. And add them to the database. Somehow it only uploads one image and the records in the database always have the same values.
this is the code i use
<?php
include "../connect.php";
$name1 = $_FILES['pic1']['name'];
$size1 = $_FILES['pic1']['size'];
$name2 = $_FILES['pic2']['name'];
$size3 = $_FILES['pic2']['size'];
if(isset($_POST['name']))
{
$extension1 = pathinfo($name1,PATHINFO_EXTENSION);
$array = array('png','gif','jpeg','jpg');
if (!in_array($extension1,$array)){
echo "<div class='faild'>".$array[0]."-".$array[1]."-".$array[2]."-".$array[3]." --> (".$name.")</div>";
}else if ($size>10000000){
echo "<div class='faild'>Size</div>";
}else {
$new_image1 = time().'.'.$extension1;
$file1 = "images/upload";
$pic1 = "$file1/".$new_image1;
move_uploaded_file($_FILES["pic1"]["tmp_name"],"../".$pic1."");
$insert = mysql_query("update temp set pic='$pic1' ") or die("error ins");
}
$extension2 = pathinfo($name2,PATHINFO_EXTENSION);
$array = array('png','gif','jpeg','jpg');
if (!in_array($extension2,$array)){
echo "<div class='faild'>".$array[0]."-".$array[1]."-".$array[2]."-".$array[3]." --> (".$name.")</div>";
}else if ($size>10000000){
echo "<div class='faild'>Size</div>";
}else {
$new_image2 = time().'.'.$extension2;
$file2 = "images/upload";
$pic2 = "$file2/".$new_image2;
move_uploaded_file($_FILES["pic2"]["tmp_name"],"../".$pic2."");
$insert = mysql_query("update temp set passport='$pic2'") or die("error ins");
}
}
?>
One of the problems you have is with your update statement. There is no 'where' statement saying which record in the database should be updated so this query updates them all. That's why you only have the last image in all the database rows.
Besides that, your code is not very good from a security point of view. You should take a look at mysqli or pdo for your database connection and queries because MySQL is deprecated and removed from PHP. Also take a look at SQL injections and data validation. Besides some very basic extension and size validation there is nothing there to keep things save. Try escaping and validating all user inputs.
And another point would be to take a look at 'functions'. You're running almost the exact same piece of code at least twice. And every code change has to be done twice. Perfect for a function call, something like
function storeImage($image){
// write the uploading and storing PHP here
}

MySQL File Download Speed Limit

I have all my files stored in a mysql database as blobs. I am trying to add a speed limit to the rate at which a user can download them through our PHP website. I have tried to use the "sleep(1);" method, it does not seem to work or i am not doing it right. So if anyone knows a way to limit the speed, i would love your help.
Here is my download code
$query=mysql_query("SELECT * FROM file_servers WHERE id='$file_server_id'");
$fetch=mysql_fetch_assoc($query);
$file_server_ip=$fetch['ip'];
$file_server_port=$fetch['port'];
$file_server_username=$fetch['username'];
$file_server_password=$fetch['password'];
$file_server_db=$fetch['database_name'];
$connectto=$file_server_ip.":".$file_server_port;
if (!$linkid = #mysql_connect($connectto, $file_server_username, $file_server_password, true))
{
die("Unable to connect to storage server!");
}
if (!mysql_select_db($file_server_db, $linkid))
{
die("Unable to connect to storage database!");
}
$nodelist = array();
// Pull the list of file inodes
$SQL = "SELECT id FROM file_data WHERE file_id='$file_id' order by id";
if (!$RES = mysql_query($SQL, $linkid))
{
die("Failure to retrive list of file inodes");
}
while ($CUR = mysql_fetch_object($RES))
{
$nodelist[] = $CUR->id;
}
// Send down the header to the client
header("Content-Type: $data_type");
header("Content-Length: $size");
header("Content-Disposition: attachment; filename=$name");
// Loop thru and stream the nodes 1 by 1
for ($Z = 0 ; $Z < count($nodelist) ; $Z++)
{
$SQL = "select file_data from file_data where id = " . $nodelist[$Z];
if (!$RESX = mysql_query($SQL, $linkid))
{
die("Failure to retrive file node data");
}
$DataObj = mysql_fetch_object($RESX);
echo $DataObj->file_data;
}
One way of doing this may be the combination of flush and sleep:
read part of what you get from database
output some bytes
flush the output to the user
sleep for 1 second
But also take a loot at throttle function:
http://php.net/manual/en/function.http-throttle.php
It also have an example there. I think it is better suited.
it is in the very last echo line in your code where you would like to implement throtling. Im not familiar with whether php supports throtling output.
if not, you can try to split up that content ($DataObj->file_data) you wish to echo, and echo it little piece by little piece with small pauses in between
and be sure to disable outbut buffering. otherwise all that you echo will not be outputted until the entire script is done.

Mystery echo - can't find why

So I've made this upload script and to make it more secure, I'm finding out the type of each file.
However, for some reason, the filetype is being echoed back to me!
For example:
image/jpeg; charset=binary Please upload only SWF files!
The echoed string looks same when the upload is successful.
The code:
<?php session_start();
defined('IN_SCRIPT') ? NULL : define('IN_SCRIPT', NULL);
require_once 'inc/db_connect.php';
require_once 'styles/import.php';
$style = new style_class(NULL);
if(!isset($_FILES['file']['tmp_name']) || empty($_FILES['file']['tmp_name'])) die($style->upload_no_parameter());
$filetype = system('file -bi '.$_FILES['file']['tmp_name']);
$filetype = explode(';', $filetype, 1);
if ($filetype[0] != 'application/x-shockwave-flash; charset=binary') die($style->upload_wrong_format());
$sha256 = hash_file("sha256", $_FILES['file']['tmp_name']);
$query = $db->prepare('SELECT id FROM swf WHERE hash = :hash');
$result = $query->execute(array(':hash'=>$sha256));
if ($query->rowCount() != 0) die($style->upload_duplicate());
$query = $db->query('SELECT * FROM swf ORDER BY id DESC LIMIT 1;');
$name = $query->fetch(PDO::FETCH_ASSOC);
$new_name = 'uploads/'.($name['id']+1).'.swf';
if(move_uploaded_file($_FILES['file']['tmp_name'], $new_name)) {
$query = $db->prepare('INSERT INTO swf (uploader, upload_time, hash) VALUES (:id, NOW(), :hash);');
$query->execute(array(':id' => $_SESSION['id'], ':hash'=> $sha256));
echo $style->upload_success();
}
else
echo $style->upload_fail();
?>
I can't see why the script would do such echo...
Thank you!
EDIT:
The style_class was the first place where I looked. This class contains functions returning mainly HTML text. The whole class is auto-generated from database.
I'm copying here the upload_* from the generated file, so you can see:
class style_class{
function upload_no_parameter(){
echo "<b>All parameters must be set!</b>";
}
function upload_fail(){
echo "<b>There was an error, please try again.</b>";
}
function upload_success(){
echo "<b>Your SWF has been uploaded!</b>";
}
function upload_duplicate(){
echo "<b>File already exists!</b>";
}
function upload_wrong_format(){
echo "<b>Please upload only SWF files!</b>";
}
}
Thank you!
I'd bet die($style->upload_wrong_format()) is causing the issue. Check that function.
You've got some very nasty logic bugs in your code:
1) Assuming the file upload succeeded. Proper error handling goes like this:
if ($_FILES['file']['error'] !== UPLOAD_ERR_OK) {
die("File upload failed with error code " . $_FILES['file']['error']);
}
Checking any of the other fields in any file upload is not proper - those fields can still be present and populated even for a failed upload. The error codes are documented here: http://php.net/manual/en/features.file-upload.errors.php
2) you're using exec() and calling file to determine mimetypes. Why? PHP has the finfo library for just this purpose: http://php.net/manual/en/book.fileinfo.php it uses the same magic numbers library as file and doesn't require an exec() call to work.
3) You have a very racey error-prone method of getting an ID number for your swf:
$query = $db->query('SELECT * FROM swf ORDER BY id DESC LIMIT 1;');
$name = $query->fetch(PDO::FETCH_ASSOC);
$new_name = 'uploads/'.($name['id']+1).'.swf';
Nothing says that another script cannot execute AND complete in the time you fetch this ID number and the time you complete thigns here. A proper method is to start a transaction, insert a skeleton record into the DB, retrieve its auto_increment primary key, then update the record and do your file moves with that id. It'll be guaranteed to be unique, whereas at some point your code WILL fail and stomp on another upload.

csv into mysql by php

Hello i'm trying to import data from a .csv file into mysql table. Below is the script i'm working with. After running it, only print_r($_FILES) was executed, and it didnt insert into the data base.
<?php session_start(); ?>
<?php require('includes/dbconnect.php'); ?>
<?php require 'includes/header.inc.php'; ?>
<?php
if(isset($_POST['SUBMIT']))
{
$fname = $_FILES['csv_file']['name']; //Acquire the name of the file
$chk_ext = explode(".",$fname);
$filename = $_FILES['csv_file']['tmp_name'];
$handle = fopen($filename, "r"); //Open the file for readability
while (($data = fgetcsv($handle,1000, ",")) !== FALSE)
{
$sql = "INSERT into biodata (student_number, fname, lname, level) values('$data[0]','$data[1]','$data[2]')";
mysql_query($sql) or die(mysql_error());
}
fclose($handle);
echo "Successfully Imported";
}
else
{
echo "Invalid File";
}
print_r($_FILES) ;
?>
Your query has problem.
Its expecting 4 columns (as you specified in column list) but you supllied only 3 columns.
$sql = "INSERT into biodata (student_number, fname, lname, level) values('$data[0]','$data[1]','$data[2]')";
First of all check whether file was opened successfully:
$handle = fopen($filename, "r");
if( !$handle){
die( 'Cannot open file fore reading');
}
That's actually only place you're not handling correctly (hope you have error reporting turned on, because this could only be problem with fgetcsv() and error report would be crucial).
Once you've worked out how to access the uploaded file, you could also look into using LOAD DATA INFILE. If you use the LOCAL keyword (LOAD DATA LOCAL INFILE), that should work even on shared hosting. For some examples, see: http://dev.mysql.com/doc/refman/5.0/en/load-data.html
This has the benefit of being much, much faster than large numbers of INSERTs, which is especially relevant for large CSV files. Plus, it's quite easy to use.

PHP script is still running on server, but the webpage display has stopped

I wrote a PHP script to push notifications using APNS. I added a PHP progress bar to monitor how many users that have been pushed. The progress bar is displayed in the PHP page. I also keep updating a MySOL database to record the number. This script is expected to run a very long time. After running for about 3 hours, the PHP page (with progress bar) is stopped, but when I check the database, the number of pushed users is still increasing. This means the script is still running in server's memory, but why has the page display stopped?
here is some code:
$count = 1;
while($row = mysql_fetch_array( $result )) {
$pushToken = $row['pushToken'];
$result2 = mysql_query("SELECT COUNT(*) FROM deactivated_pushtokens WHERE pushToken LIKE '$pushToken'");
list($token_deactivated) = mysql_fetch_row($result2);
if ($token_deactivated==0){
if ($row['pushToken']!=""){
if (strlen($row['pushToken']) == 64){//All valid push tokens will have a 32x2=64 length
//echo "<br>$count. Sending push to ".$row['deviceID']." Device token = ".$row['pushToken'];
//echo "<br><br>";
if($count > $sendThreshold)
{
$pushMessage->sendMessage($row['pushToken'],$count,$appVersion,$mode,$message, $push_id);
}
if($count >= $push_update_count * $push_update_interval)
{
$pushlog_update = mysql_query("UPDATE pushlogs SET num_push_sent = '".$count."' WHERE push_id = '".$push_id."'");
if(!$pushlog_update)
{
// echo "pushlog table update error: ".mysql_error."<br />";
}
/* if($count<=$maxBar) // if failed again commment out and use block bleow
{
$prb->moveStep($count);
}
*/
$push_update_count++;
}
if($count >= $update_progressbar_count * $update_progressbar_interval)
{
if($count<=$maxBar)
{
$prb->moveStep($count);
}
$update_progressbar_interval++;
}
$count++;
// move the Bar
Perhaps the page display stopped due to the configuration of apache in httpd.conf
KeepAliveTimeout 300
PHP still running due to the property max_execution_time on php.ini
Just to notice, you are not calling the mysql_error function at all, replace the line:
echo "pushlog table update error: ".mysql_error."";
with this one:
echo "pushlog table update error: ".mysql_error()."<br />";
Further more, what you are doing is very bad practice. Try making an updater, keep you position into a session, and update/refresh the page and continue from where you left the execution. And if you do not have a tim_out limit in your .htaccess doesn't mean anything. And sometimes you might just not set the time limit.
Try refreshing page first, to see if it helps you. You can use a html meta tag for that. or:
header('Location: thispage.php');
And make each step of you program into a request.

Categories