Hello i'm trying to import data from a .csv file into mysql table. Below is the script i'm working with. After running it, only print_r($_FILES) was executed, and it didnt insert into the data base.
<?php session_start(); ?>
<?php require('includes/dbconnect.php'); ?>
<?php require 'includes/header.inc.php'; ?>
<?php
if(isset($_POST['SUBMIT']))
{
$fname = $_FILES['csv_file']['name']; //Acquire the name of the file
$chk_ext = explode(".",$fname);
$filename = $_FILES['csv_file']['tmp_name'];
$handle = fopen($filename, "r"); //Open the file for readability
while (($data = fgetcsv($handle,1000, ",")) !== FALSE)
{
$sql = "INSERT into biodata (student_number, fname, lname, level) values('$data[0]','$data[1]','$data[2]')";
mysql_query($sql) or die(mysql_error());
}
fclose($handle);
echo "Successfully Imported";
}
else
{
echo "Invalid File";
}
print_r($_FILES) ;
?>
Your query has problem.
Its expecting 4 columns (as you specified in column list) but you supllied only 3 columns.
$sql = "INSERT into biodata (student_number, fname, lname, level) values('$data[0]','$data[1]','$data[2]')";
First of all check whether file was opened successfully:
$handle = fopen($filename, "r");
if( !$handle){
die( 'Cannot open file fore reading');
}
That's actually only place you're not handling correctly (hope you have error reporting turned on, because this could only be problem with fgetcsv() and error report would be crucial).
Once you've worked out how to access the uploaded file, you could also look into using LOAD DATA INFILE. If you use the LOCAL keyword (LOAD DATA LOCAL INFILE), that should work even on shared hosting. For some examples, see: http://dev.mysql.com/doc/refman/5.0/en/load-data.html
This has the benefit of being much, much faster than large numbers of INSERTs, which is especially relevant for large CSV files. Plus, it's quite easy to use.
Related
I am working in php I want browse and upload the image file
this is my php code
<?php
if(isset($_POST['submit']))
{
$link= mysql_connect('localhost','root','');
mysql_select_db('bawa');
if(isset($_FILES['image']) && $_FILES['image']['size'] >0)
{
//Temporary file name stored on the server
$tmpname = $_FILES['image']['tmp_name'];
//read a file
$fp = fopen($tmpname,'r');
$data=fread($fp,filesize($tmpname));
$data=addslashes($data);
fclose($fp);
$query = ("UPDATE user_summary SET image='$data' where user_id=2");
$query .= "(image) VALUES ('$data)";
$results = mysql_query($query,$link);
echo "Working code";
}
else{
echo mysql_error();
}
}
?>
when i click on submit button my image should updated in my database but its not updating in database
any help?
The main problem at the moment is the line...
$query .= "(image) VALUES ('$data)";
This looks more like something that would be part of an INSERT statement. Commenting this out should mean the UPDATE should be correct.
Although as pointed out - you should work towards updating this to use either PDO or mysqli libraries and using prepared statements and bind variables.
I am working on a PHP/MySQL project, it must verify the following tasks:
-> The user uploads multiple large CSV files at a time with the same column names (X,Y,Z) in MySQL tables
-> The web app must perform an arithmetic operation between each csv file's column
-> The user can download the csv files after the operation as Excel files
For the upload part, i need to find a way to auto generate a table in the database for each csv file uploaded -instead of creating it in advance-, because the user should be able to upload as many files as he wants.
i tried to set a while loop that contains a create table, the loop goes from 0 to $var which is the number of csv files the user wishes to upload, however it doesnt add any table, here's the code for that part :
$con= getdb();
$var=$_GET["quantity"];
mysql_query("set $i=0;
while $i<`".$var."` do
create table `couche".$var."` ( X float NOT NULL,Y float NOT NULL,Z float NOT NULL);
set $i= $i+1;
end while");
}
Hi you can use the following way to achieve it :
<?php
//database connection details
$link = mysqli_connect('localhost','root','password','db_name') or die('Could not connect to MySQL');
// path where your CSV file is located
define('CSV_PATH','/var/www/html/skerp_scripts/');
// Name of your CSV file
$csv_file = CSV_PATH . "importItems.csv";
if (($handle = fopen($csv_file, "r")) !== FALSE) {
$header = true; //If there is a header in first row then set it to true
while (($data = fgetcsv($handle, 100000, ",")) !== FALSE) {
if($header == true){
/* Here you can perform checks whether all the column are as expected
for Eg: in CSV1 : id, firstname, lastname, age
in CSV2 : firstname, age ,id
Than you can tell the user that there is a misatch
*/
$header = false;
continue;
}
$column1 = $data[0];
$column2 = $data[1];
$column3 = $data[2];
$calculation = $column1 * $column3;
$result = mysqli_query($link, "INSERT INTO table_name (column1, column2, column3)" VALUES($column1, $column2, $calculation));
}
}
echo "File data successfully imported to database!!";
mysqli_close($connect);
My project requires external CSV files to be uploaded to a database every week and required outputs are extracted whenever required. Currently am using the below to browse through a HTML form and upload to the table. It works fine!
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file,"r");
while(($fileop = fgetcsv($handle,1000,",")) !==false)
{
$placement_name = $fileop[2];
$statistics_date = $fileop[0];
$impressions = $fileop[5];
$clicks = $fileop[6];
$sql = mysql_query("INSERT INTO 'table'(source,advertiser,campaign,placement_name,statistics_date,impressions,clicks) VALUES ('xxx','yyy','zzz','$placement_name','$statistics_date','$impressions',' $clicks')");}
But now, we might need to do this on a daily basis and automated (which we hope to do with scheduled cron jobs), so decision was to use LOAD DATA LOCAL INFILE. As you could see the values inserted for source, advertiser and campaign are custom ones. How could we use the LOAD DATA... instead of the browse function?
In one of my application, users can upload CSV file (| separated fields), after uploading I am storing all the content of file in temporary table (I truncate this table every time for new upload so that it contains the current file data). After that I am iterating over each and every row of that table, and performs some database operation as per the business logic.
The following code will illustrate this:
if(isset($_POST['btn_uploadcsv']))
{
$filename = $_FILES["csvupload"]["name"];
$uploads_dir = 'csvs'; //csv files...
$tmp_name = $_FILES["csvupload"]["tmp_name"];
$name = time();
move_uploaded_file($tmp_name, "$uploads_dir/$name");
$csvpath = "$uploads_dir/$name";
$row = 0;
$emptysql = "TRUNCATE TABLE `temp`";
$connector->query($emptysql);
if (($handle = fopen($csvpath, "r")) !== FALSE) {
$str_ins = "";
while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) {
/*
* Here I am getting the column values to be store in the
* the table, using INSERT command
*/
unset($data);
}
fclose($handle);
}
/*Here I am selecting above stored data using SELECT statement */
for($j=0;$j<count($allrecords);$j++)
{
echo "In the loop";
/*If I use echo statement for debugging it is working fine*/
//set_time_limit(300);
/* I have tried this also but it is not working*/
if(!empty($allrecords[$j]['catid']))
{
// Here is my business logic which mailny deals with
// conditional DB operation
}
echo "Iteration done.";
/*If I use echo statement for debugging it is working fine*/
}
}
The problem is when I execute aboe script on server it is giving server timeout error. But when I test above script on my localhost, is is working fine.
Also as mentioned in the code, if I use echo statements for debugging, then it is working fine, and when I remove that it starts giving connection timeout problem.
I have tried set_time_limit(300), set_time_limit(0), but none of them seems to work.
Any idea, how can I resolve the above problem.
-- Many thanks for your time.
Edit:
I have checked that, files are uploading on the server.
set_time_limit
change to
ini_set("max_execution_time",300);
When max_execution_time is not set in php.ini set_time_limit valid.
I have resolved the issue using flush, to send intermediate output to the browser, while the query is executing in the background.
This is how I modified the code:
for($j=0;$j<count($allrecords);$j++)
{
/*At the end of each iteration, I have added the following code*/
echo " ";
flush();
}
Thanks to the contributors over this link PHP: Possible to trickle-output to browser while waiting for database query to execute?, from where I got inspiration.
I have some code to upload and download a sound recording from android. The problem i am having is that it appears an extra blank line is appearing in the binary. When this is removed the file plays i would like to know how to stop this line appearing. Below is my upload and download code as well as a print screen of the blank line
Upload code
mysql_select_db ($database);
// Make sure the user actually
// selected and uploaded a file
if (isset($_FILES['image']) && $_FILES['image']['size'] > 0) {
$size = $_FILES['image']['size'];
$type = $_FILES['image']['type'];
// Temporary file name stored on the server
$tmpName = $_FILES['image']['tmp_name'];
// Read the file
$fp = fopen($tmpName, 'r');
$data = fread($fp, filesize($tmpName));
fclose($fp);
$data = trim(addslashes($data));
// Create the query and insert
// into our database.
$query = "INSERT INTO media";
$query .= "(file, file_size, file_type) VALUES ('$data','$size','$type')";
$results = mysql_query($query, $link);
$mediaid = mysql_insert_id();
$gender = $_POST['gender'];
$cat_id = $_POST['cat'];
$name = $_POST['name'];
$lat = $_POST['lat'];
$lon = $_POST['lon'];
$user = $_POST['user'];
$query="INSERT INTO instance (name, gender, cat_id, lon, lat, user_id) VALUES ('$name', '$gender', '$cat_id', '$lon', '$lat', '$user')";
$result=mysql_query($query);
$instanceid = mysql_insert_id();
$query4 = "INSERT INTO media_link";
$query4 .="(media_id, instance_id) Values ('$mediaid','$instanceid')";
$results4 = mysql_query($query4, $link);
}
// Close our MySQL Link
mysql_close($link);
?>
download code
$test2 = #mysql_query("select * from media where media_id = '$media'");
$result2 = mysql_fetch_array($test2);
header('Content-Type: audio/AMR');
header('Content-Disposition: attachment; filename="ifound.amr"');
print $result2['file'];
exit;
?>
Blank line that is appearing
Check if your download code has a blank line before the first <?php . Remember to check any file it gets included from as well.
Also change addslashes to mysql_real_escape_string. It might not cause a problem here, but it is security hole.
If you can't find the root of your problem, you could always try base64_encode / base64_decode. It takes 30% more storage space, but it's a bullet proof way to store binary data in strings.
Just a tip:
$fp = fopen($tmpName, 'r');
$data = fread($fp, filesize($tmpName));
fclose($fp);
could be replaced with
$data = file_get_contents($tmpName)
I also having the same problem on the coding, but after that I found out that actually one of the including files hase empty space like below:
tool.php
line 1
line 2 <?php
line 3 .......
line 4 ?>
line 1 is causing the problem when I include on
<?php
if($_SERVER['REQUEST_METHOD']=="GET"){
if(isset($_GET["ImageID"])){
/* below require file causing the problem */
require_once($_SERVER['DOCUMENT_ROOT'] . "/model/Game/Tools.php");
$image = new ClsGameImage();
$image->Select($_GET["ImageID"]);
header("Content-type: ". $image->MIMEType);
header("Content-length: " . $image->ImageSize);
header("Content-Disposition:attachment;filename=". $image->Name0);
echo $image->Image0;
}
}
?>
It's possible ltrim could help in this situation if the line is being introduced by PHP. There is also a ltrim function for MySQL if it's being introduced in the database.
Also, use mysql_real_escape_string instead of addslashes.
You may want to consider serving media from a media directory instead of storing it in a database. I know this does nothing for replication purposes, but there are things you can do to propagate filesystem changes to multiple computers, if necessary.
This is obviously a preferential choice.