PHP script export new entries to SQL table and overwrite file - php

I am trying to create a php script that will be run by a cron job on a nightly basis.
The database holds entries from a contact form.
The idea is to run the php script (via cron) and have it export any new entries from the database from that day into a csv file on the server, overwriting that file each time for housekeeping. It needs to only export the entries to the database from that day.
Here is what I have at the moment:
<?php
// Connect and query the database for the users
$conn = new PDO("mysql:host=localhost;dbname=dbname", 'username', 'password');
$sql = "SELECT * FROM enquiries ORDER BY firstname";
$results = $conn->query($sql);
// Pick a filename and destination directory for the file
// Remember that the folder where you want to write the file has to be writable
//$filename = "/tmp/db_user_export_".time().".csv";
$filename = "/home/domain/public_html/csv/db_user_export.csv";
// Actually create the file
// The w+ parameter will wipe out and overwrite any existing file with the same name
$handle = fopen($filename, 'wb');
// Write the spreadsheet column titles / labels
fputcsv($handle, array('FirstName','Email'));
// Write all the user records to the spreadsheet
foreach($results as $row)
{
fputcsv($handle, array($row['firstname'], $row['email']));
}
// Finish writing the file
fclose($handle);
?>

Your sql must by:
SELECT * FROM enquiries where DATE(DateField) = CURDATE() ORDER BY firstname

Related

create unique folder of each user from input and save images inside that folder using php

I'm working on project for creating Online Exam for college Entrance. I am looking for solution to upload image and create folder using user serial id as which is as per mysql database primary increment idsave the images inside the folder.
Here is my solution where images are uploaded in already created folder called uploads. How to modify this for what I required.
<?php
session_start();
include('../connect.php');
$a = $_POST['name'];
// query
$file_name = strtolower($_FILES['file']['name']);
$file_ext = substr($file_name, strrpos($file_name, '.'));
$prefix = 'your_site_name_'.md5(time()*rand(1, 9999));
$file_name_new = $prefix.$file_ext;
$path = '../uploads/'.$file_name_new;
/* check if the file uploaded successfully */
if(#move_uploaded_file($_FILES['file']['tmp_name'], $path)) {
//do your write to the database filename and other details
$sql = "INSERT INTO student (name,file) VALUES (:a,:h)";
$q = $db->prepare($sql);
$q->execute(array(':a'=>$a,':h'=>$file_name_new));
header("location: students.php");
}
?>
First: Be careful uploading files without correctly checking it's types;
Second: As Sean mentioned, this approach may get out of control.
--
The following example changed the steps that you tried.
First insert the user to get it's ID and then create the folder under '../uploads' with it.
This way you do not need to move the file two times (move the file to ../uploads, insert the user and then move the file again to ../uploads/userID)
<?php
...
$sql = "INSERT INTO student (name,file) VALUES (:a,:h)";
$q = $db->prepare($sql);
$q->execute(array(':a'=>$a,':h'=>$file_name_new));
// Get user id
$studentId = $db->lastInsertId();
// Create path with new userId just inserted
$path = "../uploads/$studentId/";
if(!file_exists($path)) { // maybe "&& !is_dir($path)" ?
// IMPORTANT NOTE: the second argument is the permission of the folder. 0777 is the default and may cause security flaws
mkdir($path, 0777, true);
}
// Move the uploaded file to new path with the userId
#move_uploaded_file($_FILES['file']['tmp_name'], $path.$file_name_new);
I think you need to check if the user already exists before inserting (but I don't know the details of your project)

Automatically generate MySQL table after csv file upload

I am working on a PHP/MySQL project, it must verify the following tasks:
-> The user uploads multiple large CSV files at a time with the same column names (X,Y,Z) in MySQL tables
-> The web app must perform an arithmetic operation between each csv file's column
-> The user can download the csv files after the operation as Excel files
For the upload part, i need to find a way to auto generate a table in the database for each csv file uploaded -instead of creating it in advance-, because the user should be able to upload as many files as he wants.
i tried to set a while loop that contains a create table, the loop goes from 0 to $var which is the number of csv files the user wishes to upload, however it doesnt add any table, here's the code for that part :
$con= getdb();
$var=$_GET["quantity"];
mysql_query("set $i=0;
while $i<`".$var."` do
create table `couche".$var."` ( X float NOT NULL,Y float NOT NULL,Z float NOT NULL);
set $i= $i+1;
end while");
}
Hi you can use the following way to achieve it :
<?php
//database connection details
$link = mysqli_connect('localhost','root','password','db_name') or die('Could not connect to MySQL');
// path where your CSV file is located
define('CSV_PATH','/var/www/html/skerp_scripts/');
// Name of your CSV file
$csv_file = CSV_PATH . "importItems.csv";
if (($handle = fopen($csv_file, "r")) !== FALSE) {
$header = true; //If there is a header in first row then set it to true
while (($data = fgetcsv($handle, 100000, ",")) !== FALSE) {
if($header == true){
/* Here you can perform checks whether all the column are as expected
for Eg: in CSV1 : id, firstname, lastname, age
in CSV2 : firstname, age ,id
Than you can tell the user that there is a misatch
*/
$header = false;
continue;
}
$column1 = $data[0];
$column2 = $data[1];
$column3 = $data[2];
$calculation = $column1 * $column3;
$result = mysqli_query($link, "INSERT INTO table_name (column1, column2, column3)" VALUES($column1, $column2, $calculation));
}
}
echo "File data successfully imported to database!!";
mysqli_close($connect);

load data local infile user provided VALUES

My project requires external CSV files to be uploaded to a database every week and required outputs are extracted whenever required. Currently am using the below to browse through a HTML form and upload to the table. It works fine!
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file,"r");
while(($fileop = fgetcsv($handle,1000,",")) !==false)
{
$placement_name = $fileop[2];
$statistics_date = $fileop[0];
$impressions = $fileop[5];
$clicks = $fileop[6];
$sql = mysql_query("INSERT INTO 'table'(source,advertiser,campaign,placement_name,statistics_date,impressions,clicks) VALUES ('xxx','yyy','zzz','$placement_name','$statistics_date','$impressions',' $clicks')");}
But now, we might need to do this on a daily basis and automated (which we hope to do with scheduled cron jobs), so decision was to use LOAD DATA LOCAL INFILE. As you could see the values inserted for source, advertiser and campaign are custom ones. How could we use the LOAD DATA... instead of the browse function?

PHP Connection Timeout Issue

In one of my application, users can upload CSV file (| separated fields), after uploading I am storing all the content of file in temporary table (I truncate this table every time for new upload so that it contains the current file data). After that I am iterating over each and every row of that table, and performs some database operation as per the business logic.
The following code will illustrate this:
if(isset($_POST['btn_uploadcsv']))
{
$filename = $_FILES["csvupload"]["name"];
$uploads_dir = 'csvs'; //csv files...
$tmp_name = $_FILES["csvupload"]["tmp_name"];
$name = time();
move_uploaded_file($tmp_name, "$uploads_dir/$name");
$csvpath = "$uploads_dir/$name";
$row = 0;
$emptysql = "TRUNCATE TABLE `temp`";
$connector->query($emptysql);
if (($handle = fopen($csvpath, "r")) !== FALSE) {
$str_ins = "";
while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) {
/*
* Here I am getting the column values to be store in the
* the table, using INSERT command
*/
unset($data);
}
fclose($handle);
}
/*Here I am selecting above stored data using SELECT statement */
for($j=0;$j<count($allrecords);$j++)
{
echo "In the loop";
/*If I use echo statement for debugging it is working fine*/
//set_time_limit(300);
/* I have tried this also but it is not working*/
if(!empty($allrecords[$j]['catid']))
{
// Here is my business logic which mailny deals with
// conditional DB operation
}
echo "Iteration done.";
/*If I use echo statement for debugging it is working fine*/
}
}
The problem is when I execute aboe script on server it is giving server timeout error. But when I test above script on my localhost, is is working fine.
Also as mentioned in the code, if I use echo statements for debugging, then it is working fine, and when I remove that it starts giving connection timeout problem.
I have tried set_time_limit(300), set_time_limit(0), but none of them seems to work.
Any idea, how can I resolve the above problem.
-- Many thanks for your time.
Edit:
I have checked that, files are uploading on the server.
set_time_limit
change to
ini_set("max_execution_time",300);
When max_execution_time is not set in php.ini set_time_limit valid.
I have resolved the issue using flush, to send intermediate output to the browser, while the query is executing in the background.
This is how I modified the code:
for($j=0;$j<count($allrecords);$j++)
{
/*At the end of each iteration, I have added the following code*/
echo " ";
flush();
}
Thanks to the contributors over this link PHP: Possible to trickle-output to browser while waiting for database query to execute?, from where I got inspiration.

What is the best way to count page views in PHP/MySQL?

And by best I mean most efficient, right now placing this on my post.php file is the only thing I can think of:
$query = mysql_query(" UPDATE posts SET views + 1 WHERE id = '$id' ");
is there a better way, a method that would consume less server resources. I ask because if this was a small app I would have no problem with the above, but I am trying to build something that will be used by a lot of people and I want to be as query conscious as possible.
If you're interested in conserving resources and still using SQL for reporting, and precise # doesn't matter, you could try sampling like this (modify sample rate to suit your scale):
$sample_rate = 100;
if(mt_rand(1,$sample_rate) == 1) {
$query = mysql_query(" UPDATE posts SET views = views + {$sample_rate} WHERE id = '{$id}' ");
// execute query, etc
}
If memcache is an option in your server environment, here's another cool way to sample, but also keep up with the precise number (unlike my other answer):
function recordPostPageView($page_id) {
$memcache = new Memcached(); // you could also pull this instance from somewhere else, if you want a bit more efficiency*
$key = "Counter for Post {$page_id}";
if(!$memcache->get($key)) {
$memcache->set($key, 0);
}
$new_count = $memcache->increment($key);
// you could uncomment the following if you still want to notify mysql of the value occasionally
/*
$notify_mysql_interval = 100;
if($new_count % $notify_mysql_interval == 0) {
$query = mysql_query("UPDATE posts SET views = {$new_count} WHERE id = '{$page_id}' ");
// execute query, etc
}
*/
return $new_count;
}
And don't mind purists crying foul about Singletons. Or you could pass it into this function, if you're more purist than pragmatist :)
IMHO best solution is to have views_count stored inside memory (memcached, whatever),
and do updates in memory. (Of course updates have to be synchronized)
Then you can use cron script which will push those values to db. (after some time - seconds, minutes, whatever.)
in the database there is only one column ip with primary key defined and then store ip in database using PHP code below:
Connection file :
<?php
$conn = mysqli_connect("localhost","root","");
if (!$conn) {
die("Connection failed: " . mysqli_connect_error());
}
$db=mysqli_select_db($conn,"DB_NAME");
if(!$db)
{
echo "Connection failed";
}
?>
PHP file:
<?php
$ip=$_SERVER['REMOTE_ADDR'];
$insert="INSERT INTO `id928751_photography`.`ip` (`ip`)VALUES ('$ip');";
$result = mysqli_query($conn,$insert);
?>
show count :
<?php
$select="SELECT COUNT(ip) as count from ip;";
$run= mysqli_query($conn,$select);
$res=mysqli_fetch_array($run);
echo $res['count'];
?>
using this method in the database store all server ip
NOTE: only server ip can store or count not device ip
You could keep a counter-array in cache (like APC or Memcache) and increase the counter for certain posts in that. Then store the updates once a while. You might loose some views if a cache-reset occures
Other solution would be to keep a separate table for visits only (Field: postid, visits). That is the fasters you can get from mysql. Try to use InnoDB engine, since it provides row-level-locking!
You can also check these lines of code. I think it will be helpful because you can achieve your goal with just a text file. It does not require any database activity.
<?php
session_start();
$counter_name = "counter.txt";
// Check if a text file exists. If not create one and initialize it to zero.
if (!file_exists($counter_name)) {
$f = fopen($counter_name, "w");
fwrite($f,"0");
fclose($f);
}
// Read the current value of our counter file
$f = fopen($counter_name,"r");
$counterVal = fread($f, filesize($counter_name));
fclose($f);
// Has visitor been counted in this session?
// If not, increase counter value by one
if(!isset($_SESSION['hasVisited'])){
$_SESSION['hasVisited']="yes";
$counterVal++;
$f = fopen($counter_name, "w");
fwrite($f, $counterVal);
fclose($f);
}
echo "You are visitor number $counterVal to this site";
This way show how many actual people viewed your website not just how many times they viewed your website.
Step1: Connecting to MySQL
dbconfig.php
try
{
// Returns DB instance or create initial connection
$pdo = new PDO("mysql:host={$DB_host};port={$DB_port};dbname={$DB_name};charset=utf8mb4",$DB_user,$DB_pass);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
}
catch(PDOException $e)
{
echo $e->getMessage();
}
Step2: Creating MySQL table
--
-- Table structure for table `unique_visitors`
--
CREATE TABLE `unique_visitors` (
`date` date NOT NULL,
`ip` text COLLATE utf8_unicode_ci NOT NULL,
`views` int(1) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
Step3: Create a visitor counter by using IP address.
<?php
require_once("dbconfig.php");
// Returns current date in YYYY-MM-DD format
$date = date("Y-m-d");
// Stores remote user ip address
$userIP = $_SERVER['REMOTE_ADDR'];
// Query for selecting record of current date from the table
$stmt = $pdo->prepare("SELECT * FROM unique_visitors WHERE date=:date");
$stmt->execute(['date' => $date]);
if(count($stmt->fetchAll()) === 0){
// Block will execute when there is no record of current date in the database table
$data = [
'date' => $date,
'ip' => $userIP,
];
// SQL query for inserting new record into the database table with current date and user IP address
$sql = "INSERT INTO unique_visitors (date, ip) VALUES (:date, :ip)";
$pdo->prepare($sql)->execute($data);
}else{
$row = $stmt->fetchAll(PDO::FETCH_ASSOC);
// Will execute when current IP is not in database
if(!preg_match('/'.$userIP.'/i',$row['ip'])){
// Combines previous and current user IP address with a separator for updating in the database
$newIP = "$row[ip] $userIP";
$data = [
'ip' => $newIP,
'date' => $date,
];
$sql = "UPDATE unique_visitors SET ip=:ip, views=views+1 WHERE date=:date";
$pdo->prepare($sql)->execute($data);
}
}
?>
<?php
session_start();
$counter_name = "counter.txt";
// Check if a text file exists. If not create one and initialize it to zero.
if (!file_exists($counter_name)) {
$f = fopen($counter_name, "w");
fwrite($f,"0");
fclose($f);
}
// Read the current value of our counter file
$f = fopen($counter_name,"r");
$counterVal = fread($f, filesize($counter_name));
fclose($f);
// Has visitor been counted in this session?
// If not, increase counter value by one
if(!isset($_SESSION['hasVisited'])){
$_SESSION['hasVisited']="yes";
$counterVal++;
$f = fopen($counter_name, "w");
fwrite($f, $counterVal);
fclose($f);
}
echo "You are visitor number $counterVal to this site";

Categories