For better or worse, I am storing binary information in a database table and am having a problem retrieving it. Each BLOB has a newline prepended to it upon retrieval, at least, I believe it's upon retrieval, as the binary object in the table is exactly the same size as the source file.
I've searched for a similar problem to mine, and the closest I have found is this However, I am using PDO instead of mysql_* and I have checked for empty lines prior to the opening
Here's the retrieval function stored in a separate file that I'm including in my test:
(in raw.php):
function return_raw_rawid($raw_id) {
$data = array();
$aggregate_data = array();
$sql = "SELECT * FROM `raw` WHERE `raw_id` = :rawid";
try {
$db_obj = dbCore::getInstance();
$query = $db_obj->dbh->prepare($sql);
$query->bindValue(':rawid', $raw_id);
if ($query->execute()) {
while($results = $query->fetch(PDO::FETCH_ASSOC)) {
$data['raw_id'] = $results['raw_id'];
$data['filename'] = $results['filename'];
$data['mime_type'] = $results['mime_type'];
$data['file_size'] = $results['file_size'];
$data['file_data'] = $results['file_data'];
$data['test_id'] = $results['test_id'];
$data['user_id'] = $results['user_id'];
$data['time'] = date('Y-m-d H:i:s', $results['time']);
$aggregate_data[] = $data;
} // while
} // if
$query->closeCursor();
return $aggregate_data;
} catch (PDOException $ex) {
$errors[] = $ex;
} // catch
}
Here's the code I'm testing it with in a separate file:
<?php
include 'core/init.php'; // Contains protect_page() and includes for return_raw_rawid
protect_page();
$blob_id = 20;
$blob = return_raw_rawid($blob_id);
$data = ltrim($blob[0]['file_data']);
$name = ltrim($blob[0]['filename']);
$size = ltrim($blob[0]['file_size']);
$type = ltrim($blob[0]['mime_type']);
header("Content-type: $type");
header("Content-length: $size");
header("Content-disposition: attachment; filename=$name");
header("Content-Description: PHP Generated Data");
echo $data;
When I load this page in my browser, it will prompt me to download the file identified by blob_id and has the correct filename and type. However, upon downloading it and opening in ghex, I see that the first byte is '0A' Using cmp original_file downloaded_file I determine that the only difference is this first byte. Googling led me to the ltrim() function that I've (perhaps too) liberally applied above.
I can't tell for sure if this problem is not being caused during upload, though as I said before, I don't believe it is since the "file_size" value in phpmyadmin is exactly the same as the source file. I'm not sure if the use of the aggregate_data array in the retrieval function could be to blame or what.
Any help is greatly appreciated!
Are you sure those 4 header lines are being properly executed? 0x0A is the newline char. You could have a newline in your core/init.php triggering output, and the headers are never executed. With display_errors/error_reporting off, you'd never see the warnings about "headers not sent - output started at line X...".
Related
Today I'm battling against ssh with huge success if it wasn't for ssh2_scp_send() method. This is the first time I work integrating ssh with PHP, so you may find multiple erros in the code that I have yet to discover, since I am currently stuck at that part.
I am using a dummy array imitating the one I must get from a couple of forms and the auth is working fine, although I had a little problem back there given that I had to use 'root' as user for the public key to be recognised as correct when comparing against, instead of my actual user.
Once I solved that matter, I discovered that the code is not working properly, this time the console says:
ssh2_scp_send(tmp/sshPm/prueba1.csv): failed to open stream: No such file or directory in /home/josemaria/Desktop/API UBcloud/CSV/provisioning.php on line 32
Here is my code:
function arrayToCsv($array, $filename, $delimiter = ';'){
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename="'.$filename.'";');
$f = fopen('/tmp/sshPm/'.$filename, 'r');
foreach ($array as $line) {
fputcsv($f, $line, $delimiter);
}
}
function connectSsh($filename){
$sshConnection = ssh2_connect($host, $port);
if (!$sshConnection) {
die('Conexión Fallida');
}
ssh2_auth_pubkey_file($sshConnection, $user, $pubKey, $priKey);
ssh2_scp_send($sshConnection, "/tmp/sshPm/$filename", '/home/scripts/CSV', 0644);
$stream = ssh2_exec($sshConnection, "/home/scripts/sc_prov_ovpn_firm.sh $filename");
ssh2_exec($sshConnection, 'exit');
return $stream;
}
for ($i=0; $i < 10; $i++) {
$array[] = array(
"VAAAAAGH",
"THE WAGH IS HERE",
200 + $i,
564451 +$i,
"sip",
"",
"",
"",
"8.8.8.8",
"8.8.4.4",
20048,
"Modelo Terminal",
"677shdG3"
);
}
$filename = 'test.csv';
arrayToCsv($array, $filename);
$stream = connectSsh($filename);
print_r($stream);
As you can see, I intend the CSV's to be created and stored in /tmp. Even though the csv is created and placed in the right directory, whenever I reach scp_send, this method proves incapabe of finding it. I don't know if this could be related to the fact that I am using root to veify my public key as I've seen that it should be the user you are logged in with.
I also get the following warning right inmediately, but I guess this is consecuence of the first one... In any case, here it is:
PHP Warning: ssh2_scp_send(): Unable to read source file in /home/josemaria/Desktop/API UBcloud/CSV/provisioning.php on line 32.
I have tried using a wrapper instead of fopen() but with no success. As I said, this is the first time I work using ssh and PHP so I would ask you to explain a little bit at least!
Thank you so much for the help!
UPDATE
I managed to solve partially the issue by following ArSeN's advice and creating a directory in /Desktop and changing all the routes to that one, instead of /tmp. Now the problem I face is that I am not sure where to place the files once created. So my next question related to this issue would be:
Where should I store all the CSV's generated locally? As you see, I am doing it in /Documents since I have no restrictions there to read/modify, but I would say the answer lies in /, maybe /var? I really have no clue about much of this stuff...
Thank you again for the help provided!
This is what my code looks like now:
function arrayToCsv($array, $filename, $delimiter = ','){
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename="'.$filename.'";');
$f = fopen("/home/josemaria/Documents/sshPm/$filename", 'w');
foreach ($array as $line) {
fputcsv($f, $line, $delimiter);
}
fclose($f);
}
function connectSsh($filename){
$sshConnection = ssh2_connect($host, $port);
if (!$sshConnection) {
die('Conexión Fallida');
}
ssh2_auth_pubkey_file($sshConnection, $user, $pubKey, $priKey);
ssh2_scp_send($sshConnection, "/home/josemaria/Documents/sshPm/$filename", "/home/scripts/CSV/$filename", 0644);
$stream = ssh2_exec($sshConnection, "/home/scripts/sc_prov_ovpn_firm.sh $filename");
ssh2_exec($sshConnection, 'exit');
return $stream;
}
for ($i=0; $i < 10; $i++) {
$array[] = array(
"VAAAAAGH",
"THE WAGH IS HERE",
200 + $i,
564451 +$i,
"sip",
"",
"",
"",
"8.8.8.8",
"8.8.4.4",
20048,
"Modelo Terminal",
"677shdG3"
);
}
$filename = 'test.csv';
arrayToCsv($array, $filename);
$stream = connectSsh($filename);
print_r($stream);
You should close the file handler so the file does actually get written and is not in some I/O buffer.
function arrayToCsv($array, $filename, $delimiter = ';'){
// all your existing code here ...
fclose($f);
}
Also with your copy target it seems like you are putting a file where a folder is before, meaning:
ssh2_scp_send($sshConnection, "/tmp/sshPm/$filename", '/home/scripts/CSV', 0644);
should probably be:
ssh2_scp_send($sshConnection, "/tmp/sshPm/$filename", "/home/scripts/CSV/$filename", 0644);
The answer to my issue had to do with permissions. I had some other errors as #ArSeN pointed out, but the warning I got and which prevented the code to work was because I was trying to store and read the files in /tmp withouth the permissions required to do so. So give yourself full permissions on the dir or change the dir for another. I still have some problems and issues with this code, but I do feel that they belong to a separate question that I will be linking here: How can I generate a CSV on Win and Linux using PHP and ProcessMaker? Which paths are reccommended to store the files locally?
Im trying to create a loop that when executed it created multiple csv files and downloads them. This is my code:
session_start();
require '../connect.php'; //connect.php has connection info for my database
// and uses the variable $connect
$sqldept = "SELECT department_name from department;";
$departments = mysqli_query($connect, $sqldept);
while ($department = mysqli_fetch_array($departments)) {
$department = $department[0];
header('Content-Type: text/csv; charset=utf-8');
header("Content-Transfer-Encoding: UTF-8");
header('Content-Disposition: attachment; filename=summary-' . $department . '.csv');
header("Cache-Control: no-cache, no-store, must-revalidate"); // HTTP 1.1
header("Pragma: no-cache"); // HTTP 1.0
header("Expires: 0"); // Proxies
$date = date("Y-m-d", strtotime("-28 days" . date("Y-m-d")));
$edate = date("Y-m-d");
$startdate = "(time.dateadded BETWEEN '$date' AND '$edate') AND";
$department = " and department_name = '$department'";
// create a file pointer connected to the output stream
$output = fopen('php://output', 'w');
// output the column headings
$sql2 = "SELECT time.id as timeid, time.staff_id, SUM(time.timein), COUNT(NULLIF(time.reasonforabsence,'')) AS count_reasonforabsence, GROUP_CONCAT(CONCAT(NULLIF(time.reasonforabsence,''),' ', date_format(time.dateadded, '%d-%m-%Y'),' ')) AS reasonforabsence, time.dateadded, staff.id AS staffid, department.id AS departmentid, department.department_name, staff.staff_name, staff.department_id, SUM(staff.workhoursperday), staff.payrollnum FROM time, staff, department WHERE $startdate staff.id = time.staff_id AND staff.department_id = department.id $department $staffsearch GROUP BY staff.id ORDER BY `time`.`dateadded` ASC;";
// output headers so that the file is downloaded rather than displayed
fputcsv($output, array(
'Payroll Number',
'Name',
'Department',
'Hours Worked',
'Days Absent',
'Overtime',
'Reasons for Absence'
));
$rows = mysqli_query($connect, $sql2);
while ($rowcsv = mysqli_fetch_assoc($rows)) {
$reasonforabsence = $rowcsv['reasonforabsence'];
//$reasonforabsence = explode( ',', $rowcsv['reasonforabsence'] );
$overtime = 0;
if (empty($rowcsv['SUM(time.timein)']) == true) {
$rowcsv['SUM(time.timein)'] = 0;
}
;
if ($rowcsv['SUM(time.timein)'] > $rowcsv['SUM(staff.workhoursperday)']) {
$overtime = $rowcsv['SUM(time.timein)'] - $rowcsv['SUM(staff.workhoursperday)'];
}
;
fputcsv($output, array(
$rowcsv['payrollnum'],
$rowcsv['staff_name'],
$rowcsv['department_name'],
$rowcsv['SUM(time.timein)'],
$rowcsv['count_reasonforabsence'],
$overtime,
$reasonforabsence
));
};
readfile("php://output");
fclose($output);
};
Currently the loop created 1 CSV with a new header and the department details below it like this
I want the loop to create a new CSV for each department but its just not working for me. Any help is appreciated.
Thanks
Unfortunately you can't, 1 PHP Request results in one file, and there isn't really a way around this. You can, however, try to download them all as a ZIP file. Take a look at this question f.e.
The below are some workaround ideas, which might be useful in certain scenarios (and might be dangerous in other scenarios). Use under your own risk!
Workaround A: Loop by redirect
Output a single file normally
Do a redirect to same url that's creating the CSV file in step#1, but append a GET flag to that, like http://www.example.net/output_csv?i=1
Make sure to add a loop-breaker in step#1, like if($i==10) { exit; }
Workaround B: Loop by cronjob
Output a single file normally
Make 2nd file output be handled by a separate cronjob call.
Make sure to add a loop-breaker in step#1, like if($mycron==10) { exit; }
You can not do this by for loop.
However, You can make a php file which can do your purpose.
<a onclick="getcsv()" href="php_file_location.php?table_name=test"> Download </a>
<script>
function getcsv() {
window.open(php_file_location);
}
</script>
I was in the same problem as mentioned. But in my case I was not trying to download multiple CSVs but I was uploading it to sFTP server. While creating the file instead of using
$output = fopen('php://output', 'w');
I used
$output = fopen($path_and_name, 'w');
where $path_and_name = $path_to_sftp_folder.'/'.$file_name;
after the execution the correct file was uploaded to there respective folders correctly the way I wanted it to be. But yes the wrong file was also downloaded with same issue as sent above.
So if you are looking for uploading files on a server it can be done(even if they all have same name).
Following is a part of my php program which is written to fetch rows from mysql table from input IDs. But I wanted to get the result directly to '.csv' file. I know php has built in function for that, but I could not include it effectively. So can anyone give a direction for export to csv using advanced php function?
$file = fopen("fetched.csv","w");
for($i=0;$i<=$len;$i++)
{
$lo = $locus[$i];
mysqli_select_db($conn,"microarray");
$query = mysqli_query("SELECT * FROM anatomy WHERE locus_id = "$lo"");
while ($row = mysqli_fetch_row($query))
{
}
}
You don't necessarily need an "advanced php function". A csv file is just a sequence of comma separated columns. Try this out.
function addRowToCsv(& $csvString, $cols) {
$csvString = implode(',', $cols) . PHP_EOL;
}
$csvString = '';
$first = true;
while ($row = mysqli_fetch_assoc($query)) {
if ($first === true) {
$first = false;
addRowToCsv($csvString, array_keys($row));
}
addRowToCsv($csvString, $row);
}
header('Content-type: text/csv');
header('Content-disposition: attachment;filename=MyCsvFile.csv');
echo $csvString;
Notice that the first argument to addRowToCsv is passed by reference. This is not required and you could easily use a return value, but this is just how I would do it.
-- Edit --
I just noticed you are saving the output to a file rather than serving it as a download. If that is what you want to do then use the above but replace
header('Content-type: text/csv');
header('Content-disposition: attachment;filename=MyCsvFile.csv');
echo $csvString;
With..
file_put_contents('MyCsvFile.csv', $csvString);
I have an issue with my code
This script writes the variable to a csv file.
I m getting the parameter trough via HTTP GET, the problem is each records comes one by one very slowly.
It should be able to take a batch of thousands of records.
I also noticed it's incomplete because it's missing about half the record when comparing to the full report downloaded from my vendor.
Here is the script:
<?php
error_reporting(E_ALL ^ E_NOTICE);
// setting the default timezone to use.
date_default_timezone_set('America/New_York');
//setting a the CSV File
$fileDate = date("m_d_Y");
$filename = "./csv_archive/" . $fileDate . "_SmsReport.csv";
//Creating handle
$handle = fopen($filename, "a");
//$handle = fopen($directory.$filename, 'a')
//These are the main data field
$item1 = $_REQUEST['item1'];
$item2 = $_REQUEST['item2'];
$item3 = $_REQUEST['item3'];
$mydate = date("Y-m-d H:i:s");
$csvRow = $item2 . "," . $item1 . "," . $item3 . "," . $mydate . "\n";
//writing to csv file
// just making sure the function could wrtite to it
if (!$handle = fopen($filename, 'a')) {
echo "Cannot open file ($filename)";
exit;
}
//writing the data
if (fwrite($handle, $csvRow) === FALSE) {
echo "Cannot write to file ($filename)";
exit;
}
fclose($handle);
?>
I rewrote it twice but the issue still persist. This goes beyond the scope of my knowledge so I am hoping someone tell me a better approach.
My boss blaming PHP, help me prove him wrong!
I think there's a better way of doing this. Try putting all your data into an array first, with each row in the CSV file being an array in itself, and then outputting it. Here's an example of some code I wrote a while back:
class CSV_Output {
public $data = array();
public $deliminator;
function __construct($data, $deliminator=",") {
if (!is_array($data)) {
throw new Exception('CSV_Output only accepts data as arrays');
}
$this->data = $data;
$this->deliminator = $deliminator;
}
public function output() {
foreach ($this->data as $row) {
$quoted_data = array_map(array('CSV_Output', 'add_quotes'), $row);
echo sprintf("%s\n", implode($this->deliminator, $quoted_data));
}
}
public function headers($name) {
header('Content-Type: application/csv');
header("Content-disposition: attachment; filename={$name}.csv");
}
private function add_quotes($data) {
$data = preg_replace('/"(.+)"/', '""$1""', $data);
return sprintf('"%s"', $data);
}
}
// CONSTRUCT OUTPUT ARRAY
$CSV_Data = array(array(
"Item 1",
"Item 2",
"ITem 3",
"Date"
));
// Needs to loop through all your data..
for($i = 1; $i < (ARGUMENT_TO_STOP_LOOP) ; $i++) {
$CSV_Data[] = array($_REQUEST['item1'], $_REQUEST['item2'], $_REQUEST['item3'], $_REQUEST['itemdate']);
}
$b = new CSV_Output($CSV_Data);
$b->output();
$b->headers("NAME_YOUR_FILE_HERE");
As requests come in to your server from BulkSMS, each request is trying to open and write to the same file.
These requests are not queued, and do not wait for the previous one to finish before starting another, meaning many will fail as the server finds the file is already in use by the previous request.
For this application, you'd be much better off storing the data from each request in a database such as SQLite and writing a separate script to generate the CSV file on demand.
I'm not particularly familiar with SQLite, but I understand it's fairly easy to implement and seems to be well documented.
Because multiple requests will arrive at the same time, concurrent requests will try to access the same output-file and blocking the other request access.
As I pointed out in my comment, you should be using a decent database. PostgreSQL or MySQL are open-source databases and have good support for PHP.
In my experience, PostgreSQL is a more solid database and performs better with many simultaneous users (especially when 'writing' to the database), although harder to learn (its more 'strict').
MySQL is easier to learn and may be sufficient, depending on the total number of request/traffic.
PostgreSQL:
http://www.postgresql.org
MySQL:
http://www.mysql.com
Do not use SQLite as a database for this because SQLite is a file-based database designed as a single-user database, not for client/server purposes. Trying to use it for multiple requests at the same time will give you the same kind of problems you're currently having
http://www.sqlite.org/whentouse.html
How Scalable is SQLite?
My telecom vendor is sending me a report each time a message goes out. I have written a very simple PHP script that receive values via HTTP GET. Using fwrite I write the query parameter to a CSV file.The filename is report.csv with the current date as a prefix.
Here is the code :
<?php
error_reporting(E_ALL ^ E_NOTICE);
date_default_timezone_set('America/New_York');
//setting a the CSV File
$fileDate = date("m-d-Y") ;
$filename = $fileDate."_Report.csv";
$directory = "./csv_archive/";
//Creating handle
$handle = fopen($filename, "a");
//These are the main data field
$item1 = $_GET['item1'];
$item2 = $_GET['item2'];
$item3 = $_GET['item3'];
$mydate = date("Y-m-d H:i:s") ;
$pass = $_GET['pass'];
//testing the pass
if (isset($_GET['pass']) AND $_GET['pass'] == "password")
{
echo 'Login successful';
// just making sure the function could write to it
if (!$handle = fopen($directory.$filename, 'a')){
echo "Cannot open file ($filename)";
exit;
}
//writing the data I receive through query string
if (fwrite($handle, "$item1,$item2,$item3,$mydate \n") === FALSE) {
echo "Cannot write to file ($filename)";
exit;
}
fclose($handle);
}
else{
echo 'Login Failure please add the right pass to URL';
}
?>
The script does what I want, but the only problem is inconsistency, meaning that a good portion of the records are missing (about half the report). When I log to my account I can get the complete report.
I have no clue of what I need to do to fix this, please advice.
I have a couple of suggestions for this script.
To address Andrew Rhyne's suggestion, change your code that reads from each $GET variable to:
$item1 = (isset($_GET['item1']) && $_GET['item1']) ? $_GET['item1'] : 'empty';
This will tell you if all your fields are being populated.
I suspect you problem is something else. It sounds like you are getting a seperate request for each record that you want to save. Perhaps some of these requests are happening to close together and are messing up each other's ability to open and write to the file. To check if this is happening, you might try using the following code check if you opened the file correctly. (Note that your first use of 'fopen' in your script does nothing, because you are overwriting $handle with your second use of 'fopen', it is also opening the wrong file...)
if (!$handle = fopen($directory.$filename, 'a')){
$handle = fopen($directory.date("Y-m-d H:i:s:u").'_Record_Error.txt', 'a');
exit;
}
This will make sure that you don't ever lose data because of concurrent write attempts. If you find that this is indeed you issue, you can delay subsequent write attempts until the file is not busy.
$tries = 0;
while ($tries < 50 && !$handle = fopen($directory.$filename, 'a')){
sleep(.5);//wait half a second
$tries++;
}
if($handle){
flock($handle);//lock the file to prevent other requests from opening the file until you are done.
} else {
$handle = fopen($directory.date("Y-m-d H:i:s:u").'_Record_Error.txt', 'a');//the 'u' is for milliseconds
exit;
}
This will spend 25 seconds, trying to open the file once every half second and will still output your record to a unique file every time you are still unable to open the file to write to. You can then safely fwrite() and fclose() $handle as you were.