Php MongoDB Write a CSV File - php

I have searched the internet, and could not get any specific details about it.
The environment is Windows 8, WAMP , MONGODB
I am trying to design a webpage, which have 4 fields: Name,Contact,Device,Email. After an user hits the submit button, the data inserts in the MongoDb. All this works fine.
The issue starts when I try to write the inserted data in the csv file, as this is the requirement. I have tried MongoDB Export command in the cmd, and it works fine, but trying to call the same using exec function in the php script is proving to be futile.
I have tried with this also, by storing the command in a .bat file, and then calling the .bat file using the php's exec function, still, no effect
<?php
echo '<pre>';
// Outputs all the result of shellcommand "ls", and returns
// the last output line into $last_line. Stores the return value
// of the shell command in $retval.
exec("c:\WINDOWS\system32\cmd.exe /c START C:\wamp\bin\mongodb\mongodb-win32-x86_64-2008plus-2.4.3\conf\export.bat");
?>
I have enabled the checbox interaction with desktop in my WAMP server.
I don't need any specific help related with the coding, all I need is some direction on how to proceed ahead, as I know that I am missing something. Also, I reiterate, did not get anything specific on the Internet, hence, posting the question.
Kindly let me know on how to achieve this.
Thanks to everyone

The following may work
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename=example.csv');
header('Pragma: no-cache');
$database = "DATABASE";
$colName = "COLLECTION";
$connection = new MongoClient();
$collection = $connection->$colName->$database;
$cursor = $collection->find();
foreach($cursor as $cur)
echo '"'.$cur['field_1'].'","'.$cur['field_2']."\"\n";

This code will dump your selected database to the a json file
$mongoExport = 'c:\mongodb\mongoexport'; //full path to mongoexport binary
$database = 'test';
$collection = 'foo';
$file = "c:\\temp\\foo.json"
exec(sprintf("%s -d %s -c %s -o %s",
$mongoExport,
$database,
$collection,
$file
));
And this is using pure PHP, for sure will be more fast the mongoexport option, with large collections:
$database = 'test';
$collection = 'foo';
$m = new MongoClient();
$col = $m->selectDB($database)->$collection;
$json = json_encode(iterator_to_array($col->find()));

set_time_limit(0);
ob_start();
error_reporting(E_ALL);
ini_set("display_errors",1);
$conn = new MongoClient("mongodb://Hostname:27017", array("replicaSet" => "rs0"));
if(!$conn){
die("Unable to connect with mongodb");
}
$db = $conn-><DB NAME>; // DB name
$col1 = $db-><colname>;
$col2 = $db-><colname>; // collection name which u want .
$filterCountry = array("status"=>"1"); // where query
$records = $col1->find($filterCountry);
$fp= fopen('exampleTest11.csv', 'w'); // open csv file in which u want write data.
$headings[] = "Code";
$headings[] = "Status" ;
$headings[] = "EMP CODE";
fputcsv($fp, $headings); // PUT Headings .
$cnt =0;
foreach($records as $val) {
$csvarr = array();
$csvarr['code']= $val['code']; // fetch data from database.
$csvarr['Status']= $val['status'];
$csvarr['emp_code']= $val['emp_code'];
fputcsv($fp, $csvarr);
$cnt++;
}
echo "Completed Successfully..".$cnt;

Related

Trying to write Php fwrite XML file from mysql query

I am trying to write some code that grabs a CSV file pulls the relevant postcode column then looks up that postcode in a MySql database which has longitude and latitude fields then save them to an XML file so it can be used in a different program
I think this code piece is all working, but for some reason, it only outputs the last field of the query :
//Open the file.
$fileHandle = fopen("test.csv", "r");
$postcode = array();
while (($row = fgetcsv($fileHandle, 0, ",")) !== FALSE) {
array_push($postcode, $row[40]);
}
$postcodefil = (array_unique($postcode));
$postcodefil = str_replace(' ', '', $postcodefil);
$postcodefil = preg_replace('/\s+/', '', $postcodefil);
//print_r($postcodefil);
foreach ($postcodefil as $value) {
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$sql = "SELECT postcode, latitude, longitude FROM postcode WHERE postcode='$value' ";
$result = $conn->query($sql);
if ($result->num_rows > 0) {
// output data of each row
while($row = $result->fetch_assoc()) {
$myfile = fopen("test.xml", "w") or die("Unable to open file!");
$lat = $row["latitude"];
$lng = $row["longitude"];
fwrite($myfile, $lat."testss".$lng."\n");
echo $lat;
echo $lng;
echo "<br />";
}}
} // end of foreach
$conn->close();
however when i run it, it echo's correctly
50.822398-0.139938
51.444908-1.295341
50.841951-0.842508
51.308504-0.551835
etc.... etc...
but the Fwrite just outputs the last line
51.120916testss-0.599545
I' m totally confused by this. Please forgive me if it's something basic that I've over looked and thanks in advance.
The problem is that you open the file in each loop, this overwrites the previous data...
$myfile = fopen("test.xml", "w") or die("Unable to open file!");
while($row = $result->fetch_assoc()) {
So move the open outside the loop.
The second issue is that you aren't writing XML at all. You need to do something like...
$xml = simplexml_load_string("<coords />");
while($row = $result->fetch_assoc()) {
$newCoord = $xml->addChild("coord");
$newCoord->addChild("latitude", $row["latitude"]);
$newCoord->addChild("longitude", $row["longitude"]);
}
$xml->saveXML("test.xml");
This will generate a simple XML file, you will need to set the element names as appropriate.
First thing put the connection outside of the foreach loop, and the fopen outside the while
loop.
You open the xml file in the 'w' mode means according to the doc
Open for writing only; place the file pointer at the beginning of the
file and truncate the file to zero length. If the file does not exist,
attempt to create it.
You need append mode 'a'
Open for writing only; place the file pointer at the end of the file.
If the file does not exist, attempt to create it. In this mode,
fseek() has no effect, writes are always appended.
This will work for you. But you still making a db request per postalcode, i would suggest to collect all the postal code you need to query and make one db request to database with sql IN operator.

creating multiple csv files from php loop

Im trying to create a loop that when executed it created multiple csv files and downloads them. This is my code:
session_start();
require '../connect.php'; //connect.php has connection info for my database
// and uses the variable $connect
$sqldept = "SELECT department_name from department;";
$departments = mysqli_query($connect, $sqldept);
while ($department = mysqli_fetch_array($departments)) {
$department = $department[0];
header('Content-Type: text/csv; charset=utf-8');
header("Content-Transfer-Encoding: UTF-8");
header('Content-Disposition: attachment; filename=summary-' . $department . '.csv');
header("Cache-Control: no-cache, no-store, must-revalidate"); // HTTP 1.1
header("Pragma: no-cache"); // HTTP 1.0
header("Expires: 0"); // Proxies
$date = date("Y-m-d", strtotime("-28 days" . date("Y-m-d")));
$edate = date("Y-m-d");
$startdate = "(time.dateadded BETWEEN '$date' AND '$edate') AND";
$department = " and department_name = '$department'";
// create a file pointer connected to the output stream
$output = fopen('php://output', 'w');
// output the column headings
$sql2 = "SELECT time.id as timeid, time.staff_id, SUM(time.timein), COUNT(NULLIF(time.reasonforabsence,'')) AS count_reasonforabsence, GROUP_CONCAT(CONCAT(NULLIF(time.reasonforabsence,''),' ', date_format(time.dateadded, '%d-%m-%Y'),' ')) AS reasonforabsence, time.dateadded, staff.id AS staffid, department.id AS departmentid, department.department_name, staff.staff_name, staff.department_id, SUM(staff.workhoursperday), staff.payrollnum FROM time, staff, department WHERE $startdate staff.id = time.staff_id AND staff.department_id = department.id $department $staffsearch GROUP BY staff.id ORDER BY `time`.`dateadded` ASC;";
// output headers so that the file is downloaded rather than displayed
fputcsv($output, array(
'Payroll Number',
'Name',
'Department',
'Hours Worked',
'Days Absent',
'Overtime',
'Reasons for Absence'
));
$rows = mysqli_query($connect, $sql2);
while ($rowcsv = mysqli_fetch_assoc($rows)) {
$reasonforabsence = $rowcsv['reasonforabsence'];
//$reasonforabsence = explode( ',', $rowcsv['reasonforabsence'] );
$overtime = 0;
if (empty($rowcsv['SUM(time.timein)']) == true) {
$rowcsv['SUM(time.timein)'] = 0;
}
;
if ($rowcsv['SUM(time.timein)'] > $rowcsv['SUM(staff.workhoursperday)']) {
$overtime = $rowcsv['SUM(time.timein)'] - $rowcsv['SUM(staff.workhoursperday)'];
}
;
fputcsv($output, array(
$rowcsv['payrollnum'],
$rowcsv['staff_name'],
$rowcsv['department_name'],
$rowcsv['SUM(time.timein)'],
$rowcsv['count_reasonforabsence'],
$overtime,
$reasonforabsence
));
};
readfile("php://output");
fclose($output);
};
Currently the loop created 1 CSV with a new header and the department details below it like this
I want the loop to create a new CSV for each department but its just not working for me. Any help is appreciated.
Thanks
Unfortunately you can't, 1 PHP Request results in one file, and there isn't really a way around this. You can, however, try to download them all as a ZIP file. Take a look at this question f.e.
The below are some workaround ideas, which might be useful in certain scenarios (and might be dangerous in other scenarios). Use under your own risk!
Workaround A: Loop by redirect
Output a single file normally
Do a redirect to same url that's creating the CSV file in step#1, but append a GET flag to that, like http://www.example.net/output_csv?i=1
Make sure to add a loop-breaker in step#1, like if($i==10) { exit; }
Workaround B: Loop by cronjob
Output a single file normally
Make 2nd file output be handled by a separate cronjob call.
Make sure to add a loop-breaker in step#1, like if($mycron==10) { exit; }
You can not do this by for loop.
However, You can make a php file which can do your purpose.
<a onclick="getcsv()" href="php_file_location.php?table_name=test"> Download </a>
<script>
function getcsv() {
window.open(php_file_location);
}
</script>
I was in the same problem as mentioned. But in my case I was not trying to download multiple CSVs but I was uploading it to sFTP server. While creating the file instead of using
$output = fopen('php://output', 'w');
I used
$output = fopen($path_and_name, 'w');
where $path_and_name = $path_to_sftp_folder.'/'.$file_name;
after the execution the correct file was uploaded to there respective folders correctly the way I wanted it to be. But yes the wrong file was also downloaded with same issue as sent above.
So if you are looking for uploading files on a server it can be done(even if they all have same name).

Load image from mongoDb fail

I try to view image in browser that i take from mongodb. the image correctly saved and I can download it perfectly using genghis.php but whenever I tried to load this using my own code whether ny using getBytes() or getResource(), the result only return bytes data such as this:
HDR¿£Ðß$iUßoÛT>‰oR¤? XG‡ŠÅ¯US[¹­ÆI“¥íJ¥éØ*$ä:7‰©Û鶪O{7ü#ÙH§kk?ì<Ê»øÎí¾kktüqóÝ
Here is the code that I use to retrieve the image:
<?php
// Config
$dbhost = 'localhost';
$dbname = 'dbzuhra';
$colname = 'testData';
// Connect to test database
$m = new Mongo("mongodb://$dbhost");
$db = $m->$dbname;
$getGrid = $db->getGridFS();
$image = $getGrid->findOne(array('filename'=>'final_design.png'));
header('Content-type: image/png;');
$stream = $image->getResource();
while (!feof($stream)) {
echo fread($stream, 8192);
}
?>
Is there any explanation to why this happen?
Incorrect header:
header('Content-type: image/png;');
^--- don't put a semi-colon here
There is no image type image/png;. It's just image/png, and HTTP headers are delimited by line breaks, not semicolons.

storing live data in Mongodb

I am pushing data to a server using an API from a website. Every time my arduino detects a pulse it sends it to COSM. And by using a trigger, the data goes to my server. I was initially writing the data into a .txt file and a Json object, but now I want to start implementing mongo to have different collections.
For some reason the data is not being transmitted to my server after I added the Mongo connection when I tried to write it as a Mongo collection. I want to be able to write down the information in Mongo directly and avoid creating files.
Any suggestion is more than welcome, here is the code:
<?php
// Creating a data base in Mongodb to store my information from my $jsonFile
$connection = new MongoClient(); //Connect to mongo
$db = $connection -> qandm; // select my DB which is called qandM
$collection = $db -> pulseData;
//Gets all the information from Cosm
$pulse = $_POST['body'];
//access and open a file
//echo $pulse;
//Converts the data into PHP arrays
$pulseObj = json_decode($pulse, true);
//Parse through the specific information from the array and gets each piece of information in an array
$userName = $pulseObj["triggering_datastream"]["id"];
$dataTime= $pulseObj["triggering_datastream"]["at"];
$dataValue= $pulseObj["triggering_datastream"]["value"]["value"];
//Writes all the data coming from COSM
$file = fopen("data.txt", "a+");//a+ give the option to have the cursor at the end to access the file read and write it
/* $pulse.="\r\n"; */
fwrite($file, $pulse);//takes incoming data and writes it in the file
fclose($file);
//Opens a new .txt file and writes the values that we selected before into our file
$string = $userName." ".$dataTime." ".$dataValue." \r\n";
//error_log allows me to see in my Apache log server the information that I'm printing
error_log($string);
//Write all the information I parsed in my three variables in a new file
$file2 = fopen("rawData.txt", "a+");
fwrite($file2,$string);
fclose($file2);
//json sample
//Inputs the data from the time and the pulse value into a json object
$json = array("User" => $userName, "timestamp"=> $dataTime, "value"=> $dataValue);
//Opens a new json object
$jsonFile = fopen("data.json", "a+");
//Writes the data of our new arrayed information into the open json object
fwrite($jsonFile, json_encode($json));
fclose($jsonFile);
//A loop to populate
foreach($json as $data){
$collection->insert($data);
}
//find the data I just stored
$cursor = $collection->find();
//Output it in a UL
echo "<p> My Pulse </p>";
echo '<ul>';
foreach($cursor as $doc){
echo' <li> My pulse is: '.$doc['value'];
}
echo '</ul>';
/*$data = "data.txt";
$fh = fopen($data, "w") or die ("can't open file");
$data = json_encode($_POST);
fwrite($fh, $data);
fclose($fh);*/
//print_r($file);
?>
This is likely the source of your problems:
//A loop to populate
foreach($json as $data){
$collection->insert($data);
}
You are iterating over your $json array and passing the values (but not the keys) to the insert method of MongoCollection. According to the doc this method expects an object or an array. Based on what I understand your code to be trying to do this foreach loop should be replaced with the following:
$collection->insert($json);
This will create a mongo object resembling your array. Your code currently is attempting to insert the values of each array element as an individual entry.

records come one by one very slowly and incomplete

I have an issue with my code
This script writes the variable to a csv file.
I m getting the parameter trough via HTTP GET, the problem is each records comes one by one very slowly.
It should be able to take a batch of thousands of records.
I also noticed it's incomplete because it's missing about half the record when comparing to the full report downloaded from my vendor.
Here is the script:
<?php
error_reporting(E_ALL ^ E_NOTICE);
// setting the default timezone to use.
date_default_timezone_set('America/New_York');
//setting a the CSV File
$fileDate = date("m_d_Y");
$filename = "./csv_archive/" . $fileDate . "_SmsReport.csv";
//Creating handle
$handle = fopen($filename, "a");
//$handle = fopen($directory.$filename, 'a')
//These are the main data field
$item1 = $_REQUEST['item1'];
$item2 = $_REQUEST['item2'];
$item3 = $_REQUEST['item3'];
$mydate = date("Y-m-d H:i:s");
$csvRow = $item2 . "," . $item1 . "," . $item3 . "," . $mydate . "\n";
//writing to csv file
// just making sure the function could wrtite to it
if (!$handle = fopen($filename, 'a')) {
echo "Cannot open file ($filename)";
exit;
}
//writing the data
if (fwrite($handle, $csvRow) === FALSE) {
echo "Cannot write to file ($filename)";
exit;
}
fclose($handle);
?>
I rewrote it twice but the issue still persist. This goes beyond the scope of my knowledge so I am hoping someone tell me a better approach.
My boss blaming PHP, help me prove him wrong!
I think there's a better way of doing this. Try putting all your data into an array first, with each row in the CSV file being an array in itself, and then outputting it. Here's an example of some code I wrote a while back:
class CSV_Output {
public $data = array();
public $deliminator;
function __construct($data, $deliminator=",") {
if (!is_array($data)) {
throw new Exception('CSV_Output only accepts data as arrays');
}
$this->data = $data;
$this->deliminator = $deliminator;
}
public function output() {
foreach ($this->data as $row) {
$quoted_data = array_map(array('CSV_Output', 'add_quotes'), $row);
echo sprintf("%s\n", implode($this->deliminator, $quoted_data));
}
}
public function headers($name) {
header('Content-Type: application/csv');
header("Content-disposition: attachment; filename={$name}.csv");
}
private function add_quotes($data) {
$data = preg_replace('/"(.+)"/', '""$1""', $data);
return sprintf('"%s"', $data);
}
}
// CONSTRUCT OUTPUT ARRAY
$CSV_Data = array(array(
"Item 1",
"Item 2",
"ITem 3",
"Date"
));
// Needs to loop through all your data..
for($i = 1; $i < (ARGUMENT_TO_STOP_LOOP) ; $i++) {
$CSV_Data[] = array($_REQUEST['item1'], $_REQUEST['item2'], $_REQUEST['item3'], $_REQUEST['itemdate']);
}
$b = new CSV_Output($CSV_Data);
$b->output();
$b->headers("NAME_YOUR_FILE_HERE");
As requests come in to your server from BulkSMS, each request is trying to open and write to the same file.
These requests are not queued, and do not wait for the previous one to finish before starting another, meaning many will fail as the server finds the file is already in use by the previous request.
For this application, you'd be much better off storing the data from each request in a database such as SQLite and writing a separate script to generate the CSV file on demand.
I'm not particularly familiar with SQLite, but I understand it's fairly easy to implement and seems to be well documented.
Because multiple requests will arrive at the same time, concurrent requests will try to access the same output-file and blocking the other request access.
As I pointed out in my comment, you should be using a decent database. PostgreSQL or MySQL are open-source databases and have good support for PHP.
In my experience, PostgreSQL is a more solid database and performs better with many simultaneous users (especially when 'writing' to the database), although harder to learn (its more 'strict').
MySQL is easier to learn and may be sufficient, depending on the total number of request/traffic.
PostgreSQL:
http://www.postgresql.org
MySQL:
http://www.mysql.com
Do not use SQLite as a database for this because SQLite is a file-based database designed as a single-user database, not for client/server purposes. Trying to use it for multiple requests at the same time will give you the same kind of problems you're currently having
http://www.sqlite.org/whentouse.html
How Scalable is SQLite?

Categories