I have an ongoing xml file, that when I call a php function to add a new child, it loops through an array of strings and queries a db to add a new child to the document and save it as the current string in the array. However, it is not appending, it is overwriting everything. Do I need to load the file first? and check if it exists?
function createUnitsXML($units,$wcccanumber,$mysqli) {
// Delete whitespaces and create an array of units assigned to call
$unit = preg_replace('/\s+/', '', $units);
$unitsarray = explode(",",$unit);
for ($i = 0; $i < count($unitsarray); $i++) {
$xml = new SimpleXMLElement('<xml/>');
$query = "SELECT * FROM calls WHERE wcccanumber = '$wcccanumber'";
$result = $mysqli->query($query);
while($row = mysqli_fetch_assoc($result)) {
$draw = $xml->addChild('call');
$draw->addChild('wcccanumber',$row['wcccanumber']);
$draw->addChild('currentcall',$row['call']);
$draw->addChild('county',$row['county']);
$draw->addChild('id',$row['id']);
$draw->addChild('location',$row['location']);
$draw->addChild('callcreated',$row['callcreated']);
$draw->addChild('station',$row['station']);
$draw->addChild('units',$row['units']);
$draw->addChild('calltype',$row['calltype']);
$draw->addChild('lat',$row['lat']);
$draw->addChild('lng',$row['lng']);
$draw->addChild('inputtime',$row['inputtime']);
}
$fp = fopen("xml/units/$unitsarray[$i].xml","wb");
fwrite($fp,$xml->asXML());
fclose($fp);
}
echo "--- Created units XML document for call: $wcccanumber";
echo "</br>";
}
$fp = fopen("xml/units/$unitsarray[$i].xml","wb");
By opening the file as "wb", you are truncating the file to write. Try using "ab" (write-only, appends to end of file) or "ab+" (read or write, appends to end of file) instead.
Related
I can't seem to find an answer. Is there an easy way to change my array in the code, to an associative array?
NOTE:
-All the variables being assigned are declared already
-It basically creates an array from a csv file called staff data and loops through to find the next empty line then adds that row with data, then re-saves back into the csv file.
My problem is that I cannot figure out in the slightest how to change this array to an associative array that will still work with my code.. Any direction would be greatly appreciated. I hope my problem is clear.
// Grabs the csv file (and its existing data) and makes it into an array so the new data can be added to it.
$StaffDetails = array();
$lines = file('data/StaffData.csv', FILE_IGNORE_NEW_LINES);
foreach ($lines as $key => $value)
{
$StaffDetails[$key] = str_getcsv($value);
}
//This is a for loop that checks when there is an avaliable line to put the data into. It loops over the lines, until there is an empty spot for the new data //to go into.
//It will not run it if the full name is empty.
for ($i=0; $i<200; $i++) {
if(empty($StaffDetails[$i][0]) == false){ //If its full
}else if (empty($StaffDetails[$i][0]) == true){ //If it is empty
//This prints the details of this staff to that line in the data file. $i is the avaliable line found above.
$StaffDetails[$i][0] = $UserFullName;
$StaffDetails[$i][1] = $UserPhoneNumber;
$StaffDetails[$i][2] = $Gender;
$StaffDetails[$i][3] = $Birthday;
$StaffDetails[$i][4] = $TypeOfWork;
$StaffDetails[$i][5] = $StartingDate;
$StaffDetails[$i][6] = $AnnualLeaveLeft;
$StaffDetails[$i][7] = $SickLeaveLeft;
$StaffDetails[$i][8] = $ID;
$StaffDetails[$i][9] = $NumberOfWorkDaysAWeek;
$StaffDetails[$i][10] = $Monday;
$StaffDetails[$i][11] = $Tuesday;
$StaffDetails[$i][12] = $Wednesday;
$StaffDetails[$i][13] = $Thursday;
$StaffDetails[$i][14] = $Friday;
$StaffDetails[$i][15] = $Saturday;
$StaffDetails[$i][16] = $Sunday;
$StaffDetails[$i][17] = $UserUsername;
$StaffDetails[$i][18] = $UserPassword;
$StaffDetails[$i][19] = $Em1FullName;
$StaffDetails[$i][20] = $Em1PhoneNumber;
$StaffDetails[$i][21] = $Em2FullName;
$StaffDetails[$i][22] = $Em2PhoneNumber;
$i =201;
}
//Below saves the previous data and new changes to the csv file
//This opens the csv file as a write file
$MyCsvFile = fopen('data/StaffData.csv', 'w');
//This takes the array that has had data ddded to it and adds it to the file
foreach ($StaffDetails as $fields) {
fputcsv($MyCsvFile, $fields);
}
//This closes the file
fclose($MyCsvFile);
}
Hopefully, I won't be a novice for long. I have a 2-dimensional array (simplified) that I'm trying to work with. Simply pulling out, adding to, and uploading a file record. Can somebody forgive my ignorance and explain what I'm doing wrong?:
<?php
// Updating Current Number of Vendors
$vendorcount = #file_get_contents('../Keys/VendorCount/$v');
if(isset($vendorcount))
{
$new_vendor_number = ($vendorcount + 1);
$n = $new_vendor_number;
}
else
{
$vendorcount = 0;
$new_vendor_number = 1;
$file = '../Keys/VendorCount/$v' ;
file_put_contents($file, $vendorcount) ;
};
//getting record from file
$record = file_get_contents('../Vendors/$vendorlist');
//adding new information to record array
$record[$n] = array($new_vendor_number, $catname);
//uploading updated record
$file = '../Vendors/$vendorlist' ;
file_put_contents($file, $record) ;
?>
You need to use serialize() and unserialize() in order to store an array to a file or restore it from a file. Like this:
$array = array(1,2,3);
// writing to file
file_put_contents('file.txt', serialize($array));
// restoring from file
$array = unserialize(file_get_contents('file.txt'));
This will work with arrays of any dimension.
I am having a log file like this :
2013-04-08-03-17-52: Cleaning Up all data for restart operation
2013-04-08-03-18-02: Creating new instance of app before closing
2013-04-08-03-18-03: New instance created and running
2013-04-08-03-18-03: Application started
Currently, i am loading full log file every second to show it to the user using jquery ajax, As this being soo inefficient i am trying to figure out some way to load only the updated lines from log file.
Is their any way to get lines only after a particular time stamp 2013-04-08-03-18-03
, For this i will be managing a variable with last timestamp and will be updating it every time i get new lines.
I am kind of New to Php and know only the basics of reading and writing files.
You might want to check the file modification time first to see whether or not you need to reload the log file. You can use the filemtime-function for that.
Furthermore, you can use the file_get_contents-function using an offset to read the file from a certain point.
Edit: So, how does it work?
Suppose you have stored the last modification time in a session variable $_SESSION['log_lastmod'] and the most recent offset in $_SESSION['log_offset'].
session_start();
// First, check if the variables exist. If not, create them so that the entire log is read.
if (!isset($_SESSION['log_lastmod']) && !isset($_SESSION['log_offset'])) {
$_SESSION['log_lastmod'] = 0;
$_SESSION['log_offset'] = 0;
}
if (filemtime('log.txt') > $_SESSION['log_lastmod']) {
// read file from offset
$newcontent = file_get_contents('log.txt', NULL, NULL, $_SESSION['log_offset']);
// set new offset (add newly read characters to last offset)
$_SESSION['log_offset'] += strlen($newcontent);
// set new lastmod-time
$_SESSION['log_lastmod'] = filemtime('log.txt');
// manipulate $newcontent here to what you want it to show
} else {
// put whatever should be returned to the client if there are no updates here
}
You can try
session_start();
if (! isset($_SESSION['log'])) {
$log = new stdClass();
$log->timestamp = "2013-04-08-03-18-03";
$log->position = 0;
$log->max = 2;
$_SESSION['log'] = &$log;
} else {
$log = &$_SESSION['log'];
}
$format = "Y-m-d-:g:i:s";
$filename = "log.txt";
// Get last date
$dateLast = DateTime::createFromFormat($format, $log->timestamp);
$fp = fopen($filename, "r");
fseek($fp, $log->position); // prevent loading all file to memory
$output = array();
$i = 0;
while ( $i < $log->max && ! feof($fp) ) {
$content = fgets($fp);
// Check if date is current
if (DateTime::createFromFormat($format, $log->timestamp) < $dateLast)
continue;
$log->position = ftell($fp); // save current position
$log->timestamp = strstr($content, ":", true); // save curren time;
$output[] = $content;
$i ++;
}
fclose($fp);
echo json_encode($output); // send to ajax
Try something like this:
<?php
session_start();
$lines = file(/*Insert logfile path here*/);
if(isset($_SESSION["last_date_stamp"])){
$new_lines = array();
foreach($lines as $line){
// If the date stamp is newer than the session variable, add it to $new_lines
}
$returnLines = array_reverse($newLines); // the reverse is to put the lines in the right order
} else {
$returnLines = $lines;
}
$_SESSION['last_date_stamp'] = /* Insert the most recent date stamp here */;
// return the $returnLines variable
?>
What this does is reads a file line by line and, if a session variable already exists, add all of the lines where the date stamp is newer than the one in the session variable to a return variable $returnLines, if no session variable exists it will put all of the lines in the file in $returnLines.
After this, it create.updates the session variable for use the next time the code is executed. Finally it returns the log file data by using the $returnLines variable.
Hope this helps.
I want to add/display data from querying from the database and add it into an XML file.
Example, I have a table_persons which has a name and age. I create a mysql query to get its name and age. Then simply put the data(name and age of persons) into an XML file.
How would you do that? Or is it possible?
I suggest you use DomDocument and file_put_contents to create your XML file.
Something like this:
// Create XML document
$doc = new DomDocument('1.0', 'UTF-8');
// Create root node
$root = $doc->createElement('persons');
$root = $doc->appendChild($root);
while ($row = mysql_fetch_assoc($result)) {
// add node for each row
$node = $doc->createElement('person');
$node = $root->appendChild($node);
foreach ($row as $column => $value) {
$columnElement = $doc->createElement($column);
$columnElement = $node->appendChild($columnElement);
$columnValue = $doc->createTextNode($value);
$columnValue = $columnElement->appendChild($columnValue);
}
}
// Complete XML document
$doc->formatOutput = true;
$xmlContent = $doc->saveXML();
// Save to file
file_put_contents('persons.xml', $xmlContent);
<?php
[snip] //database code here
$f = fopen('myxml.xml', 'a+');
foreach($row = mysqli_fetch_assoc($resultFromQuery))
{
$str = "<person>
<name>{$row['name']}</name>
<age>{$row['age']}</age>
</person>\n";
fwrite($f, $str);
}
fclose($f);
?>
Assuming you use mysqli, this code works. If not, suit to fit. In the fopen function call, the a+ tells it to open it for reading at writing, placing the pointer at the end of the file.
Best of luck.
I am using PHPExcel to create an Excel document, using data from a MySQL database. My script must execute in under 512MB of RAM, and I am running into trouble as my export reaches 200k records:
PHP Fatal error: Allowed memory size of...
How can I use PHPExcel to create large documents in as little amount of RAM as possible?
My current code:
// Autoload classes
ProjectConfiguration::registerPHPExcel();
$xls = new PHPExcel();
$xls->setActiveSheetIndex(0);
$i = 0;
$j = 2;
// Write the col names
foreach ($columnas_excel as $columna) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,1,$columna);
$xls->getActiveSheet()->getColumnDimensionByColumn($i)->setAutoSize(true);
$i++;
}
// paginate the result from database
$pager = new sfPropelPager('Antecedentes',50);
$pager->setCriteria($query_personas);
$pager->init();
$last_page = $pager->getLastPage();
//write the data to excel object
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$i = 0;
foreach ($columnas_excel as $key_col => $columnas) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,$j,$persona->getByName($key_col, BasePeer::TYPE_PHPNAME));
$i++;
}
$j++;
}
}
// write the file to the disk
$writer = new PHPExcel_Writer_Excel2007($xls);
$filename = sfConfig::get('sf_upload_dir') . DIRECTORY_SEPARATOR . "$cache.listado_personas.xlsx";
if (file_exists($filename)) {
unlink($filename);
}
$writer->save($filename);
CSV version:
// Write the col names to the file
$columnas_key = array_keys($columnas_excel);
file_put_contents($filename, implode(",", $columnas_excel) . "\n");
//write data to the file
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$persona_arr = array();
// make an array
foreach ($columnas_excel as $key_col => $columnas) {
$persona_arr[] = $persona->getByName($key_col, BasePeer::TYPE_PHPNAME);
}
// append to the file
file_put_contents($filename, implode(",", $persona_arr) . "\n", FILE_APPEND | LOCK_EX);
}
}
Still have the problem of RAM when Propel makes requests to the database, it's like Propel, does not release the RAM every time you make a new request. I even tried to create and delete the Pager object in each iteration
Propel has formatters in the Query API, you'll be able to write this kind of code:
<?php
$query = AntecedentesQuery::create()
// Some ->filter()
;
$csv = $query->toCSV();
$csv contains a CSV content you'l be able to render by setting the correct mime-type.
Since it appears you can use a CSV, try pulling 1 record at a time and appending it to your CSV. Don't try to get all 200k records at the same time.
$cursor = mysql_query( $sqlToFetchData ); // get a MySql resource for your query
$fileHandle = fopen( 'data.csv', 'a'); // use 'a' for Append mode
while( $row = mysql_fetch_row( $cursor ) ){ // pull your data 1 record at a time
fputcsv( $fileHandle, $row ); // append the record to the CSV file
}
fclose( $fileHandle ); // clean up
mysql_close( $cursor );
I'm not sure how to transform the CSV into an XLS file, but hopefully this will get you on your way.