I have a function PHP to parse file Excel, it read all the number of the rows but it doesn't return all the data in the object.
Number of lines 3247, but it return just 1023 lines.
this following the parsing function:
public function parseEquipement($filePath = null) {
set_time_limit(0);
$listEquipement = [];
$count = 0;
$chunkSize = 8192;
$objReader = PHPExcel_IOFactory::createReader(PHPExcel_IOFactory::identify($filePath));
$spreadsheetInfo = $objReader->listWorksheetInfo($filePath);
$chunkFilter = new \Floose\Parse\ChunkReadFilter();
$objReader->setReadFilter($chunkFilter);
$objReader->setReadDataOnly(true);
$chunkFilter->setRows(0, 1);
$objPHPExcel = $objReader->load($filePath);
$totalRows = $spreadsheetInfo[0]['totalRows'];
for ($startRow = 1; $startRow <= $totalRows; $startRow += $chunkSize) {
$chunkFilter->setRows($startRow, $chunkSize);
$objPHPExcel = $objReader->load($filePath);
$sheetData = $objPHPExcel->getActiveSheet()->toArray(null, null, true, false);
$startIndex = ($startRow == 1) ? $startRow : $startRow - 1;
if (!empty($sheetData) && $startRow < $totalRows) {
$dataToAnalyse = array_slice($sheetData, $startIndex, $chunkSize);
if($dataToAnalyse[0][0]==NULL){
break;
}
for ($i = 0; $i < $chunkSize; $i++) {
if ($dataToAnalyse[$i]['0'] != NULL) {
$listEquipement[] = new Article($dataToAnalyse[$i]['0'], '', $dataToAnalyse[$i]['1']);
$count++;
}
}
}
//echo($totalRows); // is best
//echo($count); // is wrong
//print_r($listEquipement);
$objPHPExcel->disconnectWorksheets();
unset($objPHPExcel, $sheetData);
}
return $listEquipement;
}
I changed all the code by this following but it doesn't work:
public function parseEquipment($filePath = null) {
$objReader = PHPExcel_IOFactory::createReader(PHPExcel_IOFactory::identify($filePath));
$objReader->setReadDataOnly(true);
$objPHPExcel = $objReader->load($filePath);
$sheet = $objPHPExcel->getSheet(0);
$highestRow = $sheet->getHighestRow();
for ($row = 2; $row <= $highestRow; $row++){
echo $sheet->getCellByColumnAndRow(3, $row)->getCalculatedValue();
echo $sheet->getCellByColumnAndRow(4, $row)->getCalculatedValue();
echo $sheet->getCellByColumnAndRow(2,$row)->getCalculatedValue();
$listEquipement[] = new Article(
$sheet->getCellByColumnAndRow(3, $row)->getCalculatedValue(),
$sheet->getCellByColumnAndRow(4, $row)->getCalculatedValue(),
$sheet->getCellByColumnAndRow(2, $row)->getCalculatedValue()
);
}
}
And when I run my code always it display an error of memory size knowing that the size of my file is 81K and it display the number of lines in the same time.
Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (Tried to allocate 54byte)
Could anyone be kind enough to guide and teach me how I should do my codes or can you suggest me another code to parsing a file Excel ?
Related
I need to read a xlsx file with 10 sheets, each sheet with about 3K rows.
Is there a way to loop each sheet and chunk his rows?
Following the examples I'm on this point:
public function import($file)
{
$inputFileType = IOFactory::identify($file);
$reader = IOFactory::createReader($inputFileType);
//My ChunkReadFilter is exactly the same of the PhpSpreadsheet examples
$chunkFilter = new ChunkReadFilter();
$reader->setReadFilter($chunkFilter);
$chunkSize = 100;
$spreadsheet = $reader->load($file);
$loadedSheetNames = $spreadsheet->getSheetNames();
foreach ($loadedSheetNames as $sheetIndex => $loadedSheetName) {
$sheet = $spreadsheet->getSheet($sheetIndex);
//$highestRow = $sheet->getHighestRow(); //Is returning 1 as result
$highestRow = 3000;
for ($startRow = 1; $startRow <= $highestRow; $startRow += $chunkSize) {
/** Tell the Read Filter which rows we want this iteration **/
$chunkFilter->setRows($startRow, $chunkSize);
$sheetData = $sheet->toArray(null, true, false, true);
var_dump($sheetData);
}
}
}
The var_dump($sheetData); prints all sheet data, not only the chunk size.
So, how can I read each sheet data and chunk the rows?
I'm using "phpoffice/phpspreadsheet": "^1.4"
I completely missed your goal (the question was not so clear).
I completely change my answer.
Assumed that you can loop through multiple sheets with the code below:
// .... add helper here....
$helper->log('Loading file ' . pathinfo($inputFileName, PATHINFO_BASENAME) . ' using IOFactory with a defined reader type of ' . $inputFileType);
$reader = IOFactory::createReader($inputFileType);
// Define how many rows we want for each "chunk"
$chunkSize = 10;
// Loop to read our worksheet in "chunk size" blocks
for ($startRow = 2; $startRow <= 50 ; $startRow += $chunkSize) {
// ..... use the helper ...
$helper->log('Loading WorkSheet using configurable filter for headings row 1 and for rows ' . $startRow . ' to ' . ($startRow + $chunkSize - 1));
// Create a new Instance of our Read Filter, passing in the limits on which rows we want to read
$chunkFilter = new ChunkReadFilter($startRow, $chunkSize);
// Tell the Reader that we want to use the new Read Filter that we've just Instantiated
$reader->setReadFilter($chunkFilter);
// Load only the rows that match our filter from $inputFileName to a PhpSpreadsheet Object
$spreadsheet = $reader->load($inputFileName);
$sheetCount = $spreadsheet->getSheetCount();
for ($i = 0; $i < $sheetCount; $i++) {
$sheet = $spreadsheet->getSheet($i);
// ...not what you want, but I leave this here
$higestRow = $sheet->getHighestRow();
echo "<p> Sheet n. ".$i. " highest row is:" . ($higestRow) . "</p>";
$sheetData = $sheet->toArray(null, true, true, true);
var_dump($sheetData);
}
}
...to reach your goal I guess you need to call use PhpOffice\PhpSpreadsheet\Reader\IReadFilter; and build your own filter in order to set the highestRow inside the for loop, as for your needs.
This code is taken from the documentation, the poblic function setRows() I guess is where you need to put your own code, and than cal the filter in the for loop:
namespace Samples\Sample12;
use PhpOffice\PhpSpreadsheet\IOFactory;
use PhpOffice\PhpSpreadsheet\Reader\IReadFilter;
require __DIR__ . '/../Header.php';
$inputFileType = 'Xls';
$inputFileName = __DIR__ . '/sampleData/example2.xls';
/** Define a Read Filter class implementing IReadFilter */
class ChunkReadFilter implements IReadFilter
{
private $startRow = 0;
private $endRow = 0;
/**
* Set the list of rows that we want to read.
*
* #param mixed $startRow
* #param mixed $chunkSize
*/
public function setRows($startRow, $chunkSize)
{
$this->startRow = $startRow;
$this->endRow = $startRow + $chunkSize;
}
public function readCell($column, $row, $worksheetName = '')
{
// Only read the heading row, and the rows that are configured in $this->_startRow and $this->_endRow
if (($row == 1) || ($row >= $this->startRow && $row < $this->endRow)) {
return true;
}
return false;
}
}
$helper->log('Loading file ' . pathinfo($inputFileName, PATHINFO_BASENAME) . ' using IOFactory with a defined reader type of ' . $inputFileType);
// Create a new Reader of the type defined in $inputFileType
$reader = IOFactory::createReader($inputFileType);
// Define how many rows we want to read for each "chunk"
$chunkSize = 10;
// Create a new Instance of our Read Filter
$chunkFilter = new ChunkReadFilter();
// Tell the Reader that we want to use the Read Filter that we've Instantiated
$reader->setReadFilter($chunkFilter);
$spreadsheet = $reader->load($inputFileName);
$sheetCount = $spreadsheet->getSheetCount();
for ($i = 0; $i < $sheetCount; $i++) {
$sheet = $spreadsheet->getSheet($i);
// ...we get the highest row here, now
$higestRow = $sheet->getHighestRow();
for ($startRow = 2; $startRow <= $higestRow; $startRow += $chunkSize) {
// ..just for check the output
echo "<p> Sheet n. ".$i. " highest row is:" . ($higestRow) . "</p>";
$helper->log('Loading WorkSheet using configurable filter for headings row 1 and for rows ' . $startRow . ' to ' . ($higestRow + $chunkSize - 1));
// Tell the Read Filter, the limits on which rows we want to read this iteration
$chunkFilter->setRows($startRow, $chunkSize);
// Load only the rows that match our filter from $inputFileName to a PhpSpreadsheet Object
$spreadsheet = $reader->load($inputFileName);
// Do some processing here
$sheetData = $spreadsheet->getActiveSheet()->toArray(null, true, true, true);
var_dump($sheetData);
}
}
I am still new to this but tried out a solution which helps us here:
We can read files in Chunk through excel sheet as mentioned in the above comments but to save memory. We can create reader inside the loop and release it at the end of the loop like as mentioned below:
// Define how many rows we want to read for each "chunk"
$chunkSize = 1000;
// Loop to read our worksheet in "chunk size" blocks
for ($startRow = 1; $startRow <= $rawRows; $startRow += $chunkSize) {
// Create a new Reader of the type defined in
$reader = IOFactory::createReader($inputFileType);
// Create a new Instance of our Read Filter
$chunkFilter = new Chunk();
// Tell the Reader that we want to use the Read Filter that we've Instantiated
$reader->setReadFilter($chunkFilter);
// Tell the Read Filter, the limits on which rows we want to read this iteration
$chunkFilter->setRows($startRow, $chunkSize);
// Load only the rows that match our filter from $inputFileName to a PhpSpreadsheet Object
$spreadsheet = $reader->load($inputFileName);
.....
// process the file
.....
// then release the memory
$spreadsheet->__destruct();
$spreadsheet = null;
unset($spreadsheet);
$reader->__destruct();
$reader = null;
unset($reader);
}
It helps for large sheets to use only memory of a chunk and never exceed the memory limit.
Please let me know if this is helpful.
I've got an array with specifications. I want each specification to become a column. I am having trouble with working this out though.
$specifications = new Specifications();
$columnCounter = 1;
foreach ($specifications as $specificationId => $specification) {
$column = PHPExcel_Cell::getColumnByNumber($columnCounter);
$objPHPExcel
->getActiveSheet()
->getColumnDimension($column)
->setAutoSize(true)
;
$objPHPExcel->setActiveSheetIndex(0)
->setCellValue($column.'1', $specification['value'])
;
$columnCounter++;
}
The PHPExcel::getColumnByNumber() is of course an imaginary function. Though I am wondering how others do this and how best to address this.
$book = new PHPExcel();
$book->setActiveSheetIndex(0);
$sheet = $book->getActiveSheet();
$sheet->setTitle('Sets');
$xls_row = 5;
$xls_col = 3;
foreach($specifications as $specificationId => &$specification)
{
$adr = coord($xls_row, $xls_col);
$sheet->setCellValueExplicit($adr, $specification->title, PHPExcel_Cell_DataType::TYPE_STRING);
$sheet->getColumnDimension(coord_x($xls_col))->setAutoSize(true);
$xls_col++;
}
// convert a 0-based coordinate value into EXCEL B1-format
function coord_x($x)
{
if($x<26) $x = chr(ord('A')+$x);
else
{
$x -= 26;
$c1 = $x % 26;
$c2 = intval(($x - $c1)/26);
$x = chr(ord('A')+$c2).chr(ord('A')+$c1);
}
return $x;
}
// convert X,Y 0-based cell address into EXCEL B1-format pair
function coord($y,$x)
{
return coord_x($x).($y+1);
}
I have 2 functions to parse each one a file Excel their sizes are 673K and 131K. They have the same code just their names.
One function it read the data from file Excel and the other function it return:
Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (Tried to allow 72 bytes)
I have others files Excel their sizes more bigger than this ones and their parsing functions works well.
I want to create a logfile to register every action they do inside the system but I have no idea how to do it. On the other hand I found this solution in Stackoverflow for #Lawrence Cherone:
enter link description here
But the problem is the first time I will do a logfile, I don't know how I create it ? I create a new file and I put it in my project ? How I excute it and how I can see the reason of the error in my function ? Or I put this file in the function when I have the error like the solution proposed by #Lawrence Cherone ?
This solution it seems worked fine and the problem is resolved.
This following is my small code of my function, can you guide me how I create this logfile to debug it:
public function parseEquipment($filePath = null) {
set_time_limit(0);
$listEquipement = [];
$count = 0;
$chunkSize = 1024;
$objReader = PHPExcel_IOFactory::createReader(PHPExcel_IOFactory::identify($filePath));
$spreadsheetInfo = $objReader->listWorksheetInfo($filePath);
$chunkFilter = new \Floose\Parse\ChunkReadFilter();
$objReader->setReadFilter($chunkFilter);
$objReader->setReadDataOnly(true);
$chunkFilter->setRows(0, 1);
$objPHPExcel = $objReader->load($filePath);
$totalRows = $spreadsheetInfo[0]['totalRows'];
for ($startRow = 1; $startRow <= $totalRows; $startRow += $chunkSize) {
$chunkFilter->setRows($startRow, $chunkSize);
$objPHPExcel = $objReader->load($filePath);
$sheetData = $objPHPExcel->getActiveSheet()->toArray(null, null, true, false);
$startIndex = ($startRow == 1) ? $startRow : $startRow - 1;
if (!empty($sheetData) && $startRow < $totalRows) {
$dataToAnalyse = array_slice($sheetData, $startIndex, $chunkSize);
//echo 'test1';
if($dataToAnalyse[1][0]==NULL){
//echo 'test2';
break;
}
//echo 'test3';
//var_dump($sheetData);
for ($i = 0; $i < $chunkSize; $i++) {
if ($dataToAnalyse[$i]['0'] != NULL) {
//echo 'OK';
$listEquipement[] = new Article($dataToAnalyse[$i]['3'], $dataToAnalyse[$i]['4'], $dataToAnalyse[$i]['2']);
// echo 'test4';
$count++;
}
}
}
$objPHPExcel->disconnectWorksheets();
unset($objPHPExcel, $sheetData);
}
//var_dump(array_slice($sheetData, $startIndex, $chunkSize););
return $listEquipement;
}
error_log("You messed up!".$my_message, 3, "/var/tmp/custom-errors.log");
I use mysqli API to query a large table, every 1000 rows, but the memory of my server grows up very fast. The memory is 0, even the swap. I don't know how to fix it.
The table has 4 million rows so I query the table each time by 1000.
Here is my code:
<?php
ini_set('memory_limit','32M');
$config = require_once('config.php');
$attachmentRoot = $config['attachment_root'];
$mysqli = new mysqli($config['DB_HOST'],$config['DB_USER'],$config['DB_PASSWORD'],$config['DB_NAME']);
$mysqli->set_charset('gbk');
if(!$mysqli)
throw new Exception('DB connnect faild:'.mysqli_connect_errno().mysqli_connect_error());
echo "\nRename The Dup Files With Suffix: .es201704111728es \n";
$startTime = microtime(true);
/**
*
* Move dup file to $name + .es201704111728es
*/
$suffix = ".es201704111728es";
$fileLinesLimit = 100000;
$listSuffix = 0;
$lines = 0;
/**
* Create File List.
*/
$fileList = '/tmp/Dupfilelist.txt';
$baseListName = $fileList.$listSuffix;
//$fs = fopen($baseListName,'w');
$totalSize = 0;
$start = 0;
$step = 10000;
$sql = "SELECT id,filepath,ids,duplicatefile,filesize FROM duplicate_attachment WHERE id> $start AND duplicatefile IS NOT NULL LIMIT $step";
$result = $mysqli->query($sql);
while($result->num_rows > 0)
{
while($result->fetch_row())
{
/*$fiepath = $row[1];
$uniqueIdsArray = array_unique(explode(',',$row[2]));if(empty($row[3]))throw new \Exception("\n".'ERROR:'.$row[0]."\n".var_export($row[3],true)."\n");
$uniqueFilesArray = array_unique(explode(',',$row[3]));
$hasFile = array_search($fiepath,$uniqueFilesArray);
if($hasFile !== false)
unset($uniqueFilesArray[$hasFile]);
$num = count($uniqueIdsArray);
$fileNum = count($uniqueFilesArray);
$ids = implode(',',$uniqueIdsArray);
if($num>1 && $fileNum>0){
//echo "\nID: $row[0] . File Need To Rename:".var_export($uniqueFilesArray,true)."\n";
$size = intval($row[4]);
if($lines >= $fileLinesLimit){
$lines = 0;
$listSuffix++;
//$fileList .= $listSuffix;
}
array_map(function($file) use ($attachmentRoot,$suffix,$fiepath,$totalSize,$size,$fileLinesLimit,&$listSuffix,&$lines,$fileList){
//$fs = fopen($fileList.$listSuffix,'a');
if($file === $fiepath)
return -1;
$source = $file;
$target = $source.$suffix;
//rename($source,$target);
//fwrite($fs,$source.','.$target."\n");
//file_put_contents($fileList.$listSuffix, $source.','.$target."\n",FILE_APPEND);
//$totalSize += intval($size);
$lines ++;
//echo memory_get_usage()."\n";
//fclose($fs);
//unset($fs);
//try to write file without amount memory cost
//$ts = fopen('/tmp/tempfile-0412','w');
},$uniqueFilesArray);
//echo "Test Just One Attachment Record.\n";
//echo "Ids:$ids\n";
//exit();
}*/
}
echo memory_get_peak_usage(true)."\n";
if(!$mysqli->ping())
{
echo "Mysql Conncect Failed.Reconnecting.\n";
$mysqli = new mysqli($config['DB_HOST'],$config['DB_USER'],$config['DB_PASSWORD'],$config['DB_NAME']);
$mysqli->set_charset('gbk');
if(!$mysqli)
throw new Exception('DB connnect faild:'.mysqli_connect_errno().mysqli_connect_error());
}
//mysqli_free_result($result);
$result->close();
unset($result);
$start += $step;
$sql = "SELECT id,filepath,ids,duplicatefile,filesize FROM duplicate_attachment WHERE id> $start AND duplicatefile IS NOT NULL LIMIT $step";
$result = $mysqli->query($sql);
}
echo "Dup File Total Size: $totalSize\n";
echo "Script cost time :".(microtime(true)-$startTime)." ms\n";sleep(1000*10);
mysqli_close($mysqli);
exit();
I enable the XDEBUG extension.Sorry for that.
I disable this extension and everything goes well.
I ran into this issue with PHP Version 7.3.26 on Centos 7. I worked around it by using unbuffered query (instead of buffered). In the above example, replace
$result = $mysqli->query($sql)
with
$result = $mysqli->query($sql, MYSQLI_USE_RESULT)
I am trying to use TCPDF library to have a PDF file that supports Arabic letters. I have created a table and get the data from MYSQL database.
<?php
// Include the main TCPDF library (search for installation path).
require_once('TCPDF/config/tcpdf_config.php');
require_once('TCPDF/tcpdf.php');
class PDF extends TCPDF{
var $tablewidths;
var $headerset;
var $footerset;
function morepagestable($lineheight=8) {
// some things to set and 'remember'
$l = $this->lMargin*2;
$startheight = $h = $this->GetY();
$startpage = $currpage = $this->page;
// calculate the whole width
foreach($this->tablewidths as $width) {
$fullwidth += $width;
}
// Now let's start to write the table
$row = 0;
while($data=mysql_fetch_row($this->results)) {
$this->page = $currpage;
// write the horizontal borders
$this->Line($l,$h,$fullwidth+$l,$h);
// write the content and remember the height of the highest col
foreach($data as $col => $txt) {
$this->page = $currpage;
$this->SetXY($l,$h);
$this->MultiCell($this->tablewidths[$col],$lineheight,$txt,0,$this->colAlign[$col]);
$l += $this->tablewidths[$col];
if($tmpheight[$row.'-'.$this->page] < $this->GetY()) {
$tmpheight[$row.'-'.$this->page] = $this->GetY();
}
if($this->page > $maxpage)
$maxpage = $this->page;
unset($data[$col]);
}
// get the height we were in the last used page
$h = $tmpheight[$row.'-'.$maxpage];
// set the "pointer" to the left margin
$l = $this->lMargin*2;
// set the $currpage to the last page
$currpage = $maxpage;
unset($datas[$row]);
$row++ ;
}
// draw the borders
// we start adding a horizontal line on the last page
$this->page = $maxpage;
$this->Line($l,$h,$fullwidth+$l,$h);
// now we start at the top of the document and walk down
for($i = $startpage; $i <= $maxpage; $i++) {
$this->page = $i;
$l = $this->lMargin*2;
$t = ($i == $startpage) ? $startheight : $this->tMargin;
$lh = ($i == $maxpage) ? $h : $this->h-$this->bMargin;
$this->Line($l,$t,$l,$lh);
foreach($this->tablewidths as $width) {
$l += $width;
$this->Line($l,$t,$l,$lh);
}
}
// set it to the last page, if not it'll cause some problems
$this->page = $maxpage;
}
connect($host='xxxx',$username='xxxx',$passwd='xxxx',$db='xxxxx')
{
$this->conn = mysql_connect($host,$username,$passwd) or die( mysql_error() );
mysql_select_db($db,$this->conn) or die( mysql_error() );
return true;
}
function query($query){
$this->results = mysql_query($query,$this->conn);
$this->numFields = mysql_num_fields($this->results);
}
function mysql_report($query,$dump=false,$attr=array()){
foreach($attr as $key=>$val){
$this->$key = $val ;
}
$this->query($query);
// if column widths not set
if(!isset($this->tablewidths)){
// starting col width
$this->sColWidth = (($this->w-$this->lMargin*2-$this->rMargin))/$this->numFields;
// loop through results header and set initial col widths/ titles/ alignment
// if a col title is less than the starting col width / reduce that column size
for($i=0;$i<$this->numFields;$i++){
$stringWidth = $this->getstringwidth(mysql_field_name($this->results,$i)) + 8 ;
if( ($stringWidth) < $this->sColWidth){
$colFits[$i] = $stringWidth ;
// set any column titles less than the start width to the column title width
}
$this->colTitles[$i] = mysql_field_name($this->results,$i) ;
switch (mysql_field_type($this->results,$i)){
case 'int':
$this->colAlign[$i] = 'L';
break;
default:
$this->colAlign[$i] = 'L';
}
}
// loop through the data, any column whose contents is bigger that the col size is
// resized
while($row=mysql_fetch_row($this->results)){
foreach($colFits as $key=>$val){
$stringWidth = $this->getstringwidth($row[$key]) + 6 ;
if( ($stringWidth) > $this->sColWidth ){
// any col where row is bigger than the start width is now discarded
unset($colFits[$key]);
}else{
// if text is not bigger than the current column width setting enlarge the column
if( ($stringWidth) > $val ){
$colFits[$key] = ($stringWidth) ;
}
}
}
}
foreach($colFits as $key=>$val){
// set fitted columns to smallest size
$this->tablewidths[$key] = $val;
// to work out how much (if any) space has been freed up
$totAlreadyFitted += $val;
}
$surplus = (sizeof($colFits)*$this->sColWidth) - ($totAlreadyFitted);
for($i=0;$i<$this->numFields;$i++){
if(!in_array($i,array_keys($colFits))){
$this->tablewidths[$i] = $this->sColWidth + ($surplus/(($this->numFields)-sizeof($colFits)));
}
}
ksort($this->tablewidths);
if($dump){
Header('Content-type: text/plain');
for($i=0;$i<$this->numFields;$i++){
if(strlen(mysql_field_name($this->results,$i))>$flength){
$flength = strlen(mysql_field_name($this->results,$i));
}
}
switch($this->k){
case 72/25.4:
$unit = 'millimeters';
break;
case 72/2.54:
$unit = 'centimeters';
break;
case 72:
$unit = 'inches';
break;
default:
$unit = 'points';
}
print "All measurements in $unit\n\n";
for($i=0;$i<$this->numFields;$i++){
printf("%-{$flength}s : %-10s : %10f\n",
mysql_field_name($this->results,$i),
mysql_field_type($this->results,$i),
$this->tablewidths[$i] );
}
print "\n\n";
print "\$pdf->tablewidths=\n\tarray(\n\t\t";
for($i=0;$i<$this->numFields;$i++){
($i<($this->numFields-1)) ?
print $this->tablewidths[$i].", /* ".mysql_field_name($this->results,$i)." */ \n\t\t":
print $this->tablewidths[$i]." /* ".mysql_field_name($this->results,$i)." */\n\t\t";
}
print "\n\t);\n";
exit;
}
} else { // end of if tablewidths not defined
for($i=0;$i<$this->numFields;$i++){
$this->colTitles[$i] = mysql_field_name($this->results,$i) ;
switch (mysql_field_type($this->results,$i)){
case 'int':
$this->colAlign[$i] = 'R';
break;
default:
$this->colAlign[$i] = 'L';
}
}
}
mysql_data_seek($this->results,0);
$this->Open();
$this->setY($this->tMargin);
$this->AddPage();
$this->morepagestable($this->FontSizePt);
$this->Output();
}
}
$pdf = new PDF(PDF_PAGE_ORIENTATION,PDF_UNIT,PDF_PAGE_FORMAT,true, 'UTF- 8', false);
$pdf->SetFont('aealarabiya', '', 14, '', false);
$pdf->connect('xxxxxxxxxx','xxxxxxxxxxxxxx','xxxxxxxxxxxxx','xxxxxxxxxxxxx');
$attr=array('titleFontSize'=>24,'titleText'=>'THIS IS MY PDF FILE');
$pdf->mysql_report("SELECT * FROM Student_Table",false,$attr);
?>
I am OK with the table that is created with my code. The Arabic data retrieved is as the following?how can I solve it ?
Your problem is very simple bro, you need to choose the correct database encoding and do the query with it , but before going for that , I think this is my duty to tell you YOU DON'T HAVE TO USE "mysql" YOU HAVE TO USE "mysqli" ,now let us solve the problem :
set your database encoding to (utf8_general_ci). take a backup don't forget all the old stuff will be destroyed if you change the charset of the database do this if it's new or empty
after mysqli connect do this $db->set_charset("utf8"); or if you still
want to work with mysql use this mysql_set_charset("utf8");
Here is the edited code:
connect($host='xxxx',$username='xxxx',$passwd='xxxx',$db='xxxxx')
{
$this->conn = mysql_connect($host,$username,$passwd) or die( mysql_error() );
mysql_select_db($db,$this->conn) or die( mysql_error() );
mysql_set_charset("utf8");
return true;
}
But, like what i told you , you did it in wrong way , use mysqli and do it with full OOP not some time and in other Not !