Importing a csv script via mysql and php - php

Due to nda at work i can't really go into much of the other code but this is the snipit of code i'm questioning
while (($data = fgetcsv($handle,',')) !== FALSE) {
$csvstuff->execute($data);
$affectedRows +=1;
//echo "Test Rows:".$affectedRows."\n";
}
My code executes properly into the db as it should. but is there anyway to protect against the csv content that comes in if it has a Erroneous
,
In the csv string it will break my code. Is there away to check for this and prevent that from happening?
EDIT 1: I should also note that the vendor is sending this csv over via a ftp every night so its coming from an automated system I don't control their data output of the file

Related

ftp listing and download file in current date

I have a case,
I have a remote server that contains so many generated transaction files (.txt) from 2015 until now. I must download it everyday real time. For now, i use PHP to download it all, but the method i think is not effectifely. First, I list all files, and then I read the component of the files such as the date modified, but this method is annoying. Make my program run slowly and take a very much time.
This is my code (I've used PHP Yii2),
public function actionDownloadfile(){
$contents=Yii::$app->ftpFs->listContents('/backup', ['timestamp','path','basename']); --> Much time needed while executing this line
var_dump($contents);
foreach ($contents as $value) {
if (date('Y-m-d',$value['timestamp']) == date('Y-m-d')){
echo "[".date('Y-m-d H:i:s')."] : Downloading file ".$value['basename']."\n";
$isi = Yii::$app->ftpFs->read($value['path']);
$dirOut = Yii::$app->params['out'];
$fileoutgoing = $dirOut."/".$value['basename'];
$file = fopen($fileoutgoing,"w");
fwrite($file,$isi);
}
}
}
i have a question,
Is that possible to list and download some files in ftp server just only on this current date without listing them all first?
Any solution either using PHP or Shell Script is OK.
Thank you so much (y)

Codeigniter Allowed memory size exhausted while processing large files

I'm posting this in case someone else is looking for the same solution, seeing as I just wasted two days on this bullshit.
I have a cron job that updates the database using a very large file once a day, using the following code:
if (($handle = fopen(dirname(__FILE__) . '/uncompressed', "r")) !== FALSE)
{
while (($data = fgets($handle)) !== FALSE)
{
$thisline = json_decode($data, true);
$this->regen($thisline);
}
fclose($handle);
}
This is in a Codeigniter controller that's only used for cron jobs. The $this->regen function runs through a bunch of different checks and stores the right information from the line in the database. The file itself is over 300MB of JSONs separated by newlines.
The problem: it would only process about 20,000 lines before the whole thing ran out of memory.
I spent hours troubleshooting this and got nothing obvious. I'm using fgets, I have $query->free_result() in the right places. It didn't help. So then I started checking a loop of about 100 lines, and watched the output of memory_get_usage(). I finally narrowed it down to the Codeigniter Active Record class - every call to the class caused the memory usage to increase by a tiny amount.
Then I found this thread on Ellislabs and I got the answer. CI Active Record saves queries so that if you want to, you can build a query in multiple functions. (I am not even going to go into how dumb it is to have that switched on by default.)
Go to /config/database.php and add
$db['default']['save_queries'] = FALSE;
to the end of the file. Then make sure you build and execute queries using Active Record in a single function. If you need to switch it off just for one case, use
$this->db->save_queries = FALSE;
in the constructor or wherever you need to put it.

PHP server file download cutoff unexpectedly

I have a web interface that I built into the admin section of a WordPress site. It scrapes a few tables in my database and just displays a big list of data row by row. There are about 30,000 rows of this data, displayed with a basic echo in a for loop. Displaying all 30,000 rows on a page works fine.
Additionally, I include an option to download a CSV file of the complete rows of data. I use fopen and then fputcsv to build the CSV file for download from the result of the data query. This feature used to work, but now that the dataset is at 30,000, the CSV will no longer generate correctly. What happens is the first 200~1000 rows will be written to the CSV file leaving out the majority of the data. I estimate that the CSV that is not properly generated in my case would be about 10 Megs. Then the file will download the first 200~1000 rows as though everything was working correctly.
Here is the code:
// This gets a huge list of data from a SP I built. This data is well formed
$data = $this->run_stats_stored_procedure($job_to_report);
// This is where the data is converted into a csv file. This part is broken
// the file may already exist at that location burn it down if it does
if(file_exists(ABSPATH . "some/path/to/my/file/csv_export.csv")) {
unlink(ABSPATH . "some/path/to/my/file/csv_export.csv");
}
$csv_file_handler = fopen(ABSPATH . "some/path/to/my/file/candidate_export.csv", 'w');
if(!empty($csv_file_handler)) {
$title_array = array(
"ID",
"other_feild"
);
fputcsv($csv_file_handler, $title_array, ",");
if(!empty($data)) {
foreach($data as $data_piece) {
$array_as_csv_line = array();
foreach($data_piece as $object_property) {
$array_as_csv_line[] = (string)$object_property;
}
fputcsv($csv_file_handler, $array_as_csv_line, ",");
unset($array_as_csv_line);
}
} else {
fputcsv($csv_file_handler, array("empty"), ",");
}
// pros clean everything up when they are done
fclose($csv_file_handler);
}
I'm not sure what I need to change to get the entire CSV file to download. I believe this could be a configuration issue, but I'm not should. I am led to believe this because this function used to work with even 20,000 csv rows, it is now at 30,000 and breaking. Please let me know if additional info would help. Has anyone bumped into issues with huge CSV files before? Thank you to anyone who can help.
Is the "download" taking more than say a minute, two minutes, or three minutes? If so, the webserver could be closing the connection. For example, if you're using the Apache FCGI module, it has this directive:
FcgidBusyTimeout
which defaults to 300 seconds.
This is the maximum time limit for request handling. If a FastCGI request does not complete within FcgidBusyTimeout seconds, it will be subject to termination.
Hope this helps you solve your problem.
The answer that I am currently implementing is to allow the script to use more time. To do this, I am simply running the following code before the script runs:
set_time_limit ( 3600 );
I am doing further research because this is not a sustainable solution. Any further advice would be greatly appreciated.

PHP Flock Writing to Open File

I have a php script that logs ads(banners) for a website and stores them to a .dat file. Inside this file an ID, URL, an other important information is saved. The problem I am having is that there are at any given time 4 ads on the page and so the .dat file is often corrupted when the php script attempts to write to it while it is open.
I checked and tried this solution however it did not help me:
PHP Simultaneous file access / flock() issue
The function I am using at the moment looks like this:
function writeads(){
global $bannerAdsPath, $ads, $bannerAds;
$data = fopen($bannerAdsPath, 'w') or die();
flock($data, 2) or die();
fputs($data, #join("\n", $ads)."\n");
while (list ($key, $val) = each ($bannerAds)) {
if (($key != '') && ($val != '')) {
fputs($data, $key.'='.$val."\n");
}
}
flock($data, 3);
fclose($data);
reset($bannerAds);
}
Any help would be appreciated as I have been scratching my head over this for a while.
Side bit of information, the client did not want to have their code rewritten to use a Database instead of a file so that option is out.
Thanks!
fopen with 'w' truncates the file before you have the option of flocking it.
You almost never want to use flock to unlock a file; just use fclose; the file will be unlocked when the handle is closed, and that way you know that no buffered writes will happen after you unlock.

Load big file into database

I have a big file that has about 11 Mb. It is a CSV file and I need to load the content of that file into a Postgres database.
I use a PHP script to do this job but always stop in some moment.
I put big size for PHP memory and other stuff and I could load more data but not all data.
How can I solve that? Is any cache memory that I need to clean? Some secret to manage big files in PHP?
Thanks in advance.
UPDATE: Add some code
$handler = fopen($fileName, "r");
$dbHandler = pg_connect($databaseConfig);
while (($line = $handler->fgetcsv(";")) !== false) {
// Algorithms to transform data
// Adding sql sentences in a variable
// I am using a "batch" idea that execute all sql formed after 5000 read lines
// When I reach 5000 read lines, execute my sql
$results = pg_query($dbHandler, $sql);
}
In case you have direct access to the server(and you don't work with some subversion software), postgre has a far better option that is far less demanding in terms of resources. Keep in mind that php is a slow and resource consuming language
COPY my_table_name FROM '/home/myfile.csv' DELIMITERS ',' CSV

Categories