Strange issue i'm having. when I perform a file check with File_exist or is_file it only checks half the files... what my script does is processes a csv file and inserts the data into the table only if the file exist on the server. if I remove the file check everything processes fine. I've double check to make sure all files exist on the server it just stop half way through for some reason.
$column_headers = array();
$row_count = 0;
if (mysql_result(
mysql_query(
"SELECT count(*) FROM load_test WHERE batch_id='".$batchid."'"
), 0
) > 0) {
die("Error batch already present");
}
while (($data = fgetcsv($handle, 0, ",")) !== FALSE) {
if ($row_count==0){
$column_headers = $data;
} else {
$dirchk1 = "/temp/files/" . $batchid . "/" .$data[0] . ".wav";
$dirchk2 = "/files/" . $batchid . "/" . $data[1] . ".wav";
if (file_exists($dirchk1)) {
$importword="INSERT into load_test SET
word = '".$data[2]."',
batch_id = UCASE('".$batchid."'),
accent = '".$data[15]."'
";
mysql_query($importword);
$word_id = mysql_insert_id();
echo $word_id . "\n";
}
}
++$row_count;
}
Try it using "-e" test condition.
For eg ::
if(-e $dirchk1){
print "File exists\n";
}
Also make sure if your variable $dirchk1 etc are getting correctly populated or not.
Please check if it works or not.
the script processed correctly, human error whey verifying on my part.
Related
I have a php script which downloads a CSV file and updates stock for products on PrestaShop.
The problem is that this script can be executed only through GET request but there is some timeout on the server-side which causes an error.
I can't increase the timeout so I have to figure out a workaround.
My idea is to make this script run the import snippet in the background (in another process) so the script will end almost immediately (eg. no timeout), but the import will run in the background.
Is it possible?
My script:
<?php
#ini_set('max_execution_time', 0);
include(dirname(__FILE__) . '/config/config.inc.php');
include(dirname(__FILE__) . '/init.php');
$url = 'URL';
$file_name = basename($url);
if(file_put_contents( $file_name,file_get_contents($url))) {
echo "File downloaded successfully";
}
else {
echo "File downloading failed.";
// die('error');
}
echo "\n";
echo "<br>";
$row = 1;
if (($handle = fopen($file_name, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++) {
// skip first ligne (header csv )
if ($row == 1 || $row == 2) {
continue;
}
if(!($data[5] == 'Suk' || $data[5] == 'plus size')){
continue;
}
// get attribut from prestashop database
if(empty($data[9]))
continue;
$productAttribut=findProductAttributByReference($data[9]);
// if product attribut exist
if(!empty($productAttribut)){
echo $productAttribut['id_product_attribute'];
// update quantity
StockAvailable::setQuantity((int)$productAttribut['id_product'],(int)$productAttribut['id_product_attribute'], (int)$data[10], Context::getContext()->shop->id);
echo "product ".$productAttribut['id_product_attribute']." quantity \n updated";
echo "\n";
echo "<br>";
}
}
}
fclose($handle);
echo "\n";
echo "<br>";
echo "end script ";
}
function findProductAttributByReference($reference){
$sql='
SELECT *
FROM `' . _DB_PREFIX_ . 'product_attribute`
WHERE `reference` = "' . $reference.'"'
;
$result = Db::getInstance()->getRow($sql);
return $result;
}
?>
No you can't,
As #martin Paucot said in comment, PHP is blocking. It will run all the instructions one by one. meanwhile, Each instruction will be waiting previous one to be finished.
So, Your main script will be waiting for your imported script to finish its execution.
Possible Way
Use CRON Job to export CSV in background and push a notification once it's job completed. So user will be notified once the job completed and download the file using a link. (You will need to store some records in database)
I am able to save the results as JSON. The problem I am having is saving it to a network drive.
I confirmed the path is good by using the following:
if ($handle = opendir('//network/IS/folder1/folder2/targetdirectory'))
{
echo "Directory handle: $handle\n";
echo "Entries:\n";
// This is the correct way to loop over the directory.
while (false !== ($entry = readdir($handle))) {
echo "$entry\n";
}
// This is the WRONG way to loop over the directory.
while ($entry = readdir($handle)) {
echo "$entry\n";
}
closedir($handle);
}
Using the above, I am able to see all of the files in the targetdirectory. Therefore, I know the path is good. It's writing/creating the text file that is to be saved in the targetdirectory.
The whole process looks like this:
<?php
include("../include/sessions.php");
if(isset($_POST['selectedOption']))
{
$svc1 = mysqli_real_escape_string($dbc, $_POST['selectedOption']);
$sql = "SELECT column1, column2, column3 FROM table WHERE column1 = '$svc1'";
$query = mysqli_query($dbc, $sql);
if($query)
{
$out = array();
while($row = $query->fetch_assoc())
{
$out[] = $row;
}
$json = json_encode($out); // I can change $json to echo and see the $out in the console as json
$file = fopen(__DIR__ . "//network/IS/folder1/folder2/targetdirectory", 'w');
fwrite($file, $json);
fclose($file);
}
else
{
echo "Error: " . mysqli_error($dbc);
}
?>
Using the above process, nothing is being saved into the targetdirectory.
What I can do to fix this problem?
I solved my problem by removing DIR and adding the filename at the end of the path:
$file = fopen("//network/IS/folder1/folder2/targetdirectory/newFile.txt", 'w');
fwrite($file, $json);
fclose($file);
The PHP command "file_put_contents", although succeeding in creating a file, only continued to append to the same file. Using the above will overwrite the target file in the targetdirectory.
I have a script which goes through every filename in a directory and removes unwanted files. Amount other things, it matches filenames in a CSV file and then removes a file in an adjacent cell.
<?php
$gameList = trim(shell_exec("ls -a -1 -I . -I .. -I Removed"));
$gameArray = explode("\n", $gameList);
shell_exec('mkdir -p Removed');
echo "\033[01;32m\n\n<<<<<<<<<<Starting Scan>>>>>>>>>>\n\n";
// Do the magic for every file
foreach ($gameArray as $thisGame)
{
// Ensure continue if already removed
if (!$thisGame) continue;
if (!file_exists($thisGame)) continue;
// Manipulate names to look and play nice
$niceName = trim(preg_replace('%[\(|\[].*%', '', $thisGame));
$thisGameNoExtension = preg_replace('/\.' . preg_quote(pathinfo($thisGame, PATHINFO_EXTENSION) , '/') . '$/', '', $thisGame);
// Let the user know what game is being evaluated
echo "\033[00;35m{$thisGameNoExtension} ... ";
$f = fopen("ManualRegionDupes.csv", "r");
while ($row = fgetcsv($f))
{
if ($row[1] == $thisGameNoExtension)
{
$primaryFile = $row[0];
$ext = pathinfo($thisGame, PATHINFO_EXTENSION);
$fCheck = trim(shell_exec("ls -1 " . escapeshellarg($primaryFile) . "." . escapeshellarg($ext) . " 2>/dev/null"));
if ($fCheck)
{
echo "ECHO LION";
shell_exec("mv " . escapeshellarg($thisGame) . " Removed/");
continue;
}
else
{
echo "ECHO ZEBRA";
continue;
}
break;
}
}
fclose($f);
echo "Scanned and kept";
}
echo "\033[01;32m\n<<<<<<<<<<Process Complete>>>>>>>>>>\n\n";
?>
It's working, however I don't understand why I am seeing the final echo "Scanned and Kept" straight after either "ECHO ZEBRA" or "ECHO LION" - as I have "continue" calls straight after them, which should restart the loop and move on to the next file. I'm sure I just need to re-jig something somewhere, but I've been fiddling for hours and I'm completely stumped. I'd be super greatful for any help! Many thanks!
The continue is just working for the inner loop which is reading the individual lines. If you want to skip to the end of the outer loop, you will need to use...
continue 2;
This allows you to say continue for 2 levels of loop.
You have nothing but a break after your continue calls, what else would it run? Edit, if you want the continue on the outer loop, move it there. The continue is just in the wrong loop.
I'm using the ua-parser library to identify the device family for a number of user agent strings in a spreadsheet column. The problem I'm running into is that it doesn't seem like my function is really running. The value output for detectAgent($data[2]) is not always accurate.
Here's a code sample. I feel like I must be missing something related to the limitations of creating objects over and over again.
Thanks in advance for any help.
<?php
require_once 'vendor/autoload.php';
use UAParser\Parser;
function detectAgent($ua) {
$parser = Parser::create();
$result = $parser->parse($ua);
return $result->os->family;
}
$input_file = "input.csv";
$output_file = "output.csv";
if (($handle1 = fopen($input_file, "r")) !== FALSE) {
if (($handle2 = fopen($output_file, "w")) !== FALSE) {
while (($data = fgetcsv($handle1, 5000000, ",")) !== FALSE) {
// Alter your data
#print $data . "<br />";
$data[2] = detectAgent($data[2]); //identify browser family
// Write back to CSV format
fputcsv($handle2, $data);
}
fclose($handle2);
}
fclose($handle1);
}
?>
This was a silly mistake. I was writing to the wrong column in $data[2] = detectAgent($data[2]);.
If anyone else runs into the same problem, the code is working now and I've posted an example here.
I have a php script that steps through a folder containing tab delimited files, parsing them line by line and inserting the data into a mysql database. I cannot use LOAD TABLE because of security restrictions on my server and I do not have access to the configuration files. The script works just fine parsing 1 or 2 smaller files but when when working with several large files I get a 500 error. There do not appear to be any error logs containing messages pertaining to the error, at least none that my hosting provider gives me access to. Below is the code, I am also open to suggestions for alternate ways of doing what I need to do. Ultimately I want this script to fire off every 30 minutes or so, inserting new data and deleting the files when finished.
EDIT: After making the changes Phil suggested, the script still fails but I now have the following message in my error log "mod_fcgid: read data timeout in 120 seconds", looks like the script is timing out, any idea where I can change the timeout setting?
$folder = opendir($dir);
while (($file = readdir($folder)) !== false) {
$filepath = $dir . "/" . $file;
//If it is a file and ends in txt, parse it and insert the records into the db
if (is_file($filepath) && substr($filepath, strlen($filepath) - 3) == "txt") {
uploadDataToDB($filepath, $connection);
}
}
function uploadDataToDB($filepath, $connection) {
ini_set('display_errors', 'On');
error_reporting(E_ALL);
ini_set('max_execution_time', 300);
$insertString = "INSERT INTO dirty_products values(";
$count = 1;
$file = #fopen($filepath, "r");
while (($line = fgets($file)) !== false) {
$values = "";
$valueArray = explode("\t", $line);
foreach ($valueArray as $value) {
//Escape single quotes
$value = str_replace("'", "\'", $value);
if ($values != "")
$values = $values . ",'" . $value . "'";
else
$values = "'" . $value . "'";
}
mysql_query($insertString . $values . ")", $connection);
$count++;
}
fclose($file);
echo "Count: " . $count . "</p>";
}
First thing I'd do is use prepared statements (using PDO).
Using the mysql_query() function, you're creating a new statement for every insert and you may be exceeding the allowed limit.
If you use a prepared statement, only one statement is created and compiled on the database server.
Example
function uploadDataToDB($filepath, $connection) {
ini_set('display_errors', 'On');
error_reporting(E_ALL);
ini_set('max_execution_time', 300);
$db = new PDO(/* DB connection parameters */);
$stmt = $db->prepare('INSERT INTO dirty_products VALUES (
?, ?, ?, ?, ?, ?)');
// match number of placeholders to number of TSV fields
$count = 1;
$file = #fopen($filepath, "r");
while (($line = fgets($file)) !== false) {
$valueArray = explode("\t", $line);
$stmt->execute($valueArray);
$count++;
}
fclose($file);
$db = null;
echo "Count: " . $count . "</p>";
}
Considering you want to run this script on a schedule, I'd avoid the web server entirely and run the script via the CLI using cron or whatever scheduling service your host provides. This will help you avoid any timeout configured in the web server.