Run PHP script in background using GET request to avoid timeout - php

I have a php script which downloads a CSV file and updates stock for products on PrestaShop.
The problem is that this script can be executed only through GET request but there is some timeout on the server-side which causes an error.
I can't increase the timeout so I have to figure out a workaround.
My idea is to make this script run the import snippet in the background (in another process) so the script will end almost immediately (eg. no timeout), but the import will run in the background.
Is it possible?
My script:
<?php
#ini_set('max_execution_time', 0);
include(dirname(__FILE__) . '/config/config.inc.php');
include(dirname(__FILE__) . '/init.php');
$url = 'URL';
$file_name = basename($url);
if(file_put_contents( $file_name,file_get_contents($url))) {
echo "File downloaded successfully";
}
else {
echo "File downloading failed.";
// die('error');
}
echo "\n";
echo "<br>";
$row = 1;
if (($handle = fopen($file_name, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++) {
// skip first ligne (header csv )
if ($row == 1 || $row == 2) {
continue;
}
if(!($data[5] == 'Suk' || $data[5] == 'plus size')){
continue;
}
// get attribut from prestashop database
if(empty($data[9]))
continue;
$productAttribut=findProductAttributByReference($data[9]);
// if product attribut exist
if(!empty($productAttribut)){
echo $productAttribut['id_product_attribute'];
// update quantity
StockAvailable::setQuantity((int)$productAttribut['id_product'],(int)$productAttribut['id_product_attribute'], (int)$data[10], Context::getContext()->shop->id);
echo "product ".$productAttribut['id_product_attribute']." quantity \n updated";
echo "\n";
echo "<br>";
}
}
}
fclose($handle);
echo "\n";
echo "<br>";
echo "end script ";
}
function findProductAttributByReference($reference){
$sql='
SELECT *
FROM `' . _DB_PREFIX_ . 'product_attribute`
WHERE `reference` = "' . $reference.'"'
;
$result = Db::getInstance()->getRow($sql);
return $result;
}
?>

No you can't,
As #martin Paucot said in comment, PHP is blocking. It will run all the instructions one by one. meanwhile, Each instruction will be waiting previous one to be finished.
So, Your main script will be waiting for your imported script to finish its execution.
Possible Way
Use CRON Job to export CSV in background and push a notification once it's job completed. So user will be notified once the job completed and download the file using a link. (You will need to store some records in database)

Related

Checking if multiple files exsist if so create another

I'm having a problem trying to figure out a way to check for a file and if it exists check the next and so on.
I'm not familiar enough with loops so looking for help.
I'm putting together a small set of php forms to help me do rental inspections. The form will create a page for each area/item for the inspection with a photo and description of the problem if any. The form for the photo's is already working and not part of this.
I'm matching the paper form they will have me use. This could save me an hour or so on the windoze spreadsheet they would want me to put everything in and then print to PDF.
My laptop/PC is Linux. Would also save me having to get a Win machine or tablet.
I have everything else working. Just the page creation is giving me fits. I understand a loop should be the easiest to save having to write the file_exists search for each page up to 40 pages.
Here is a snip of were I'm at. The location this is sitting is not publicly accessible.
Thanks in advance
Bob
<?php
// This will be accessed for each area/item inspected
$dirpath = "/localhost/rent.inspect/";
// Get POST info from form page
$a1 = $_POST["a1"];
$a2 = $_POST["a2"];
$a3 = $_POST["a3"];
$a4 = $_POST["a4"];
…...
$a40 = &_POST[“a40”];
// File names we write to can be any name
$FileName1 = "$dirPath/page1.php";
$FileName2 = "$dirPath/page2.php";
$FileName3 = "$dirPath/page3.php";
$FileName4 = "$dirPath/page4.php";
…...
$FileName3 = "$dirPath/page39.php";
$FileName4 = "$dirPath/page40.php";
// Check if the first file is already created.
// If not create it and write, if is does exist check for the
// next file. Keep checking until one not created is found.
// Should never get to the 40th file at this time.
// Check if first file has already been created.
if(file_exists("$FileNam1"))
{
if(file_exists("$FileNam2"))
{
if(file_exists("$FileNam3"))
{
// Check for next one.... Should never see 40
// but keep checking to it just in case something is added
}
else
{
$myfile = fopen("$dirPath/$filename3", "w") or die("Unable to open file!");
$txt = "<font size=\"2\">$a1 $a2 $a3</font></b><br /><br />";
fwrite($myfile, $txt);
fclose($myfile);
}
}
else
{
$myfile = fopen("$dirPath/$filename2", "w") or die("Unable to open file!");
$txt = "<font size=\"2\">$a1 $a2 $a3</font></b><br /><br />";
fwrite($myfile, $txt);
fclose($myfile);
}
}
else
{
$myfile = fopen("$dirPath/$filename1", "w") or die("Unable to open file!");
$txt = "<font size=\"2\">$a1 $a2 $a3</font></b><br /><br />";
fwrite($myfile, $txt);
fclose($myfile);
}
?>
Several ways to do so, an easy one fitting your current problem could be :
for ($i = 1; $i < 41; ++$i) {
if (!file_exists($dirPath . '/page' . $i . '.php')) {
// fopen, fwrite, fclose ...
break;
}
}
You could also improve your variable initializations using an array to store your variables, even more when it's all about changing an increment integer.
Here is an example, not really useful, but explaining how you could do :
for ($i = 0; $i < 41; ++$i) {
$myVar['a' . $i] = $_POST['a' . $i];
}
you can check till the file exixts and increment the counter
$filepath = "/somepath/";
$filename = "FileNam"
$i=1;
$pathtocheck = $filepath + $filename + $i;
while ( file_exists ($pathtocheck ))
{
$i++
$pathtocheck = $filepath + $filename + $i;
}
// your code for file write will be here
// this code will check is there file exist if not while will break otherwise it will continue till no file like FileNam1 ,FileNam2 and so on ...

Doctrine2 Batch processing

I'm trying to import text file with 10000+ lines into DB. I've found a manual here. But my script ends just before reaching the final flush without any error. The boolean parameter in my custom flush method stands for calling clear method after flushing.
Code:
$handle = fopen($file, 'r');
if ($handle != FALSE) {
// Clear entities
$clear = 1;
// Read line
while (($data = fgets($handle)) !== FALSE) {
$entity = $this->_createEntity($data);
echo $clear . '<br>';
$this->getPresenter()->getService('mapService')->persist($entity);
if ($clear % 100 == 0) {
echo 'saving...<br>';
$this->getPresenter()->getService('mapService')->flush(TRUE); // Flush and clear
}
$clear++;
}
echo 'end...'; // Script ends before reaching this line
$this->getPresenter()->getService('mapService')->flush(); // Final flush
echo '...ed';
}
fclose($handle);
Custom Flush method:
public function flush($clear = FALSE) {
$this->db->flush();
if ($clear) {
$this->db->clear();
}
}
Echo output:
1
...
9998
9999
10000
saving...
But no end......ed.
Thanks a lot in advance.
EDIT
I've changed number of line in files to process in one batch from 10k to 5000. It's OK now. But I still wonder why 10k is "too much" for PHP or Doctrine.
Try using feof:
$handle = fopen($file, 'r');
if ($handle != FALSE) {
// Clear entities
$clear = 1;
// Read line
//while (($data = fgets($handle)) !== FALSE) {
while (!feof($handle)){
$data = fgets($handle);
$entity = $this->_createEntity($data);
echo $clear . '<br>';
$this->getPresenter()->getService('mapService')->persist($entity);
if ($clear % 100 == 0) {
echo 'saving...<br>';
$this->getPresenter()->getService('mapService')->flush(TRUE); // Flush and clear
}
$clear++;
}
echo 'end...'; // Script ends before reaching this line
$this->getPresenter()->getService('mapService')->flush(); // Final flush
echo '...ed';
}
fclose($handle);

Execute the scripts at the same time if-else problem

I can't use flock at the moment(server restrictions) so I am creating a alternative file lock system. Here is my code.
$dir = "C:\\wamp\\www\\test\\";
$files = scandir($dir);
for($i=0; $i<count($files); $i++)
{
if(substr(strrchr($files[$i],'.csv'),-4) == '.csv')
{
echo "File ".$files[$i]." is a csv"."</br>";
if (file_exists("$dir$files[$i].lock"))
{
echo $files[$i]." has lock in place"."</br>";
$i++;
}
else
{
if($file_handle = fopen("$dir$files[$i]", "rb"))
{
$file_lock_handle = fopen("$dir$files[$i].lock", "w");
echo "Setting Lock"."</br>";
//Do Logic
fclose($file_handle);
fclose($file_lock_handle);
sleep(3);
unlink("$dir$files[$i].lock");
}
}
}
else
{
//Do nothing
}
}
If I run these scripts side by side. It waits for the first script to be finished before it executes the second one. How can I run them concurrently? i.e. If a lock exists I want it to skip that file and go the the next one.
There is a good example of this here: http://www.php.net/manual/en/function.flock.php#92731

While statement with file_exists / is_file doesn't process completely

Strange issue i'm having. when I perform a file check with File_exist or is_file it only checks half the files... what my script does is processes a csv file and inserts the data into the table only if the file exist on the server. if I remove the file check everything processes fine. I've double check to make sure all files exist on the server it just stop half way through for some reason.
$column_headers = array();
$row_count = 0;
if (mysql_result(
mysql_query(
"SELECT count(*) FROM load_test WHERE batch_id='".$batchid."'"
), 0
) > 0) {
die("Error batch already present");
}
while (($data = fgetcsv($handle, 0, ",")) !== FALSE) {
if ($row_count==0){
$column_headers = $data;
} else {
$dirchk1 = "/temp/files/" . $batchid . "/" .$data[0] . ".wav";
$dirchk2 = "/files/" . $batchid . "/" . $data[1] . ".wav";
if (file_exists($dirchk1)) {
$importword="INSERT into load_test SET
word = '".$data[2]."',
batch_id = UCASE('".$batchid."'),
accent = '".$data[15]."'
";
mysql_query($importword);
$word_id = mysql_insert_id();
echo $word_id . "\n";
}
}
++$row_count;
}
Try it using "-e" test condition.
For eg ::
if(-e $dirchk1){
print "File exists\n";
}
Also make sure if your variable $dirchk1 etc are getting correctly populated or not.
Please check if it works or not.
the script processed correctly, human error whey verifying on my part.

PHP Memory Exhaustion error, poor code or just increase memory limit?

I am trying to read 738627 records from a flat file into MySQl. The script appears to run fine, but is giving me the above memory errors.
A sample of the file is:
#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285760002827930241
127667880285760002827987861
127667880285760002828089791
127667880285760002828168361
127667880285760002828192041
127667880285760002829144541
127667880285760002829351511
I have tried increasing the allowed memory using
ini_set("memory_limit","80M");
and it still fails. Do I keep upping this until it runs?
The code in full is
<?php
ini_set("memory_limit","80M");
$db = mysql_connect("localhost", "uname", "pword");
// test connection
if (!$db) {
echo "Couldn't make a connection!";
exit;
}
// select database
if (!mysql_select_db("dbname",$db))
{
echo "Couldn't select database!";
exit;
}
mysql_set_charset('utf8',$db);
$delimiter = chr(1);
$eoldelimiter = chr(2) . "\n";
$fp = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.</table></body></html>'; exit;}
$loop = 0;
while (!feof($fp)) {
$loop++;
$line = stream_get_line($fp,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue; //Skip lines that start with #
$field[$loop] = explode ($delimiter, $line);
$fp++;
$export_date = $field[$loop][0];
$genre_id = $field[$loop][1];
$application_id = $field[$loop][2];
$query = "REPLACE into genre_apps
(export_date, genre_id, application_id)
VALUES ('$export_date','$genre_id','$application_id')";
print "SQL-Query: ".$query."<br>";
if(mysql_query($query,$db))
{
echo " OK !\n";
}
else
{
echo "Error<br><br>";
echo mysql_errno() . ":" . mysql_error() . "</font></center><br>\n";
}
}
fclose($fp);
?>
Your loop fills the variable $field for no reason (it writes to a different cell on every loop iteration), thereby using up more memory with every line.
You can replace:
$field[$loop] = explode ($delimiter, $line);
$export_date = $field[$loop][0];
$genre_id = $field[$loop][1];
$application_id = $field[$loop][2];
With:
list($export_date, $genre_id, $application_id) = explode($delimiter, $line);
For improved performance, you could take advantage of the ability to insert several lines using REPLACE INTO by grouping N lines into a single query.

Categories