I can't use flock at the moment(server restrictions) so I am creating a alternative file lock system. Here is my code.
$dir = "C:\\wamp\\www\\test\\";
$files = scandir($dir);
for($i=0; $i<count($files); $i++)
{
if(substr(strrchr($files[$i],'.csv'),-4) == '.csv')
{
echo "File ".$files[$i]." is a csv"."</br>";
if (file_exists("$dir$files[$i].lock"))
{
echo $files[$i]." has lock in place"."</br>";
$i++;
}
else
{
if($file_handle = fopen("$dir$files[$i]", "rb"))
{
$file_lock_handle = fopen("$dir$files[$i].lock", "w");
echo "Setting Lock"."</br>";
//Do Logic
fclose($file_handle);
fclose($file_lock_handle);
sleep(3);
unlink("$dir$files[$i].lock");
}
}
}
else
{
//Do nothing
}
}
If I run these scripts side by side. It waits for the first script to be finished before it executes the second one. How can I run them concurrently? i.e. If a lock exists I want it to skip that file and go the the next one.
There is a good example of this here: http://www.php.net/manual/en/function.flock.php#92731
Related
I have a php script which downloads a CSV file and updates stock for products on PrestaShop.
The problem is that this script can be executed only through GET request but there is some timeout on the server-side which causes an error.
I can't increase the timeout so I have to figure out a workaround.
My idea is to make this script run the import snippet in the background (in another process) so the script will end almost immediately (eg. no timeout), but the import will run in the background.
Is it possible?
My script:
<?php
#ini_set('max_execution_time', 0);
include(dirname(__FILE__) . '/config/config.inc.php');
include(dirname(__FILE__) . '/init.php');
$url = 'URL';
$file_name = basename($url);
if(file_put_contents( $file_name,file_get_contents($url))) {
echo "File downloaded successfully";
}
else {
echo "File downloading failed.";
// die('error');
}
echo "\n";
echo "<br>";
$row = 1;
if (($handle = fopen($file_name, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++) {
// skip first ligne (header csv )
if ($row == 1 || $row == 2) {
continue;
}
if(!($data[5] == 'Suk' || $data[5] == 'plus size')){
continue;
}
// get attribut from prestashop database
if(empty($data[9]))
continue;
$productAttribut=findProductAttributByReference($data[9]);
// if product attribut exist
if(!empty($productAttribut)){
echo $productAttribut['id_product_attribute'];
// update quantity
StockAvailable::setQuantity((int)$productAttribut['id_product'],(int)$productAttribut['id_product_attribute'], (int)$data[10], Context::getContext()->shop->id);
echo "product ".$productAttribut['id_product_attribute']." quantity \n updated";
echo "\n";
echo "<br>";
}
}
}
fclose($handle);
echo "\n";
echo "<br>";
echo "end script ";
}
function findProductAttributByReference($reference){
$sql='
SELECT *
FROM `' . _DB_PREFIX_ . 'product_attribute`
WHERE `reference` = "' . $reference.'"'
;
$result = Db::getInstance()->getRow($sql);
return $result;
}
?>
No you can't,
As #martin Paucot said in comment, PHP is blocking. It will run all the instructions one by one. meanwhile, Each instruction will be waiting previous one to be finished.
So, Your main script will be waiting for your imported script to finish its execution.
Possible Way
Use CRON Job to export CSV in background and push a notification once it's job completed. So user will be notified once the job completed and download the file using a link. (You will need to store some records in database)
Here is my Code with filename
it does work without problems if lets say i just use
update.php?pokemon=pikachu
it updates pikachu value in my found.txt +0.0001
But now my problem, when i have multiple threads running and randomly
2 threads are
update.php?pokemon=pikachu
and
update.php?pokemon=zaptos
i see the found.txt file
is empty than!!
so nothing is written in it then anymore.
So i guess its a bug when the php file is opened and another request is posted to the server.
How can i solve this problem this does accour often
found.txt
pikachu:2.2122
arktos:0
zaptos:0
lavados:9.2814
blabla:0
update.php
<?php
$file = "found.txt";
$fh = fopen($file,'r+');
$gotPokemon = $_GET['pokemon'];
$users = '';
while(!feof($fh)) {
$user = explode(':',fgets($fh));
$pokename = trim($user[0]);
$infound = trim($user[1]);
// check for empty indexes
if (!empty($pokename)) {
if ($pokename == $gotPokemon) {
if ($gotPokemon == "Pikachu"){
$infound+=0.0001;
}
if ($gotPokemon == "Arktos"){
$infound+=0.0001;
}
if ($gotPokemon == "Zaptos"){
$infound+=0.0001;
}
if ($gotPokemon == "Lavados"){
$infound+=0.0001;
}
}
$users .= $pokename . ':' . $infound;
$users .= "\r\n";
}
}
file_put_contents('found.txt', $users);
fclose($fh);
?>
I would create an exclusive lock after open the file and then release the lock before closing the file:
For creating an exclusive lock over the file:
flock($fh, LOCK_EX);
To delete it:
flock($fh, LOCK_UN);
Anyway you will need to check if other threads hot already the lock, so the first idea coming up is to try a few attempts to get the lock and if it's not finally possible, to inform the user, throw an exception or whatever other action to avoid an infinite loop:
$fh = fopen("found.txt", "w+");
$attempts = 0;
do {
$attempts++;
if ($attempts > 5) {
// throw exception or return response with http status code = 500
}
if ($attempts != 1) {
sleep(1);
}
} while (!flock($fh, LOCK_EX));
// rest of your code
file_put_contents('found.txt', $users);
flock($fh, LOCK_UN); // release the lock
fclose($fh);
Update
Probably the issue still remains because the reading part, so let's create also a shared lock before start reading and also let's simplify the code:
$file = "found.txt";
$fh = fopen($file,'r+');
$gotPokemon = $_GET['pokemon'];
$users = '';
$wouldblock = true;
// we add a shared lock for reading
$locked = flock($fh, LOCK_SH, $wouldblock); // it will wait if locked ($wouldblock = true)
while(!feof($fh)) {
// your code inside while loop
}
// we add an exclusive lock for writing
flock($fh, LOCK_EX, $wouldblock);
file_put_contents('found.txt', $users);
flock($fh, LOCK_UN); // release the locks
fclose($fh);
Let's see if it works
Let's consider a sample php script which deletes a line by user input:
$DELETE_LINE = $_GET['line'];
$out = array();
$data = #file("foo.txt");
if($data)
{
foreach($data as $line)
if(trim($line) != $DELETE_LINE)
$out[] = $line;
}
$fp = fopen("foo.txt", "w+");
flock($fp, LOCK_EX);
foreach($out as $line)
fwrite($fp, $line);
flock($fp, LOCK_UN);
fclose($fp);
I want to know if some user is currently executing this script and file "foo.txt" is locked, in same time or before completion of its execution, if some other user calls this script, then what will happen?
Will second users process wait for unlocking of file by first users? or line deletion by second users input will fail?
If you try to acquire an exclusive lock while another process has the file locked, your attempt will wait until the file is unlocked. This is the whole point of locking.
See the Linux documentation of flock(), which describes how it works in general across operating systems. PHP uses fcntl() under the hood so NFS shares are generally supported.
There's no timeout. If you want to implement a timeout yourself, you can do something like this:
$count = 0;
$timeout_secs = 10; //number of seconds of timeout
$got_lock = true;
while (!flock($fp, LOCK_EX | LOCK_NB, $wouldblock)) {
if ($wouldblock && $count++ < $timeout_secs) {
sleep(1);
} else {
$got_lock = false;
break;
}
}
if ($got_lock) {
// Do stuff with file
}
After running the command copy($uploadedFile, "pdf/".$fullFileName);, what would be the quickest and most efficient way to verify that the file copied successfully?
This would be enough no?
if (!copy($file, $newfile)) {
echo "failed to copy $file...\n";
}
ref: http://php.net/manual/en/function.copy.php
If you look at the copy function in the PHP documentation, you'll see:
Returns TRUE on success or FALSE on failure.
So, something as simple as:
if(!copy($uploadedFile, "pdf/".$fullFileName)) {
// Failure code
}
Or:
$returnCode = copy($uploadedFile, "pdf/".$fullFileName);
if(!$returnCode) {
// Failure code
}
would be sufficient.
You can compare size whlist copying - if sizes equals we can assume copying is done..
$fs1=$fs='';
$filename = 'test.zip'; // copy from ftp or slow copy..
if (ob_get_level() == 0) ob_start();
for ($i = 0; $i<25; $i++){
echo "<hr> Compare \n";
echo "<br>fs1: $fs1";
$fs='';
$fs = filesize($filename);
echo "<br>fs: $fs";
if ( $i > 0 )
if ( $fs1 === $fs ) break;
$fs1 = $fs;
ob_flush();
flush();
sleep(2);
clearstatcache();
}
echo "<br>Done copying.";
ob_end_flush();
When a user upload a file(users can upload multiple files)
exec('nohup php /main/apache2/work/upload/run.php &');
I am using nohup as the it needs to be executed in the back end.
In my original design run.php scans the directory using scandir everytime it's executed. Get an exclusive lock LOCK_EX on the file using flock and use LOCK_NB to skip the file if it has a lock and go the next one. If a file has a lock //Do logic. The problem is that the server is missing fcntl() library and since flock uses that library to execute the locking mechanism, flock won't work at the moment. It's going to take a month or two to get that installed(I have no control over that).
So my work around for that is have a temporary file lock.txt that acts a lock. If the filename exists in lock.txt skip the file and go to the next one.
$dir = "/main/apache2/work/upload/files/";
$files = scandir($dir);
$fileName = "lock.txt";
for($i=0; $i<count($files); $i++)
{
if(substr(strrchr($files[$i],'.csv'),-4) == '.csv')
{
if($file_handle = fopen("$fileName", "rb"))
{
while(!feof($file_handle))
{
$line = fgets($file_handle);
$line = rtrim($line);
if($line == "")
{
break;
}
else
{
if($files[$i] == $line)
{
echo "Reading from lock: ".$line."</br>";
$i++; //Go to next file
}
}
}
fclose($file_handle);
}
if($i >= count($files))
{
die("$i End of file");
}
if($file_handle = fopen("$fileName", "a+"))
{
if(is_writable($fileName))
{
$write = fputs($file_handle, "$files[$i]"."\n");
//Do logic
//Delete the file name - Stuck here
fclose($file_handle);
}
}
}
else
{
//Do nothing
}
}
How can I delete the filename from lock.txt?
More importantly, is there a better way to lock a file in php without using flock?
Having a shared lock database simply moves the locking problem to that file; it doesn't solve it.
A much better solution is to use one lock file per real file. If you want to lock access to myFile.csv then you check file_exists('myFile.csv.lock') and touch('myFile.csv.lock') if it doesn't exist. And unlink('myFile.csv.lock') when done.
Now, there is a possible race-condition between file_exists() and touch(), which can be mitigated by storing the PID in the file and checking if getmypid() is indeed the process holding the lock.