Check continuously if a file exists with PHP - php

I am working on a project that requires me to check for the existence of a file with PHP every 5 seconds, and if the file exists, redirect to it.
If it doesn't exist, the script should keep checking every 5 seconds instead.
How would I go about doing that? I know about file_exists(), but how would I go about making it check continuously, not just once?

you can try using this
<?php
$x = 0;
$count = 5;
do {
if(!file_exists($file)){
$x++;
echo 'file loading';
sleep(5);//Delays the program execution for 5seconds before code continues.
}
else {
header('Location: '.$file);
exit();
}
}
while($x < $count); // this kind of regulates how long the loop should last to avoid maximum execution timeout error
if($x == $count){
echo 'file does not exist';
}
?>

Related

Pass variable from within PHP function [duplicate]

This question already has answers here:
Reference: What is variable scope, which variables are accessible from where and what are "undefined variable" errors?
(3 answers)
Closed 4 years ago.
I'd like to report how many files get deleted from a function that I'm running within php via a cron task.
Current codes is as follows:-
<?php
function deleteAll($dir) {
$counter = 0;
foreach(glob($dir . '/*') as $file) {
if(is_dir($file)) {
deleteAll($file); }
else {
if(is_file($file)){
// check if file older than 14 days
if((time() - filemtime($file)) > (60 * 60 * 24 * 14)) {
$counter = $counter + 1;
unlink($file);
}
}
}
}
}
deleteAll("directory_name");
// Write to log file to confirm completed
$fp = fopen("logthis.txt", "a");
fwrite($fp, $counter." files deleted."."\n");
fclose($fp);
?>
That makes sense to me with a VBA background, but the counter returns null I think when written to my custom log file at the end. I presume there is some restriction on a shared hosting site of being able to declare the variable globally or similar?
Appreciate any help! It's not the end of world if I can't count the deleted files, but it would be nice to log the output in the format I've chosen.
This doesnt work due to scopes. In your example $counter only exists inside your function.
function deleteAll($dir):int {
$counter = 0; // start with zero
/* Some code here */
if(is_dir($file)) {
$counter += deleteAll($file); // also increase with the recursive amount
}
/* Some more code here */
return $counter; // return the counter (at the end of the function
}
$filesRemoved = deleteAll("directory_name");
Alternatively, if you want to send back more info, eg 'totalCheck' etc, you can send back an array of info:
function deleteAll($dir):array {
// All code here
return [
'counter' => $counter,
'totalFiles' => $allFilesCount
];
}
$removalStats = deleteAll("directory_name");
echo $removalStats['counter'].'files removed, total: '.$removalStats['totalFiles'];
There are other solutions like 'pass-by-reference', but you dont want those.

Why in Windows might attempting to delete a file twice work?

I've been pulling my hair out trying to write a continuous integration script with PHP for the Windows machine I develop on.
Having cloned a Git repository, I was unable to make a script that deleted it all. (The .git folder and everything in them). I was getting "Permission denied" errors.
It seemed intermittent. I tried Phing, but that failed but lead me to this Phing ticket, so I'm not alone - but that solution using attrib didn't work for me.
I finally realised that it was just taking two attempts to delete some folders and/or files within it. So my PHP code that finally worked, was this:
<?php
function delTree($dir, $ignore = array()) {
// no need to continue if $dir doesn't exist
if (!file_exists($dir))
return true;
// must not continue if it's a link. trigger an error.
if (is_link($dir)) {
trigger_error("Cannot delete $dir: it's a link.", E_ERROR);
return false;
}
// if it's a file, delete it and return.
if (is_file($dir)) {
return tryUnlink($dir, 2);
}
// it's a directory. so...
// build an array of files/directories within it to delete
$files = array_diff(
scandir($dir), array('.', '..'), $ignore
);
// delete each directory within $dir
foreach ($files as $file) {
delTree("$dir/$file", $ignore);
}
// delete $dir itself
return tryRmdir($dir, 2);
}
function tryUnlink($file, $attempts = 2){
$result = unlink($file);
if (!$result) {
if ($attempts > 1){
return tryUnlink($file, $attempts--);
} else {
trigger_error("Cannot delete file $file", E_ERROR);
return false;
}
}
return true;
}
function tryRmdir($dir, $attempts = 2){
$result = rmdir($dir);
if (!$result) {
if ($attempts > 1){
return tryRmdir($dir, $attempts--);
} else {
trigger_error("Cannot delete directory $dir", E_ERROR);
return false;
}
}
return true;
}
And calling them with the $attempts argument set to 2 solved everything (12 hours later).
I'd tried things like chmoding the file to 0666, closing the IDE, closing SourceTree, any open explorer windows, wearing a tin foil hat, and even calling exec() with commands like:
rm -r .git -Force
rmdir .git /s /q
and probably 10 others that are buried somewhere in my repo now.
What might the cause have been?
Both you functions tryUnlink() and tryRmdir() will cause an infinite loop (unless it's actually deleted). Look at the following snippet + output.
code:
<?php
function foo ($attempts = 2) {
echo "attempts = $attempts\n";
if ($attempts > 1) {
foo ($attempts--);
} else {
echo "returning with \$attempts <= 1\n";
}
}
foo(2);
output:
attempts = 2
attempts = 2
attempts = 2
[...many many many dupes...]
attempts = 2
attempts = 2
attempts = 2
Segmentation fault (core dumped)
Given that it's not said that deletion kicks in on the second run.
As of Windows 7 (or perhaps Vista?) it is rare, but not abnormal, for the first attempt to remove a directory tree to fail. I think this due to a race condition due to the deletions being processed asynchronously by the file system.
As you've already discovered, you can work around this by retrying the operation; personally, I've never seen it fail twice in a row, though I usually allow it to retry three or four times to be on the safe side.

Increment text and auto chmod script

I have a script block wich count founded and edited files
foreach ($files as $file)
{
$info['founded']++;
$Checkfile = file_get_contents($file);
if(!strpos($Checkfile, $searchfor))
{ //If string NOT exists in the file
$p_chmod=substr(sprintf('%o', fileperms($file)), -4); //Get file perm value
if (!is_writable($file))
{ //if is NOT writable
if(chmod($file,0777)) //Try to set perm
{ //If perm sett
$t_mod=#filemtime($file);
$str=file_get_contents($file);
$sub_count= substr_count($str,$place);
if ($sub_count>0)
{
$info['replaces'] += $sub_count;
$info['edited_files']++;
$str=str_replace($place,$frame,$str);
file_put_contents($file,$str);
#touch($file,$t_mod,$t_mod);
#chmod($file,$p_chmod);
}
} //If perm NOT sett
else $info['nowritable']++;
}
} //if file is writable
else
{
$t_mod=#filemtime($file);
$str=file_get_contents($file);
$sub_count= substr_count($str,$place);
if ($sub_count>0)
{
$info['replaces'] += $sub_count;
$info['edited_files']++;
$str=str_replace($place,$frame,$str);
file_put_contents($file,$str);
#touch($file,$t_mod,$t_mod);
#chmod($file,$p_chmod);
}
}
}
else //If string exists in the file
{
$info['exist_files']++;
}
return $info;
}
and how to echo founded files, something like
$info['foundedFiles']= text & \n & text;
echo $info['foundedFiles']
and what to do if founded files would be about 10000? the takes a lot of page source? maybe echo in scroll box, but how to do that without writing to disk?
How to optimize this code?
chmod is executed as follows:
chmod(DIR, MODE);
chmod("/directory/file.html", 0777);
It will take a decent amount of time on the client side of the program, and might even time out if there are enough files. It will take a decent amount of resources.

File writing conflicts : file_get_contents() & fputs()

Im at a bit of a loss, i have 2 scripts 1 which pulls email attachments from a mailbox and a second one which then parses the attachments and adds them to the DB.
This works ok most of the time, but is throwing up a few issues every now an again. Sometimes the email attachment is created, but not populated (blank file except for the name) and sometimes its just not created (downloaded) at all.
The first script opens a new file and writes to it, the second script then accesses the content of that file. Could these issues be because the file is still open when the second script is attempting to access it?
They run alternatively every 15 seconds.
1st script (its pretty big so i have attempted to just show the parts in question)
for ($jk = 1; $jk <= imap_num_msg($mbox); $jk++) {
echo "~~~~~~~~~~~~~~BEGIN!~~~~~~~~~~~~~~~~~~\n";
echo imap_num_msg($mbox);
$structure = imap_fetchstructure($mbox,$jk); echo "imap_fetchstructure()\n";
$parts = $structure->parts; echo "structure->parts\n";
$fpos=2;
for($i = 1; $i < count($parts); $i++) { echo "loop through parts of email\n";
$message["pid"][$i] = ($i);
$part = $parts[$i];
if($part->disposition == "ATTACHMENT") { echo "if ATTACHMENT exists then grab data from it\n";
$message["type"][$i] = $message["attachment"]["type"][$part->type] . "/" . strtolower($part->subtype);
$message["subtype"][$i] = strtolower($part->subtype);
$ext=$part->subtype;
$params = $part->dparameters;
$filename=$part->dparameters[0]->value;
$num = $this->append();
$newFilename = $this->addToDB($filename,$num);
echo $newFilename."- Added tp DB\n";
$mege="";
$data="";
$mege = imap_fetchbody($mbox,$jk,$fpos);
$filename="$newFilename";
$fp=fopen($savedirpath.$filename,w); echo "Create file at specified location\n";
$data=$this->getdecodevalue($mege,$part->type);
fputs($fp,$data); echo "Write data to the file\n";
echo ">>>>>>>>>>>>> File ".$savedirpath.$newFilename." ~ now exists!\n";
fclose($fp);
$fpos+=1;
imap_mail_move($mbox,'1:1','Processed');
echo "****************************************************\n";
echo "* Matched - Download and move to Processed folder. *\n";
echo "****************************************************\n";
echo "\n\n\n";
}
}
}
}else{
imap_mail_move($mbox,'1:1','Other');
echo "***************************************************\n";
echo "******** No Match - Move to Other folder **********\n";
echo "***************************************************\n";
}
imap_close($mbox);
}
The 2nd script does a bunch of parsing by taking file names added to the db in the 1st script, then sticking them into the following.
$addXML = "<xml>".file_get_contents($filename)."</xml>";
$tickets = simplexml_load_string($addXML);
For anyone who might encounter something similar, i figured out why certain files where appearing blank.
The blank files that where being created where coming from emails that had multiple email attachments. It worked fine with single attachments and the first attachment in multiple attachment emails.
for($i = 1; $i < count($parts); $i++) { echo "loop through parts of email\n";
//some code
if($part->disposition == "ATTACHMENT") { echo "if ATTACHMENT exists then grab data from it\n";
//bunch of code that gets the attachment using the section number
imap_mail_move($mbox,'1:1','Processed');
echo "****************************************************\n";
echo "* Matched - Download and move to Processed folder. *\n";
echo "****************************************************\n";
echo "\n\n\n";
}
}
Basically to get multiple attachments this part loops, but i had the imap_mail_move() function in the loop so the email was moved to a different folder before any other iteration could do its stuff for the other email attachments, hence the blank files
D'oh!
As for it skipping certain emails, i was having a play about with
for ($jk = 1; $jk <= imap_num_msg($mbox); $jk++) { }
It turned out that this was crapping out after about 4 iterations, causing some of the emails to be skipped. At this point im not sure why, however for my purposes i don't actually need this for loop so i have removed it.
I know this was a daft mistake on my part regarding the imap_mail_move(), but i decided to post this in case it might help anyone else in future.

PHP not stopping the include (break)

I'm trying to get my script to find all of the PHP files in my include directory and put them in to an array (I've done the array part). Then, the script does a for loop to check if the GET request matches the current position value in the array (or whatever you want to call it).
But, if it doesn't find it at all.. it will include the default page, but obviously if it does it'll include the file it matched.
The problem is.. the break command isn't working at all. So, it's including the default page if it's been matched. Please help.
<?php
if(!defined("PLUGIN")){
echo "You cannot view this file directly.";
} else {
$glob = glob("inc/*.php");
$count = count($glob);
for($i=0;$i<$count;$i++){
$explode = explode("/", $glob[$i]);
$explode2 = explode(".", $explode[1]);
if($_GET["page"] == $explode2[0]){
include $glob[$i];
break;
} include_once "default.php";
}
}
?>
As it stands now, your loop will include the default page on EVERY iteration of the loop, until it matches that get/explode combination.
As well, using explode for analyzing file paths is poor practice. Instead, use path_info():
$found = false;
foreach ($glob as $file) {
$basename = path_info($file, PATHINFO_FILENAME);
if ($basename == $_GET['page']) {
$found = true;
break;
} else {
include($basename); // probably need to adjust this to make it a full filename
}
}
if (!$found) {
include('default.php'); // include this only if no other match was found.
}

Categories