I'm on Windows and looking for the location of the cache generated by FileSystemAdapter. I thought it would be in var/cache of the application directory but it doesn't look like as when I clear it, it's still using the cache.
Any idea where it could be?
Filesystem Cache Adapter
use Symfony\Component\Cache\Adapter\FilesystemAdapter;
$cache = new FilesystemAdapter(
// the subdirectory of the main cache directory where cache items are stored
$namespace = '',
// in seconds; applied to cache items that don't define their own lifetime
// 0 means to store the cache items indefinitely (i.e. until the files are deleted)
$defaultLifetime = 0,
// the main cache directory (the application needs read-write permissions on it)
// if none is specified, a directory is created inside the system temporary directory
$directory = null
);
note: if none is specified, a directory is created inside the system
temporary directory.
Also see: Removing Cache Items
Checking in the source code of the class, the class use this trait and if you don't specify a directory, it will use the function sys_get_temp_dir that return directory path used for temporary files.
For a Windows-based system could be:
C:\Windows\Temp
Hope this help
Related
I have created a simple cache mechanism through which all dynamic pages will get stored as an html file in the root directory and that html file would be loaded if it exists.
I also wanted to create a mechanism to delete the cache files if a php file is loaded.
I looked around tried somethings but I can't get to delete the files named in the fashion 'cache-variable.php'
I want to delete all such files which fall in this pattern in the root directory of the domain. How do I do that? Any tip is most welcome.
I got the answer!
<?php array_map('unlink', glob("cache-*.php")); ?>
This can be loaded through a password mechanism to effect cache clearing.
Source: http://php.net/manual/en/function.unlink.php
I've been using laravel for nearly all my projects and I've came across a problem when trying to list all files in a directory.
Here is my code below:
$directory = ("upload/"."$username->username");
$scanned_directory = storage::allfiles($directory);
So as you can see First line is the directory which is in Upload followed by the username of the account.
The second line is meant to scan this directory and list only files. Sadly it always gives me back an empty array. Using Scandir() here works... But it gives me folders which I'm not wanting.
So... Any tips? I checked the API and I really can't see what is wrong here :(
The reason you're getting no results when using the Storage::allFiles('directory') method is this method only looks inside of your application's storage folder, usually storage/app. Hence why modifying the storage path in configuration works as mentioned by #yassine (you shouldn't do this).
If you wish to use files outside of the storage folder, simply use the File facade instead. e.g. File::allFiles('directory')
Me too i had the same issue for local storage. I fixed it by :
project/config/filesystems.php
line 44
changed root folder to from storage_path('app') to changed to
base_path()
then I provided a related path to the root folder in Storage::files() or Storage::allfiles().
Like Storage::allfiles("app") if I want to fetch files withing the sub folder "app" relative to the base path given in the config file.
Good luck
$direcotry has to be an absolute path from your laravel root folder.
When your uploaded folder is in the root folder try this
$directory = base_path("upload/"."$username->username");
$scanned_directory = storage::allfiles($directory);
My development and production sites written in php both need to use directory iterator in order to get at some files. Directory iterator starts at the base directory of the drive i.e c:/. However, on the dev and prod servers the webroot folder is located in a different place.
Is there a way I can get directory iterator to start at the webroot. Or some similar method I can use so that I can use the same code on dev and prod without having to worry about where on the disk the application is stored.
A DirectoryIterator is instantiated with a $path. Just change that to the webroot.
DirectoryIterator::__construct() ( string $path )
path: The path of the directory to traverse.
You can store the path to the webroot in a config file per environment or determine it at runtime and save it as a constant or in a registry or other accessible place during bootstrap. For instance, if you call your direct access to you application through a FrontController that resides in index.php in the webroot, you can do:
$root = dirname(__FILENAME__);
and store that in a constant or a Registry or something.
Starting the path with getenv("DOCUMENT_ROOT") works
I have a folder named "repository" in my admin folders. This folder holds 2 files: index.html and content.php. When a user creates a new page the php creates a new folder specified by the user then will need to copy the two files into that folder while leaving them in the repository.
copy(file,dest) does not work.
rename(file,dest) moves the files to the new folder but I lose them in the repository.
How do I copy the files in one folder to the new folder without losing the files in the original folder?
$dest = '../'.$menuLocation.'/'.$pageName;
$file1= "repository/index.html";
$file2= "repository/content.php";
mkdir($dest,0777);
rename($file1,$dest.'/index.html');
rename($file2,$dest.'/content.php');
$menuLocation and $pageName are supplied by the user. The files are there, file_exists returns back true. Also, the directory is created with no issues. rename() also works I just lose the files in repository.
For anyone hunting for the solution to this:
when using copy() in php you also need to include the file name.
copy(orginalfile,destination_with_filename);
for example:
wrong:
copy('/temp/sports/basketball.jpg','/images/sports/')
Correct:
copy('/temp/sports/basketball.jpg','/images/sports/basketball.jpg')
Use copy(). Make sure you capture the return value so that your program knows if it worked or not. Also check permission of the files and directory to confirm that the user executing the PHP script is able to create the new file in the place you specified. You might want use is_writable() on the directory to confirm these assumptions.
I'm writing a PHP function that will delete all the files in the directory that's passed to it (and eventually, possibly expand it to a recursive delete). To be safe, I want to make sure that, through a bug or something, I don't try to delete anything if the directory passed in is the root directory.
File permissions should protect me to a large extent, but just in case, and especially if I expand it to a recursive delete I just want to take that extra step.
As a complicating factor, this code may be run in a windows machine or a linux machine, so the root directory may look like 'C:\' or '/'. I assume there are other ways that really refer to the root as well, possibly 'c:\temp..'
So, is there a reliable way in PHP to recognize that a dir spec resolves to the root of the file system?
Elaboration...
I'm writing PHPUnit tests for a web app and I'm trying to create a framework where the state of the app is backed up before the tests are run and restored afterwards. The app allows users to upload files. Depending on what the file is it is copied to one of several possible directories.
To save and restore the state of the app those directories need to be copied somewhere, the tests run, then the directories need to have their files deleted and retreived from the backup.
The location of these directories can vary from one machine to another and I know that some people put them outside of the web app. There is a configuration file that can be read by the test that gives the location of those directories for the given machine.
If I don't restrict all these directories to a specific dir tree it's difficult to do the jailing. If I do restrict these directories to a specific dir tree then some people will have to reconfigure their machines.
You should have a defined root folder, which you never go above, a.k.a. jailing. The root folder is not the only folder where severe damage can be done.
Edit.
Although I still advocate using some sort of jailing, I suppose you could recognize the root folder by stripping out any drive-letters and translating \ to /. The root folder would then always be a single /.
function isRootFolder($dirpath) {
list($drive, $path) = explode(':', str_replace('\\', '/', $dirpath), 2);
return $path == '/';
}
Try, this function:
function is_root_dir($path)
{
$clean_path = realpath($path);
if($clean_path == '/' || preg_match('/[a-z]:\\/i', $clean_path))
{
return true;
}
else
{
return false;
}
}
It's not tested, I just wrote it in the editor here. realpath() resolves the path, folowing simbolic links and resolving stuff like: c:\temp.. == c:\
Edit: In the end you should folow the advice that nikc gave you, define a list of directories that are safe to delete.
I use this:
if (dirname($target)==$target) { // you're at the root dir
(is portable between Microsoft and everything else)
C.