I am building a PHP application that uses a select menu to build email templates. The templates are broken into reusable parts (each is a separate html file). Is there an easy way to require multiple files with one expression? (my PHP is really rusty...)
Essentially I want to do something like:
function require_multi() {
require_once($File1);
require_once($File2);
require_once($File3);
require_once($File4);
}
Well, you could turn it into a function:
function require_multi($files) {
$files = func_get_args();
foreach($files as $file)
require_once($file);
}
Use like this:
require_multi("one.php", "two.php", ..);
However, if you're including classes, a better solution would be to use autoloading.
Credit to Tom Haigh from how to require all files in a folder?:
$files = glob( $dir . '/*.php' );
foreach ( $files as $file )
require( $file );
Store all your required files in $dir and the above code will do the rest.
EDIT:
Because you want to require or include multiple files, you could use this recursive
algorithm to include files in a specified folder. The folder is the root that starts
the iterator. Because the algorithm is recursive, it will automatically traverse all
subsequent folders and include those files as well.
public function include_all_files($root) {
$d = new RecursiveDirectoryIterator($root);
foreach (new RecursiveIteratorIterator($d) as $file => $f) {
$ext = pathinfo($f, PATHINFO_EXTENSION);
if ($ext == 'php' || $ext == 'inc')
include_once ($file); // or require(), require_once(), include_once()
}
}
include_all_files('./lib');
Related
I'm new to this forum and also new to PHP, I'm building some very basic functions on a test site while I learn a little more about how to use PHP. One of the current project I'm experimenting with is combining two directories of CSV files.
I was hoping to use GLOB as sort of a *wildcard to gather up the files in each directory and then combine them. I know the way I'm using below isn't very memory efficient but this is just to learn with. The issue I'm having is setting the GLOB command to pickup all my CSV files and then getting that variable into a file_get_contents.
Here's my code..
$files = glob("http://www.website.com/1/*.csv");
foreach($files as $filepath) {
if ($handle = fopen($filepath, "r")) {
// ...
}
}
$files2 = glob("http://www.website.com/35/*.csv");
foreach($files2 as $filepath2) {
if ($handle2 = fopen($filepath2, "r")) {
// ...
}
}
file_put_contents('final_data.csv',
file_get_contents($files) .
file_get_contents($files2)
);
When you use Glob the resulting array doesn't contain the base path, so you have to add it, like this:
$basePath = '/path/to/csv';
foreach ($files = glob("$basePath/dir1/*.csv") as $filePath)
{
echo "$basePath/$filePath";
//
}
It would also make sense to read from local path instead of remote URL.
I am trying to list files in subdirectories and write these lists into separate text files.
I managed to get the directory and subdirectory listings and even to write all the files into a text file.
I just don't seem to manage to burst out of loops I am creating. I either end up with a single text file or the second+ files include all preceeding subdirectories content as well.
What I need to achieve is:
dir A/AA/a1.txt,a2.txt >> AA.log
dir A/BB/b1.txt,b2.txt >> BB.log
etc.
Hope this makes sense.
I've found the recursiveDirectoryIterator method as described in PHP SPL RecursiveDirectoryIterator RecursiveIteratorIterator retrieving the full tree being great help. I then use a for and a foreach loop to iterate through the directories, to write the text files, but I cannot break them into multiple files.
Most likely you are not filtering out the directories . and .. .
$maindir=opendir('A');
if (!$maindir) die('Cant open directory A');
while (true) {
$dir=readdir($maindir);
if (!$dir) break;
if ($dir=='.') continue;
if ($dir=='..') continue;
if (!is_dir("A/$dir")) continue;
$subdir=opendir("A/$dir");
if (!$subdir) continue;
$fd=fopen("$dir.log",'wb');
if (!$fd) continue;
while (true) {
$file=readdir($subdir);
if (!$file) break;
if (!is_file($file)) continue;
fwrite($fd,file_get_contents("A/$dir/$file");
}
fclose($fd);
}
I thought I'd demonstrate a different way, as this seems like a nice place to use glob.
// Where to start recursing, no trailing slash
$start_folder = './test';
// Where to output files
$output_folder = $start_folder;
chdir($start_folder);
function glob_each_dir ($start_folder, $callback) {
$search_pattern = $start_folder . DIRECTORY_SEPARATOR . '*';
// Get just the folders in an array
$folders = glob($search_pattern, GLOB_ONLYDIR);
// Get just the files: there isn't an ONLYFILES option yet so just diff the
// entire folder contents against the previous array of folders
$files = array_diff(glob($search_pattern), $folders);
// Apply the callback function to the array of files
$callback($start_folder, $files);
if (!empty($folders)) {
// Call this function for every folder found
foreach ($folders as $folder) {
glob_each_dir($folder, $callback);
}
}
}
glob_each_dir('.', function ($folder_name, Array $filelist) {
// Generate a filename from the folder, changing / or \ into _
$output_filename = $_GLOBALS['output_folder']
. trim(strtr(str_replace(__DIR__, '', realpath($folder_name)), DIRECTORY_SEPARATOR, '_'), '_')
. '.txt';
file_put_contents($output_filename, implode(PHP_EOL, $filelist));
});
I have the following code snippet. I'm trying to list all the files in a directory and make them available for users to download. This script works fine with directories that don't have sub-directories, but if I wanted to get the files in a sub-directory, it doesn't work. It only lists the directory name. I'm not sure why the is_dir is failing on me... I'm a bit baffled on that. I'm sure that there is a better way to list all the files recursively, so I'm open to any suggestions!
function getLinks ($folderName, $folderID) {
$fileArray = array();
foreach (new DirectoryIterator(<some base directory> . $folderName) as $file) {
//if its not "." or ".." continue
if (!$file->isDot()) {
if (is_dir($file)) {
$tempArray = getLinks($file . "/", $folderID);
array_merge($fileArray, $tempArray);
} else {
$fileName = $file->getFilename();
$url = getDownloadLink($folderID, $fileName);
$fileArray[] = $url;
}
}
}
Instead of using DirectoryIterator, you can use RecursiveDirectoryIterator, which provides functionality for iterating over a file structure recursively. Example from documentation:
$path = realpath('/etc');
$objects = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST);
foreach($objects as $name => $object){
echo "$name\n";
}
This prints a list of all files and
directories under $path (including
$path ifself). If you want to omit
directories, remove the
RecursiveIteratorIterator::SELF_FIRST
part.
You should use RecursiveDirectoryIterator, but you might also want to consider using the Finder component from Symfony2. It allows for easy on the fly filtering (by size, date, ..), including dirs or files, excluding dirs or dot-files, etc. Look at the docblocks inside the Finder.php file for instructions.
I am using a lot of include files to include my 191 files of with the collection of functions, classes ect.
A problem for me is I really dislike editing the include files its gets a big mess and sometime i just forget to include something.
Therefore I was wondering, is there a include function for php or own made library that includes all the php files in a folder or even better in its own folder + all its sub-folders.
These things make life much easyer and flexible.
If you have a standard between the Class Names and the files you can use the __autoload() function. It will save you a lot of includes.
http://php.net/manual/en/language.oop5.autoload.php
You can use the following function i just created:
function load_folder($folder, $ext = '.php') {
foreach (glob("$folder*$ext") as $file) {
if (file_exists($file)) {
require_once($file);
}
}
}
START EDIT
This is the new version of the same function. Now it allows you to specify folders as folder or folder/ without crashing. Also now it loads all files in all folders and subfolders.
function load_folder($dir, $ext = '.php') {
if (substr($dir, -1) != '/') { $dir = "$dir/"; }
if($dh = opendir($dir)) {
$files = array();
$inner_files = array();
while($file = readdir($dh)) {
if($file != "." and $file != ".." and $file[0] != '.') {
if(is_dir($dir . $file)) {
$inner_files = load_folder($dir . $file);
if(is_array($inner_files)) $files = array_merge($files, $inner_files);
} else {
array_push($files, $dir . $file);
}
}
}
closedir($dh);
foreach ($files as $file) {
if (is_file($file) and file_exists($file)) {
$lenght = strlen($ext);
if (substr($file, -$lenght) == $ext) { require_once($file); }
}
}
}
}
END EDIT
You can also specify a specific extension if you want to load for example only .txt files in a folder you can execute is like this: load_folder('folder/', '.txt');.
Remember that someone think that this is somehow insecure. Before using this function inside a business site, look for more opinion about the topic.
Notice also that if some of your files are regarding classes you could use the __autoload() PHP native function to let PHP call the class where it is really needed (lazy loading).
References:
Autoloading classes
You could just have a 'meta-include' file which has the individual include statements, and then you only include that one single file in your scripts.
Of course, the auto-loading versions in the other answers here would be more efficient. While PHP's pretty fast at loading/parsing, 191 individual files to load for every request would add up pretty quick.
I usually require as first application_top.php with this:
...
function requireAll($folder){
// open the folder
$libs = opendir($folder);
// loop inside to include each file, excluding windows default 'meta-link' . and ..
while ($lib = readdir($libs)) {
if ($lib != "." && $lib != "..")
// require_once to be sure to require only one time
require_once $folder . $lib;
}
// close the dir for cleaning stuff
closedir($libs);
}
//Require all helpers
requireAll(DIR_HELPERS);
//Require all model classes
requireAll(DIR_MODEL);
//Require all mappers
requireAll(DIR_MAPPERS);
...
I'm not sure how simple this would be, but I'm using a script which displays the files from a specific folder, however I'd like them to be displayed in alphabetical order, would it be hard to do this? Here's the code I'm using:
if ($handle = opendir($mainframe->getCfg( 'absolute_path' ) ."/images/store/")) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
if (($file != "index.html")&&($file != "index.php")&&($file != "Thumbs.db")) {
$strExt = end(explode(".", $file));
if ($strExt == 'jpg') {
$Link = 'index.php?option=com_shop&task=deleteFile&file[]='.$file;
$thelist .= '<tr class="row0"><td nowrap="nowrap">'.$file.'</td>'."\n";
$thelist .= '<td align="center" class="order"><img src="/administrator/images/publish_x.png" width="16" height="16" alt="delete"></td></tr>'."\n";
}
}
}
}
closedir($handle);
}
echo $thelist;
:)
Instead of using readdir you could simply use scandir (documentation) which sorts alphabetically by default.
The return value of scandir is an array instead of a string, so your code would have to be adjusted slightly, to iterate over the array instead of checking for the final null return value. Also, scandir takes a string with the directory path instead of a file handle as input, the new version would look something like this:
foreach(scandir($mainframe->getCfg( 'absolute_path' ) ."/images/store/") as $file) {
// rest of the loop could remain unchanged
}
That code looks pretty messy. You can separate the directory traversing logic with the presentation. A much more concise version (in my opinion):
<?php
// Head of page
$it = new DirectoryIterator($mainframe->getCfg('absolute_path') . '/images/store/'));
foreach ($it as $file) {
if (preg_match('#\.jpe?g$#', $file->getFilename()))
$files[] = $file->getFilename();
}
sort($files);
// Further down
foreach ($files as $file)
// display links to delete file.
?>
You don't even need to worry about opening or closing the handle, and since you're checking the filename with a regular expression, you don't need any of the explode or conditional checks.
I like Glob
It makes directory reading a snap as it returns an array that's easily sortable:
<?php
$files = glob("*.txt");
sort($files);
foreach ($files as $filename) {
echo "$filename size " . filesize($filename) . "\n";
}
?>
If you're using Joomla1.5 you should be using the defined constant JPATH_BASE instead of
$mainframe->getCfg( 'absolute_path' )
If this is a Joomla extension that you will distribute, don't use scandir() as it is PHP5 only.
The best thing to do is to use the Joomla API. It has a classes for directory and file access that is layered to do this over different networks and protocols. So the file system can be over FTP for example, and the classes can be extended for any network/protocol.
jimport( 'joomla.filesystem.folder' );
$files = JFolder::files(JPATH_BASE."/images/store/");
sort($files);
foreach($files as $file) {
// do your filtering and other task
}
You can also pass a regular expression as the second parameter to JFolder::files() that filters the files you receive.
You also don't want to use URL literals like /administrator/ since they can be changed.
use the JURI methods like:
JURI::base();
If you want to make sure of the Joomla CSS classes in the tables, for:
'<tr class="row0">'
use:
'<tr class="row'.($i&1).'">'
where $i is the number of iterations. This gives you a sequence of alternating 0s and 1s.
if we have PHP built in functions, always use it, they are faster.
use glob instead of traversing folders, if it fits for your needs.
$folder_names = array();
$folder_names = glob( '*', GLOB_ONLYDIR + GLOB_MARK + GLOB_NOSORT );
returs everything in the current directory, use chdir() before calling it
remove the GLOB_ONLYDIR to include files too ( . would be only files )
GLOB_MARK is for adding a slash to folders names
Remove GLOB_NOSORT not to sort the array