I'm making a Wordpress theme, basically for portfolios.
Making plugins into the theme is bad, since changing themes can become a problem for the user and yourself if you do so. So I'm cooking up some script, that takes a plugins folder I made in my theme, which has the plugins that I would have built into the theme, but I'm making them install themselves when you select my theme. So these plugins will be updatable through the dashboard, and auto installed (if not already installed), into the site. Good idea no? (I got it from a forums post, but I dont think its been done as far as I know).
So I have a plugins folder in my theme, which has the plugins I want to auto install. I want to copy the plugins(single files or directories) into the wp-content/plugins folder and then install/activate them.
The problem is when I try to copy, it gives an error
Warning: copy(http://127.0.0.1/inside-theme/wordpress/wp-content/plugins): failed to open stream: HTTP wrapper does not support writeable connections in C:\**path-to-www-**\www\inside-theme\wordpress\wp-content\themes\Inside Theme\header.php on line 105
If you're wondering about why it's in header.php, I'm just doing this for testing purposes to see if it copies. I will put it in a hook after.
Here is my code I'm using to copy the plugins,
$dir = get_template_directory() . '/plugins/'; // the plugins folder in the theme
$plugins_in_theme = scandir($dir); // $dir's contents
$plugins_dir = plugins_url(); // url to the wp-content/plugins/
print_r($plugins_in_theme); // just to check the output, not important
foreach ($plugins_in_theme as $plugin) {
if ($plugin != '.' || '..') {
if (!file_exists($plugins_dir . $plugin)) {
if (is_dir($plugin)) {
recurse_copy($dir . $plugin, $plugins_dir);
} else {
copy($dir . $plugin, $plugins_dir);
}
}
}
}
recurse_copy() is a function I picked up off another stackoverflow question for copying directories since copy() only copies files, not folders. Also note that, it gives multiple errors, with the functions.php of my theme mentioned in most errors, which is where I put the recursive_copy() function. (Is that ok? It's my first theme..)
function recurse_copy($src,$dst) { //for copying directories
$dir = opendir($src);
#mkdir($dst);
while(false !== ( $file = readdir($dir)) ) {
if (( $file != '.' ) && ( $file != '..' )) {
if ( is_dir($src . '/' . $file) ) {
recurse_copy($src . '/' . $file,$dst . '/' . $file);
}
else {
copy($src . '/' . $file,$dst . '/' . $file);
}
}
}
closedir($dir);
}
So how can I remove this error and get it to work?
Extra details,
I'm using windows xp and I'm using the 'handcrafted wp' parent theme, I AM RUNNING THIS LOCALLY. (on local host)
Hope I was clear.
You are using $plugins_dir = plugins_url(); in the code which returns http://yoursite/wp-content/plugins/ However, copy function works with directories not URLs, so it's better to use, ABSPATH, dirname( __FILE__ ) . '/blablabla.php' and other functions returning directories not URLs. Click here to learn more about PHP copy function
Related
How to detect if a folder is having a WordPress installation or not ?
WP-CLI does that and gives the error This does not seem to be a WP installation, and detects correctly if the directory has a WP install even if it is in any of the subfolders (wp-includes/images or wp-includes/js ) .
I went through the code and it searches for index.php and compares content with the original index.php . One more thing it does is to check for the presence of wp-includes/version.php . Got the idea but how it works on subfolders like those mentioned above is still not clear . Do anybody have any idea on how to do this ? Thanks in advance .
Look for the wp-config.php file. If you find it, require it, then try to use its constants DB_HOST, DB_USER, DB_PASSWORD and DB_NAME to connect to the WordPress database associated with the WordPress instance. If that works, you very likely have a working WordPress instance.
If your current working directory doesn't have wp-config.php look at parent directories recursively until you (a) find it or (b) come to the top level directory.
wp-cli does more elaborate things. But this should work for you.
So i have scribbled a script that detects a WP install . It works as expected inside a wp install folder , but if it is not inside a WordPress install it executes an infinite loop . Can somebody guide on how to stop at top level directory as mentioned by #O.Jones ? Here is my code .
<?php
function get_wp_index($dir=null) {
if(is_null($dir)){
$dir = getcwd();
}
echo "Currently Looking \n";
echo $dir;
$name = $dir.DIRECTORY_SEPARATOR."index.php";
if ( file_exists( $name ) ) {
$index_code = (file_get_contents($name));
if ( preg_match( '|^\s*require\s*\(?\s*(.+?)/wp-blog-header\.php([\'"])|m', $index_code, $matches ) ) {
echo "Is a WP Install";
return;
} else {
echo "Has index File but not one with wp-blog-header";
echo "\n\n";
//Go one directory up
$up_path = realpath($dir. DIRECTORY_SEPARATOR . '..');
get_wp_index($up_path);
}
} else {
echo 'No Index File Found';
echo "\n\n";
//Go one directory up
$up_path = realpath($dir. DIRECTORY_SEPARATOR . '..');
echo $up_path;
get_wp_index($up_path);
}
}
get_wp_index();
?>
I am working on a website of a client for which I didn't write the code. I have troubles making files downloadable.
It is about a subdomain where users can download course files.
The website files are contained in the folder "courses" (on the root level).
The file for displaying the downloadable course files is contained in
"courses/displayfiles.php".
The downloadable files are contained in a folder in "courses/downloadfolder". Inside this folder, each user has his own
files folder which as its name has the user id.
displayfiles.php: The following code successfully displays all files that can be downloaded by the logged-in user:
$path = "downloadfolder/" . $_SESSION['userId'] . "/";
$files = array();
$output = #opendir($path) or die("$path could not be found");
while ($file = readdir($output)) {
if (($file != "..") and ($file != ".")) {
array_push($files, $file);
}
}
closedir($output);
sort($files);
foreach ($files as $file) {
echo '<a class="imtext" href="downloadfolder/' . $_SESSION['userId'] . '/' . $file . '/">' . $file . '</a><br/>';
}
So what does not work about this code: When a user clicks on a file, I get a "404 Not Found" message that the file was not found. How can this be?
Why does displaying the files totally works fine, but at the same time I get a 404 error when clicking a file? The files path ($path) must be correct, or not? What further investigations do I need to take in order to solve this problem?
* UPDATE *
I decided to modify the files loop as followed (changing the href):
foreach ($files as $file) {
echo '<a class="imtext" href="http://'.$_SERVER['HTTP_HOST']. '/downloadfolder/' . $_SESSION['courseId'] . '/' . $file . '/">' . $file . '</a><br/>';
}
Still, when I click on a file, I get a 404 Not Found error. How can this be?
You have to look where the webroot of your page is, where the php file generating the list is located and wherer the files are.
Your generated link is relative to the php file generating the link, which might not be corresponding to the URL in the browser. I'd try to make this link relative to the webroot (note the leading slash!)
echo '<a class="imtext" href="/courses/downloadfolder/' . $_SESSION['userId'] . '/' . $file . '/">' . $file . '</a><br/>';
If that guessed solution doesn't work please provide the current URL of the page where this links are generated and one generated link, so we can help you better.
I'm trying to recursively list every file that is in my bucket. It's not too many files but I'd like to list them to test a few things. This code works on a normal file system but it's not working on Google Cloud Storage.
Anyone have any suggestions?
function recurse_look($src) {
$dir = opendir($src);
while(false !== ( $file = readdir($dir)) ) {
if (( $file != '.' ) && ( $file != '..' )) {
if ( is_dir($src . '/' . $file) ) {
recurse_look($src . '/' . $file);
}
else {
echo $src . '/' . $file;
echo "<br />";
}
}
}
closedir($dir);
}
recurse_look("gs://<BUCKET>");
Personally, I would recommend not using a filesystem-impersonation abstraction layer on top of Google Cloud Storage, for a task such as listing everything inside a bucket -- rather, just reach out for the underlying functionality.
In particular, see https://cloud.google.com/storage/docs/json_api/v1/json-api-php-samples for everything about authentication etc, and, once, that's taken care of, focus on just one line in the example:
$objects = $storageService->objects->listObjects(DEFAULT_BUCKET);
This is all you need to list all objects in a bucket (which is not the same thing as "files in a directory", and the "filesystem simulations" on top of buckets and objects, I offer as being just my personal opinion, end up hurting rather than helping despite their excellent intentions:-).
Now if the objects' names contain e.g slashes and you want to take that into account as symbolically signifying something or other, go right ahead, but at least this way you're sure you're getting all the objects actually existing in the bucket, and, nothing but those!-)
Now that glob is working, you can try something like this
function lstree($dir) {
foreach (glob($dir . '/*') as $path) {
if (is_dir($path)) {
echo $path;
lstree($path);
} else {
echo $path;
}
}
lstree('gs://{bucket}/');
Beginner : I cant seem to get my head around the logic of it. Have searched but seems to come up with listing files and folders from an actual directory ie. (opendir).
My problem is :
Im trying to work out (in PHP) how to list files and subfolders from a path stored in a database. (Without any access to the file or dir, so just from the path name)
For example database shows:
main/home/television.jpg
main/home/sofa.jpg
main/home/bedroom/bed.jpg
main/home/bedroom/lamp.jpg
So if i specify main/home - it shows: television.jpg, sofa.jpg and the name of the subfolder : bedroom.
scanFolder('main/home');
function scanFolder($dir) {
foreach (scandir($dir) as $file) {
if (!in_array($file, array('.', '..'))) {
if (is_dir($file)) {
scanFolder($dir . '/' . $file);
}
else {
echo $dir . '/' . $file . "\n";
}
}
}
}
You would probably want to check on each iteration if the filename is a directory or not. If it is, open it up and read its contents and output them. A recursive function would work best in this situation.
http://php.net/manual/en/function.is-dir.php
Does anyone know a solution to this problem? I'm unable to open a subdirectory within a symboliclink'd directory. I've confirmed that the paths are correct (even copy & pasted the path into explorer, which parsed it fine). This is a strange, annoying, bug :|.
Example:
C:\folder\symbolic_link\dir1\dir2 - opening dir2 fails.
C:\folder\symbolic_link\dir1 - works
C:\folder\real_directory\dir1\dir2 - works
C:\folder\real_directory\dir1 - works
Alright, I finally found a hack to solve this bug in php's handling of symlinks on windows. The bug occurs when recursively iterating through files/directories using opendir(). If a symlink to a directory exists in the current directory, opendir() will fail to read the directories in the directory symlink. It is caused by something funky in php's statcache, and can be resolved by calling clearstatcache() before calling opendir() on the directory symlink (also, the parent directory's file-handle must be closed).
Here is an example of the fix:
<?php
class Filesystem
{
public static function files($path, $stats = FALSE)
{
clearstatcache();
$ret = array();
$handle = opendir($path);
$files = array();
// Store files in directory, subdirectories can't be read until current handle is closed & statcache cleared.
while (FALSE !== ($file = readdir($handle)))
{
if ($file != '.' && $file != '..')
{
$files[] = $file;
}
}
// Handle _must_ be closed before statcache is cleared, cache from open handles won't be cleared!
closedir($handle);
foreach ($files as $file)
{
clearstatcache($path);
if (is_dir($path . '/' . $file))
{
$dir_files = self::files($path . '/' . $file);
foreach ($dir_files as $dir_file)
{
$ret[] = $file . '/' . $dir_file;
}
}
else if (is_file($path . '/' . $file))
{
$ret[] = $file;
}
}
return $ret;
}
}
var_dump(filessystem::files('c:\\some_path'));
Edit: It seems that clearstatcache($path) must be called before any file-handling functions on the symlink'd dir. Php isn't caching symlink'd dirs properly.