So some reason server changes random (?) .php files to .ph.
Need to rename any .ph file to .php
Tried about every rename and rename extension code I found on stackflow.
Nothing has worked so far.
No root access (shared)
working directory will be /
You can try below code :(Tested in localhost)
if (!copy($file, $newfile)) {
echo "failed to copy $file...\n";
}
unlink('test.ph');
?>
If you are looking to rename all the .ph files in the current directory to .php, you can try this:
<?php
$old_extension = '.ph';
$new_extension = '.php';
$files = glob("*$old_extension");
foreach($files as $file){
$filename = pathinfo($file,PATHINFO_FILENAME);
rename($file,$filename . $new_extension);
}
?>
A word of caution: the script will rename the existing files in the current directory without warning.
I highly advise running the above script in a development environment or within a Test directory beforehand so you know what to expect.
Best-
Related
So this one is pretty straight forward I want to delete a file on the server using PHP, I have:
$myfile = 'theone.png';
unlink($myfile);
This code deletes the file, howevere if the path to file is /images/theone.png, it doesn't work, I have tried images\theone.png with no luck.
If I try and connect with FTP I get the error message to say that cURL does not support the unlink function... Any help would be great.
Thanks Guys!
What about:
$root = realpath($_SERVER['DOCUMENT_ROOT']);
$myfile = '$root/images/theone.png';
unlink($myfile);
Although to my knowledge, your attempted method should work, unless either I'm missing something, or you haven't included some code here that might be interfering with the unlink.
__DIR__ - this magic constant contains current directory, in case that the file is in the same directory as your PHP script you can use:
unlink(__DIR__ . "/$myfile");
If the file is for example in one directory above your PHP script you can use:
unlink(__DIR__ . "/../$myfile");
If the directory has correct access rights it should work.
I'm working on a script which will upload all the zpanel backups to my Amazon S3 account. It works if I don't use the zpanel backups and create and upload backups created by me using CronJob.
The reason why I cannot upload the zpanel backup is I cannot figure out a way to reach the "backups" directory using php opendir function.
I know the absolute path of the backups folder though. And as it seems, this absolute path won't work with opendir()
I've used : opendir('var/zpanel/hostdata/my_username/backups/')
And it won't work. Now if I want to use the relative path, I cannot reach there either.
So, is there a way that I can move the zPanel Backups directory somewhere inside the public_html folder? Or, a web URL that can reach the backups folder? Ajaxplorer can reach there, why I cannot?
If none is possible, I'll greatly appreciate if someone can teach me a way to create backups exactly like zPanel (all DB + everything inside public_html folder).
The code looks like this
require_once('S3.php');
// Enter your amazon s3 creadentials
$s3 = new S3('xxxxx', 'xxxxx');
if ($handle = opendir('var/zpanel/hostdata/my_username/backups/')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
//echo "okay";
echo $file; exit;
if ($s3->putObjectFile("backups/$file", "my_basket", "$file", S3::ACL_PUBLIC_READ)) {
echo "<strong>We successfully uploaded your file.<br /></strong>";
//this will delete the file from your server after upload
//if (file_exists($baseurl . '/' . $file)) { unlink ($baseurl . '/' . $file); }
}else{
echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
}
}else{
echo "No file found here ".$root;
}
}
closedir($handle);
}
Thanks a lot in advance!
You have a typo in your code. If you want to use an absolute path with opendir, make sure it starts with a slash (/). So the correct version would be:
opendir('/var/zpanel/hostdata/my_username/backups/')
If you still can't get this to work, verify that the path you pass to opendir is indeed a directory and also verify the permissions on that directory to make sure your PHP process can read from it.
From the PHP doc,
If path is not a valid directory or the directory can not be opened
due to permission restrictions or filesystem errors, opendir() returns
FALSE ...
Also, if you use a any open-basedir restrictions in your php.ini or in your code, make sure the path you pass to opendir is not blocked by those restrictions.
I'm making a content management system for a website I built. I want the system to be discrete, so I made it exist in only one PHP file, called '_admin.php'. All the content displayed in this file comes from includes that I store in a sub-folder called 'admin' (out of the way).
The photos used on the website are stored in an 'assets' folder that also sits in the root dir. The admin page has direct access to the assets folder, as it is also in the root. But the upload file script sits a few directories into the 'admin' folder and I want the uploaded files to be stored in the assets folder.
The move_uploaded_file() method takes the destination path for the file, but it requires a direct path. I try using $_SERVER['DOCUMENT_ROOT'], but the resulting directory doesn't seem to have any of my files. If I use getcwd() in a doc in the root, it returns the actual file structure that I can use. The same if I echo out __FILE__. But I've experimented with this SERVER constant a lot and I can't locate my website with it.
Since the script that uploads the images is called as a form action, I can't pass the root directory as a variable.
Not really sure what I'm doing wrong, anyone have any ideas?
Thanks
edit **
//See if Files array contains new files
if (!empty($_FILES['file'])){
foreach($_FILES['file']['name'] as $key => $name){
$error = $_FILES['file']['error'][$key];
$temp_name = $_FILES['file']['tmp_name'][$key];
$dir = getcwd();
$move_file = move_uploaded_file($temp_name, "$dir/temp/$name");
if (($error == 0) && ($move_file)){
$uploaded[] = $name;
}else{
die($error);
}
}
echo __FILE__;
echo "<br/>";
echo __DIR__;
echo "<br/>";
echo $_SERVER['DOCUMENT_ROOT'];
exit();
}
The upload script I'm currently using. This script works fine because I'm storing the images in the same directory as the script. I just don't know how to store them in my root.
I would specify the full absolute file path if you can. I set this via define() in a config file for my CMS. On install you figure out what that path is and set it.
define("BASEFILEPATH", "/home/.../[webroot]"); // The base file path for the website
You may be looking for something more general, but you could have some sort of install script where the user can enter basic info into a form, such as username, pwd, etc. and you could have them enter this path as well.
I try to use the copy() function in php to store two same files in the sever at one time, I have specified the directory for the copied file, but it doesn't go to the directory I specified which is "edituploads" folder, instead it goes to the current directory which the upload php scrpit is located. and I have used the copy() function three times , is that a problem?
Any one could tell me what's wrong, thanks alot.
here is my php code:
if (!empty($_FILES))
{
$a = uniqid();
$tempFile = $_FILES['Filedata']['tmp_name'];
$targetpath4=$_SERVER['DOCUMENT_ROOT']."/example/upload/edituploads/";
$targetFile = str_replace('//','/',$targetPath) . $a.".jpg";
$targetFile4 = str_replace('//','/',$targetPath4) . $a.".jpg";
move_uploaded_file($tempFile,$targetFile);
copy($targetFile, $targetFile4);
}
php's copy/move commands work purely on a filename basis. You can't specify a directory as a source or a target, because they don't operate in directories. It's not like a shell where you can do
$ cp sourcefile /some/destination/directory/
and the system will happily create 'sourcefile' in that directory for you. You have to specify a filename for the target, e.g.:
$ cp sourcefile /some/destination/directory/sourcefile
Beyond that, your move command is usign$targetPath, which your code snippet doesn't define, so it's going to just create a $a.jpg filename in the current working directory.
And your copy() command is using $targetFile4, which is based off targetPath3, which is also not defined anywhere.
You need to copy the file first, then move over the TMP to the other directory.
copy($tempFile,'somePlace_1');
move_uploaded_file($tempFile, 'somePlace_2');
My project files are located on remote server in the folder. I access a file in this folder in this way:
http://www.example.com/searchtest.html
This opens a page with Input Box where user types keywords to search. The search script is a .php file located in the root itself. The script has to search for .html files with the name similar to the keywords entered. These .html files are also located in the same root folder where all .php files reside.
My application is running well when I work on local machine and is able to search well, but when I hosted it on my site, it gives error:
Warning: file_get_contents(//.rnd) [function.file-get-contents]: failed to open stream: Permission denied in /home/myServer/public_html/example/search1.php on line 40
.rnd
It is giving error on a line where I am trying to access files in my directory. Below is my code snippet. And yes, I am the administrator and I have all privileges on my site.
$matchingFiles = array();
$dirName = "\\";
$dh = opendir($dirName);
while( ($file = readdir($dh)) !== false)
{
$fullPath = $dirName . "/" . $file;
if(is_dir($fullPath)) continue; // Skip directories
similar_text($lcSearch,strtolower($file),$percentSimilar);
if($percentSimilar >= $percentMatch)
{
$matchingFiles[] = $fullPath;
}
}
It gives error probably in the "opendir" function.
I also want to know whether the $dirName holds correct path or not. It must hold the same path from where the correct script is run.
Assuming you did not change the $dirName assignment to omit server detail on a public posting, you should be setting that to the path to search. Since your use is pretty static, I'd suggest using the full path to the directory to open, e.g.
$dirName = '/home/myServer/public_html';
And of course make sure the permissions to all files it will search are compatible with the user your server is running as.