I have a apache dedicated server with lost of websites.
I also have a red5 installation on the server.
What I want to know how to do is perform file functions - specifically unlink() - on files held in the RED5 directory within the root server dir.
I can move files with this:
copy ("http://www.parttimepornstar.com:5080/echo/streams/".$strFilename, $strDestination);
but
unlink("http://www.parttimepornstar.com:5080/echo/streams/".$strFilename);
...won't work...
Any ideas what I'm doing wrong?
Thanks.
You need to check the permission of the file that you are trying to delete. Apache should (hopefully) not be running as root and therefore cannot delete any files that it does not have permissions for.
You should also be vary wary of security. Allowing an unchecked variable to be used in the end of a copy() or unlink() call could potentially give a user access to your entire filesystem. Take a look at basename.
I suspect you need to use absolute file paths rather than urls/relative paths. Also if you want to delete from root dir, you need to specify that too. Try doing something like below:
unlink($_SERVER['DICUMENT_ROOT'] . '/RED5/' . $yourfiles);
You are addressing the files via HTTP, which you can't use to delete files.
You need to specify a filesystem path such as /etc/httpd/sitename/file.php
Related
I placed files in my root folder and copy when user try to download the files
like
/root/filelocation/file.mp3
when user on download page
copy("/root/filelocation/file.mp3","download/file.mp3");
i use this command but it takes too much load time
i have ffmpeg installed in server also
Don't copy it. Make a symlink. This is almost instantaneous.
Really though, you should consider why you're making a copy in the first place. A simple script can call the appropriate sendfile function on your web server. Or, you can get crafty with your rewrite rules and you might not have to copy or symlink anything.
Also, don't recommend using the root user's home directory.
Is it possible to arrange file permissions/group ownership/etc in such a way that a file can be read by the function readFile() for a forced download, but it cannot be downloaded by navigating to the literal url of the file?
Maybe you could add the user that is running apache / php to the group that owns the file. And set config to read and write for owner and owner group, and no permission at all for others. (-rwxrw---- 0r 0760)
Never tested it, but it should work.
The Apache user will need read permissions. To prevent it from being navigated to, the best (and easiest) solution is to store the file outside of the web folder.
GoDaddy does not a give FTP root access to my account, meaning I can only access the public_html folder and not the includes folder.
Is there any way I can include the config files in that public folder but somehow make it so only the server can access them in a secure way? How does Wordpress do it?
You could use a .htaccess file to restrict Website Access.
Take a look of this article.
just make sure they have a .php extension.
(and actually contain PHP code of course)
Wordpress keeps the config file in the main folder. Just make sure you have a .php extension and you dont echo anything from that. (I know you wont.)
People really cant get the details inside your php file unless you echo something, or the chmod of the file is set wrong so that people may be able to actually download the file.
As xdazz said, you can also restrict access to your config files, but I think its just for MORE protection, and you are still safe without that.
I have a project where Red5 is recording videos. I need PHP to be able to access the videos and move them so they can be accessed by HTML.
How can I do this?
I found this post: Accessing files outside the document root with Apache
But it involves updating some file that was never specified. And I'm not sure it is a viable solution in this case anyway.
lee
PHP by default can already access files outside the web root, unless restricted with an open_basedir directive (or safe mode, but hope you're not in that cage).
It's normally a good practice to insert within a VirtualHost configuration an open_basedir restriction. You can specify multiple directories separated by : on Linux and ; on windows.
php_admin_value open_basedir /var/www/s/stage:/usr/share/php:/your/dir
To access those files either use an absolute path or a path relative to the position of the PHP file called. (So you'll have to ../ to reach levels above).
Also be sure that directories in which you want to write to are assigned to the webserver user and have write permission.
i'm using Xampp. When I tried to do this earlier, it worked, but now it is not working.
I'm trying to make a directory in my www folder to hide it from baddies who steal files.
Each user gets their own folder in uploads to put their files on.
Xampp uses apache, and Xampp is a local web server. It allows me to design websites without the need of an online host. The www folder is in my C:\program files\xampp\php\www\ and I need to make a directory there. I know it's possible because i've done this before, I have just forgotten how to make it happen.
When I make a directory I use:
$uploaddir1 = "xampp/php/www/uploads/".$esclcusername."/";
mkdir($uploaddir1,0777);
Do I need to include C:\program files\ before xampp?
And finally, how would this be possible on a real online web host?
I saw your question here and searched some on google. This is what i found:
mkdir("D:/hshome/rubygirl58/gameparody.com/clansites/".$sitename."/lib", 0777)
So yes, I think you have to include the complete path.
Greetings,
Younes
you need to make sure that you give permisions to the parent folder to create dirs in it (0777)
to get the full path you can use dirname(FILE) wich will return the path for the directory of the file in wich it is runned