why for the directories i creat getting rwx---rwxand rw----r-- permisions for the files i create?
I cant access a folder by externally via url which have rwx---rwx permition but can access the file which have rw----r-- permision.
Even I change permision to rwxrwxrwx i can not access the folder.
eg:
http://www.example.com/folder/
Gives me 403 Forbidden
403 forbidden on a folder url usually means that your server has folder browsing disabled, and there's no default document (index.html, index.php, etc...) in the directory. Since there's no default content to server, and the configuration doesn't allow for browsing, you get the forbidden error.
That's probably a pre-caution of your webserver, that denies to offer files that have the executable bit (x) set.
Check your servers documentation about the features it offers and which file permissions are needed to be set to make them work with your webserver.
Related
I am working on a project where users can upload files to my server. I want to store the files in a folder without execute permissions and I want to force the user to access the file via a php file. I am on shared hosting.
If I call the file from root folder i.e. '/home/user/Website.com/Files/MyFile.ext' the file will need execute permissions because it executes before readfile() sends it to the user. If I call it from 'https://Website.com/MyFile.ext' I don't need execute permissions but the resource is either prevented by my .htaccess file or it visible to the user via the url.
My current solution is to set the folder permissions to 601 and to set my .htaccess in 'https://Website.com/Files' to
Order Deny, Allow
Deny from all
Allow from xxx.ServerIpAddress
and my php file to
readfile('https://Website.com/Files/MyFile.ext');
exit();
This does the desired behavior and I think this would be good enough if I was on my own server but since I am on shared hosting I suspect someone who shares my IP address would be able to access my files and I am looking for an alternative method.
So this is what's happening. I don't know how to explain this.
I made a software which uploads file to my website's FTP location. It's working as it should.
As you can see in the folder that inside public_html/phphostrot.rviews.xyz/user_rot/ there are 2 php files. But when i visit the url http://phphostrot.rviews.xyz/ i get a blank page with inder of / and nothing else. And if i try to access http://phphostrot.rviews.xyz/user_rot/ I get 404 error. Even the link to the file is not working. I don't know what's the problem.
I think it's a issue with the file permissions. Have you configured your server to serve files from your public directory.
Try changing the ownership of the files to www or apache
app, img, wp-admin, wp-content, wp-includes?
check file permissions (+x), as #atefth said, check owner.
check webserver config. and check apache/nginx logs
My dedicated server has Debian wheezy installed along with Apache2 php and mysql. ISPConfig3 is installed as the control panel.
After creating a site under ISPConfig and a shell user account with it I uploaded the tar file into the web directory and extracted the files.
The default ispconfig index.html file would display how ever after the extraction the index.php is not getting picked up and after deleting the default index.php i get the error message
Forbidden
You don't have permission to access / on this server.
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.
I tried going directly to www.domain.com/index.php and that doesnt work too.
Do I have to modify anything in apache? Any help would be appreciated.
That error is telling you the server can't read the file (or files) specified in your DirectoryIndex directive, and is trying to display a 403 (security) error document instead, which is doesn't have permission to read either.
You need to check what user your httpd daemon is running as, and make sure that user had at least read permissions to all the files in your DocumentRoot. The best way to do this is to ensure that user is the owner of the files.
On a Redhat/CentOS system, httpd runs as the user "apache"
chown -R apache:apache /var/www/html
I have different folders in the site I am making.
What if the user tries to enter that folder, How would I not let them see what's inside?
Or can I redirect them into another page saying that they don't have the permission to access the folder/ invalid url?
I read something about htaccess but I dunno how that one works.
I am currently doing some trick (like adding index.php in the folders with a message saying they don't have permission to access) to every folder.
But it's kind of a pain. And I believe there's an easier method.
It depends on the contents of the folders:
If it contains php or configuration files that are never to be opened directly (or anything else that never needs to be requested directly by the browser), you should not put them in the web-root;
If it contains assets that are included in html but you do not want the visitor to browse the directory, you should configure your web-server so that directory browsing is disabled;
If only certain logged-in users should be able to open certain files, you should handle that in the file itself, not on the directory level.
If you cannot move your directory out of the web-root but nothing in it needs to be accessible by the browser, you can put an .htaccess file in that directory with just the following contents:
deny from all
This is something you should accomplish on a server level.
Basically restraining access to these folders on UNIX (chmod -r) for example will take care of this for you.
Sorry if this is a trivial question.
I am a kind of new to PHP and I'm creating a project from scratch. I need to store my application logs (generated using log4php) as files, and I don't want them to be public.
They are now stored in a subfolder under my PHP application folder, (/myAppFolder/logs) so they are served by Apache.
Where shall I store them, or what shall I do to keep them away from being served as content by Apache?
You can either have them in a directory above the root, or, if you're on shared host/ can't have the files above the root for whatever reason, you can have them in a directory that denies all HTTP access.
So you could have a folder called "secret_files" with a .htaccess file sitting inside:
.htaccess:
deny from all
Which will prevent HTTP access to files/subfolders in that folder.
Somewhere not under the public root!?
This is more a server config question as it depends on your server, but in apache you could use the custom log directives to set the location, so if you have
/www/myapp
Create
/www/log
and put them there instead. You need control over the config to do this so look up your web hosts docs to find out how.