I have a production server with apache2, php, mysql.
I have just one site right now (mysite.com) as a virtual host. I want to put phpmyadmin, webalizer, and maybe webmin on there. So far, I installed phpmyadmin, and it works but the whole internet can go to mysite.com/phpmyadmin
How can I reduce the visibility to say 192.168.0.0/16 so it's just accessible to machines behind my firewall?
1) You can do it at the Webserver level.
Use allow/deny rules for apache. If you don't have direct access to your apache configuration file, you may use a .htaccess file.
<Directory /docroot>
Order Deny,Allow
Deny from all
Allow from 10.1.2.3
</Directory>
2) You can do it at the application level using the phpmyadmin config file.
The configuration parameter is: $cfg['Servers'][$i]['AllowDeny']['rules']
Examples of rules are:
'all' -> 0.0.0.0/0
'localhost' -> 127.0.0.1/8
'localnetA' -> SERVER_ADDRESS/8
'localnetB' -> SERVER_ADDRESS/16
'localnetC' -> SERVER_ADDRESS/24
You can see this on the official phpMyAdmin configuration documentation.
http://www.phpmyadmin.net/documentation/#servers_allowdeny_order
You would use a module in Apache called mod_access
You can either configure it in your apache config file or within a .htaccess file in the directory's root.
Here's a short example
<Directory /your_folder/location>
Order Deny,Allow
Deny from all
Allow from 123.123.123.123
</Directory>
Use the <Location> directive (either in server configuration or if it is allowed, in .htaccess). In there, you can use Allow from to deny access to everyone else except some certain source.
Related
Actually my project is in WordPress. I'm changing my server from windows to Linux.
After changing this I'm facing this error!
Why isn't my .htaccess supporting the Linux server?
If you're using Apache, you should configure it in order to allow .htaccess files to be executed.
To do so, you can
Create a virtual host, that configures specifically one site. Using a virtual host you can set a domain name, document root, server alias, etc. Also, you can set Allowoverride to All (or other). See AllowOverride
Set your configuration in /etc/apache2/apache2.conf (or /etc/apache2/httpd.conf in some versions). In these files, there is a <Directory "/var/www/" that points to your /var/www/ directory. Inside this Directory tag, you can set AllowOverride to All. Using this configuration, every site on your server will be allowed to use .htaccess.
I recommend to use the Virtual Host that allows a easier and cleaner configuration.
Checkout the comment in .htaccess file, if it starts with // then change it to #
For me this solved the problem, I wish for you too.
I have a home server running on apache at /var/www/html, I also created a self signed SSL at DocumentRoot /var/www/html in /etc/apache2/sites-available/default-ssl.conf
How can I exclude a file within html from SSL, for example, my current server is https://myserver.com, but I want to use http://myserver.com/nossl/api.php
What should I add in .htaccess of that nossl folder?
This is my current .htaccess to exclude authentication
<Files main.php>
AuthType none
Satisfy any
Order Allow,Deny
Allow from all
</Files>
Since I can't find any direct solution to this, I did find a way of free SSL using letsencrypt
Working great on multiple servers
I have a PHP script that is run from cron to send out reminder emails.
To prevent unauthorised use of this script, I have the following .htaccess file which I upgraded to the Require directive after upgrading apache to 2.4 from 2.2.
<Files "reminder.php">
Require all denied
Require host localhost
Require ip 127.0.0.1
Require ip xxx.yyy.zzz.aaa
</Files>
xxx.yyy.zzz.aaa is the address of the webserver, equivalent to localhost.
Whereas the old .htaccess file used to work, this one isn't preventing access from remote browsers. I've read and reread all the directive documentation and can't see what is wrong. Any clues? Is this the best way to protect a PHP script designed to run from cron?
The old .htaccess file was:
<Files "reminder.php">
Order Deny,Allow
Deny from all
Allow from localhost
Allow from 127.0.0.1
Allow from xxx.yyy.zzz.aaa
</Files>
I found the problem. When I set up the 2.4 server, I explictly used
AllowOverride None
and didn't override this in specific directories. BTW AllowOverride defaulted to All in 2.2 and defaults to None in 2.4 so without the directive I would still have had the same problem.
So replacing this with
AllowOverride All
within the <directory> group fixed the problem. The .htaccess file is now allowed to do its job.
I have a Django site running on ec2 instance (ubuntu) on apache2 using mod_wsgi. I have placed it in /var/www/django_project. It is Up and running without any issues. Now i want to host another site (php, mysql) on this ec2 instance only. I tried to configure my httpd.conf and added php directory with proper permissions, but I believe due to Alias setting in mod_wsgi, any request above '/' is taken up by django.
I DO NOT have any domain name. I access my Django site with IP of machine (i.e w.x.y.z/django_app).
Correct me if I am wrong : Since i do not have server name, I cannot have both site running on port 80 using virtual Hosts.
And I do not mind running them on different ports either. Please suggest me a way to host php site on this server. which file to configure and how to configure it.
My httpd.conf file:
Alias /static /var/www/resumerepo/static
<Directory /var/www/resumerepo/static>
order deny,allow
Allow From All
</Directory>
WSGIScriptAlias / /var/www/resumerepo/resumerepo/wsgi.py
WSGIPythonPath /var/www/resumerepo
<Directory /var/www/resumerepo>
<Files wsgi.py>
Order deny,allow
Allow From All
</Files>
</Directory>
It works fine and my app is accessible. However if i put it in one virtual host and make another virtual host, apache restart throws an error saying PythonPath can not be in virtual host.
You can't do this easily if your django project is accessible via the root of your domain (or your IP), for example: http://1.2.3.4 -> leads to your django project`.
I think, one solution could be to move your django project to a subdirectory, like http://1.2.3.4/django, and make your php project also accessible in subdirectory, like http://1.2.3.4/php.
Or you can create a new virtual host, listening on port 8080, for example, for your php project. This way :
http://1.2.3.4 leads to your django project
http://1.2.3.4:8080 leads to your php project
The second option may be easier to set up, as you won't have to change config for your django project.
Your vhost file could look like :
<VirtualHost *:8080>
ServerAdmin contact#yourdomain
DocumentRoot /var/www/php
<Directory /var/www/php>
AllowOverride All
Order allow,deny
Allow From All
</Directory>
</VirtualHost>
Maybe this won't work : according to Apache documentation, you should not use virtual hosts without ServerName.
I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>