I set up my IX Webhosting to conduct the following cronjob every 15 minutes.
usr/bin/wget -O- http://xxx.com/php/xxx.php
I want the php folder to be blocked off from all outside requests for security reasons, so I set the .htaccess file to deny from all. But when the .htaccess file is present the cron job is denied with a 403 error.
I thought that server-side cronjobs are not blocked by .htaccess? Is there any way to get around this?
This is still a web request from localhost, since wget is a regular HTTP client requesting the page, and the web server is serving it (even though it may produce no output for humans to read). Instead of denying all, allow localhost:
Order deny,allow
Allow from 127.0.0.1
Deny from all
You can add something like:
Order deny,allow
Allow from 192.168.0.1/24
Allow from 127
Deny from all
in your .htacces to allow access from local intranet IP (192.168.0.1/24) and localhost IP i.e. 127.0.0.1
Change the cron job to:
/usr/bin/php /path/to/web/root/php/greader_forceupdate.php xxxx
/usr/bin/php may need to be adjusted if PHP is installed elsewhere.
/path/to/web/root is the path on the filesystem.
To access the parameter, use $_SERVER['argv'][1].
Related
In the situation where you have a Shared Hosting plan with a provider like GoDaddy without full access to the server, is there anything i can do to disallow outside HTTP requests for *.php files?
If anyone has experience with shared hosting or with GoDaddy specifically that would be appreciated. Im with GoDaddy and the only thing i can try is to mess with User/Group/World permissions of a php file, but no combinations enables server-only access to a file. And, obviously, i don't have access to the apache server's config file which is the easiest solution.
outside HTTP requests for *.php files
I'll interpret this as requests from outside a set of people you're willing to share the pages with. You want to limit access to your site.
Easiest approach -
Use an .htaccess file in DocumentRoot that limits access by IP address (if you are willing to force all the people who use the files to work from a limited set of IP addresses)
https://httpd.apache.org/docs/current/howto/htaccess.html
For Apache 2.2, in the .htaccess file, put
Order Deny,Allow
Deny from all
Allow from 1.2.3.4
For Apache 2.4, use
Require ip 1.2.3.4
Another quick solution would be to password protect the directory with the .php files. The configuration would look something like this:
AuthType Basic
AuthName "Restricted Files"
# (Following line optional)
AuthBasicProvider file
AuthUserFile "/usr/local/apache/passwd/passwords"
Require user goodguy
https://httpd.apache.org/docs/current/howto/auth.html#lettingmorethanonepersonin
You may want to find hosting with SSH access to give you more control.
I am making php program to run on local network and I want to give it to some other peoples to use. but I don't want to ask them to install XAMPP or other programs like this. (XAMPP is about 100 MB and they cant download it) is there any other way to make Apache and MySQL server easier on a system. for example they run a batch file and the file make server on port 80 and copy my scripts on its htdocs (or something like this) folder.
My second problem in XAMPP is PhpMyAdmin, which allow all devices in local network to manage my program databases and change them. I need a trick to disable PhpMyAdmin in client devices.
Sorry for bad English. :)
Xampp on windows is set to allow phpmyadmin from your local network. To disable access from your network, Open httpd-xampp.conf located in xampp-folder\apache\conf\extra\httpd-xampp.conf Somewhere down in the bottom of this configuration file is the LocationMatch node Even though you would think the default Order is set to deny, allow with Deny from all. This configuration by default is set to Allow from 192.168.0.#. Remove this ip range and you are set. Change this
<LocationMatch "^/(?i:(?:xampp|security|licenses|phpmyadmin|webalizer|server-status|server-info))">
Order deny,allow
Deny from all
Allow from ::1 127.0.0.0/8 \
fc00::/7 10.0.0.0/8 172.16.0.0/12 \
fe80::/10 169.254.0.0/16
ErrorDocument 403 /error/XAMPP_FORBIDDEN.html.var
</LocationMatch>
Don’t forget to restart Apache. Now http://localhost/phpmyadmin will result 403 Forbidden error.
If your computer and the other computers(computers which like to access) in the same network, the other computers can have access to the web application from the browser by inserting localhost/yourproject to [your-ip-address]/yourproject. Not only computer but also from your mobile device.
The only and the other way is hosting on the web.
have a cron job that is located in a folder in the normal directory.
I am trying to not allow access to this file by anyone else EXCEPT my server for running the cron job.
I tried:
order deny,allow
deny from all
allow from 127.0.0.1
But no luck. I have gone down the root of putting the cron job outside the web root but i could not get it to run no matter what me and my host tried.
Thanks.
Two things here: (a) getting the cronjob to run, (b) access restriction.
Cronjob
New crontab entry:
*/10 * * * * /usr/bin/php /somewhere/www/cronjob.php
Set correct permissons on cronjob.php:
Execute flag: chmod +x /somewhere/www/cronjob.php
Access Restriction
In general, it is a good practice, to place the script files for cronjobs outside of the www path.
If you really need to place them in www, then you might protect them with an access restriction. For the webserver Apache, this would work via .htaccess, like so:
.htaccess inside /somewhere/www/:
<Files "cronjob.php">
Order Allow,Deny
Deny from all
</Files>
This protects the file cronjob.php from outside access, but allows cron to execute the file.
If nothing works, follow my step by step guide: https://stackoverflow.com/a/22744360/1163786
You can restricted the access by set the environment in crontab file
SCRIPT_RUN_ENV=mycronenv
and validte the environment string within the function:
if (get_env('SCRIPT_RUN_ENV') != 'mycronenv') {
die('Access denied');
}
OR you can restrict the access by IP
if( $_SERVER['REMOTE_ADDR'] != $_SERVER['SERVER_ADDR'] || $_SERVER['REMOTE_ADDR'] != "127.0.0.1" ){
die('Access denied!');
}
And you can set the permission to your scripting file through .htaccess like:
Order deny,allow
Allow from THIS_SERVER_IP
Allow from 127.0.0.1
Deny from all
I have several folders in my website directory which contains textfiles called "secure.txt".
For security reasons the URL of these files are never shown in the webbrowser but my website searches for these files (PHP code), which contains sensitive information.
How can I make the PHP code allowed to read these files but restrain access through the url, so a potential hacker wouldn't be able to read the content of these files?
put them out side your document root folder and place the following .htaccess file in the folder you want to protect. Also if you don't want to access it through a particular IP remove the last line.
order deny, allow
deny from all
allow from 192.168.0
[EDIT:]
To allow php scripts, etc. allow localhost (127.0.0.1)
order deny, allow
deny from all
allow from 127.0.0.1
You should put them in another folder and make the .htaccess deny from all, allow from 127.0.0.1
Old trick for that: Prefix the files with <?php die("I'm a fraid I can't do that, Jim"); ?>, and call them *.php. On parsing, ignore the prefix.
Edit
Why do this? The rationale behind it is, that you avoid a dependency on some special webserver configuration, which acn be forgotten (on moving to a new server), unavailable (many cheap hosts don't give you .htaccess), not applicable to some webserver software (IIS) etc.
So the reasoning is to trade some computational overhead against flexibility, portability and platform independence.
Can you move them out of your website directory altogether? If so, then make sure PHP has access to that directory! (The open_basedir value will need to include it.)
I'd suggest moving the files out of the webroot to be on the safe side
If you use Apache, deny access to all files named secure.txt from within httpd.conf:
<Files secure.txt>
deny from all
</Files>
You may do the same via .htaccess files as well (if your server is configured to allow override access rights from htaccess).
But a better solution would be to include the sensitive information into your PHP scripts themselves, or to store it in a database.
I don't know what is the exact term for this, so my title could be incorrect.
Basically what I what to do is to write a PHP script that has an input field:
Domain Name: <input type='text' name='dname' id='dname' value='http://example.com' />
<input type='submit' name='addname' value='Add A Domain' />
When user type their own domain into the text field and press submit, the script will automatically make a directory, copy some PHP scripts there and map the domain name to there. (The domain name is pointing to our server, of course.)
I have already figured out the mkdir() and copy() part, but I couldn't figure out the mapping part. How to add an entry to map http://example.com to /home/user/public_html/copy1/ automatically, using PHP?
While you could do that directly from your PHP page, I suggest not to do that, for many reasons, from high failure risks (in case page execution gets interrupted suddenly, for example) to security risks (your httpd user will have write access to its own configuration + some stuff on the filesystem where it shouldn't).
Some times ago I wrote a similar control "website creation control panel" that works pretty much this way:
The php script receives the website creation request and stores it somewhere (e.g. in a database). dot.
Another script, running as root via cron each, let's say, five minutes checks the website creation requests queue. If there is any:
Mark the site creation task as "locked"
Create the directory at appropriate location, populate with scripts etc.
Change all the permissions as needed
Create new virtualhost configuration, and enable it
Make the webserver reload its own configuration
Mark the site creation task as "done"
The second script can be written in whatever language you like, PHP, Python, Bash, ...
About apache configuration
To create a directory "mapped" to your domain, you could use something like this:
First of all, "slugify" your domain name. I usually take the (sanitized!) domain name, and convert all dots with double-dash (that is not a valid domain name part). I don't like dots in file/directory names a part from file extension separation.
Then, you can create something like this (assuming domain is www.example.com):
<VirtualHost *:80>
ServerName www.exampple.com
DocumentRoot `/my-sites/wwwroot/www--example--com`
<Directory "/my-sites/wwwroot/www--example--com">
Options -Indexes FollowSymLinks
Order allow,deny
Allow from all
AllowOverride All
</Directory>
ErrorLog /var/log/apache2/www--example--com_error.log
CustomLog /var/log/apache2/www--example--com_access.log vhost_combined
</VirtualHost>
<VirtualHost *:80>
## redirect non-www to www
ServerName www.example.com
ServerAlias example.com
RedirectMatch permanent ^(.*) http://www.example.com$1
</VirtualHost>
Assuming you are placing your site files in directories like /my-sites/wwwroot/www--example--com.
Some security-related improvements
In my case, I also preferred not running the second script by root either; to do so you have to add some changes in order to let a less privileged user do some things on your system:
Create a directory with write access for your user, let's say /my-sites/conf/vhosts
Create an apache virtualhost containing the following line: Include "/my-sites/conf/vhosts/*.vhost", and enable it
Then, let your user reload apache configuration, by installing sudo and adding this to your /etc/sudoers:
Cmnd_Alias A2RELOAD = /usr/sbin/apache2ctl graceful
youruser ALL=NOPASSWD: A2RELOAD
%yourgroup ALL = NOPASSWD: A2RELOAD
And, of course, also give write permissions to your websites base directory to the user that will be used to run the script.
I think you need to add VirtualHosts to apache config (such as httpd.conf)
like this:
<VirtualHost *:80>
ServerName example.com
DocumentRoot /home/user/public_html/copy1/
</VirtualHost>
Apache documents for virtual host config:
http://httpd.apache.org/docs/2.2/vhosts/name-based.html
http://httpd.apache.org/docs/2.2/vhosts/mass.html
It is over 10 months now, so I'll just suggest what I have found.
While writing directly to httpd.conf seems the only way, but recently our site change the server. It has cause us so much trouble in those file / folder permission, and the hosting company refuse to help us due to security concern.
So I have a second look and discovered I am using CPanel for hosting, I can certainly use Addon Domain feature to create and add in new domains.
But it has its limits. Since it is an addon domain, we can no longer limit the bandwidth and disk usage per domain. Everything is shared. But that doesn't matter to us anyway.
So far it works. You can do it by either using the CPanel API library available on the official cpanel website, or you can use the direct URL request way to create the domain. Since we are making a new wordpress install, we create a new database as well.
http://{username}:{password}#{mysite.com}:2082/frontend/x3/sql/addb.html?db={dbname}
http://{username}:{password}#{mysite.com}:2082/frontend/x3/sql/adduser.html?user={dbuser}&pass={dbpass}
http://{username}:{password}#{mysite.com}:2082/frontend/x3/sql/addusertodb.html?db={dbname}&user={dbuser}&ALL=ALL
http://{username}:{password}#{mysite.com}:2082/frontend/x3/addon/doadddomain.html?domain={newsite.com}&user={ftp_user}&dir={ftp_dir}&pass={ftp_pass}&pass2={ftp_pass}
It doesn't have to be wordpress. You can use this to install joomla, drupal, phpbb or even you custom script.
Hope it helps anyone who is reading this. Thanks.