I've been learning to setup servers to use for my web-apps - and have found that my favourite (fastest and easiest to get going) setup is CentOS5.5/Lighttpd/fastcgi and SQL. I don't, however, know how secure these are out of the box - I installed them using Yum and have modified some settings to encourage PHP to play ball - is there anything I should be doing to increase my security levels, prevent tampering with my scripts?
The server doesn't have FTP, any additional users from root, mail or anything else installed at all, and all directories are owned by lighttpd:lighttpd and not CHMOD for any world use. The greater world won't ever be using the apps I'm writing, they are for personal and for my employees / partners to keep track of money and clients (hence my wish for them to be secure).
Thanks guys!
If you are talking about servers (plural) and you have the budget / ability I would encourage you to only have servers that are client facing that serve static content only. Move your PHP and SQL back to internal only.
Web server with 80 / 443 open to the world and the SSH port open only to trusted IP's or listening only on an internal interface you can access
Application server with port 80 listening only to requests from the front end web server through a private IP address (if possible). Otherwise, restrict it's access to the public IP of the front end webservers and consider having HTTPS (443) communications between the two.
Your SQL instance / server should have the same concept, only being accessible from the Application server.
This allows you to have multiple levels of security and dedicated resources to process specific tasks (FE webserving / Middleware Application serving / Backend data services)
In addition, if your FE is compromised, they wont have immediate access to your PHP source and the database content.
If it is a single server, ensure only 80/443 are open to the world and make sure you have a firewall, or firewall concepts in place, to restrict/deny access to all other ports except from trusted sources. Consider moving SSH from port 22 (default) to an alternate port ...
Related
I am currently working on the release of my project management software to the internet. It is my first tool that I am going to deploy in the www and I am concerned about some Security Factors. At the moment the tool is running on Apache Port 80 (https is to be done) with MySQL and PHP. I got myself a domain name which links back to the public IP of my Windows Server on which Port 80 is open for access.
I am now thinking about deploying a letsencrypt certificate in combination with the win-acme letsencrypt client.
Can the procedure be considered as safe? I would be happy if you could provide me some feedback or improvements.
I have used Let's encrypt for many websites. I have never faced any problems with it. Just make sure you install it correctly.
I suggest using a Firewall and an SSH connection with strong passwords. Setup your Firewall to allow HTTPS incoming connections(PORT 443). You can deny HTTP connections if you want and allow specific IP addresses and port ranges.
Once you do it, it will depend on how efficient your code is. One of the common web hacking techniques is SQL injection. I suggest using PDO.
Hope it helps!
My office network provides internet access to my employees when they connect to it through the office's router. I want to make a web application in which only computers connected to the internet through my office router, can access. So that my employees have to be in my office area before they can login into the php web application.
If they are connected to the internet, but not through my office network they should not be able to log in to the application. (I know I could have deployed the php app in a local server setup in my office but I want the app to be on a remote server on the internet for my personal reason).
What hardware do i need to setup my office network and how do I make PHP detect the id of the hardware of my network so php can determine that a request is coming from my network.
Some options to recognise your private office from a public website:
IP address
This will only work if you know what IP address is in use at any given time by your allowed clients. In the case you use a NAT gateway, this has to be the outside address.
It becomes pretty easy to do this if you have static IP addresses for all your allowed clients, if they change, it quickly becomes a nightmare to keep them right at all times.
Security: since HTTP is based on TCP it's not trivial for other to get to use your IP address through spoofing, but it's by far not foolproof either. Consider it a poor-man solution at the very best.
Caveat: if any of your staff can get remotely to their machine, they can access it remote (so e.g. a time registration system is going to get circumvented by this quickly)
VPN
VPN stands for Virtual Private Network.
This is the goto solution from a security perspective. Essentially you build up tunnels between either individual clients or networks as a whole with the VPN server.
On the central end of those tunnel(s), your webserver answers to web requests (but not to the internet at large).
There is a whole range of VPN products out there. There are equally relatively easy to build solutions using free software (e.g. OpenVPN).
Things on how the client (network or computers) will authenticate to the server and what traffic is attracted to the VPN and much more are all possible parameters you can set.
Security: it depends a bit on the choices made, but unless unproven or outdated solutions are picked, this can be done "top notch". It is however in skill level probably just above your typical IT shop around the corner (but you might be in luck).
Same remark as above: your staff that can gain access to it, might be tunnelling into their machine at work or might use credentials and settings on an office machine at home as well.
DNS
reverse mapping of IP to names is far too easy to spoof, don't try this.
Login/Password
This is a relatively easy solution: allow access from anywhere, but give authorised users a login and password and let them have access after being logged in properly.
Security: It's non-trivial to get this fully secure, there's plenty of opportunity to make errors in how the application works so that it becomes a problem.
But if you have to have a zero footprint on the clients, this is your best option.
Add in 2 factor authentication to increase the password security and make passing on passwords a bit more difficult.
TL;DR
I'd setup an OpenVPN based VPN, they are relatively easy to setup, the clients exist for most OSes (take care not all: e.g. iOS: I don't know of one) and it'll give you more than average protection without you having to delve deeply into the details of encryption protocols and the like.
Still there's a learning curve, but there's plenty of tutorials out there that don't assume much prior knowledge either.
For your clients you setup a certificate-based authentication system using EasyRSA (included with OpenVPN). It's a bit of a habit you need to create, but once setup properly, adding and removing users becomes relatively painless.
On your server all you need to do is make sure the http server only binds to the IP address of the tunnel interface.
So this is the situation: I have a bunch of Arduinos and Raspberry Pis along with an ubuntu server on a local network. The arduinos and pis communicate with that local server routinely using PHP GET & POST requests.
Now this local server sometimes "fetches" something from a remote server in the cloud (also using PHP GETs) to respond to local requests from Arduinos and Pis.
Now here's the problem: The local server has no issues communicating with the remote server by GETs, but what if I want the server in the cloud to send a GET to the local server?
This part is kind of confusing to me as the local server is on a regular LAN and connects to the internet via a router through a local commercial ISP that issues dynamic IPs.
How can I send PHP GETs from an "online" server to a local server?
Please note that both servers are running Apache/PHP/MySQL on Ubuntu 14.04.
Thanks a ton in advance!
You will need two steps to accomplish that.
step 1 - make router forward external requests to LAN server
step 2 - make external server know the current dynamic WAN ip
step 1:
The router has to be configured to forward WAN requests to your LAN server. Assuming you use a normal home router, you typically point your browser towards the router ip and login on the router. Now you have to find where to configure forwarding (unfortunately naming of this feature varies from router to router).
While you typically can define an "exposed host" where just all external requests go to, you are better of in terms of security if you just forward specific ports to your server. As you are going to use HTTP protocol, the standard ports here would be 80 (http) and 443 (https). So assuming you use HTTPS with default port, a typical forwarding would be:
router WAN ip, port 443 --> server LAN ip, port 443
This forwards any external request to the router on port 443 to your internal server on port 443.
Now your server should be able to receive those requests, but you still would need to know your router's current dynamic WAN ip.
step 2:
As your router's WAN ip changes from time to time, you need to somehow announce that ip to your external server.
One easy way of doing is by using an external service which will provide you with a URL, which will resolve to your current ip. This is often referred to as DDNS or dynamic DNS. One quite well known DDNS provider is https://dyn.com/dns/ - but there are plenty others, and you will even find free ones. After registering with such a provider you will be given a URL which your external server can use instead of the ip.
Now you still would have to let know the DDNS provider you current dynamic WAN ip. Most easy way to do this again involves your router. Check its config for DDNS settings, typically routers do support this feature, often there are even some specific providers pre-configured. Setup your router with the credentials you got from the DDNS provider.
Now everything is set. You should be able to send requests to your internal server by using the URL you got from your DDNS provider, while your router both forwards such requests and notifies the DDNS provider about any ip changes.
A word of warning - you just exposed your local server to the internet. So you will have to treat it like any server on the internet to keep it safe, including careful configuration, installing security updates and so on.
You have to open a port on your router, and specify where the router should lead the request to. Lets assume your external ip is: 80.82.71.24, going to this ip address (fx: http://80.82.71.24) will lead to your router. Then the router decides what to do with this request, normally the request would timeoutted / refused. But on the router, if you specify that this certain request (could be: tcp/udp) (to a specified port) should point to a certain internal ip (the local server ip), then it's possible to do what you want.
But to do this, you need to read up on your router - first of all, see if you can login into it. Could you specify what router you use and if your internet connection is yours or shared (fx. campus, school, etc)?
By the way, it would not be a good idea to open up the port for the whole world, so maybe you should consider to only allow your cloud server ip to gain access to that specific port.
I'm new to PHP, so I don't know how to explain it. I'm running WAMP on my computer and I would like to be able to access my localhost from another computer.
Is it possible? How can I do this?
This is provided that all machines are on the same network and that you have
administrative privileges on the machines (you'll have to edit some system files).
You can easily do this but it would have to be a manual process.
You have to create an entry in the hosts file -
On Windows machines is is located in %SystemRoot%\system32\drivers\etc\hosts
On UNIX like systems it is located in /etc/hosts
http://en.wikipedia.org/wiki/Hosts_(file)#Location_in_the_file_system.
See the link for details on where your hosts file is located. It depends on the operating system.
The following will have to be done on every machine that you would like
to have access to your localhost machine.
Add a line at the very end of your hosts file similar to this :
10.0.0.42 prathyash-localhost.com
The IP address (in the example above it is 10.0.0.42) is the address of your localhost; Your computers IP address. The domain name (prathyash-localhost.com) is what is mapped
to the IP address.
After you save that file, whenever that computer points to prathyash-localhost.com, it will be directed to your IP address. Firewalls are still a barrier - however the other answers covered that so I will not repeat their contribution.
Depending on your situation, manually editing tens maybe hundreds of files might not be feasible. In this case, you might want to consult the networks administrator (he probably hangs around on Server Fault), and he may have a better solution for you.
This problem can be fixed as follows.This is for one using a wamp server or a similar local server.
first ensure that you have modified the httpd.conf.scroll until you find this line:
# onlineoffline tag - don't remove
Order Allow,Deny
Allow from all
If you have a smartphone turn on your wifi hotspot to connect with your pc and the one you want to connect with.
Open the command prompt in your pc and type ipconfig. Note down the ip4 address of your pc (eg. 192.168.43.47) under wireless LAN adapter Wireless Network Connection.
In the pc you want to connect to set "Obtain IP address automatically".
Before you connect ensure your wamp server is online.
Open the browser of the client pc and type the IP address noted down earlier.This should work just fine. In some cases you may be required to switch off your antivirus.
Yes if they are on the same network, simply target the computer's IP address and ensure anything on either computer that would block access to port 80 (firewalls) is off
#Shaun Hare explained it pretty good, however, if those computers are not in the same network (my case, when remote presentation is needed) you would also need to set port forwarding on your router and remote side would need router's public IP address.
Basically, remote side would enter http://123.123.123.123/index.php in their browser and router would point that request (via port forwarding) to WAMP server installed at 192.168.10.10 (for instance).
You can't. Bind the appropriate daemon to 0.0.0.0/:: or an external interface and use the machine's IP address.
If it's for testing you could use a service like http://localhost.run/ or https://ngrok.com/ to temporarily put localhost on the internet.
Post forward port 80 on your router configuration. Start wamp. Now when your IP address is accessed from any external machine it will jump to the "www" folder and show the index file. If you are not able to do so, it means your firewall is blocking the request: Disable it and try again.
You could just tinker around the firewall. I found that the inbound and outbound rules were blocking all public network traffic (that is, all traffic to my router which is seen as public, even though it has a password) and proceeded to check the box to allow traffic on a public network (both inbound and outbound) for all the rules bearing the Apache name. Also, I did turn on the mySQL server, but that shouldn't do anything at all in this matter (though life has surprised me like this before where something insignificant turned out to be quite significant in the end, so I would do this as a last resort, but unlikely). Also, I think this should work at least over the same WiFi network (and I know that's a part of LAN, but just to clear up any ambiguity) since I only tested with my Android phone (oh how I wish I had a Windows Phone). Hope this of any use to anyone!
I have a classifieds website...
As you might imagine, as a webmaster (administrator) I need to sometimes remove classifieds, edit them etc etc.
I have my own Linux server, with root access offcourse.
Currently I have a section of my website with all administrative php scripts which I use to remove classifieds, edit them etc:
/www/adm/ //Location of administrative tools
This section above is protected today by a simple authentication using apache2.conf file:
<Directory /var/www/adm>
AuthType Basic
AuthName "Adm"
AuthUserFile /path/to/password
Require user username
</Directory>
My question is, is this enough to prevent outsiders access to my administrative tools?
Because it would be devastating if somebody with the wrong intentions got their hands on these tools. They would be able to delete all records from my databases... I do have backups, but it would mean tons of work...
What is usually done in cases like this?
Only thing I can think of is upload the administrative scripts whenever I plan on using them, and then remove them from the server after using them.
Other information which may help you decide what solution I should use:
I manage the website and server from only one and same computer
The IP adress is dynamic of that computer
I use secure ftp transfers of files to server
The administrative tools are PHP codes which communicate with the databases
I have IPTables firewall setup to only allow connections to database from my own server/website.
I backup all files every day
Thanks
If anybody else has access shell to the server, you should be very careful with permissions.
Otherwise, basic Apache auth is OK, but keep in mind that if you are using an unencrypted connection (not SSL), you password is sent as clear text across the web, so there's always the possibility of it being sniffed.
To enable SSL you need:
mod_ssl enabled on your apache
a self-signed (free) certificate
Change your apache configuration to include SSL port
You can refer to this tutorial on how to enable SSL on Debian.
A better option, on top of the usual password protection, IP restrictions, SSL, etc... is to host the tools on a completely seperate domain. Someone might guess that you have example.com/admin and try to brute force their way in, but hosting a simple login page on somecompletelydifferentdomain.com with no branding/markings to relate it to example.com is a better defence yet.
Apache auth can also restrict by IP address, so if you have a static IP, using that and a password should be pretty safe. I would also use AuthDigestFile instead of AuthUserFile if you're worried about attacks.
This page explains it well:
Unlike basic authentication, digest authentication always sends the password from the client browser to the server as an MD5 encryted string making it impossible for a packet sniffer to see the raw password.
If you must have direct remote access to the administrative tools, find an out-of-band way to prevent the web server from running them at all when they're not needed. You might, for example, do a chmod 000 /var/www/adm under normal circumstances, change it to something usable (say, 500) when you need to use them and back to 000 when you're done.
Better would be to secure the entire path between you and the administrative tools:
Use port knocking to enable SSH on some port other than 22 (e.g., 2222).
Lock down the sshd on that port to whatever your requirements.
Run a separate instance of your web server that listens on a port other than 80 (e.g., 8080) that can't be seen from the outside and has configuration to allow access to /var/www/adm but restrict access to the local host only.
When it comes time to use the administrative tools:
Knock to open the SSH port.
SSH into port 2222 and establish a tunnel from 8080 on the remote host to port 8080 on the server.
Use the remote browser to visit localhost:8080 and access your tools. The server will see the connection as coming from the local system.