LAN Architecture - Looking for Input - php

I currently have a web server and a file server on my lan; both have static IP's on the LAN, and the router forwards inbound port 80 traffic to the web server, so the file server is only accessible internally.
I'm toying with the idea of writing a small PHP app that will let me interact with the file server via the web server. So, I wouldn't change anything at the router level - anyone scanning my public IP would still only find port 80 on the webserver.
My questions are pretty high level and don't necessarily have a "right answer". I can see a few ways of doing this, and then looking to open it up for input / ideas.
One is to exec ssh from the webserver to the fileserver, and just run shell commands via exec(ssh) that return directory listings, scp files that I upload, etc. This way is no better than giving someone root on my fileserver if they compromise my php front end, so I'm not too keen on that idea, but am considering it.
Another way would be to just duplicate the contents of the fileserver to the webserver drive, and sync them up periodically, but that's impractical and defeats the purpose of having a central file server, so I've pretty much already dismissed that idea.
That's about all I have so far. Anyone have thoughts or opinions?

You can setup a SFTP chroot jail. What this means is that an authorised account can SFTP in but never have access to a shell. You can also keep an eye on acces via lastlog as well as sshd logs.
Do configure your sshd for key-pair logins as well for the most security. The advantage is you can provide multiple access to a 'common' SFTP account - if the need arises. Just add all the public keys to the target accounts .ssh/authorized_keys2.
This should help:
http://ubuntuforums.org/showthread.php?t=858475

Related

Is someone trying to Hack? Receiving Suspicious Requests on my Apache2 Ubuntu 18.04 server

Today I was checking my server logs then I noticed some requests which I think is that someone is trying to get into my server.
I am hosting PHP Laravel (6) based admin panel and API's on it. I have also checked my public routes and permissions of the files.
Can someone figure out what else should I do to prevent something disastrous thing to happen? Thanks in advance.
Here are some other suspicious requests :
/hudson
/cgi-bin/mainfunction.cgi
/?XDEBUG_SESSION_START=phpstorm
/solr/admin/info/system?wt=json
/?-a=fetch&content=%3Cphp%3Edie%28%40md5%28HelloThinkCMF%29%29%3C%2Fphp%3E
/api/jsonws/invoke
/azenv.php?a=PSCMN&auth=159175997367&i=2650084793&p=80
?function=call_user_func_array&s=%2FIndex%2F%5Cthink%5Capp%2Finvokefunction&vars%5B0%5D=md5&vars%5B1%5D%5B0%5D=HelloThinkPHP
/.well-known/security.txt
/sitemap.xml
/TP/index.php
/TP/public/index.php
/ip.ws.126.net:443
/nmaplowercheck1591708572
/evox/about
/MAPI/API
/evox/about
/owa/auth/logon.aspx?url=https%3A%2F%2F1%2Fecp%2F
/vendor/phpunit/phpunit/src/Util/PHP/eval-stdin.php
These are among many bots that are constantly trying to break into servers or gain unauthorized access on your web app. You can read more about them here. This happens to all servers, regardless of which service provider you're using AWS / DigitalOcean / Linode or whatever other options.
Most commonly, they'll try generic login urls and bruteforce them with default or common username/passwords. They're always there, but you probably did not notice until you started checking the log files.
While we're on this topic, there are also SSH worms that are constantly trying to bruteforce SSH into your server. This is why it's important to use good passwords, or better yet, disable password entry into your server and only allow SSH. That will greatly improve security but still will not stop their efforts.
What you can do to protect your server:
Like mentioned above, disabled password login and only allow SSH
Enable firewall and setup the firewall rules accordingly
Ensure the packages that you use always have the latest security patches
Use tools like Fail2Ban which will ban an IP if SSH attempts failed more than a set amount of time. You can configure Fail2Ban to do more, do explore the docs
Welcome to the internet.
The last time I bothered to look it took around 10 minutes after plugging a device into a public IP address which had been unused for over 6 months before the first attack. What you can do about it is:
Ensure you keep your OS and any third party libs/applications are patched and up to date
Ensure that the uid running your PHP code can only write to specific locations outside the document root (preferable nowhere on the filesystem)
Ensure that the uid running your PHP code cannot read your weblogs
write secure code
Take regular backups
Run a host based IDS
This might just be an automated bot searching for certain files/urls on the webserver. Make sure all your environment files are not accessible (using htaccess) and you have the latest security patches of Laravel.

Running a custom webserver in cloud9 environment and reaching it from the outside world

Is it possible to run serve my web application from another server than the one provided in cloud9?
For example : I would like to run different applications (PHP, Node.js - not sure what's possible yet) with nginx as the backend server (i) and/or a reverse proxy (ii) (to try different scenarios and configuration options).
Is it possible to run nginx and serve content to the outside world in cloud9?
Is it possible to have nginx as a reverse proxy in cloud9?
EDIT:
Here they write:
$PORT is exposed to the outside: When you run an application which listens on the port specified in the environment variable $PORT, you can access this application using the http://projectname.username.c9.io URL scheme. The proxy expects the server on that port to be a HTTP server. Other protocols are not supported.
This leads me to believe that if I would start nginx on port=$PORT it would be accesible via the specified URL sheme - can anyone confirm? Maybe anyone has tried this and can share some time-saving tips. Thanks.
I know this might be a late reply but might be helpful for those who are wondering how to do the same.
Short answer
I've created a repository to hold all the configuration needed on the process. Just run a command and NGINX and PHP-FPM will be serving and accessible from internet.
GitHub repo: https://github.com/GabrielGil/c9-lemp
Explanation
Basically to run NGINX on a c9 environment as you noted, you just have to make it listen on port 8080. YOu can either edit the default site on /etc/nginx/sites-available or create and enable your own (That's what the script above does)
Then, in order to run PHP-FPM script using NGINX, configure some permissions and the socket on the webserver is needed. By default, c9 uses ubuntu:ubuntu and the webserver www-data:www-data.
The script above also does this changes for you.
Hope this help you, or other users on similar situations.
You can run nginx on a normal Cloud9 workspace, as long as it listens to port 8080 (the value of $PORT). The URL scheme to reach your server would be http://projectname-username.c9.io, however. Please refer to the docs.c9.io for more up-to-date help on running applications.
One other thing you can do if you have another server where you would like to host your software, is to create an ssh workspace (https://docs.c9.io/ssh_workspaces.html). That way, you can connect Cloud9 to an external server directly.

Web Server Interrupt Driven File Transfer

I have a webpage that currently takes an upload from a user and stores this into a directory (/upload). [Linux based Server]
I am looking for a way instead of storing this on the server/in that directory to instead transfer the file onto a local machine. [Running Ubuntu 12.04]
Assuming I already have public/private keys setup how might I go about doing this?
Current Ideas:
ftp transfer
rsync
Ideas:
1) Stop running anything on the server, and forward every byte to your local box. Just run ssh -N -R :8080:localhost:3000 remote.host.com This will allow anyone to hit http://remote.host.com:8080 and get your port 3000. (If you do port 80, you'll need to SSH in as root.) Performance will be kinda bad, and it won't be that reliable. But might be fine for real-time transfer where you're both online at once.
2) use inotifywait to watch the upload dir on the server, and trigger rsync from the server to your local box. (Requires exposing SSH port of your box to the world.) If you sometimes delete files, use unison bidirectional file sync instead. (Although unison doesn't work on long filenames or with lots of files.)
3) Leave the system as-is, and just run rsync from cron on your local box. (Ok, not realtime.)
Of course, most people just use dropbox or similar.Alghough

Move File in Server Cluster From ServerA to ServerB

** Preface: I don't know much about networking. If i described the set up wrong, I'll try again. **
I have a server cluster of serverA, serverB, and serverC all behind a firewall and on a switch. I want to move a file from serverA to serverB programmatically. In the past when I had to move a file on serverA to another location on serverA I just call exec("sudo mv file1 /home/user/file1"); Can I still do this when multiple servers are involved?
EDIT: All great responses guys. I look into how the server's is cluster and find out if it's a mount or what's going on. Thank you EVERYONE! You guys are my hero!
If you use a common share like nfs that is mounted to all the servers, you can use mv on a file.
If you don't have that option, you can transfer the file to another server using scp or rsync.
Well first of all you should use the native functions to move files around. See rename: http://us2.php.net/rename. It would just mean that you need to make sure the permissions are correct in both locations (likely they need to be owned by the apache user)
But in answer to your actual question it really depends on the setup. Generally another server you could move files to would have a mount point and it would look like any other directory so you wouldn't need any changes to your code at all. This is probably the best way to do it.
If you have to use FTP or something like that you'll need to use the appropriate libraries for whatever protocol required.
While this option is probably a bit too complicated to set up, let me point to UDP hole Punching.
If the addresses of all servers are know and fixed, it is able to traverse firewalls and NATed networks.
In principle, portpunching works like this:
Let A and B be the two hosts, each in its own private network; N1 and N2 are the two NAT devices:
A and B try to create an UDP connection to each other
Most likely both attempts fail, since no holes are prepared yet
But: The NAT devices N1 and N2 create UDP translation states and assign temporary external port numbers
A and B contact each others' NAT devices directly on the translated ports; the NAT devices use the previously created translation states and send the packets to A and B
This even works, the addresses of A and B are unknown to each other. In this case, one needs a public known intermediate system S. See the Wikipedia article to learn more.
you can use the linux command line tool SCP to copy files over a network via SSH
make sure SSH certificates are configured on the servers.
Example:
exec("sudo cp [-Cr] [[user#]ServerA:]/path/to/file [more...] [[user#]SERVERB:]/path/to/file

How to upload a site to a server where folders are writable for php

This problem occurred to me multiple times now, and it's time for me to do it the right way!
How can I upload a website to the server, where php has access to the folders for writing data.
Usually I use an FTP program, but I can't upload as root, so there are restriction problems all over the place...
How do you do stuff like this?
Thanks!
EDIT
I'm sorry, I accidentally added rails to the tags instead off php.
Probably I need to clarify my problem, since the answers didn't really help me out here:
I already have a server running apache, DirectAdmin and some other stuff like rails.
And the problem is when I upload a website like joomla or wordpress via FTP the restrictions always need to be set to 777/775 or these sites can't write to the folders..
So what I need to know is:
How can I upload these sites (via FTP/SSH) as a user (root) that is the same as php, so that php can create files in all folders it needs to write to?
Hope I'm being more clear now, thanks for the help so far!
Use a server with ssh access and full write access to wherever your Rails app is hosted (and usually ssh access is as the user that Rails runs as).
For me this usually means a VPS type server, I like Rackspace Cloud which turns out to be around $11 - $15 per month for a low traffic, low spec server. I've also heard good things about Linode
The solution
Upload your site with FTP
SSH to the server and go to the public_html folder
chown -R [user_name]:[group_name] [folder_name]
For me the right user was apache..

Categories