I am developing a server application that is mostly used to registered users and gives services upon request.
It was built from scratch, and I am not fundamentally a PHP developer.
I must say that this is the first time I am deploying a production to the web.
The end of phase 1 is drawing near and I would like to know what do you think is the best approach to deploying this server.
How do I debug on a production server ? how do I log all my actions to log files ?
How should I handle the resources and traffic? how would I know if the server is reaching its limits ?
Any approaches to evening out the load ? I have one DB, is it possible to let another server use the same DB to take off the load from the first one ?
(Of course these are general and long shot questions, but I'd like to have a bigger picture of what I am doing here)
Is there anything I should know about security measures other than my own code ? like if I can trace down a hacking attempt for example?
If you are asking all these basic question, then you should not be deploying a production server. But needless to say, you are.
It's usually never a good idea to debug on a Production server, that is what QA or Development or Testing servers are for. If you are deploying a combination of Apache, PHP, MySQL, usually for PHP, there is a php_error.log file for you to look at. The location of that is base on your httpd.conf
Handling resource traffic is based on your volume, you have to answer that question yourself. Google MySQL configuration optimization and you'll find many helpful tips to correctly configure and optimize it's speed, and when you hit your limit, you'll know.
Security, that is another very vague question, security differ base on your needs, such as a bank vs a mom and pop website. I guess research network secure, and always keep in mine this, don't over kill, security is good enough when it accomplish the task.
As mentioned you shouldn't debug on a production server, all testing should have been completed in your development environment. However, of course you should thoroughly test everything once it is on the production server and fix any issues that may appear. By making sure your development environment is as close as possible to the live one in terms of settings etc you can mitigate the risks but you can never totally eradicate any potential issues.
Depending on your server you can try running a command such as "top" or "topaz" at the command line, this works on a Unix box if it has the correct installation and will tell you how much CPU is free and how much is being used. This would give a rough idea if it can handle the traffic you are throwing at it. Handling traffic and managing resources is a huge area in itself, there's a lot that can be done such as load balancing for example if you have multiple servers, you may find VMWare is helpful here also. There are also call-gapping techniques that can be employed depdending on what your application is for and who is using it. And yes you can have one database shared between more than one server.
There is professional monitoring sofware you can buy to show how busy your servers are that may help, just search for "monitoring software" for example.
Security is another huge area and the solutions would depend on what you are deploying and who is going to use it. You should be aware of all the likely methods of attack that your application is likely to be subjected to and have planned your code to cope with this, for example SQL injection, session hijacking etc.
You need contingency plans in case hackers compromise your site and ideally disaster recovery plans as well, depending on how critical your application is.
Best advice is to plan everything thoroughly before you start and get your bosses to approve the plans to cover yourself.
If you have more precise questions I can give more precise answers, I used to work for a global bank and have experience of releasing critical code to production servers with all the red-tape that goes with that
;-)
Related
I'm hosting a couple of sites for my friends, which they can edit using SFTP. But I recently stumbled upon something quite alarming.
Using: <? echo $realIP = file_get_contents("http://ipecho.net/plain"); ?>
They are able to get the real IP of the server. I'm using CloudFlare to "mask" the IP from the outside, so that's quite safe. I know that I can use a VPN for this, but that a quite expensive option. Is there any way to avoid, them using this certain methodes to gain the real server IP?
I would just generally secure the server in a way that is similar to typical shared hosting accounts.
While this is on the edge of what I know, I am by no means an expert on server security I do know a few things I have ran into over the years.
disable using stream file wrappers for remote files. This is controlled by the allow_url_fopen setting in PHP, and disables opening remote files using fopen, file_get_contents and their like.
disable some of the more dangerous PHP functions these include but are not limited to shell_exec, exec, eval, popen. These likewise can be disabled in the PHP ini file by adding them to the disabled_functions list.
remove shell access for the users. They will still be able to authenticate for sFTP, for file transfers. But they will not be able to login via SSH through something like putty you can modify the users like this usermod -s /sbin/nologin myuser for more details see this post on Unix.StackExchange
setup a test account with the same access your clients (people you provide hosting for) have and test what works and what doesn't. This will give you a bit of an idea what they can and cant do, and it gives you a place to test some configuration changes before applying them globally.
I am sure there are many more things you can do, and I can't really go into a whole article on server security. As I said I am by no means an authority on the subject. So the last thing I would say is do as much research as you can and see what others are doing for shared hosting servers as this is basically what you have.
I did find this Post on 14 best practices on server security.
http://www.hostingadvice.com/how-to/web-hosting-security-best-practices/
This just gives a high level overview of some of the concerns and doesn't really get to overly technical.
This is a pretty big topic, with many pitfalls, but I hope my limited knowledge at least gets you started down the road of securing your server. And remember, it's your server you get to say what the policies are on it.
That said it is very important to communicate with your users about any policy changes. They have pretty much had free reign up to this point. But if you explain to them that it's in their interest because it not only protects the server but also protects their data, it may go over a bit easier. They do have a right to know, and you do have an obligation to tell them. This way they can make any necessary changes to their code. But again, it is your server ultimately and it's your responsibility to make it as secure as you can.
Good Luck!
Currently I'm working with PHP programming, and I find that I can load a web page just only by using PHP CL, so I don't understand exactly why we have to install additional server like Apache or Nginx.
I don't know why your question was voted down. I see it as a question for focusing on a slightly broader but highly related question: Why should we be extremely careful to only allow specific software onto public-facing infrastructure? And, even more generally, what sort of software is okay to place onto public-facing infrastructure? And its corollary, what does good server software look like?
First off, there is no such thing as secure software. This means you should always hold a very skeptical view of anything that opens a single port on a computer to enable network connections (in either direction). However, there is a very small set of software that has had enough eyeballs on it to guarantee a certain minimum level of assurance that things will probably not go horribly wrong. Apache is the most battle-tested server out there and Nginx comes in at a close second as far as modern web servers are concerned. The built-in PHP HTTP server is not a good choice for a public-facing system let alone testing production software as it lacks the qualities of good network server design and may have undiscovered security vulnerabilities in it. For those and other reasons, the developers include a warning against using the built-in PHP server. It was added because users kept asking for it but that doesn't mean it should be used.
It is also a good idea to not trust network servers written by someone who doesn't know what they are doing. I frequently see ill-conceived network servers written in Node or Go, typically WebSocket-based solutions or just used to work around some issue with another piece of software, that implicitly opens security holes in the infrastructure even if the author didn't intend to do so. Just because someone can do something doesn't mean that they should and, when it comes to writing network servers, they shouldn't. Frequently those servers are proxied behind Apache or Nginx, which affords some defense against standard attacks. However, once an attacker gets past the defenses of Apache or Nginx, it's up to the software to provide its own defenses, which, sadly, is almost always significantly lacking. As a result, any time I see a proxied service running on a host, I brace myself for the inevitable security disaster that awaits - Ruby, Node, and Go developers being the biggest offenders. The moment a developer decides to write a network server is the moment they've probably chosen the wrong strategy unless they have a very specific reason to do so AND must be aware of and prepared to defend against a wide range of attack scenarios. A developer needs to be well-versed in a wide variety of disciplines before taking on the extremely difficult task of writing a network server, scalable or otherwise. It is my experience that few developers out there are actually capable of that task without introducing major security holes into their own or their users' infrastructure. While the PHP core developers generally know what they are doing elsewhere, I have personally found several critical bugs in their core networking logic, which shows that they are collectively lacking in that department. Therefore their built-in web server should be used sparingly, if at all.
Beyond security, Apache and Nginx are designed to handle "load" more so than the built-in PHP server. What load means is the answer to the question of, "How many requests per second can be serviced?" The answer is actually extremely complicated. Depending on code complexity, what is being hosted, what hardware is in use, and what is running at any point in time, a single host can handle anywhere from 20 to 20,000 requests per second and that number can vary greatly from moment to moment. Apache comes with a tool called Apache Bench (ab) that can be used to benchmark performance of a web server. However, benchmarks should always be taken with a grain of salt and viewed from the perspective of "Can we get this application to go any faster?" rather than "My application is faster than yours."
As far as developing software in PHP goes (since SO is a programming question site), I recommend trying to mirror your production environment as best as possible. If Apache will be running remotely, then running Apache locally provides the best simulation of the real thing so that there aren't a bunch of last-minute surprises. PHP code running under the Apache module may have significantly different behavior than PHP code running under the built-in PHP server (e.g. $_SERVER differences)!
If you are like me and don't like setting up Apache and PHP and don't need Apache running all the time, I maintain a set of scripts for setting up portable versions of Apache, PHP, and Maria DB (roughly equivalent to MySQL) for Windows over here:
https://github.com/cubiclesoft/portable-apache-maria-db-php-for-windows/
If your software application is actually intended to be run using the built-in PHP server (e.g. a localhost only server), then I highly recommend introducing a buffer layer such as the CubicleSoft WebServer class:
https://github.com/cubiclesoft/ultimate-web-scraper/
By using a PHP userland class like that one, you can gain certain assurances that the built-in PHP server cannot provide while still being a pure PHP solution (i.e. no extra dependencies): Fewer, if any, buffer overflow opportunities, the server is interpreted through the Zend Engine resulting in fewer rogue code execution opportunities, and has more features than the built-in server including complete customization of the server request/response cycle itself. PHP itself can start such a server during an OS boot by utilizing a tool similar to Service Manager:
https://github.com/cubiclesoft/service-manager/
Of course, that all means that a user has to trust your application's code that opened a port to run on their computer. For example, what happens if a website starts port scanning localhost ports via the user's web browser? And, if they do find the port that your software is running on, can that website start deleting files or run code that installs malware? It's the unusual exploits that will really trip you up. A "zero open ports" with "disconnected network cable/disabled WiFi" strategy is the only known way to truly secure a device. Every open port and established connection carries risk.
Good network-enabled software will have been battle-tested and hardened against a wide range of attacks. Writing such software is a responsibility that takes a lot of time to get right and it will generally show if it is done wrong. PHP's built-in server feels sloppy and lacks basic configuration options. I can't recommend its use for any reasonable purpose.
If you refer to the PHP documentation:
Warning
This web server was designed to aid application development. It may
also be useful for testing purposes or for application demonstrations
that are run in controlled environments. It is not intended to be a
full-featured web server. It should not be used on a public network.
http://php.net/manual/en/features.commandline.webserver.php
So yes, as it states, this is a good tool for testing purposes. You can quickly start a server and test your scripts in your browser. But that does not mean it provides all of the features you get with a production level server like apache or Nginx :)
You can use the built in server in your local development environment. But you should you use a more secure, feature rich web server in your production environment which requires much more features in terms of security, handling large number of requests etc.
We have developed a web application for an office management, the plan is to provide a hosting server.
Therefor due to some circomstences an important part of our targeted portential clients, wouldn't want there work online on the web (even if many security measurment are to be implemented), so we offer a LAN installation for these clients.
This is causing a real headaches for the team, since we don't want someone with enought skills to access our sources code, since if they do, they can try and find a way to hack our online sytem.
So, the question is how can we protect our work from beeing accessible, and only allow its execution, obfusticating the PHP code is one way, but its not enough, while seaching I've come accross a way where we can make a Vitrual machine, but this is not the most optmal solution since it will require some hardware configuration (memory and other stuff)
Does anybody know a way out from our misery ? It would be wonderful if its a free solution, but if not, it can be okay with a cheap one :)
This is a FAQ here and the received wisdom is that you can't do anything. This makes sense if you think about it as ultimately there is nothing you can do to stop someone reading what their computer is doing. If your computer is going to run code, it must be in a computer-acceptable format which means that no matter how you try to hide it, it can be decoded and displayed as in order for your computer to understand it, it must meet certain inviolable standards which are published and well known.
You can make if difficult for people who would not know what to do with the code anyway, but anyone who could use your code will be able to get it if they want to.
Is your stuff so remarkable and innovative that you really think it is worth stealing?
For instance, it took me almost 10 minutes to work out how Google did it's nifty suggest thing and another 20 to replicate it. By SO standards that is extremely slow. I use the idea along with almost everyone else but I have never seen their code and it would be boring if they showed it.
Why not just bind users with a non-reverse-engineering contract? This, after all is how MS protects its IP. Windows is easy enough to copy if you want to. MS makes it worth paying for their product by providing updates only to licenced users. Perhaps you could do the same.
Ask yourself if it is REALLY necessary to have the code hosted locally. When did the internet last fail you? 3 years ago I experienced a major earthquake. No power, water or sewage for 2 weeks but the internet both wired and mobile kept working. My computer didn't as I had no power but 3g was just fine. The infrastructure is incredibly robust and there is really very little need for local data duplication. My experience has been that anything that knocks out the internet more than transiently is more than likely going to knock out any local solution too.
Finally, if your clients want a locally hosted solution, ask yourself if they are worth the trouble. The best way to help them to mature is to let them see what they are missing.
Caveat - I do actually duplicate some data on some local systems but this is a useful feature of backup - I allow clients a view of the backed up data in the event of catastrophic internet failure, but I don't allow them to modify the locally held data as it negates the 'one true record' principle which is why we use the cloud in the first place.
In the next few weeks I'll be taking my site from the localhost (WAMP) and puting it on a new server. This will be the first site, on my first server, so basically...i'm a noob!
This must be an important moment for any independent web developer / small business so i'd love to hear about some experiences, mistakes and system default security holes that one should fix straight away...
I'm using php, mysql, cpanel and WHM, and looking for tips like "Turn off error reporting in PHP"
First and foremost if you are worried about security then you should use LAMP. As long as the Linux platform is using AppArmor or SELinux (Ubuntu and fedora respectively), then you are much better off than any version of Windows. I know this from first hand experience of developing exploit code for the two platforms.
Before you lock your system down, test your code for vulnerablites using Wapiti. Acunetix is also good, but expensive. This type of testing, especially sql injection testing must be done with dispaly_errors=On set in your php.ini
There is a lot that can go wrong with PHP Configuration that makes your system less secure. You should run PHPSecInfo and remove all red. dispaly_errors=Off is what you want, and phpsecinfo tests for it.
You should also use a web application firewall like Mod_secuirty.
It's actually quite a huge undertaking, but well worth the experience. Here are just one or two suggestions...
Site security also means being heavily involved in managing your sometimes scarce resources. Just as important is obeying any limits your host has, and guessing all possible ways your site users can push you over those limits, leaving you responsible to pay a hefty bill. IE downloading or uploading large files over and over, spamming email lists, repeatedly requesting pages using too many database connections and queries, etc. Get overusage limits and fees in writing from your host before you begin, and have response plans ready. Really, this part is like buying a cellphone service.
A lot would also depend on what features you'll have on your site. File uploads? Forum? Logins? Email? Etc? For example - If you're running a file-sharing site: along with upload/download rate limiting, I suggest you first check available disk space before permitting any file to be uploaded, or do regular audits so you're prepared to archive or delete old and unused files. It's a quick check just to make sure you're not caught by surprise a year down the road when you suddenly start getting disk full errors or get shafted by your host with a large bill.
There are literally a hundred more issues to consider. Gather up a complete overview - an itemized list - of all features and functions of your site. Google each one to get more ideas on handling security. Your host should also publish their own security considerations and have a handy manual for operating with all of their services. If they don't, well, I wouldn't personally feel comfortable with them.
I own a website running on LAMP - Linux, Apache, mySQL and PHP. In the past 2-3 weeks the PHP and jQuery files on my website have become infected from malware from a site called gumblar.cn
I can't understand how does this malware get into my PHP files and how do I prevent it from happening again and again.
Any ideas?
UPDATE:
Looks like it is a cpanel exploit
Your site is cracked, so the crackers simply replace your files.
You should always upgrade your Linux OS, Apache, MySQL, PHP, and the web PHP programs whenever a security alert is announced.
Linux servers running open services without upgrading them regularly are the most vulnerable boxes on internet.
No one here can provide a conclusive solution based on the information you provided, so all we can suggest is that you follow good security practices and standards and correct any weak points immediately.
Make sure your software is up-to-date. It's very possible to gain access to local files through exploits in PHP programs, so keep any third-party applications you're running on their latest versions (especially very widespread programs like Wordpress and phpBB), and do whatever you can to ensure that your server is running the correct versions of its services (PHP, Apache, etc.).
Use strong passwords. A strong password is a long, random list of characters. It should have nothing to do with your life, it should have no readily available acronyms or mnemonics, it should not resemble a dictionary word, and it should contain a healthy interspersing of different characters; numbers, letters of different cases, and symbols. It should also be reasonably long, ideally more than 26 characters. This should help keep people from bruteforcing your credentials for enough time for competent sysadmins to take action against the attackers.
Work with the administrators at your hosting provider to understand what happened in this particular case and do things to correct it. They may not have noticed anything unusual; for instance, if you have an easy password, or if this attack was perpetrated by a trusted individual, or if you have an unpatched exploit in a custom PHP application, there would be nothing to indicate an improper use.
Shared hosts also have many people with access to the same local machine, so things like file permissions and patching of locally-accessible exploits both within your application and generally is very important. Make sure your host has good policies on this and make sure that none of your software unequivocally trusts local connections or users.
The nature of the attack (an import of malware from a site that appears to do this kind of thing en masse) suggests that you were running an exploitable application or that your username/password combination was not sufficiently strong, but the administrators at your provider are really the only ones able to supply accurate details on how this happened. Good luck. :)
Chances are, there is an application on your server with a known vulnerability that has been attacked, and something has modified files on your web site or installed a new file.
When searching for information on gumblar.cn, it looks like they use a trojan called JS-Redirector-H. Not sure if this is what is involved here.
Fixing this may involve restoring your web site from backup, if you have no way of knowing what has been modified. If you have source control or a recent version, you may be able to do a whole-site diff. But you will also need to fix the security vulnerability that allowed this to happen in the first place.
Chances are it's some insecure app, or an app you installed some time ago but have not updated recently. A few people who have complained about this mentioned that they use Gallery (ie PHP Gallery). Though I'm not sure if that's connected.
If you are not the server administrator, talk to the server admin. They may be able to help, and it would be wise to let them know about this.
Google Advisory:
http://safebrowsing.clients.google.com/safebrowsing/diagnostic?client=Firefox&hl=en-US&site=http://gumblar.cn (linking doesn't work)
First, contact your hosting company and report this. If this is server-wide, they need to know about it.
The most common cause of infections like this is vulnerable popular PHP software (such as PHPBB, Mamboserver and other popular systems). If you're running any 3rd party PHP code, make sure you have the latest version.
If you've determined that this only affects your site, restore from a backup. If you don't have any backups, try re-installing everything (you can probably migrate the database) you have (to the latest version) and go through your own PHP code (if any).
PHP Programs are actually simple text files that run on the server by the PHP interpreter. if your application is infected, then I think there are tow posiibilities:
1.they have used some security hole in YOUR application to inject some code into your server, so now they have changed some of your PHP files, or some of your database information.
if this is the case, you better double check every single place where you are fetching information from the user (text inputs, file uploads, cookie values, ...), make sure everything is well filtered. this is very common security practice to filter anything that comes from the user. you also better make sure that the data that is currently saved in your database (or file system) is clean. I suggest using Zend_Filter component of the Zend Framework to filter user input. there are many full featured filter libraries out there.
2.they could have run some program on your server, that is affecting your PHP source files. so somehow they have accomplished running some program/script your server, that is changing your application.
if this is the case, I suggest your check all your server processes and make sure you know every process that is running. although I think this is less possible.
Ok, this is NOT a programming question and SO is not the place for this because if we would tolerate such questions here we would soon be a first aid / support site for ppl with bad shared hosting accounts.
I only didn't vote for closing because I feel bad turning a few ppl down who are probably feeling really bad about a problem they don't have the knowledge to fix.
First of all: google for gumblar.cn, there is a growing number of potentialy helpful posts accumulating as we speak.
If you're a real beginner and you feel you don't get any of the things in the answers here then just do the following:
Get a new host
Google for information about all your software until you know, if the software is safe. If it's not, don't use it, until the developers have fixed the problem. An example of a not secure software is 'Galery'.
Install all your software (the secure ones only) FRESH INSTALL!!
Copy over static files (like images) to the new server. Do NOT copy over any dynamic files, like php scripts, as they could be infected.
Don't upload any of your own PHP scripts until you've checked them for security vulnerabilities. If you don't know how to do this, don't upload anything before you've learned about these things.
I have been affected by this virus/malware and currently cleaning up. I hope this will be helpful:
1) You most likely have a TROJAN on your PC. To verify this simply run (Start > Run... or Windows key + R) and type "cmd" or "regedit". If either of those doesnt open its window as expected, you have the Js:Redirector trojan. You can also verify that the anti virus programs aVast and Malware Bytes can not connect to updates for some reason (sneaky trojan that is). Plus, you'll notice that the Security program of the Control Panel was disabled, you wouldn't have seen a notification in the tray icons to tell you that the virus protection was disabled.
2) This is a very recent exploit, apparently of vulnerabilities inflash or pdf plugins, thus you are not safe even if you didn't use Internet Explorer!
As for me, I believe because I hate programs slowing down my PC, I have my Windows Updates on "manual", and I didn't have resident protection (scanning of all web connections, etc), and I was probably infected by visiting another hacked site which was not blacklisted yet. Also I was over confident in non-IE browsers! I sometimes ignore the blacklist warning as I am curious about what the scripts do etc, and forgot once again just how BAD Windows really is. Conclusion: leave Windows Updates on automatic, have minimal resident protection (aVast Web Shield + Network Shield).
3) Because this is a trojan that sends back your FTP password, it doesn't matter how good your password was!
4) Try to lceanup your PC with Malware or aVast, it will find a file ending with ".ctv"
You MUST have a virus database dated 14 May or more recent. If you can't update (as explained above), then follow these instructions (you'll need to extrapolate but basically you have a file, the name may vary, which is pointed in the registry, and use HiJackThis to remove it, once you rebout without this file excuted, all is fine)
5) Of course update your passwords, BUT make sure the trojan is removed first!
6) For an exact list of all pages modified try to get a FTP log and you'll find the IP of the script/hacker and all touched files.
7) If you have a complete local copy of the "production" environment, then the safest is to delete ALL the site on the server, and re-upload all files.
8) During the clean up process DONT visit your infected site, or you will re-install the trojan! If you have the latest aVast Home Edition and the "Web Shield" protection it will give you a warning and block the page from being executed by your browser.
like Francis mentioned, try to get your hosting company to make sure their software is up to date.
On your side, change your ftp password to something completely obscure as soon as possible. I've seen this happen to people before. What these 'hackers' do is a brute force on your ftp account, download a couple of files, modify them slightly, and then re-upload the infected copies. If you have access to the ftp log files you'll probably see a connection to your account from an IP other than yours. You may be able to submit this to your hosting company and ask them to black-list that IP from accessing their servers.
That website (gumblar.cn that you mentioned) is being tested for malware. You can monitor results here: http://www.siteadvisor.com/sites/gumblar.cn/postid?p=1659540
I had something like this happen to me at an old hosting provider. Somehow, someone, was able to infect Apache in some way so that a special header was injected into all my PHP files which caused the browser to try to download and run in the browser. While they got it fixed, the quick solution was to take down all my PHP files, and change my index file a plain HTML file. Whether or not this stops the problem for you depends on how the server is infected. The best thing and probably most responsible thing you can do is to protect your visitors by taking down site, and if possible (if text files aren't infected), display a message stating that if they visited recently they may have been infected.
Needless to say, I switched hosting providers quick soon after my site was infected. My hosting provider was pretty bad in a lot of other ways, but this was pretty much the final straw.