In the next few weeks I'll be taking my site from the localhost (WAMP) and puting it on a new server. This will be the first site, on my first server, so basically...i'm a noob!
This must be an important moment for any independent web developer / small business so i'd love to hear about some experiences, mistakes and system default security holes that one should fix straight away...
I'm using php, mysql, cpanel and WHM, and looking for tips like "Turn off error reporting in PHP"
First and foremost if you are worried about security then you should use LAMP. As long as the Linux platform is using AppArmor or SELinux (Ubuntu and fedora respectively), then you are much better off than any version of Windows. I know this from first hand experience of developing exploit code for the two platforms.
Before you lock your system down, test your code for vulnerablites using Wapiti. Acunetix is also good, but expensive. This type of testing, especially sql injection testing must be done with dispaly_errors=On set in your php.ini
There is a lot that can go wrong with PHP Configuration that makes your system less secure. You should run PHPSecInfo and remove all red. dispaly_errors=Off is what you want, and phpsecinfo tests for it.
You should also use a web application firewall like Mod_secuirty.
It's actually quite a huge undertaking, but well worth the experience. Here are just one or two suggestions...
Site security also means being heavily involved in managing your sometimes scarce resources. Just as important is obeying any limits your host has, and guessing all possible ways your site users can push you over those limits, leaving you responsible to pay a hefty bill. IE downloading or uploading large files over and over, spamming email lists, repeatedly requesting pages using too many database connections and queries, etc. Get overusage limits and fees in writing from your host before you begin, and have response plans ready. Really, this part is like buying a cellphone service.
A lot would also depend on what features you'll have on your site. File uploads? Forum? Logins? Email? Etc? For example - If you're running a file-sharing site: along with upload/download rate limiting, I suggest you first check available disk space before permitting any file to be uploaded, or do regular audits so you're prepared to archive or delete old and unused files. It's a quick check just to make sure you're not caught by surprise a year down the road when you suddenly start getting disk full errors or get shafted by your host with a large bill.
There are literally a hundred more issues to consider. Gather up a complete overview - an itemized list - of all features and functions of your site. Google each one to get more ideas on handling security. Your host should also publish their own security considerations and have a handy manual for operating with all of their services. If they don't, well, I wouldn't personally feel comfortable with them.
Related
By no means, NewRelic is taking the world by storm with many successful deployments.
But what are the cons of using it in production?
PHP monitoring agent works as a .so extension. If I understand correctly, it connects to another system aggregation service, which filters data out and pushes them into the NewRelic cloud.
This simply means that it works transparently under the hood. However, is this actually true?
Any monitoring, profiling or api service adds some overhead to the entire stack.
The extension itself is 0.6 MB, which adds up to each php process, this isn't much so my concern is rather CPU and IO.
The image shows CPU Utilization on a production EC2 t1.micro instances with NewRelic agent (top blue one) and w/o the agent (other lines)
What does NewRelic really do what cause the additional overhead?
What are other negative sides when using it?
Your mileage may vary based on the settings, your particular site's code base, etc...
The additional overhead you're seeing is less the memory used, but the tracing and profiling of your php code and gathering analytic data on it as well as DB request profiling. Basically some additional overhead hooked into every php function call. You see similar overhead if you left Xdebug or ZendDebugger running on a machine or profiling. Any module will use some resources, ones that hook deep in for profiling can be the costliest, but I've seen new relic has config settings to dial back how intensively it profiles, so you might be able to lighten it's hit more than say Xdebug.
All that being said, with the newrelic shared PHP module loaded with the default setup and config from their site my company's website overall server response latency went up about 15-20% across the board when we turned it on for all our production machines. I'm only talking about the time it takes for php-fpm to generate an initial response. Our site is http://www.nara.me. The newrelic-daemon and newrelic-sysmon services running as well, but I doubt they have any impact on response time.
Don't get me wrong, I love new relic, but the perfomance hit in my specific situation hit doesn't make me want to keep the PHP module running on all our live load balanced machines. We'll probably keep it running on one machine all the time. We do plan to keep the sysmon stuff going 100% and keep the module disabled in case we need it for troubleshooting.
My advice is this:
Wrap any calls to new relic functions in if(function_exists ( $function_name )) blocks so your code can run without error if the new relic module isn't loaded
If you've multiple identical servers behind a loadbalancer sharing the same code, only enable the php module on one image to save performance. You can keep the sysmon stuff running if you use new relic for this.
If you've just one server, only enable the shared php module when you need it--when you're actually profiling your code or mysql unless a 10-20% performance hit isn't a problem.
One other thing to remember if your main source of info is the new relic website: they get paid by the number of machines you're monitoring, so don't expect them to convince you to not use it on anything less than 100% of your machines even if it not needed. I think one of their FAQ's or blogs state basically you should expect some performance impact, but if you use it as intended and fix the issues you see from it, you should recoup the latency lost. I agree, but I think once you fix the issues, limit the exposure to the smallest needed number of servers.
The agent shouldn't be adding much overhead the way it is designed. Because of the level of detail required to adequately troubleshoot the problem, this seems like a good question to ask at https://support.newrelic.com
I have developed a Social Networking website based on WAMP(Windows, Apache, MySQL, PHP) server. I put it up on a Free Host (the hosting serves it in LAMP), and it works fine.
Now, I researched a little and found out that PHP applications are difficult to scale, and take a lot of parallel algorithms. I would like to test how many users does the webhost support for my website, and how much does my localhost.
It's a social network like any other, involving:
Posting data on main page(with images).
Chat between users (with polling every 3 seconds, consider as one in
facebook).
A question-answer forum(just as this one, or yahoo answers -
including upvotes,downvotes, points etc)
Two HTML5 server sent event loops running infinitely.
Many AJAX requests for retrieving data from MySQL database.
As for now, I haven't applied any Cache options, which I plan for later. Also, the chat application has to be switched from Polling to Websockets(HTML5).
My estimated user database goes to be a lot more than 100,000 users. That may need some serious scalability.
I need to know what kind of server may I need for the same. Should it be a dedicated server, should it be 2 of 'em or even more?
I tried this ab.exe located in bin folder of Apache, but it tests the location we provide manually. A social network needs login information to access all the data, which unfortunately limits the functionality of ab.exe only to availability of the "Welcome" page, and nothing towards the AJAX and HTML5 features which I mentioned above.
So, how exactly should I test the scalibilty of website for a hardware same as my laptop(Windows, Intel i5, 4gb Ram, 2.0 GHz), and what about scalability on Shared servers available out there, or even the dedicated ones.
Simply put: You are counting your chickens before they hatch. If you become overwhelmed by a bunch of new users, then that is what we tend to call a "good problem". If you're worried about scalability along the way, then you should look into:
Caching with Memcached or Redis.
Load balancing.
Switching from Apache to Nginx.
Providing a proper CDN.
Since you're using PHP, you should install an opcache.
There are a lot of different ways to squeeze out results. Until you need them, I'd suggest sticking by best practices (normalization, etc).
If you are worried about the capacity of your hosting company to cope with your application then the first thing to do is to contact them and discuss what capacity they have available and how scalable their environment is.
Unless you are hosting yourself, then you have virtually no control over the situation.
But if you believe the number of users is going to grow very quickly then it would be wise to enter a dialogue very early with your provider.
Let's say your writing a PHP application that will be hosted in a load-balanced/multi-server setup. What are the things you need to know in order to ensure smooth operation? Right now the only thing I think will be an issue is PHP sessions (i.e., you must use a custom database handler for it). Anything else?
Let's turn this into an answer:
In my experience, the overwhelming majority of PHP applications is not or not only constrained by PHP horsepower on the webserver, but at least as much by backing store, i.e. Database and/or files.
So load balancing a PHP application without carefull analysis bears the potential to make things worse: Hit the weakest link in the chain with more and more load.
So the first - and IMHO most important "thing to know when writing a web app hosted in a load-balanced server" is the load pattern, and its potential for balancing. If your app performs bad, you load-balance it on more servers, then find out you now have more servers waiting for the DB, you are in trouble.
Here is an out-of-the blue checklist, please reagrd it as a brainstorm (or a brainfart) only:
First: Are you really CPU-bound?
Which pages are hit most (see your log)
For the top N of these (with a suitable N) check the processing pattern: Where do the CPU cycles go?
What would be the side effects of making sessions, uploads, file storage (add whatever you use) shared and would it be offset by the load balancing?
Comments welcome, I am very sure to have not even scratched the surface!
Edit
Just thought of something that bit me once in this context: Resource locking. Brace yourself for a higher degree of concurrency, if you go multi-server
File uploads/downloads could be also an issue - you probably would need them to be visible all servers
I am developing a server application that is mostly used to registered users and gives services upon request.
It was built from scratch, and I am not fundamentally a PHP developer.
I must say that this is the first time I am deploying a production to the web.
The end of phase 1 is drawing near and I would like to know what do you think is the best approach to deploying this server.
How do I debug on a production server ? how do I log all my actions to log files ?
How should I handle the resources and traffic? how would I know if the server is reaching its limits ?
Any approaches to evening out the load ? I have one DB, is it possible to let another server use the same DB to take off the load from the first one ?
(Of course these are general and long shot questions, but I'd like to have a bigger picture of what I am doing here)
Is there anything I should know about security measures other than my own code ? like if I can trace down a hacking attempt for example?
If you are asking all these basic question, then you should not be deploying a production server. But needless to say, you are.
It's usually never a good idea to debug on a Production server, that is what QA or Development or Testing servers are for. If you are deploying a combination of Apache, PHP, MySQL, usually for PHP, there is a php_error.log file for you to look at. The location of that is base on your httpd.conf
Handling resource traffic is based on your volume, you have to answer that question yourself. Google MySQL configuration optimization and you'll find many helpful tips to correctly configure and optimize it's speed, and when you hit your limit, you'll know.
Security, that is another very vague question, security differ base on your needs, such as a bank vs a mom and pop website. I guess research network secure, and always keep in mine this, don't over kill, security is good enough when it accomplish the task.
As mentioned you shouldn't debug on a production server, all testing should have been completed in your development environment. However, of course you should thoroughly test everything once it is on the production server and fix any issues that may appear. By making sure your development environment is as close as possible to the live one in terms of settings etc you can mitigate the risks but you can never totally eradicate any potential issues.
Depending on your server you can try running a command such as "top" or "topaz" at the command line, this works on a Unix box if it has the correct installation and will tell you how much CPU is free and how much is being used. This would give a rough idea if it can handle the traffic you are throwing at it. Handling traffic and managing resources is a huge area in itself, there's a lot that can be done such as load balancing for example if you have multiple servers, you may find VMWare is helpful here also. There are also call-gapping techniques that can be employed depdending on what your application is for and who is using it. And yes you can have one database shared between more than one server.
There is professional monitoring sofware you can buy to show how busy your servers are that may help, just search for "monitoring software" for example.
Security is another huge area and the solutions would depend on what you are deploying and who is going to use it. You should be aware of all the likely methods of attack that your application is likely to be subjected to and have planned your code to cope with this, for example SQL injection, session hijacking etc.
You need contingency plans in case hackers compromise your site and ideally disaster recovery plans as well, depending on how critical your application is.
Best advice is to plan everything thoroughly before you start and get your bosses to approve the plans to cover yourself.
If you have more precise questions I can give more precise answers, I used to work for a global bank and have experience of releasing critical code to production servers with all the red-tape that goes with that
;-)
I own a website running on LAMP - Linux, Apache, mySQL and PHP. In the past 2-3 weeks the PHP and jQuery files on my website have become infected from malware from a site called gumblar.cn
I can't understand how does this malware get into my PHP files and how do I prevent it from happening again and again.
Any ideas?
UPDATE:
Looks like it is a cpanel exploit
Your site is cracked, so the crackers simply replace your files.
You should always upgrade your Linux OS, Apache, MySQL, PHP, and the web PHP programs whenever a security alert is announced.
Linux servers running open services without upgrading them regularly are the most vulnerable boxes on internet.
No one here can provide a conclusive solution based on the information you provided, so all we can suggest is that you follow good security practices and standards and correct any weak points immediately.
Make sure your software is up-to-date. It's very possible to gain access to local files through exploits in PHP programs, so keep any third-party applications you're running on their latest versions (especially very widespread programs like Wordpress and phpBB), and do whatever you can to ensure that your server is running the correct versions of its services (PHP, Apache, etc.).
Use strong passwords. A strong password is a long, random list of characters. It should have nothing to do with your life, it should have no readily available acronyms or mnemonics, it should not resemble a dictionary word, and it should contain a healthy interspersing of different characters; numbers, letters of different cases, and symbols. It should also be reasonably long, ideally more than 26 characters. This should help keep people from bruteforcing your credentials for enough time for competent sysadmins to take action against the attackers.
Work with the administrators at your hosting provider to understand what happened in this particular case and do things to correct it. They may not have noticed anything unusual; for instance, if you have an easy password, or if this attack was perpetrated by a trusted individual, or if you have an unpatched exploit in a custom PHP application, there would be nothing to indicate an improper use.
Shared hosts also have many people with access to the same local machine, so things like file permissions and patching of locally-accessible exploits both within your application and generally is very important. Make sure your host has good policies on this and make sure that none of your software unequivocally trusts local connections or users.
The nature of the attack (an import of malware from a site that appears to do this kind of thing en masse) suggests that you were running an exploitable application or that your username/password combination was not sufficiently strong, but the administrators at your provider are really the only ones able to supply accurate details on how this happened. Good luck. :)
Chances are, there is an application on your server with a known vulnerability that has been attacked, and something has modified files on your web site or installed a new file.
When searching for information on gumblar.cn, it looks like they use a trojan called JS-Redirector-H. Not sure if this is what is involved here.
Fixing this may involve restoring your web site from backup, if you have no way of knowing what has been modified. If you have source control or a recent version, you may be able to do a whole-site diff. But you will also need to fix the security vulnerability that allowed this to happen in the first place.
Chances are it's some insecure app, or an app you installed some time ago but have not updated recently. A few people who have complained about this mentioned that they use Gallery (ie PHP Gallery). Though I'm not sure if that's connected.
If you are not the server administrator, talk to the server admin. They may be able to help, and it would be wise to let them know about this.
Google Advisory:
http://safebrowsing.clients.google.com/safebrowsing/diagnostic?client=Firefox&hl=en-US&site=http://gumblar.cn (linking doesn't work)
First, contact your hosting company and report this. If this is server-wide, they need to know about it.
The most common cause of infections like this is vulnerable popular PHP software (such as PHPBB, Mamboserver and other popular systems). If you're running any 3rd party PHP code, make sure you have the latest version.
If you've determined that this only affects your site, restore from a backup. If you don't have any backups, try re-installing everything (you can probably migrate the database) you have (to the latest version) and go through your own PHP code (if any).
PHP Programs are actually simple text files that run on the server by the PHP interpreter. if your application is infected, then I think there are tow posiibilities:
1.they have used some security hole in YOUR application to inject some code into your server, so now they have changed some of your PHP files, or some of your database information.
if this is the case, you better double check every single place where you are fetching information from the user (text inputs, file uploads, cookie values, ...), make sure everything is well filtered. this is very common security practice to filter anything that comes from the user. you also better make sure that the data that is currently saved in your database (or file system) is clean. I suggest using Zend_Filter component of the Zend Framework to filter user input. there are many full featured filter libraries out there.
2.they could have run some program on your server, that is affecting your PHP source files. so somehow they have accomplished running some program/script your server, that is changing your application.
if this is the case, I suggest your check all your server processes and make sure you know every process that is running. although I think this is less possible.
Ok, this is NOT a programming question and SO is not the place for this because if we would tolerate such questions here we would soon be a first aid / support site for ppl with bad shared hosting accounts.
I only didn't vote for closing because I feel bad turning a few ppl down who are probably feeling really bad about a problem they don't have the knowledge to fix.
First of all: google for gumblar.cn, there is a growing number of potentialy helpful posts accumulating as we speak.
If you're a real beginner and you feel you don't get any of the things in the answers here then just do the following:
Get a new host
Google for information about all your software until you know, if the software is safe. If it's not, don't use it, until the developers have fixed the problem. An example of a not secure software is 'Galery'.
Install all your software (the secure ones only) FRESH INSTALL!!
Copy over static files (like images) to the new server. Do NOT copy over any dynamic files, like php scripts, as they could be infected.
Don't upload any of your own PHP scripts until you've checked them for security vulnerabilities. If you don't know how to do this, don't upload anything before you've learned about these things.
I have been affected by this virus/malware and currently cleaning up. I hope this will be helpful:
1) You most likely have a TROJAN on your PC. To verify this simply run (Start > Run... or Windows key + R) and type "cmd" or "regedit". If either of those doesnt open its window as expected, you have the Js:Redirector trojan. You can also verify that the anti virus programs aVast and Malware Bytes can not connect to updates for some reason (sneaky trojan that is). Plus, you'll notice that the Security program of the Control Panel was disabled, you wouldn't have seen a notification in the tray icons to tell you that the virus protection was disabled.
2) This is a very recent exploit, apparently of vulnerabilities inflash or pdf plugins, thus you are not safe even if you didn't use Internet Explorer!
As for me, I believe because I hate programs slowing down my PC, I have my Windows Updates on "manual", and I didn't have resident protection (scanning of all web connections, etc), and I was probably infected by visiting another hacked site which was not blacklisted yet. Also I was over confident in non-IE browsers! I sometimes ignore the blacklist warning as I am curious about what the scripts do etc, and forgot once again just how BAD Windows really is. Conclusion: leave Windows Updates on automatic, have minimal resident protection (aVast Web Shield + Network Shield).
3) Because this is a trojan that sends back your FTP password, it doesn't matter how good your password was!
4) Try to lceanup your PC with Malware or aVast, it will find a file ending with ".ctv"
You MUST have a virus database dated 14 May or more recent. If you can't update (as explained above), then follow these instructions (you'll need to extrapolate but basically you have a file, the name may vary, which is pointed in the registry, and use HiJackThis to remove it, once you rebout without this file excuted, all is fine)
5) Of course update your passwords, BUT make sure the trojan is removed first!
6) For an exact list of all pages modified try to get a FTP log and you'll find the IP of the script/hacker and all touched files.
7) If you have a complete local copy of the "production" environment, then the safest is to delete ALL the site on the server, and re-upload all files.
8) During the clean up process DONT visit your infected site, or you will re-install the trojan! If you have the latest aVast Home Edition and the "Web Shield" protection it will give you a warning and block the page from being executed by your browser.
like Francis mentioned, try to get your hosting company to make sure their software is up to date.
On your side, change your ftp password to something completely obscure as soon as possible. I've seen this happen to people before. What these 'hackers' do is a brute force on your ftp account, download a couple of files, modify them slightly, and then re-upload the infected copies. If you have access to the ftp log files you'll probably see a connection to your account from an IP other than yours. You may be able to submit this to your hosting company and ask them to black-list that IP from accessing their servers.
That website (gumblar.cn that you mentioned) is being tested for malware. You can monitor results here: http://www.siteadvisor.com/sites/gumblar.cn/postid?p=1659540
I had something like this happen to me at an old hosting provider. Somehow, someone, was able to infect Apache in some way so that a special header was injected into all my PHP files which caused the browser to try to download and run in the browser. While they got it fixed, the quick solution was to take down all my PHP files, and change my index file a plain HTML file. Whether or not this stops the problem for you depends on how the server is infected. The best thing and probably most responsible thing you can do is to protect your visitors by taking down site, and if possible (if text files aren't infected), display a message stating that if they visited recently they may have been infected.
Needless to say, I switched hosting providers quick soon after my site was infected. My hosting provider was pretty bad in a lot of other ways, but this was pretty much the final straw.