PHPSecurityScanner & SpikePHPSecAudit - How do I? - php

How do I run PHP Security Scanner and SpikePHPSecAudit?
I've already extracted them at the root of my website and thought it could be run like phpSecInfo where you just navigate to
www.mySite.com/phpsecinfo/index.php
Any assistance will be appreciated.
ps I am using Windows XP and XAMPP

Spike PHP SecAudit does static analysis of the files, its also very old. Pixy and RATS are also static analysis tools for php, but I think Rats is the only one the three that is still maintained. These tools will produce a lot of false positives and it takes skill to tell the difference between a real problem and meaningless output.
In terms of scanners you are best off with Wapiti, which will produce very few false positives. Wapiti also very easy to use python wapiti.py http://localhost/vulnerable_app/. I recommend downloading "Hackme Blog" from The Whitebox. Many apps aren't immediately vulnerable, sometimes you have to use the app a bit before the vulnerability can be reached. Try scanning the blog after a fresh install, then login as the admin and create a blog entry and then scan it again.
If all goes well I'll see an exploit of yours on BugTraq, Give a shout out to "The Rook" ;).

Related

PHP as independent application ( binary, compile, pack, no php on host )

If I would like to distribute PHP application with installer(package system of OS) how should I proceed? I don't want PHP files to be there, just working application, so when I type 'app' into console, it ends up being launching application, without need to install PHP on system(no php installation on host required). I would also like the application to have patch-able byte-code, so it's in parts, loaded when needed and only part needs to be replaced on update.
What I would do now is following:
->Compile PHP with extensions for specific platform.
->Make binary application which launches '/full/php app' when app is launched.
->Pack it in installer in a way, that there would be binary added to path when added, launching specific installation of PHP which is alongside the app with argument of start point->App would be running.
Problem is:
Maybe I don't want my PHP files to be exposed(in application, there will be available source anyway) is there some ready made stuff to do this? Is there some better way than I proposed?
Alternative: Modifying OP Cache to work with "packing" application to deliver byte codes to modified OP Cache which just reads the cache.
My suggestion would be a tiny tool I just finished, for almost exactly the same problem. (Oh yes I tried all the others but they're old and rusty, sometimes they're stuck with 4.x syntax, have no support, have no proper documentation, etc)
So here's RapidEXE:
http://deneskellner.com/sw/rapidexe
In the classical way, it's not a really-real compiler, just a glorified packer, but does exactly what you need: the output exe will be standalone, carrying everything with it and transparently building an ad-hoc runtime environment. Don't worry, it all happens very fast.
It uses PHP 7.2 / Win64 by default but has 5.x too, for XP compatibility.
It's freeware, obviously. (MIT License.)
(Just telling this because I don't want anyone to think I'm advertising or something. I just took a few minutes to read the guidelines about own-product answers and I'm trying to stay within the Code of the Jedi here.)
However...
I would also like the application to have patch-able byte-code, so it's in parts, loaded when needed and only part needs to be replaced on update.
It's easier to recompile the exe. You can extract the payload pieces of course but the source pack is one big zip; there seems to be no real advantage of handling it separately. Recompiling a project is just one command.
Maybe I don't want my PHP files to be exposed(in application, there will be available source anyway)
In this case, the exe contains your source compressed but eventually they get extracted into a temp folder. They're deleted immediately after run but, well, this is no protection whatsoever. Obfuscation seems to be the only viable option.
If something goes wrong, feel free to comment or drop me a line on developer-at-deneskellner-dot-com. (I mean, I just finished it, it's brand new, it may misbehave so consider it something like a beta for now.)
Happy compiling!
PHP doesn't do that natively, but here are a few ideas:
Self-extracting archive
Many archival programs allow you to create a self-extracting archive and some even allow to run a program after extraction. Configure it so that it extracts php.exe and all your code to a temp folder and then runs ir from there; deleting after the script has complete.
Transpilers/compilers
There's the old HPHC which translates PHP code to C++, and its wikipedia age also contains links to other, similar projects. Perhaps you can take advantage of those.
Modified PHP
PHP itself is opensource. You should be able to modify it withot too much difficulty to take the source code from another location, like some resource compiled directly inside the php.exe.
Use Zend Guard tool that compiles and converts the plain-text PHP scripts into a platform-independent binary format known as a 'Zend Intermediate Code' file. These encoded binary files can then be distributed instead of the plain text PHP. Zend Guard loaders are available for Windows and Linux platform that enables PHP to run the scripts encoded by Zend Guard.
Refer to http://www.zend.com/en/products/zend-guard
I would like to add another answer for anyone who might be Googling for answers.
Peach Pie compiler/runtime
There is an alternative method to run (and build apps from) .php source codes, without using the standard php.exe runtime. The solution is based on C#/.NET and is actually able to compile php source files to .NET bytecode.
This allows you to distribute your program without exposing its source code.
You can learn more about the project at:
https://www.peachpie.io/
You've got 3 overlapping questions.
1. Can I create a stand-alone executable from a PHP application?
Answered in this question. TL;DR: yes, but it's tricky, and many of the tools you might use are semi-abandoned.
2. Can I package my executable for distribution on client machines?
Yes, though it depends on how you answer question 1. If you use the .Net compiler, your options are different to the C++ option.
3. Can I protect my source code once I've created the application?
Again, depends on how you answer question 1. Many compilers include an "obfuscator" option which makes it hard to make sense of any information you get from decompiling the app. However, a determined attacker can probably get through that (this is why software piracy is possible).

How can I have my CMS upgrade itself?

I've built a CMS (using the Codeigniter PHP framework) that we use for all our clients. I'm constantly tweaking it, and it gets hard to keep track of which clients have which version. We really want everyone to always have the latest version.
I've written it in a way so that updates and upgrades generally only involve uploading the new version via FTP, and deleting the old one - I just don't touch the /uploads or /themes directories (everything specific to the site is either there or in the database). Everything is a module, and each module has it's own version number (as well as the core CMS), as well as an install and uninstall script for each version, but I have to manually FTP the files first, then run the module's install script from the control panel. I wrote and will continue to write everything personally, so I have complete control over the code.
What I'd like is to be able to upgrade the core CMS and individual modules from the control panel of the CMS itself. This is a "CMS for Dummies", so asking people to FTP or do anything remotely technical is out of the question. I'm envisioning something like a message popping up on login, or in the list of installed modules, like "New version available".
I'm confident that I can sort out most of the technical details once I get this going, but I'm not sure which direction to take. I can think of ways to attempt this with cURL (to authenticate and pull source files from somewhere on our server) and PHP's native filesystem functions like unlink(), file_put_contents(), etc. to preform the actual updates to files or stuff the "old" CMS in a backup directory and set up the new one, but even as I'm writing this post - it sounds like a recipe for disaster.
I don't use git/github or anything, but I have the feeling something like that could help? How should (or shouldn't) I approach this?
Theres a bunch of ways to do this but the least complicated is just to have Git installedo n your client servers and set up a cron job that runs a git pull origin master every now and then. If your application uses Migrations it should be easy as hell to do.
You can do this as it sounds like you are in full control of your clients. For something like PyroCMS or PancakeApp that doesn't work because anyone can have it on any server and we have to be a little smarter. We just download a ZIP which contains all changed files and a list of deleted files, which means the file system is updated nicely.
We have a list of installations which we can ping with a HTTP request so the system knows to run the download, or the click can hit "Upgrade" when they log in.
You can use Git from your CMS: Glip. The cron would be a url on your own system, without installing Git.
#Obsidian Wouldn't a DNS poisoning attack also compromise most methods being mentioned in this thread?
Additionally SSH could be compromised by a man in the middle attack as well.
While total paranoia is a good thing when dealing with security, Wordpress being a GPL codebase would make it easy to detect an unauthorized code change in your code if such an attack did occur, so resolution would be easy.
SSH and Git does sound like a good solution, but what is the intended audience?
Have you taken a look at how WordPress does it?
That would seem to do what you want.
Check this page for a description of how it works.
http://tech.ipstenu.org/2011/how-the-wordpress-upgrade-works/

localhost + staging + production environments?

I have a website say www.livesite.com which is currently running. I have been developing a new version of the website on my local machine with http://localhost and then committing my changes with svn to www.testsite.com where I would test the site on the livesite.com server but under another domain (its the same environment as the live site but under a different domain).
Now I am ready to release the new version to livesite.com. Doing it the first time is easy, I could just copy & paste everything from testsite.com to livesite.com (not sure its the best way to do it).
I want to keep testsite.com as a testing site where I would push updates, test them and once satisfied move to livesite.com but I am not sure how to do that after the new site is launched.. I don't think copy pasting the whole directory is the right way of doing it and it will break the operations of current users on the livesite.com.
I also want to keep my svn history on testsite.com. What is the correct way of doing this with SVN ? Thank you so much!
Other answers mentioning Hudson or Weploy are good. They cover more issues than what follows. That said, the following may be sufficient.
If you feel that's overkill, here's the poor-man's way of doing it with SVN and a little creative sysadminning.
Make your proudction document root a symlink, not an actual directory. Meaning you have something like this:
/var/www/myproject-1-0-0
/var/www/myproject-1-1-0
/var/www/myproject-1-1-1
/var/www/html -> myproject-1-1-1
This means you can check out code onto production (say, myproject-1-1-2) without overwriting stuff being served. Then you can switch codebases near-instantly by doing something like:
$ rm html && ln -s myproject-1-1-2 html
I'd further recommend not doing an svn checkout/svn export of your trunk on the production box. Instead, create a branch ahead of time (name it something like myproject-X-Y-Z). That way if you need to do some very stressful tweaking of production code, you can commit it back to the branch, and merge it back to trunk once the fire is extinguished)
I do this a lot, and it works quite well. However, it has some major drawbacks:
Mainly, you have to handle database migrations, or other upgrade scripts, all by yourself. If you have scripts (plain-old-SQL, or something more involved), you need to think about how best to execute them. Downtime of hopefully-just-a-minute might not be a bad idea. You could keep a "maintenance site" around (/var/www/mainenance), and point the symlink there for a few moments if you needed to.
This method is not nearly as cool as Weploy, for example, but for relatively small projects (running on a single server, with not-huge databases), it's often good enough, and dead simple.
My answer will complicate things a little bit, but here goes:
I would for this type of scenario use Hudson.
Hudson will allow for you to have an auto deploy / clean the current dir out / add new from svn process. You can then worry about development and less about jugling and deploying from one place to another.
The caveat is that you need to learn a little bit on how to setup Hudson and how to make him work for you.
How to get started with PHP for Hudson
I think that should get you on the right track, a bit of work like I said, but pays off later on.
If only the server side code changes, it is possible that you can simply copy the code across and things will be okay. But even there you have to think of possibility of people in mid-interaction. If the client side code changes, especially if you are heavily using ajax, you will have to get the current users to reload their pages. If the database also changes, then you have ensure that no database transactions happen during the time that you are applying the database change scripts.
In all cases, and irrespective of whether you are using any continuous integration tool, I believe it is safest to go for downtime to apply these changes. One of the reasons why people have the "beta" sticker on their sites is so that they can log everyone off and shut them all out to apply changes without notice. As long as they don't do it very frequently they can get away with it too. Once you are out of beta, applying changes becomes a ceremony where you start announcing downtime weeks in advance, then get a window of 30 minutes to a few hours to apply all changes.
For underlying things like patching security flaws in the OS or system software, adding hardware etc, downtime can be avoided if there is load balancing, and the patches are applied one by one.

Generate an image / thumbnail of a webpage using X/Gui-less linux

There exists numerous solutions on generating a thumbnail or an image preview of a webpage. Some of these solutions are webs-based like websnapshots, windows libraries such as PHP's imagegrabscreen (only works on windows), and KDE's wkhtml. Many more do exist.
However, I'm looking for a GUI-less solution. Something I can create an API around and link it to php or python.
I'm comfortable with python, php, C, and shell. This is a personal project, so I'm not interested in commercial applications as I'm aware of their existence.
Any ideas?
You can run a web browser or web control within Xvfb, and use something like import to capture it.
I'll never get back the time I wasted on wkhtml and Xvfb, along with the joy of embedding a monolithic binary from google onto my system. You can save yourself a lot of time and headache by abandoning wkhtml2whatever completely and installing phantom.js. Once I did that, I had five lines of shell code and beautiful images in no time.
I had a single problem - using ww instead of www in a url caused the process to fail without meaningful error messages. Eventually I saw the dns lookup problem, and my faith was restored.
But seriously, every other avenue of thumbnailing seemed to be out of date and/or buggy.
phantom.js = it changed my life.

Portable version control?

Currently I do all of my work off of a flash drive. Keeps things portable, and I'm able to learn web development while I'm at work. Currently I run Portableapps with XAMPP, Notepad++, and Chrome installed on it.
My question is, does anyone know of a version control system that would work portably on a flash drive? I just learned about the importance of Version control, and I want to get started, I just need something that will work with my setup.
Edit: Just to clarify, the whole thing should be able to run off the flash drive alone on a completely foreign computer. So if I go to Aunt Edna's house for a family get together, I can go on her computer, plug in my flash drive and just go. The Aunt Edna's of the world get very offended if you install anything but solitaire on their fancy new computer. So it can't leave anything behind
This question was asked before: https://stackoverflow.com/questions/1109838/recommend-portable-source-control-setup and Version control on a 2GB USB drive - the second one has an accepted answer (darcs looks good too).
Pick a distributed one.
GIT or Mercurial for example.
Expanding off of Dan's Answer.
Git is almost completely file based. As long as you have the files, it will work the same with any computer (given you have the command line stuff installed).
This is also good if you switch between a GUI Editor and the Command Line, as pretty much everything will stay saved (the files waiting to be committed, for example)
Mercurial is a good start. A repository does not need to be stored on any server, you just create it where your data files are. Also, there is a nice interface called TortoiseHG, which lets you use Mercurial out of Windows Explorer with ease.

Categories