I have kind of a strange situation right now. Basically, my company is currently putting links to the latest builds of our software behind a gate, but once the user signs in, they can distribute the link to the builds. Free 30-day trials, unaccounted for.
So I would like to block access to URL /downloads/file.extension, or even just /downloads/ entirely, unless the referring URL is allowed_domain.com.
I would love to use .htaccess, but the hosting provider (WP Engine) we use has their servers set up on Nginx. Not only that, but I have to send my Nginx code to WP Engine for them to implement for me, which is not practical.
Thus, I am looking for a PHP ONLY solution, or a WordPress plugin that I apparently didn't notice.
And yes, I know this is a really stupid way for my company to be storing important files, but I need to fix this now until they come up with a better way.
You could use this method that I'm using it.
Enjoy.
You're going to need to list down those IP address.
<?php $allow = array("201.168.0.1", "456.789.123", "789.123.456"); //allowed IPs
if(!in_array($_SERVER['REMOTE_ADDR'], $allow) && !in_array($_SERVER["HTTP_X_FORWARDED_FOR"], $allow)) {
header("Location: http://yourdomain.com/index.php"); //redirect
exit();
} ?>
Unfortunately it is not possible to do what you described. That is because when someone tries to access a file from the site, the web server will first check if it exists and if it does, it will serve the file immediately. It does not go through any PHP code. Therefore you cannot write any PHP code that will intercept this request and block it.
Related
I'm currently making two different, but pretty similar web-based applications. So I'm using PHP, and standard MySQL (MariaDB in my case because I use XAMPP but I guess two of them are similar to one another). FYAI, I'm building my apps on a local server which located just right in my PC.
So let's say my first project is Project1, and it is stored in localhost/project1. Meanwhile, the Project2 is in localhost/project2. Both of them have a login feature for different account/user.
So when they try to log in I do this. By the way, it's MVC.
class Auth extends Controller {
public function index()
{
// verification such as prevent raw attempt with no post data, etc.
// verify the username and password, header back if fail, blablabla, u know...
// and then the following is if succeeded
$_SESSION['login'] = true;
$_SESSION['id'] = // user id;
}
}
I used that similar system in both of my projects.
The Problem
So I was trying to figure out how session work, and I log myself into the localhost/project1. And then I open localhost/project2. Surprisingly for me, I didn't need to type in my username and password in localhost/project2. I logged in already. These things gave me some concern. So these are my questions:
Did that happened because I develop my site on a local server in my very own computer?
Don't you guys think that anybody can break in to my site just by creating some simple procedural php code in their server like this:
$_SESSION['login'] =true;
// and other sessions
and then just access that file, which make their session login value is
true, and other validation session index that i used, and then just
access my web site and logged in, like some man with big AK-47 in his
hand walks in right from the very front door of the white house with no
secret service notice and hold him down in custody?
Why is this really happening and how do you think I should fix it? I also have some timeout feature so I'm afraid any big changes would bother most of my construction but any suggestion is really welcome.
Sorry if my English is bad, or my php knowledge is pathetic. I'm new.
Sessions are normally driven by a cookie that contains your session ID. Since cookies are shared within the same domain by default, if you have two projects both located on the same host/domain (e.g. localhost), and both use sessions, then they will both share the same cookie and thus the same session data.
This means someone could NOT hack into your site by setting up a random session in their own site. Their session data would only be applicable to their own site.
On a side note, it's not usually a good idea to suggest analogies that involve storming the white house with a gun. Just a friendly piece of advice.
User A has some PHP library files. User B needs access to the library. Is it possible permission-wise to make user B able to include the PHP file but not able to view the source code?
User A library entry file is lib.php.
User B uses lib.php in his start.php like this:
include path/to/lib.php;
However user B won't be able to view the content of lib.php or any other class files thereof.
Is this possible?
You're trying to find a way to do something that can't be done properly. Maybe in a kind of hackish, definitely dirty way.
You really should consider writing an API for your Application that contains all your logic. Then you could just handle everything else with User permission and so on, perfectly clean and state of the art.
Nobody but the API devs can look into the code, but everyone can use it based on his user permissions.
Every other method could is just to hard to handle and will cause more problems than just writing an API. It's worth the time.
Basically what you ask is not possible. The PHP interpreter needs to be able to read the file in order to include it, and if the PHP process can read it then your untrusted user can write some code that would read it in and dump it back out.
A few options you have are:
1) Use an API. Would allow you to keep you code secret as you'd only expose the API. Might take a few days work to implement though (or might not even be possible - impossible to say without knowing what you are doing), so probably not suitable in your situation.
2) Obsfucate your code. There are a number of PHP code obsfucators out there. It wouldn't stop prying eyes completely but it might be enough for your purposes.
3) Create a stub include file. If what your library includes isn't all critical to the running of the code you could create a cut-down stub library for your client to code against, then replace it with the real thing when they've done.
Hello all please help the ignorant. Ive searched high and low to find a solution but it seems to have evaded me.
I have recently set up a php file containing a read all script in the public html folder on my host. The db_config and db_connect and any more sensitive files are happily hidden away so direct access is not possible.
I need to prevent or at least slow down the average Joe from being able to run my read all script in their browser, obviously with the time taken to collect such a database it has become somewhat valuable and would hate to let someone have it for free too easily.
The php needs to be accessible for an mobile application to execute so unfortunately has to stay in the public directory (unless you know otherwise?)
Can you please point me in the right direction?
Header redirects seem to be the only option available. Which i must admit confuse me on the scripting somewhat.
As much as Id love someone to just give me the script, wheres the fun in not learning it yourself :)
Thank you for taking the time to read and reply.
I'll ignore CHMOD in this answer:
This isn't the best solution, but an easy-to-maintain method of protecting the file would be to block public access to it using HTACCESS (if you can). Using a flag like one of the other answers mentions is good too and is also a legitimate way to do this, but HTACCESS would disallow the script from even running in the first place.
<files myfile.php>
order allow,deny
deny from all
</files>
Edit: Just saw that you mentioned JSON so ignore the above in this case (I am not familiar with JSON, but I don't think it would work).
This solution isn't perfect either, but it could help a little:
PHP check whether Incoming Request is JSON type
You can detect if the incoming request is from JSON and then ignore if it isn't.
as I understand, Your app needs to use it, but not anyone on the web, right? You could do a few things.
First, your app could request the page with a query string like &verified=1 and unless that $_GET variable is passed, the script wouldn't work. Like
if(isset($_GET['verified'])){
//show code
}
else
{
//not today average joe
}
You could also put it in a secret directory like "sjdvjhb_kdfjgvkedn"
On my website, I have a search.php page that makes $.get requests to pages like search_data.php and search_user_data.php etc.
The problem is all of these files are located within my public html folder.
Even though someone could browse to www.mysite.com/search_user_data.php, all of the data processed is properly escaped and handled, but on a professional level this is inadequate to even have this file within public reach.
I have tried moving the sensitive files to my web root, however since Jquery is making $.get requests and passing variables in the URL, this doesn't work.
Does anyone know any methods to firmly secure these vulnerable pages?
What you describe is normal.
You have PHP files that are reachable in your www directory so apache (or your favored webserver) can read and process them.
If you move them out you can't reach them anymore so there is no real option of that sort.
After all your PHP files for AJAX are just regular php files, likely your other project also contains php files. Right ? They are not more or less at risk than any script on your server.
Make sure you program "clean". Think about evil requests when writing your php functions, not after writing them.
As you already did: correctly quote all incoming input that might hit a database or sensitive function.
You can add security checks on your incoming values and create an automated email if you detect someone trying evil stuff. So you'll likely receive a warning in such cases.
But on the downside: You'll regularly receive warnings because some companies automatically scan websites for possible bugs. So you will receive a warning on such scans as well.
On top of writing your code as "secure" as you can, you may want to add a referer check in your code. That means your PHP file will only react if your website was given as referer when accessing it. That's enough to block 80% of the kids out there.
But on the downside: a few internet users do not send a referer at all, some proxies filter that. (I personally would ignore them, half the (www) internet breaks on them anyway)
One more layer of protection can be added by htaccess, you can do most within PHP but it might still be of interest for you: http://httpd.apache.org/docs/2.0/howto/htaccess.html
You can store a uid each time your page is loaded and store it in $_SESSION['uid']. You give this uid to javascript by doing :
var uid = <?php print $_SESSION['uid']; ?>;
Then you pass it with your get request, compare it to your $_SESSION :
if($_GET['uid'] != $_SESSION['uid']) // Stop with an error message or send a forbidden header.
If it's ok, do what you need.
It's not perfect since someone can request search.php and get the current uid, and then request the other pages, but it may be the best possible solution.
UPDATE & SOLUTION: Everyone, for whoever has this problem in the future, I figured out how to solve it. If you use the PHPThumb class, you MUST import the settings from the config file, as it will not do this otherwise. I discovered this by opening the testing file that Clint gave. To do so, insert this code after you define the object:
if (include_once($PATH . 'phpThumb.config.php')) {
foreach ($PHPTHUMB_CONFIG as $key => $value) {
$keyname = 'config_'.$key;
$phpThumb->setParameter($keyname, $value);
}
}
Thanks to the people who attempted to help, and thanks Clint for at least giving me somewhere to look.
Original Question:
Before I cannot read this message due to my newfound blindness, I need help with something. And before you go farther, I must warn you that I have to link to other sites to show you the problems. So get ready to have a few tabs open.
So I am using PHPthumb to generate images for my gallery website. It was going along really great actually until I uploaded the script so that my business partner could start showing them (clients) an alpha stage of the script (I know it says beta).
http://speakwire.net/excitebeta2/?m=2
The problem becomes very obvious on the two gallery pages, which I happened to link to. The images are not being created at all. If you go into the admin panel, they seem to work, but that is merely a cache generated from my desktop. I have meticulously stopped at every step and even tried to manipulate the class code. I looked for other scripts but they did not help me, because they did not have what I needed. Because the code is proprietary though, I cannot share it. I bet you are thinking "Oh my farce god", but here is something you can look at - because I am able to replicate the same problem with the code I got before.
http://speakwire.net/phpthumbtest/
The second website has the EXACT same structure and code as:
http://mrphp.com.au/code/image-cache-using-phpthumb-and-modrewrite
The few exceptions are allowing the 100x100 parameters, but those are supposed to be changed and I know that is not causing the error, because its very existence is optional and removing it only allows people to do naughty things. The second is the thing I made only after the error persisted, and that was chmod(dirname($path), 0777); because for some weird reason, mkdir won't give the folder 777 permissions.
The old image: http://speakwire.net/phpthumbtest/images/flowerupright.JPG
The new image: http://speakwire.net/phpthumbtest/thumbs/100x100/images/flowerupright.JPG
As seen in the new image, it is unable to write the file. This happens to be the fault of PHPThumb. Whether that be the lack of parameters given, or the hosting does not permit.
Which brings me to the point, the script works superbly on my desktop WAMP, but fails when on the GoDaddy hosting. My business partner is going to open up an account on the hosting we plan to have people on soon, but the problem is still existent and if it is happening here, it most certainly can happen there too. Even though, it won't be on GoDaddy's servers later.
The specific place where it is failing I will insert here, but the rest you need to open up the mrphp.com.au site to see. It is way to long to post here.
require('../phpthumb/phpthumb.class.php');
$phpThumb = new phpThumb();
$phpThumb->setSourceFilename($image);
$phpThumb->setParameter('w',$width);
$phpThumb->setParameter('h',$height);
$phpThumb->setParameter('f',substr($thumb,-3,3)); // set the output format
//$phpThumb->setParameter('far','C'); // scale outside
//$phpThumb->setParameter('bg','FFFFFF'); // scale outside
if (!$phpThumb->GenerateThumbnail()) { // RETURNS FALSE FOR SOME REASON
error('cannot generate thumbnail'); // And is called due to fail.
}
I would love you long time whoever helps me with this, because I have spent essentially all my free time for the last few days, including time meant to be sleeping, trying to figure this out.
EDIT: http://speakwire.net/phpthumbtest/index2.php
I added this as Clints suggestion, seems imagemagick isn't working, could that really be the problem and how would I fix it?
Sounds like a permissions issue. Make sure whatever folder you are writing to, apache has write access.