Php page protection for cron task only - php

I am using linux cpanel shared hosting.
Am using http://aaa.com/script.php to scrape data from other website.
PHP portion is to curl call to read whole page content, then on the page, will output the full content as html, then use jquery scrapping & ajax call to insert final data into mysql.
(I decided to go for jquery client side scrapping because the page with html to scrap is pretty complicated, and hard to achieve with phpsimpledom and regex.)
I want this page to stop outputting html when it is
- not open by me as a tester
- not open by local cpanel cron task.
So I put exit(); at the top few lines.
If detected is legitimate, then will continue the rest of the html outputs at bottom, else, just exit and show an empty page.
Now is security issue, what's the possible and best way for me to make sure other visitors/bots to this page will see empty page.
If I put a password to cron task, I don't think it can work right?
Because at script.php I am scrapping data, so if the website owner see the visitor referral log, he can see the full url including ?password=12345, isn't it.
/usr/local/bin/php -f /home/mysite/public_html/dir/script.php?password=12345
If I put my script outside of public_html, like /usr/local/bin/php -f /home/mysite/script
I don't think it will work for jquery, it is purely for php isn't it?
What else I can do??

You could config apache's virtual host to only allow access from your ip.
Anyone else would get a 404 page not found or 403 permission denied depending how you configured it.
Here's a sample
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
Using 127.0.0.1 tells apache to let requests from itself (ie cron) to work but noone else.
You get learn more by reading the apache2 docs

Passwords on the query string are a bad idea. You could check for valid IP addresses at the start of your PHP file. This will allow any request from a set of IP addresses to access the parsed jQuery output. All other IPs will be denied access.
$allowedIps = array('127.0.0.1','::1');
if(!in_array($_SERVER['REMOTE_ADDR'],$allowedIps)){
echo 'No jQuery for you';
}else{
echo 'jQuery goodness to follow...';
}

Related

Forbidden access but working with a refresh

I'm trying to access a streaming page but I get the error "Forbidden. You do not have permission to access this document."
However I can skip this message with F5/refresh and watch the video.
Is there any way to open this URL and do a refresh automatically? (using PHP)
I've tried something like this but it does not seem to work
header("Refresh:0; url=http://www.url.com");
Thank you in advance.
The forbidden access message is most likely coming from your web server configuration (apache?). The browser will stop there, and no document will be loaded from your server.
Since PHP is only interpreted after that, it will actually not get interpreted at all... you have no way to override this behavior in PHP alone, you need to fix your configuration on the server.
If you have a behavior that shows a 403 once every two load, chances are you have either a load balancer type of setup (loading one server or the other), or something that alternate between two configuration (for example issues in your domain name configuration, like the ServerName configuration).
In that case, if you add a header("Refresh:0; url=http://www.url.com"); in your page, you will only get this worse, since the successful load will reload, and then get back to the forbidden message (403).
check your web server config and logs to find the issue.
It's more likely the page is using HTTP REFERER restriction from it's own domain.
If you accessing the page by code, just added the referer condition on it.
In php you can use curl and add this line
curl_setopt($ch, CURLOPT_REFERER, 'domain_url');

How to verify logins on direct navigation to resource pages on an AJAX site

I have an SPA that uses AJAX calls to assemble content from multiple PHP files. I can add the following into the main application's config file to be able to redirect users that are not logged in back to the login page as long as they tried going through the portal to look at stuff.
// Verify Login to access this resource
if($_SESSION["loggedIn"] != true) {
echo ('resource denied <script>window.location.href = "https://'.$_SERVER['SERVER_NAME'].'/login.php";</script> ');
exit();
}
The problem comes in that there are tons of views, models, controllers, and third party widgets that can still be accessed directly if you simply tried scanning the site for common file architures.
Is there a way to use something like an htaccess or php.ini file to automatically append this login check to all of the php files in a directory so that I don't have to paste this into each and every page?
Baring that, is there a way to set my chmod settings to only allow indirect access to those files such that php scripts running on the server can use them, but they can not be directly visited? Thanks.
[EDIT]
Moving files outside of my public folder did not work because it broke the AJAX.
I also tried auto_prepend_file in an htaccess file, but this resulted in a 500 error. I am using a VPS that apparently won't let me do an AllowOverride All in my Apache pre_virtualhost_global.conf; otherwise, I think that would have been the right way to do this.
Setting the CHMOD settings of my resource folders to 0750 appear to be allowing the AJAX commands to execute without allowing direct access to the files. If anyone knows of any other security caveats to be aware of when doing this let me know. Thanks.

How do I limit my PHP script to only run when requested by localhost?

I'm having trouble getting my php page to only run when it is requested by the server itself.
This is what I have right now:
if ($_SERVER['SERVER_ADDR'] == $_SERVER['REMOTE_ADDR']) {
//process page
} else {
$this->redirect('http://' . $_SERVER['HTTP_HOST'] . '/404');
}
However, when I curl it, it doesn't give any errors or return anything at all. If I remove the check, it spits out the HTML as expected.
I tried echoing both of those values and got 192.168.1.186, and 192.168.1.225 respectively. I do realize they are different (this is being run by the server itself), but how can I fix it? This code was from this S.O answer
The title of your question implies that I can provide an answer which doesn't quite match the body of your question.
My response to this is that putting your entire script in a giant if statement seems somehow insecure and unmaintainable.
You would need a guard like this if it were possible for other computers to run your script, say by accessing it from the web.
But, if the server is the only machine which can run the script, why not just put it in a place where only the server can access it? For instance, one directory level above the web-accessible directory or in a child-directory with 700 permissions. Or use .htaccess to limit access.
That seems both safer and more maintainable.
It is easier and better to use your server configuration to limit file access. You could for instance use a .htaccess file in you specific directory with these contents:
order deny,allow
deny from all
allow from 127.0.0.1
To deny all traffic except from 127.0.0.1 (localhost). This will return a 403 error when someone tries to access these files from another computer.

.htaccess permissions to stop outside world executing php scripts

I am setting up a new website and currently if I go to mydomian/php/someScript.php it will execute the php script. How can I let the files that include this still include this but not let anyone else execute these scripts from the browser. Currently I have this in my .htaccess file:
deny from all
but when I visit the site a AJAX post request is made to a script in this folder and is getting back a 403 error.
Any ideas on how to achieve this are welcome.
====EDIT====
for clarity, some files in the php directory are requested by AJAX and I've now been made aware that these files cant have the desired permissions. However I would still like to put these permissions on the other files in this directory
Thanks
The best solution is to put them outside of the web root directory if at all possible, that way you can include them but the web server can't serve them, no configuration is required at all in this case.
EDIT: I noticed you want to allow access to the scripts by AJAX. There is no way of doing this as there's no way of telling the difference between an AJAX request or other types of HTTP request with any reliability.
You can still include those files from php, e.g. using include or require.
Calling it via AJAX is not different from calling it by entering the URL in the browser - i.e. you cannot block direct access but allow AJAX access.

Can a client view server-side PHP source code?

I'm developing a PHP application that has to respond to request from several clients, and I thinks "Can any of the clients see the PHP code that I'm writing?".
No, unless
There is a server misconfiguration
There is a bad echo/include somewhere
No. Unless you're echoing it to them under where you're actually using it.
Use includes from below or outside the www served directory. (can't +1 yet.. for Frankie)
Don't use symlinks for your http directories. I've intentionally used this to both show source and execute depending on user request path before, but that required httpd.conf changes (or misconfiguration) and can explicitly be disabled in httpd.conf.
If allowing downloads of files using fopen, don't pass anything the user creates to it or they could figure out how to get it to grab any file they can find.
Consider:
fopen('reports/' . $_GET['blah']);
where the user passes in '../index.php'
No, but you should take all measures to prevent it.
You should always set your sensitive code (heck, why not all?) in a directory bellow your server working dir (say /www), that way if the server gets messed up, it wont be able to show your code to the world because that code will be included by php that is not working in the first place.
If you have your webserver set to serve instead of parse your php yes. But then the clients wouldn't work. So the barring any security holes, answer is no.
No. Assuming you've installed a L/UAMP server properly, or aren't printing out (echo, print_r, etc.) and of the guts of your code, the PHP will be processed and the logic or HTML it's meant to output will be used on the page, not visible.
N.B. If there isn't an 'index' in a directory or a proper .htacess file, an Apache server will show a list of files in the directory, which can be downloaded and reviewed.
One mistake for it to happen is to paste a php tag inside a php string, example:
$string = "This is the answer: <s><?php echo $answer; ?></s>";
echo $string;
The person did a Ctrl+C and Ctrl+V of something that should be printed along the string, but the coder forgot to remove the php tags by distraction.

Categories