how to protect php page used by jquery ajax call - php

I just finished coding my first jquery ajax call page. It calls a php page every 1 or 2 seconds and returns json data.
The page basically displays posts of the message board the user is viewing. There are multiple message boards and some users should not be able to view certain boards, however the same php page is used for the call. It pics out the message using $id that is sent by the ajax script.
My question is how would I protect the php page from being manipulated and opened directly? The user can easily change the board id by opening the file directly and changing the URL. Not to mention the other ways.
If there is no easy way, then I guess I'd have to duplicate the majority of the main page to check if the user has necessary permissions. That would mean more server load since the page is updated every second.

Ajax calls are treated by server in the same way as normal page requests. All the authentication and authorization mechanisms are called before serving the page. To make sure just log off and try to get stuff from your page using AJAX. It should not work if your page requires you to log into the site.

In ajax script you can use $_SESSION too - you can check if current user has privilages to specified ID - if not - just deny access.

Save the permissions in a session and check if that certain flag is present?

If an AJAX call can open the page, so can the user, you cannot rely on definitive technique to protect a page. Rest you can follow what #TheVillageIdiot has said in his answer.

Related

Hiding php file path and the Parameters from the POST request?

I have post request to increase the liking number on records in database. The php file and the GET paramters are sown in the post request so any one will see the page source will be able to process that exteranlly via the php file.. so is it a way to hide those information, and if not .. so what is the most secrue way to hit the databse without showing secure data like that?
$.post("liking.php?id="+rank_id+"&lik="+lik+"&dis="+dis,function(data){}
If you are doing the POST from jquery like that then the variables are going to be visible to the user in the source. This is not a problem as your security should be server side.
In your file: liking.php
You need to add some kinds of checks to prevent users from repeat likes if that is your goal.
If you want to limit a like to one per user then you need to log the like to a table somewhere with the userid (if they are logged in) so you can prevent double likes.
If you are allowing non-logged in users to submit likes then you will want to limit them somehow. Perhaps using PHP sessions to not allow another like in the same session for the same rank_id. This can be session based or time based.
Here are a few other questions that might lead you on the path:
How do I make sure my like button is pressed only once by user?
How to secure/encode Javascript POST requests

How to prevent viewing two pages in same time

I've media website once member is logged successfully it creates session
$_SESSION['login_id'] = $username;
I wonder how to prevent members to watch two channels in same time.
I mean for example if member is viewing page video.php?id=4 and open in new tab page video.php?id=5 it shows him error have to close the page of video.php?id=4 before viewing the new page.
1st thing came into my mind is random token key that cleared on page exit but i don't know if it good idea or not, does anyone known how to do it or have better idea ! ~ thanks
At first thought
You could use requests to a php script that tells you if the user is still opening a web page.
For that you can use a loop of timed out ajax requests ( using jQuery for example)
Hint:
Instead of making Requests you can try and load tiny images ( 1px h/w ) and of course load this image using a php script (you can trick the url using htaccess),
So when the image is requested, your php script will do the trick (setting the currently watched video) then serve the image (don't forget to set the proper content-type )
and keep loading the image at certain interval (you will need to generate url token to avoid caching ;) )
A second solution could be
Serving your videos using php script as proxy, like that you can know when a video has been streamed completely, then if a user request a second video, knowing his is still streaming a previous one, you deny his request, show him an appropriate message or do as you like :)
I guess, using the 2nd solution would be better for you and the visitor, since he would be able to start caching a 2nd video once the 1st one has been cached completely
1st solution will use many request which can overwhelm the network or both the client side and server side
Both Solutions would not track a user that is using more than one browser, which means he would have more than one session, unless the user is registered and logged in
if ($_SESSION['login_id'] = $username && COUNT($_GET['id'])>1 )
Now after you check this condition I don't know what you could do to prevent the user from opening 2 tabs..
Just my thoughts
Since the video has to stream, it pings the server . The session can have assigned to it, the last video clicked. then once a new video is clicked, the session on the server will use the new video id and once the first video pings the server and find out the session is pointed at a different page, then the video can return an error message
Alternatively, You could assign an ID to each instance of form OR a hidden field with an ID, then use AJAX to ping the server with that ID. If the user tries to request the same form when there's an active ID, it should display an error message.

Getting contents of referring page with php

I'm trying to enable screenshots of the page a logged in user is currently on. I've placed a button that needs to:
read in the content of the referring page
save it to a file
render that file as a PDF
redirect back to the referring page
The problem I've run into is that users are logged in and on pages that are very specific to them. I can't grab the page via CURL with generic credentials because the screenshot won't be applicable, and I don't have the user's credentials.
How can I read in the contents of the current/referrering page with PHP without access to the users credentials? I've tried file_get_contents which was not working either.
It sounds like your mechanism is going to be faulty anyway: you're not saving the page as it looks to them, but rather saving the page as it looks to CURL at some point in the future.
If you want an accurate solution, then you need to save a copy of the rendered HTML somewhere server-side as you send it out (you can use PHP's output buffering to capture it) and mark the file you save with some sort of key that goes to the user. If the user clicks the button, it sends that key to the server which you use to look up the saved HTML file, and process it as desired.
Significantly less efficient, of course, but there you go. Alternately, you can save just the parameters processed in the page such that you can re-render it with PHP if required. Still no curl involved, but less saving going on. Obviously you don't need to keep this cache information long; just a few minutes, so storing it in ram (e.g. memcache) would be sufficient.
I don't believe this can be accomplished ethically without obtaining the user's credentials.

proper way to craft an ajax call with php and auth

From a security standpoint, can someone give me a step-by-step (but very simple) path to securing an ajax call when logged in to PHP?
Example:
on the php page, there is a session id given to the logged in user.
the session id is placed dynamically into the javascript before pushing the page to the client.
the client clicks a "submit" button which sends the data (including the session id) back to the php processing page.
the php processing page confirms the session id, performs the task, and sends back data
I'm stuck on how (and whether) the session data should be secured before sending it through an ajax request. I'm not building a bank here, but i'm concerned about so many ajax calls going to "open-ended" php pages that can just accept requests from anywhere (given that sources can be spoofed).
PHP can get the session data without you having to send a session ID via javascript. Just use the $_SESSION variable. If you want to check if a session exists you can just do
if(isset($_SESSION['some_val'))
//do work son.
You'll need to use JavaScript to asynchronously pass user input back to the server, but not to keep track of a session.
Don't send your session data with javascript.
You don't need to (in most cases).
Just post the data with javascript and let PHP retrieve the session data from... the session.
Depends on how you setup your session data.
One simple example would be you have a session called username.
When PHP gets the request from javascript you can do: $_SESSION['username'] to retrieve the sessiondata.
This is a very simple example just to show how it can be done.
As noted above, you don't need to send any session identifiers out with your javascript, to the server an AJAX request is the same as any other request and it will know your session just fine. So basically, just don't worry about it, it's already taken care of.
It's another part of your question that worries me.
i'm concerned about so many ajax calls going to "open-ended" php pages that can just accept requests from anywhere
It worries me too; you shouldn't have any "open-ended" PHP pages hanging around at all. Every public .php script should have authentication and authorisation done. The easiest and most maintainable way to achieve this, IMHO, is to have a single controller script (e.g. index.php) that does authentication and authorisation then sends the request to an appropriate controller. Aside from this controller, all other scripts should be outside the document root so that they cannot be called directly.
This means that you only ever have to worry about authentication and authorisation in one place; if you need to change it, it only changes in one place. It means you don't need to worry about accidentally leaving some executable stuff in some library PHP file that's not meant to be called directly. It means you don't need to shag around with mod_rewrite rules trying to protect .php files that shouldn't be in the doc root at all.

Jquery $.post and PHP - Prevent the ability to use script outside of main website

I have a PHP script setup using Jquery $.post which would return a response or do an action within the targeted .php file within $.post.
Eg. My page has a form where you type in your Name. Once you hit the submit form button, $.post is called and sends the entered Name field value into "mywebsite.xyz/folder/ajaxscript.php"
If a user was to visit "mywebsite.xyz/folder/ajaxscript.php" directly and somehow POST the data to the script, the script would return a response / do an action, based on the submitted POST data.
The problem is, I don't want others to be able to periodically "call" an action or request a response from my website without using the website directly. Theoretically, right now you could determine what Name values my website allows without even visiting it, or you could call an action without going through the website, by simply visiting "mywebsite.xyz/folder/ajaxscript.php"
So, what measures can I take to prevent this from happening? So far my idea is to ensure that it is a $_POST and not a $_GET - so they cannot manually enter it into the browser, but they could still post data to the script...
Another measure is to apply a session key that expires, and is only valid for X amount of visits until they revisit the website. ~ Or, just have a daily "code" that changes and they'd need to grab this code from the website each day to keep their direct access to the script working (eg. I pass the daily "code" into each post request. I then check that code matches in the ajax php script.)
However, even with these meaures, they will STILL have access to the scripts so long as they know how to POST the data, and also get the new code each day. Also, having a daily code requirement will cause issues when visiting the site at midnight (12:00am) as the code will change and the script will break for someone who is on the website trying to call the script, with the invalid code being passed still.
I have attempted using .htaccess however using:
order allow,deny
deny from all
Prevents legitimate access, and I'd have to add an exception so the website's IP is allowed to access it.. which is a hassle to update I think. Although, if it's the only legitimate solution I guess I'll have to.
If I need to be more clear please let me know.
The problem you describe is similar to Cross-Site Request Forgery (CSRF or XSRF). To protect you against this you could put a cookie into the browser and have the cookie value sent in the post form too (by hidden field or just add it to $.post). On server side check both those fields, if they match the request probably came from your site.
However the problem you describe will be quite hard to protect against. Since you could easily make a script (or use Crul) to forge all kinds of requests and send to your server. I don't know how to "only allow a browser and nothing else".
Use the Session variable as you say plus...
As MyGGAN said use a value set in a cookie (CVAL1) before rendering the submit forms. If this cookie is available (JS Code Check will verify) then submit.
On the server side:
If this cookie value exists and the session variable exist then the HTTP Request came from your website.
Note: If the script (form) is to presented under another domain DO NOT allow the cookie value (CVAL1) to be set.
Do not allow HTTP Requests on the Server Side Scripts if extra Http Headers Are not available (like x-requested-with: jquery). JQuery sends a request with an X-* header to the server.
Read more on Croos-Site Request Forgery as MyGGAN suggests.
I am not really sure REMOTE_ADDR would work. Isnt that supposed to be the end users IP addr?
Firstly, you could make use of
$_SERVER['HTTP_REFERER'], though not always trust-able.
The only bet that a valid post came from your page would be use a captcha.
try to use HTTP_SEC
// SECURITER
if ($_SERVER[HTTP_SEC_FETCH_SITE] != "same-origin")
die();
if ($_SERVER[HTTP_SEC_FETCH_MODE] != "cors")
die();
if ($_SERVER[HTTP_SEC_FETCH_DEST] != "empty")
die();

Categories