In the JavaScript file containing the Ajax request what is the URL relative to? I've got the www directory containing the directories alpha and bravo. The JavaScript file in alpha and the HTML file that includes it and the PHP that processes the request in bravo.
In the JavaScript file I have xmlhttp.open("GET", "CheckServer.php?name="+name,true); but I don't think CheckServer.php is right. I've tried ../bravo/CheckServer.php but it doesn't work.
I'm not using JQuery and I am using WAMP.
Plus, is there any trouble shooting tools I can use to see if the PHP page for processing the request is being accessed in the first place?
EDIT: I opened the console and it says the function I'm calling in the JavaScript file is not defined. This only happens when I moved the .js file to a different directory. (I modified the <script> tag appropiratly: <script type="text/javascript" src="../alpha/Check.js">.
EDIT 2: I think there is a problem with WAMP because I copy the exact same files/folders to the desktop and everything works.
It is relative based on the current location of the page it is called on. It has nothing to do with where the JavaScript is loaded from.
Open up the console and look at the Ajax request [console or net tab], you will see the path that it is requesting.
It's an URL, those have nothing to do with directories per se. They might be mapped to directories on the server and often are but the client can never know it for sure and doesn't care. The URL is relative to the current URL (what you see in your browser address bar).
So, the question is not "where is CheckServer.php located on the server", but "how can I access it from the client".
If it's like:
http://example.com/alpha/index.html
http://example.com/bravo/CheckServer.php
fine. Use the relative URL.
But if it's like:
http://alpha.example.com/index.html
http://bravo.example.com/CheckServer.php
then it gets complicated. You will have to look into CORS (Cross-Origin Resource Sharing) because AJAX usually does not work across different domains.
Oh, and if CheckServer.php is not accessible at all... you probably can imagine the answer.
Related
So i was browsing through the web and found a video, upon inspecting the video source i found out that it had a get variable in the url, it looked like this:
http://www.blablabla.com/stream/2017/09/2a5ef169.mp4?expires=1302948611&token=1290239327
this part got my interest:
2a5ef169.mp4?expires=1302948611&token=1290239327
Its a MP4 file but accepts get variables. if those variables do not match a certain function i am not able to view the video so i think its linked to a php file.
I do know how GET and POST works in PHP but how do i apply this to a MP4 file or any other file.
Just because the extension ends in mp4, it doesn't mean "there is an mp4 file somewhere".
consider .htaccess can change extensions and using mod_rewrite or similar, people can redirect a given "clean" url to any php program.
So there may be a php interpreter behind the mp4 requests, and apache may have a modified httpd.conf or .htaccess file which routes /(.*).mp4 requests into a serve_video.php program (or whatever the name is).
This means in general, extensions don't mean anything.
by using HTTP header()s, the server might be responding dynamically (example: https://gist.github.com/ranacseruet/9826293) to each request (potentially in order to log the video's view count or something similar, like checking the HTTP REFERER in order to avoid hotlinking)
Hope that helps!
I want to deny visitors access to pages but still use the pages. How can I:
~ Make a page unviewable but allow it to process ajax requests.
~ Make a PHP file unviewable but include it in scripts.
It seems I need htaccess. I tried using it but it stopped me from using the file as an include.
For the ajax only thing, it seems I can use this in the ajax-only page:
<?php
$AJAX = (isset($_SERVER['HTTP_X_REQUESTED_WITH']) &&
$_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest');
if ($AJAX){
// echo ajax code.
}
?>
Is this reliable?
TAGS: only via ajax
One way to accomplish your second question about making it so a script is available for server-side inclusion and usage but not accessible from a client is to add this to an .htaccess file in the folder containing the scripts you wish to protect in this way:
deny from all
Try browsing to the script now and you should not be able to get to it. This works for the entire directory the .htaccess file is placed in.
Another way of 'shielding' the php file from access by clients through the web server like this is by placing the php files in a directory outside your wwwroot/public_html.
In your PHP config you'll have to add this dir to your include-search path, or simply include it via the correct relative path, or by using absolute paths.
For example, if you have root_containing_folder/wwwroot/index.php and root_containing_folder/app/core.php, in index.php you could have
require_once('../app/core.php');
and core would be included, but a browser could never get to core.php on its own. (If they could, it would have to be through a URL like www.facing-site.com/../app/core.php -- which your web server should never allow!)
You can't do those things: when an script makes an AJAX request, it's the user's browser that sends the request. If you want client-side scripts to see your content, browsers must be able to see it.
You can apply some security-through-obscurity, for example by putting some kind of auth token in the script. This won't give you much protection, as all a user has to do is read the JS to get the token, but it will stop casual visitors from poking around. Your 'if XHR' is effectively doing this - a browser won't normally send that header if the address is put in the address bar, but a user can easily get the same effect outside of your AJAX code.
I was developing a web crawler when I noticed this.
URL 1: http://www.techwyse.com/services/
URL 2: http://www.techwyse.com/contact-us.php
URL 1 doesnt have any extension like HTML or aspx.But it displays a page . Is is possible to know the exact name of the page being displayed? (it is not displayed in Browser)
What do we call these kind of URLs like URL2 ?
Thanks in Advance
http://www.techwyse.com/services/ refers to a folder on the webserver whereas http://www.techwyse.com/contact-us.php refers to an actual file on the webserver.
When you request a folder 4 things can basically happen:
The webserver configured to have 'default files' like index.html, default.asp, index.php and one of these is shown.
The webserver found no 'default' files and folder browsing is enabled and you will actually get to see all the files and subfolders
There are no default files and folderbrowsing is disabled and you will see the error message 'folder browsing is disabled'
SEO is used and the webservers internally refers that URL to a specific file.
This is done by URL Rewritting
http://msdn.microsoft.com/en-us/library/ms972974.aspx
URL 1: is just calling a directory on the server and the webserver serves a default document.
In this case it's http://www.techwyse.com/services/index.php(same page as URL 1). You can try that this is not a coincidence by opening http://www.techwyse.com/services/index2.php which returns a 404.
URL 1 doesnt have any extension like HTML or aspx. But it displays a page. Is is possible to know the exact name of the page being displayed? (it is not displayed in Browser)
The name of the page will be the text inside the <title> element (assuming an HTML document). There is no way to know the filename, or even if there is a file to begin with. URLs resolve to resources and how the HTTP server determines the resource is an implementation detail of the HTTP server and utterly irrelevant to the client.
(Anecdote time: A friend was unimpressed by his university spidering the contents of his HTTP server and ignoring the robots.txt directives, so he wrote a script to generate random HTML documents containing random links, and let it spend a couple of days indexing utter nonsense on random URLs where the resources were all generated by random.py. There was no way to know that random.py was called random.py from outside the server (or even that it existed, although it was relatively easy to infer)).
The exact URL is http://www.techwyse.com/services/ (there might be other URLs that resolve to an identical resource, but that one is as written).
What do we call these kind of URLs like URL2 ?
URLs. (If you really want to distinguish them you could say "URLs with something that appears to be a file name in them").
The first URL is either handled by something like mod_rewrite or a Java servlet (which can be mapped to any path), or it is simply a directory index. Most any web server will allow you to place a page with a given (often configurable) name in a directory, such as index.html or index.php, and have that page load by default. So, for example, www.mysite.com/ actually loads www.mysite.com/index.html. This works on subdirectories as well.
It's likely that the first type of URL you mention was created by simply having a file named "index" inside the directory named "services". Most web servers will by default look for a file name "index" if the URL does not specify a file.
Considering the fact that the 2nd example is a file with a PHP extension, it's likely that the file referenced by the first URL is "index.php".
That's called SEO friendly urls.
and with PHP that can be achieved with the apache module mod_rewrite
EDIT: Did not gone through posted links.
Here services is the directory hence the file name set with DirectoryIndex in httpd.conf file (usually index.php but one can change) will be executed.
You can also change that file using .htaccess.
Let's do a quick test to understand this.
Create a file inside services directory and name it (for example test.php)
Create a .htaccess file and add below line
DirectoryIndex test.php
and then go to http://www.techwyse.com/services/
You will see now test.php file is getting exeucted instead of index.php
I have an iframe that loads a remote page (not hosted on the same domain). I would like to edit the contents of the page, but of course, this is not possible, since I don't have permissions.
So, I was wondering, if I have FTP access to the site, would there be a work around to the problem? With FTP, I could copy the files of the site over to my domain, and edit them via an iframe. But I was wondering if there is an alternate method.
Actually, yes. If you had FTP access to the site you could do it in theory.
Basically, something like:
// I used jQuery to speed up writing ajax code, really it could be anything else
jQuery.get('?refresh',function(){ // this function is called when the request finishes
// force the iframe to do a complete refresh (hence the random token)
jQuery('#iframe').attr('src','http://targetsite.com/somefile.php?r='+Math.random());
});
And:
// if the variable in question was set...
if(isset($_REQUEST['refresh'])){
// the following requires "allow_url_fopen" config to be on
// otherwise, you could use any other PHP FTP library
file_put_contents('ftp://username:password#targetsite.com/somefile.php','Hello');
}
Why use iFrames? If you need to load the content of a page hosted on another server, you could grab its content with cURL or some of the PHP file wrappers, e.g. the PHP readfile function. Viola!
If you used readfile(..) you can also make edits to the file content you've loaded before you display it. If you have permission, you could also use include() to read the file via HTTP if you are certain that a valid PHP file will be returned from your request.
I am trying to secure my PHP Image upload script and the last hurdle I have to jump is making it so that users cannot directly excecute the images, but the server can still serve them in web pages. I tried changing ownership and permissions of the folders to no avail, so I am trying to store the images above public_html and display them in pages that are stored in public_html.
My File Structure:
- userimages
image.jpg
image2.jpg
- public_html
filetoserveimage.html
I tried linking to an image in the userimages folder like this:
<img src="../userimages/image.jpg">
But it does not work. Is there something I am missing here? If you have any better suggestions please let me know. I am trying to keep public users from executing potentially dangerous files they may have uploaded. Just as an extra security measure. Thanks!
You want something that's basically impossible.
The way a browser loads a page (in a very basic sense) is this:
Step 1: Download the page.
Step 2: Parse the page.
Step 3: Download anything referenced in the content of the page (images, stylesheets, javascripts, etc)
Each "Download" event is atomic.
It seems like you want to only serve images to people who have just downloaded a page that references those images.
As PHP Jedi illustrated, you can pass the files through PHP. You could expand on his code, and check the HTTP_REFERER on the request to ensure that people aren't grabbing "just" the image.
Now, serving every image through a PHP passthru script is not efficient, but it could work.
The most common reason people want to do this is to avoid "hotlinking" -- when people create image tags on other sites that reference the image on your server. When they do that, you expend resources handling requests that get presented on someone else's page.
If that's what you're really trying to avoid, you can use mod_rewrite to check the referer.
A decent-looking discussion of hotlinking/anti-hotlinking can be found here
Use an image relay script!
To serve a imagefile that is outside the public_html folder you would have to do it by a php script. E.g make a image-relay.php that reads the image that is outside the public html...
<?php
header('Content-Type: image/jpeg');
$_file = 'myimage.jpg'; // or $_GET['img']
echo file_get_contents('/myimages/'.$_file);
?>
Now, $_file could be a $_GET parameter, but its absolutley important to validate the input parameter...
now you can make an <img src="image-relay.php?img=flower.jpg"> to access a flower.jpg image that is located in /myimage/flower.jpg ...
Well, a web browser will only be able to access files and folders inside public_html.
If the public_html directory is the root of the server for your users, Apache cannot serve anything that is not inside/below that dorectory.
If you want a file to be served by Apache directly, you'll have to put it in/below public_html.
I think your misunderstanding is in the fact that if you include an image in an <img> tag, your browser will send the exact same request to the webserver to fetch it, that will be sent to the webserver if you try to open the src url of the image in your browser directly.
Therefore, either both things work, or neither.
There are hacks around, involving a (php or other) script to make sure that an IP that has requested the image has also requested the html page within the last few seconds (which will not work if the user is behind a proxy that rotates outgoing IPs) or by checking the referer (which does not work with HTTPs and also not if the user has referer disabled).
If you want to make sure that only some users can see the image (both via <img> tag and directly), you can put the image outside public_html and have a (php or other) script that verifies the user's credentials before serving the image.
If you are using Apache or lighttpd you can use the X-Sendfile header to send files that are not in the web root(provided you haven't changed the configuration of mod_xsendfile).
To learn more about X-sendfile see this site.
This solution is giving you the best possible performance as PHP doesn't send the file but the server does and therefore PHP can be exited while the files are being served.
Hope that helps.