check every request coming to apacheweb server - php

I have a weird requirement asked , to check time difference between each request and response
I have a PHP file (request.php) serving a image as it's response, so i have to check time difference between GET request.php and GET image request. for each and every request.
For example:
the time request.php is made lets say 12:40 AM
it will response a image, when image is inserted in a web page i get a image request say 12.42 AM.
so my requirement is to calculate time difference between request.php and image every time they hit my server.
I suggested to read Apache access-log but client want me to note request time for every request made and store some where and calculate average between each request on end of the day.
My idea is to read Apache-log and is there a way we can write a php script which will work for every incoming request and check it's timing, probably a custom access-log file.
Any Help!!

You should use a handler to be able to hook into each and every request.
This can be done quite easily, but you should then also serve the files from your php.
Update
The following code snippets assume the following:
you have a typical "LAMP stack" installed on the hosting server
your Apache config "allows directory overrides" (with .htaccess files)
your Apache setup has mod_rewrite installed and enabled
Apache+PHP has read permission of all the files created in your docroot
your PHP version is at least 5.5 (or better)
In your server's docroot folder, create these files:
.htaccess
handler.php
.htaccess
Open the .htaccess file in your favorite text editor, type in the following, and save:
RewriteEngine On
RewriteCond %{REQUEST_URI} !handler.php$
RewriteRule (.*) handler.php
handler.php
Type the following in your .handler.php file, and save:
<?
$over = $_SERVER['SERVER_PROTOCOL'];
$path = explode('?',$_SERVER['REQUEST_URI']);
$path = (($path == '/') ? '/home.html');
$extn = array_shift((explode('.',$path)));
$list = # array
[
'html' => 'text/html',
'css' => 'text/css',
'png' => 'image/png',
'js' => 'application/javascript',
];
$type = (isset($mime[$extn]) ? $mime[$extn] : 'text/plain');
if (file_exists(".${path}"))
{
if (is_dir(".${path}") || ($path == '/.htaccess'))
{
header("${over} 403 Forbidden");
echo "path: `$path` is forbidden";
exit;
}
header("${over} 200 OK");
header("Content-Type: ${type}");
header("Content-Length: ".filesize($path));
readfile($path);
exit;
}
header("${over} 404 Not Found");
echo "path: `$path` is undefined";
exit;
?>
With each request, get the current microtime.
then, serve the file, (get the mime type of the file, write appropriate header: "Content-type", and simply: readfile('path/to/file.mp3');
*where 'path/to/file.mp3' would obviously be:
$_SERVER['REQUEST_URI'] or some re-routing - however you want it.
then, get the microtime again, now, subtract the former microtime from the latter, and you have the time it took to serve the file.
So, with that done, now you can log each request in the database, or where-ever, per request, specifying the field names accordingly.
I'm not sure how how much detail is required, please comment accordingly.

2:From your question, I gather that the requirement is to actually get the time that the image finally "shows up" on the visitor's browser?
If that is the case, then, well this may get technical pretty quick, depending how accurate the info should be, and also if you don't mind getting the said images via XMLHTTpRequest - via JavaScript. If that is the case, you can do the whole server-side thing as a pre-requisite, aditionally, set a custom response header, i.e:
header('x-imageID', $mysql_insert_id());
Then readfile('blah.jpg');
on the client-side, with the XMLHTTpRequest, set in the onload event another request to the server, with the image-id with e.g xhr.getResponseHeader('x-imageID') and the current time in milliseconds: new Date().getTime().
obviously you will need to set some info to distinguish the requests so that the server will only log for image requests, and update the records with the "update time" requests.
Hope it helps ;)

Related

Reserve file for ajax request only

I have a massive of scripts that my core application
include('JS/gramp.php');
include('JS/est.php');
include('JS/curest.php');
include('JS/memomarker.php');
include('JS/local----------.php');
include('JS/poirel.php');
include('JS/maplayers.php');
include('JS/trafficinc.php');
include('JS/plannedtraffic.php');
include('JS/transportissues.php');
include('JS/cams_traff.php');
include('JS/places2.php');
Now these are all being moved to a on the fly loading, to reduce the size of the application on load
if(button_case_curtime==true){
$(".jsstm").load("<?php echo $core_dir; ?>JS/curresttime.php?la=<?php echo $caseset['iplat']; ?>&lo=<?php echo $caseset['iplong']; ?>&h=<?php echo $days_h; ?>");
rendermap = true;
}
issue! the application requires these files to be secure, the data involved requires that no one can access.
The ONLY file that will ever request these files will be index.php
Any input or idears would be fantastic!
There is no way to provide a file to the browser without also providing it to a user.
You could configure your server to only supply the files given an extra HTTP header (which you could add with JS), but nothing would stop people from sending that header manually or just digging the source out of the browser's debugging tools.
Any user you give the files to will have access to the files. If you want to limit which users have access to them, then you have to use auth/authz (which you'll want to apply to the index.php file as well so that unauthorised users don't just get JS errors or silent failure states).
No. What you are trying to do is not possible. Ajax requests are not special. They are just HTTP requests. End points created for Ajax should be secured with authentication/authorization just like any other HTTP request end point.
This is a trivial solution that will solve your problem half-way. Request them via a POST request, like so:
$.post('JS/maplayers.php', {'ajax':true}, function(){});
Notice the POST variable 'ajax'. In the file maplayers.php, add to the beginning the following code:
if((!isset($_POST['ajax']))) {
die('Invalid request, only ajax requests are permitted');
}

Cancel an HTTP POST request server side

I am trying to write a script for uploading large files (>500MB). I would like to do some authentication before the upload is processed, eg:
$id = $_GET['key'];
$size = $_GET['size'];
$time = $_GET['time'];
$signature = $_GET['signature'];
$secret = 'asdfgh123456';
if(sha1($id.$size.$time.$secret) != $signature){
echo 'invalid signature';
exit;
}
process upload...
unfortunately php only runs this code after the file has been uploaded to a temp directory, taking up valuable server resources. Is there a way to do this before the upload happens? I have tried similar things with perl/cgi but the same thing happens.
Wow, already 5 answers telling how it can't be done. mod_perl to the rescue, here you can reject a request before the whole request body is uploaded.
Apache is taking care of the upload before the PHP script is even invoked so you won't be able to get at it.
You can either split up the process into two pages (authentication, file upload page) or, if you need to do it all in one page, use an AJAX-esque solution to upload the file after authentication parameters are checked.
As far as I know, you cannot do that in PHP. PHP script is launched in response to a request, but a request is not "sent" until the file is uploaded, since the file being uploaded is a part of the request.
This is definitely not possible inside the PHP script you're uploading to.
The most simple possibility is indeed to provide authentication one step before the upload takes place.
If that is not an option, one slightly outlandish possibility comes to mind - using a RewriteMap and mapping it to an external program (it should be possible to make that program a PHP script).
Using RewriteMap it is possible to rewrite an URL based on the output of a command line program. If you use this directive to call a (separate) PHP script - you won't be able to use the user's session, though! - you would have access to the GET parameters before the request is processed.
If the processing fails (= the credentials are invalid), you could redirect the request to a static resource which would at least prevent PHP from starting up. (I assume the uploaded will be hogging some resources anyway, but probably less than if it were redirected to PHP.)
No guarantees whether this'll work! I have no own experience with RewriteMap.
This is due to the fact that each HTTP request is a single contains all the of form/POST data, including the file upload data.
As such, I don't believe it's possible to handle a file upload request in this fashion irrespective of which scripting language you use.
I don't think you can do this. The best you can do is probably to run an AJAX function onSubmit to do your validation first, then if it returns valid then execute the POST to upload the file. You could set a $_SESSION in your AJAX script if the authentication is valid, then check for that session var in the upload script to allow the upload.

Do not allow hot-linking of images unless logged in

I just ran into something today that I am not sure how it is being done.
I know a few things:
The site is done in php
There is an image gallery, a url would be something like
http://www.example.com/girls/Cyn/sets/getImage/1170753147/7.jpg
I can see that url, as long as I am logged into the site. It does not appear to be referrer based, as I took the url, made a new window in a browser, and was able to load it while still logged in. In doing so, I had no referrer.
The second I log out, I am redirected to a please register/login page.
This is a heavy hit site.
How is this done? I can tell they are running apache on Cent. When I login, I am given a cookie, that has a hash of something, which I am sure they are using to lookup an id to make sure I am allowed to be logged in.
However, the above is a direct request for a resource that is just a jpg. There has to be some communication then with Apache, and their database to see the state of that request. How would merely loading a url, send off a cookie value to apache that could then pass it off to a database?
I am about to embark on a paid membership site, and will need to protect images in the same way. This was not http auth, this was form based login, and I am at a loss as to how this was done. Any ideas?
All requests go through the web server. If a site sets a cookie, then all your requests to that site will include the cookie contents until that cookie expires or is removed. It doesn't matter what you're requesting it only matters where you are requesting it from.
If you have firebug open up the 'Net' tab when you're on this site and check all the requests you have made. You'll see in the request headers a 'Cookie' line. This will be on every resource requested: the images, the stylesheets, everything.
If Apache is the web server then it could use mod_rewrite to direct your request or it could pass it to PHP or Perl or something else that can check the cookie and output the image if valid or redirect if not.
Here is a php example (image output taken from php.net):
if(valid_auth($_COOKIE['auth'])) {
// open the file in a binary mode
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
} else {
header('Location: /login');
exit;
}
It's probably a web application that uses a session cookie for authentication and redirects if the session has not been authenticated.
Pretty much any web framework has plugins for this sort of thing. There might even be apache modules to do it, but I haven't seen one.
You must create a "getter" for the images. The images must be stored in a folder outside of the public accessible directories.
/public_html
/js
jquery.js
index.php
getimage.php
/private_images/
myimage.jpg
Note that private_images directory is not accessible when you: http://www.mysite.com/private_images
Now, to create the "getter" script.
/* This is getimage.php */
if(!isset($_SESSION['is_logged_in'])) {
header('Location: /login');
exit;
}
/*
Get the image_name from the URL
You will be using something like: http://mysite.com?image_name=flowers.jpg
This is the way to get the image.
*/
$path = "/var/www/html/private_images"
$name = $path.'/'.$_GET['image_name'];
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/jpg");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
If you missed the comment above, you can do this to retrieve the image:
http://mysite.com?image_name=flowers.jpg
However, the above is a direct request for a resource that is just a jpg. There has to be some communication then with Apache, and their database to see the state of that request. How would merely loading a url, send off a cookie value to apache that could then pass it off to a database?
Every single http request is sent to a web server. The web server will then decide how to handle the request, based on a set of rules. By default, Apache has a simple handler that just sends the requested file back to the user. There is however no reason why you couldn't configure Apache to handle all requests with a php-script. On a high traffic site, you would probably solve this differently, since it's a bit expensive to fire up php for each and every image to show, but in theory you could just make a mod_rewrite rule that pipes all requests matching a particular pattern (Such as ^girls/Cyn/sets/getImage/.*) to a php-script. This script would then read the actual file from somewhere outside the web root and print it out to the user.

Redirect files to simple form before downloading

Hi I am trying to redirect all links to any pdf file in my site to a page with a form in it that collects user info before they can proceed to download/view the pdf.
Eg
I want to redirect *.pdf files in web site to request.php?file=name_of_pdf_being_redirected
Where request.php is the page with the form on it asking for a few details before proceeding.
All pdf's in the site are held inside /pdf folder.
Any ideas?
EDIT: sorry I'm using Apache on the server.
OK I'M GETTING THERE:
I have it working now using:
RewriteEngine on
RewriteRule ^pdf/(.+.pdf)$ request.php?file=/$1 [R]
But now when it goes to the download page when i want to let the person actually download the file my new rule is spitting the download link back to the form :-P haha so is there anyway to let it download the file once the form has been submitted and you're on download.php?
Ideas? You could start by telling us which web/app server you're using, that might help :-)
In Apache, you should be able to use a RewriteRule to morph the request into a different form. For example, turning /pub/docs/x.pdf into request.php?file=/pub/docs/x.pdf could be done with something like:
RewriteRule ^/pdf/(.*)\.pdf/ request.php?file=/$1.pdf
Keep in mind this is from memory (six years since I touched Apache and still clean :-), the format may be slightly different.
Update:
Now you've got that sorted, here's a couple of options for your next problem.
1/ Rename the PDFs to have a different extension so that they're not caught by the rewrite rule. They should be configured to push out the same MIME type to the client so that they open in the clients choice of viewer.
2/ Do the download as part of the script as well, not as a direct access to the PDF. Since the submission of the form is a HTTP request, you should be able to answer it immediately with the PDF contents rather than re-directing them again to the download page.
That second option would be my choice since it:
stops people figuring out they can get to the PDFs just by requesting xx.pdfx instead of xx.pdf.
makes it quicker for the person to get the PDF (they don't have to click on the link again).
You can try this:
Move your files to a folder "outside" your web root so that no one can access it thru a browser
Use sessions to detect whether a person has completed the form or not
Use a php powered file download script. In its naivest form, it might look like this:
if ( isset( $_SESSION[ 'OK_TO_DOWNLOAD' ] ) == false )
{
header( "Location: must_fill_this_first.php" );
exit( 0 );
}
header( "Content-type: application/pdf" );
// double check the above, google it as i am not sure
echo file_get_contents( 'some_directory_inaccessible_thru_www/' . $_GET[ 'pdf_name' ] );
// ideally a binary-safe function needs to be used above
This is a tried and tested technique I used on a website. The code example is a draft outline and needs refinement.
Note, my answer is with respect to a .NET website, but I'm sure the same constructs exist somewhere in PHP.
I would have an HTTPModule with a path of *.pdf that simply does a Response.Redirect to request.php?...etc (in my case request.aspx) And then in the event handler for the button click on that page, when you know which pdf to display and that they're authorized, simple do a Response.ContentType = [MIME type of pdf], and then Response.WriteFile(pdfFile), and finally Response.End().
There are other things you can add to make it better, such as filesize, etc. But in the minimal case, this would work. If you want the code for it in C# I could come up with something in about 3 minutes, but in PHP i'm quite lost. I'd start out looking for HTTPModules and how to write them in PHP.
Googling for "PHP HTTPModule" leads to this: Equivalent of ASP.NET HttpModules in PHP so, I may be a little wrong, but hopefully that's a starting point.
Use an .htaccess file if you're using an Apache web server. You'll need to make certain that you have mod_rewrite enabled, but once you do you can rewrite all files using these two simple lines:
RewriteEngine On
RewriteRule ^.pdf$ /rewrite.php [NC,L]
If you are using IIS, you can accomplish something similar using ISAPI_Rewrite.
Your other alternative is to place your pdf's inside of a directory that is not publicly accessible and then any request made for a pdf resource would return an access denied error and the files could only be accessed through the appropriate download script.
if($user==authenticated){
//set pdf headers
echo file_get_contents('actual.pdf');
no mod re-writes, hides actual source and is what i normally do - hope this helps

how to prevent PHP's file_get_contents( )

one of my php page returns data like this:
<?php
//...
echo "json string";
?>
but someone else use file_get_contents() to get my data and use in other website.
can anybody tell me what can i do to prevent such thing happen.
i consider if i can get the request's domain name to echo something else.but i dont know
the function to get request's domain name.and if the request is sent by a server,that
will be unhelpful. My English is poor, to express doubts, please bear with.
you can also use sessions. if somewhere in your application, before the user gets the json data, you start a session, then in this page where you are outputting json data, you can check for the session variable. this way only users that have passed the session generator page, can view your output.
suppose you have page A.php that generates the session. use this code before outputting anything in this page.
session_start();
$_SESSION['approvedForJson'] = true;
then in your page where you are outputting json data, before outputting anything, call session_start() again. the beginning of your PHP code is a good place to call it.
then before outputting the json data, check if the session variable for approved users exists, or not.
if ( isset($_SESSION['approvedForJson']) && $_SESSION['approvedForJson'] ) {
echo "json data";
} else {
// bad request
}
You can use $_SERVER['REMOTE_ADDR'] to get the address of the client address. You can also check $_SERVER['HTTP_REFERER'] and block external requests that way, but it's less reliable. There's probably a few other techniques involving $_SERVER that you can try.
Your fighting an uphill battle here. I am assuming your serverside process that responds in json is being consumed via javascript in your users browsers... so there is no easy way to encrypt it. You might try some of the techniques used to prevent xspf (see http://en.wikipedia.org/wiki/Cross-site_request_forgery ). If you developed the client to pass along some session token that is uniq per client you could reduce some of the problem. But, chances are whoever is stealing your data is gonna figure out whatever mechanism you put in place ... assuming this is some sort of ajax type thing. If its a server-server thing then as sli mentions, setting up some restrictions based on the remote ip would help, plus setting up some sort of API authentication tokens would help even more (see oauth for some pointers)
You could also using .htaccess with apache block every external request to the page if it get's called internally or block every request that is not from your domain:
Google search thingie
EDIT
You could also use some php file which includes the file which can not be read. So for example you have file.php:
<?php
$allowedFiles[] = 'somefile.php';
$allowedFiles[] = 'someotherFile.php';
$allowedFiles[] = 'jsonReturnFile.php';
if(in_array($_GET['file'], $allowedFiles)){
include( "include/".$_GET['file'] );
}
?>
Then you can allow file_ get _contents() on that file and write a rewriteRule in your .htacces to disallow any request to the include/ folder.
RewriteRule include* - [F,NC]
That will return a 403 forbidden error for a request to that directory or any file in the directory.
Then you can do you JSON request to something like: file.php?file=jsonReturnFile.php&someothherParamReadByJsonFile=1
And when someone tries to get the file contents for the JSON file they will get the forbidden error, and getting the file contents for the include.php won't return anything usefull.

Categories