I have seen file protection methods used in many web sites such as youtube , file hosting sites, music sites, facebook.. They are using special way to control the availability of the file....
Links look like this,
http://www.mysite.com/music/audio.mp3?Expires=1354180089&Key=APKAIKAIRXBA2H7FXITA
after the expiry , file is no longer available so the user who wants to use the file have to request it again with new expiry code... It will avoid illegal use of the file in other sites and it will protect bandwidth...
when using such a way, file is not available forever like in http://www.mysite.com/music/audio.mp3
I searched everywhere for tutorials but I couldn't find any.... Help me...
in this case, audio.mp3 is not a real mp3 file, it is a script which checks the session expire time and if it is valid, makes the right header, and prints out the real mp3 file which is located somewhere on the server and only the script can access it. something like this pseudo code:
if (session valid) {
//set the right header;
//print out the mp3 file;
} else {
//text/html header;
//print the message about session being invalid;
}
Related
I have a massive of scripts that my core application
include('JS/gramp.php');
include('JS/est.php');
include('JS/curest.php');
include('JS/memomarker.php');
include('JS/local----------.php');
include('JS/poirel.php');
include('JS/maplayers.php');
include('JS/trafficinc.php');
include('JS/plannedtraffic.php');
include('JS/transportissues.php');
include('JS/cams_traff.php');
include('JS/places2.php');
Now these are all being moved to a on the fly loading, to reduce the size of the application on load
if(button_case_curtime==true){
$(".jsstm").load("<?php echo $core_dir; ?>JS/curresttime.php?la=<?php echo $caseset['iplat']; ?>&lo=<?php echo $caseset['iplong']; ?>&h=<?php echo $days_h; ?>");
rendermap = true;
}
issue! the application requires these files to be secure, the data involved requires that no one can access.
The ONLY file that will ever request these files will be index.php
Any input or idears would be fantastic!
There is no way to provide a file to the browser without also providing it to a user.
You could configure your server to only supply the files given an extra HTTP header (which you could add with JS), but nothing would stop people from sending that header manually or just digging the source out of the browser's debugging tools.
Any user you give the files to will have access to the files. If you want to limit which users have access to them, then you have to use auth/authz (which you'll want to apply to the index.php file as well so that unauthorised users don't just get JS errors or silent failure states).
No. What you are trying to do is not possible. Ajax requests are not special. They are just HTTP requests. End points created for Ajax should be secured with authentication/authorization just like any other HTTP request end point.
This is a trivial solution that will solve your problem half-way. Request them via a POST request, like so:
$.post('JS/maplayers.php', {'ajax':true}, function(){});
Notice the POST variable 'ajax'. In the file maplayers.php, add to the beginning the following code:
if((!isset($_POST['ajax']))) {
die('Invalid request, only ajax requests are permitted');
}
I'm developing sites and some visitor's browsers appear with old cache.
Is there a way we can clear visitor's browser cache using codes from the server side or even javascript so they don't have to clear themselves?
I cannot find the direct answer to this.
There must be a way big companies do like Facebook, Ebay etc
We have been using htaccess to determine the caching rules of the clients. We explicitly give the cache a 24h lifetime and we put no-cache rules the day before we do the update. It has helped but it is tedious and not so reliable.
Just posting it to give you ideas if no one answers, but I would really love to get the answer too. :)
First Method:
You can actually save the output of the page before you end the script, then load the cache at the start of the script.
example code:
<?php
$cachefile = 'cache/'.basename($_SERVER['PHP_SELF']).'.cache'; // e.g. cache/index.php.cache
$cachetime = 3600; // time to cache in seconds
if(file_exists($cachefile) && time()-$cachetime <= filemtime($cachefile)){
$c = #file_get_contents($cf);
echo $c;
exit;
}else{
unlink($cachefile);
}
ob_start();
// all the coding goes here
$c = ob_get_contents();
file_put_contents($cachefile);
?>
You can actually save the output of the page before you end the script, then load the cache at the start of the script.
example code:
If you have a lot of pages needing this caching you can do this:
in cachestart.php:
in cacheend.php:
<?php
$c = ob_get_contents();
file_put_contents($cachefile);
?>
Then just simply add
include('cachestart.php');
at the start of your scripts. and add
include('cacheend.php');
at the end of your scripts. Remember to have a folder named cache and allow PHP to access it.
Also do remember that if you're doing a full page cache, your page should not have SESSION specific display (e.g. display members' bar or what) because they will be cached as well. Look at a framework for specific-caching (variable or part of the page).
Second Method:
Use Squid or update the HTTP headers correctly to do browser caching.
PEAR has a caching package (actually two):
http://pear.php.net/package/Cache
Fourth Method:
Use http://memcached.org/. There's an explanation of how to do it on that site.
I usually use a combination of techniques:
HTML resulting from PHP code is not cached using the standard configuration, because it sends out the appropriate headers automatically.
Images and other binary assets get renamed if they change.
For JavaScript and CSS I add a automatically created unique code (e.a MD5 hash of the contents or the file size) to the filename (e.g. /public/styles.f782bed8.css) and remove it again with mod_rewrite. This way every change in the file results in a new file name. This can be done at runtime in PHP while outputting the HTML header, to have it fully automated. In this case however an MD5 might have a performance impact.
I am trying to write a script for uploading large files (>500MB). I would like to do some authentication before the upload is processed, eg:
$id = $_GET['key'];
$size = $_GET['size'];
$time = $_GET['time'];
$signature = $_GET['signature'];
$secret = 'asdfgh123456';
if(sha1($id.$size.$time.$secret) != $signature){
echo 'invalid signature';
exit;
}
process upload...
unfortunately php only runs this code after the file has been uploaded to a temp directory, taking up valuable server resources. Is there a way to do this before the upload happens? I have tried similar things with perl/cgi but the same thing happens.
Wow, already 5 answers telling how it can't be done. mod_perl to the rescue, here you can reject a request before the whole request body is uploaded.
Apache is taking care of the upload before the PHP script is even invoked so you won't be able to get at it.
You can either split up the process into two pages (authentication, file upload page) or, if you need to do it all in one page, use an AJAX-esque solution to upload the file after authentication parameters are checked.
As far as I know, you cannot do that in PHP. PHP script is launched in response to a request, but a request is not "sent" until the file is uploaded, since the file being uploaded is a part of the request.
This is definitely not possible inside the PHP script you're uploading to.
The most simple possibility is indeed to provide authentication one step before the upload takes place.
If that is not an option, one slightly outlandish possibility comes to mind - using a RewriteMap and mapping it to an external program (it should be possible to make that program a PHP script).
Using RewriteMap it is possible to rewrite an URL based on the output of a command line program. If you use this directive to call a (separate) PHP script - you won't be able to use the user's session, though! - you would have access to the GET parameters before the request is processed.
If the processing fails (= the credentials are invalid), you could redirect the request to a static resource which would at least prevent PHP from starting up. (I assume the uploaded will be hogging some resources anyway, but probably less than if it were redirected to PHP.)
No guarantees whether this'll work! I have no own experience with RewriteMap.
This is due to the fact that each HTTP request is a single contains all the of form/POST data, including the file upload data.
As such, I don't believe it's possible to handle a file upload request in this fashion irrespective of which scripting language you use.
I don't think you can do this. The best you can do is probably to run an AJAX function onSubmit to do your validation first, then if it returns valid then execute the POST to upload the file. You could set a $_SESSION in your AJAX script if the authentication is valid, then check for that session var in the upload script to allow the upload.
I am writing a anti-leeching download script, and my plan is to create a temporary file, which is named by session ID, then after the session expires, the file will be automatically deleted. Is it possible ? And can you give me some tips how to do that in PHP ?
Thanks so much for any reply
PHP has a function for that name tmpfile. It creates a temporary file and returns a resource. The resource can be used like any other resource.
E.g. the example from the manual:
<?php
$temp = tmpfile();
fwrite($temp, "writing to tempfile");
fseek($temp, 0);
echo fread($temp, 1024);
fclose($temp); // this removes the file
?>
The file is automatically removed when closed (using fclose()), or when the script ends. You can use any file functions on the resource. You can find these here. Hope this will help you?
Another solution would be to create the file in the regular way and use a cronjob to regular check if a session is expired. The expiration date and other session data could be stored in a database. Use the script to query that data and determine if a session is expired. If so, remove it physically from the disk. Make sure to run the script once an hour or so (depending on your timeout).
So we have one or more files available for download. Creating a temporary file for each download requests is not a good idea. Creating a symlink() for each file instead is a much better idea. This will save loads of disk space and keep down the server load.
Naming the symlink after the user's session is a decent idea. A better idea is to generate a random symlink name & associate with the session, so the script can handle multiple downloads per session. You can use session_set_save_handler() (link) and register a custom read function that checks for expired sessions and removes symlinks when the session has expired.
Could you explain your problem a bit more deeply? Because I don't see a reason why not to use $_SESSION. The data in $_SESSION is stored server-side in a file (see http://php.net/session.save-path) BTW. At least by default. ;-)
Ok, so we have the following requirements so far
Let the user download in his/her session only
no copy & paste the link to somebody else
Users have to download from the site, e.g. no hotlinking
Control speed
Let's see. This is not working code, but it should work along these lines:
<?php // download.php
session_start(); // start or resume a session
// always sanitize user input
$fileId = filter_input(INPUT_GET, 'fileId', FILTER_SANITIZE_NUMBER_INT);
$token = filter_input(INPUT_GET, 'token', FILTER_UNSAFE_RAW);
$referer = filter_input(INPUT_SERVER, 'HTTP_REFERER', FILTER_SANITIZE_URL);
$script = filter_input(INPUT_SERVER, 'SCRIPT_NAME', FILTER_SANITIZE_URL);
// mush session_id and fileId into an access token
$secret = 'i can haz salt?';
$expectedToken = md5($secret . session_id() . $fileId);
// check if request came from download.php and has the valid access token
if(($expectedToken === $token) && ($referer === $script)) {
$file = realpath('path/to/files/' . $fileId . '.zip');
if(is_readable($file)) {
session_destroy(); // optional
header(/* stuff */);
fpassthru($file);
exit;
}
}
// if no file was sent, send the page with the download link.
?>
<html ...
<?php printf('a href="/download.php?fileId=%s&token=%s',
$fileId, $expectedToken); ?>
...
</html>
And that's it. No database required. This should cover requirements 1-3. You cannot control speed with PHP, but if you dont destroy the session after sending a file you could write a counter to the session and limit the number of files the user will be sent during a session.
I wholeheartedly agree that this could be solved much more elegantly than with this monkeyform hack, but as proof-of-concept, it should be sufficient.
I'd suggest you not to copy the file in the first place. I'd do the following: when user requests the file, you generate a random unique string to give him the link this way: dl.php?k=hd8DcjCjdCkk123 then put this string to a database, storing his IP address, maybe session and the time you've generated the link. Then another user request that file, make sure all the stuff (hash, ip and so on) matches and the link is not expired (e.g. not more that N hours have passed since the generation) and if everything is OK, use PHP to pipe the file. Set a cron job to look through the DB and remove the expired entries. What do you think?
tmpfile
Creates a temporary file with a unique
name in read-write (w+) mode and
returns a file handle. The file is
automatically removed when closed
(using fclose()), or when the script
ends.
Maybe it's to late for answering but I'm try to share on feature googlize!
if you use CPanel there is a short and quick way for blocking external
request on your hosted files which name is: HotLink.
you can Enable HotLinks on you Cpanel and be sure nobody can has request o your file from another hosting or use your files as a download reference.
To acheive this, I would make one file and protect it using chmod - making it unavailable to the public. Or, alternatively, save the contents in a database table row, fetch it whenever required.
Making it downloadable as a file. To do so, I would get the contents from the protected file, or if it is stored in a database table, fetch it and simply output it. Using php headers, I would, give it a desired name, extension, specify it's type, and finally force the browser to download the output as a solid file.
This way, you only need to save data in one place either, in a protected file or in database. Force client browser to download it as many times as the the conditions meet e.g., as long as the user is logged-in and so on. Without having to worry about the disk space, making any temp file, cronJobs and or auto-deletion of the file.
one of my php page returns data like this:
<?php
//...
echo "json string";
?>
but someone else use file_get_contents() to get my data and use in other website.
can anybody tell me what can i do to prevent such thing happen.
i consider if i can get the request's domain name to echo something else.but i dont know
the function to get request's domain name.and if the request is sent by a server,that
will be unhelpful. My English is poor, to express doubts, please bear with.
you can also use sessions. if somewhere in your application, before the user gets the json data, you start a session, then in this page where you are outputting json data, you can check for the session variable. this way only users that have passed the session generator page, can view your output.
suppose you have page A.php that generates the session. use this code before outputting anything in this page.
session_start();
$_SESSION['approvedForJson'] = true;
then in your page where you are outputting json data, before outputting anything, call session_start() again. the beginning of your PHP code is a good place to call it.
then before outputting the json data, check if the session variable for approved users exists, or not.
if ( isset($_SESSION['approvedForJson']) && $_SESSION['approvedForJson'] ) {
echo "json data";
} else {
// bad request
}
You can use $_SERVER['REMOTE_ADDR'] to get the address of the client address. You can also check $_SERVER['HTTP_REFERER'] and block external requests that way, but it's less reliable. There's probably a few other techniques involving $_SERVER that you can try.
Your fighting an uphill battle here. I am assuming your serverside process that responds in json is being consumed via javascript in your users browsers... so there is no easy way to encrypt it. You might try some of the techniques used to prevent xspf (see http://en.wikipedia.org/wiki/Cross-site_request_forgery ). If you developed the client to pass along some session token that is uniq per client you could reduce some of the problem. But, chances are whoever is stealing your data is gonna figure out whatever mechanism you put in place ... assuming this is some sort of ajax type thing. If its a server-server thing then as sli mentions, setting up some restrictions based on the remote ip would help, plus setting up some sort of API authentication tokens would help even more (see oauth for some pointers)
You could also using .htaccess with apache block every external request to the page if it get's called internally or block every request that is not from your domain:
Google search thingie
EDIT
You could also use some php file which includes the file which can not be read. So for example you have file.php:
<?php
$allowedFiles[] = 'somefile.php';
$allowedFiles[] = 'someotherFile.php';
$allowedFiles[] = 'jsonReturnFile.php';
if(in_array($_GET['file'], $allowedFiles)){
include( "include/".$_GET['file'] );
}
?>
Then you can allow file_ get _contents() on that file and write a rewriteRule in your .htacces to disallow any request to the include/ folder.
RewriteRule include* - [F,NC]
That will return a 403 forbidden error for a request to that directory or any file in the directory.
Then you can do you JSON request to something like: file.php?file=jsonReturnFile.php&someothherParamReadByJsonFile=1
And when someone tries to get the file contents for the JSON file they will get the forbidden error, and getting the file contents for the include.php won't return anything usefull.