I have a page that I want to execute via cron. It just does some pretty simple archiving stuff, nothing super high-security, no DB access etc.
Is it a secure practice to simply require a GET var to be present in order to execute the page? So myarchiver.php would be something like:
<?php
$mysecret_word = "abc123";
if ($_GET['secret'] == $mysecret_word){
// execute my stuff here
}
Then you'd just call myarchiver.php?secret=abc123 in the crontab and the process would run, while any wrong answer or attempt to execute the page with no secret would simply present a blank page (with no extra server load).
I realize this is not "secure" against man in the middle attacks and if the site was compromised-- but I believe in general it's plenty secure to keep this script from being fired by random script kiddies and other idiots who may somehow know about its existence? The thing I'm guarding against is random malicious users who may know about this script bombarding it with requests in order to dos/tie up resources.
EDIT TO ADD: the server is not accessible via SSH and the cron is being executed on a remote machine-- so it must be done via an http request.
Thanks for input.
If this script is never going to be run from the browser, you should place the file outside of your web root directory where browsers cannot reach it and just have your cron run the script at the alternate location. It would definitely be the most secure way to do it.
If you're on a shared hosting environment, you may need browser access for manual running. I would just use SSH to manually run the file from its location since it only takes me a couple seconds to login to SSH and get to the directory. Personally, I just prefer not to have excess pages laying around on my website.
First off, why not just check the IP address of the server making the request?
If it has to be done via an HTTP request and simply checking the IP address isn't an option, you can have your cron run a script similar to "runcron.php". That script would in turn make a CURL or WGET request to the actual cron script you want to run.
That would allow you to pass a dynamic hash instead of a static key. That would prevent someone from just repeating the HTTP request if they happen to sniff the traffic. For the hash you could use anything dynamic like the date combined with a salt.
Example:
if (md5('secretword') . date('H')) == $_GET['hash']) { // do cron }
That would at least rotate your key once an hour.
Also, crontab won't let you pass GET variables. You'll have to do this -
/usr/bin/php /home/blah.php hello
Then in the script -
$argv = $_SERVER['argv'];
echo $argv[1];
Someone correct me if I'm mistaken.
This is a technique that facebook uses on their logout.php file, so that if someone sends a link to logout.php it won't log them out. I would recommend doing this.
$mysecret_word = "abc123";
if ($_GET['asd2edxc32cwqcxad'] === $mysecret_word){
// execute my stuff here
} else {
error_log('oopis');
header('HTTP/1.0 404 Not Found');
die();
}
Related
I have a massive of scripts that my core application
include('JS/gramp.php');
include('JS/est.php');
include('JS/curest.php');
include('JS/memomarker.php');
include('JS/local----------.php');
include('JS/poirel.php');
include('JS/maplayers.php');
include('JS/trafficinc.php');
include('JS/plannedtraffic.php');
include('JS/transportissues.php');
include('JS/cams_traff.php');
include('JS/places2.php');
Now these are all being moved to a on the fly loading, to reduce the size of the application on load
if(button_case_curtime==true){
$(".jsstm").load("<?php echo $core_dir; ?>JS/curresttime.php?la=<?php echo $caseset['iplat']; ?>&lo=<?php echo $caseset['iplong']; ?>&h=<?php echo $days_h; ?>");
rendermap = true;
}
issue! the application requires these files to be secure, the data involved requires that no one can access.
The ONLY file that will ever request these files will be index.php
Any input or idears would be fantastic!
There is no way to provide a file to the browser without also providing it to a user.
You could configure your server to only supply the files given an extra HTTP header (which you could add with JS), but nothing would stop people from sending that header manually or just digging the source out of the browser's debugging tools.
Any user you give the files to will have access to the files. If you want to limit which users have access to them, then you have to use auth/authz (which you'll want to apply to the index.php file as well so that unauthorised users don't just get JS errors or silent failure states).
No. What you are trying to do is not possible. Ajax requests are not special. They are just HTTP requests. End points created for Ajax should be secured with authentication/authorization just like any other HTTP request end point.
This is a trivial solution that will solve your problem half-way. Request them via a POST request, like so:
$.post('JS/maplayers.php', {'ajax':true}, function(){});
Notice the POST variable 'ajax'. In the file maplayers.php, add to the beginning the following code:
if((!isset($_POST['ajax']))) {
die('Invalid request, only ajax requests are permitted');
}
how can I invoke a php script on a remote server from my server code ?
I'm currently using:
header('Location: http://www.url.com/script.php?arg1=blabla');
in my code, but it doesn't work.
thanks
If you mean by invoking just "calling" it, so you only need it to run, then you can use curl.
If you mean by invoking that you want it to act the same as include, then you can't trough http (the server does ofcourse not return code, but runs it). You might be able to obtain the file trough other means (ftp?), and then include it, but that seems like a bit of a hack.
If you mean by invoking that you want to redirect the user to the page, then this should work:
header('Location: http://www.site.nl/');
exit;
(your script continues to run after a header call, so you might need to call that exit). How doens't your code work for you? (I'm guessing because you want one of the other options)
If you only want to invoke the script you can simply use $result = file_get_contents('http://www.example.com/');.
Your version using header() will as said above redirect the user.
Use cURL, it gives you much wider manipulation options.
I am trying to write a script for uploading large files (>500MB). I would like to do some authentication before the upload is processed, eg:
$id = $_GET['key'];
$size = $_GET['size'];
$time = $_GET['time'];
$signature = $_GET['signature'];
$secret = 'asdfgh123456';
if(sha1($id.$size.$time.$secret) != $signature){
echo 'invalid signature';
exit;
}
process upload...
unfortunately php only runs this code after the file has been uploaded to a temp directory, taking up valuable server resources. Is there a way to do this before the upload happens? I have tried similar things with perl/cgi but the same thing happens.
Wow, already 5 answers telling how it can't be done. mod_perl to the rescue, here you can reject a request before the whole request body is uploaded.
Apache is taking care of the upload before the PHP script is even invoked so you won't be able to get at it.
You can either split up the process into two pages (authentication, file upload page) or, if you need to do it all in one page, use an AJAX-esque solution to upload the file after authentication parameters are checked.
As far as I know, you cannot do that in PHP. PHP script is launched in response to a request, but a request is not "sent" until the file is uploaded, since the file being uploaded is a part of the request.
This is definitely not possible inside the PHP script you're uploading to.
The most simple possibility is indeed to provide authentication one step before the upload takes place.
If that is not an option, one slightly outlandish possibility comes to mind - using a RewriteMap and mapping it to an external program (it should be possible to make that program a PHP script).
Using RewriteMap it is possible to rewrite an URL based on the output of a command line program. If you use this directive to call a (separate) PHP script - you won't be able to use the user's session, though! - you would have access to the GET parameters before the request is processed.
If the processing fails (= the credentials are invalid), you could redirect the request to a static resource which would at least prevent PHP from starting up. (I assume the uploaded will be hogging some resources anyway, but probably less than if it were redirected to PHP.)
No guarantees whether this'll work! I have no own experience with RewriteMap.
This is due to the fact that each HTTP request is a single contains all the of form/POST data, including the file upload data.
As such, I don't believe it's possible to handle a file upload request in this fashion irrespective of which scripting language you use.
I don't think you can do this. The best you can do is probably to run an AJAX function onSubmit to do your validation first, then if it returns valid then execute the POST to upload the file. You could set a $_SESSION in your AJAX script if the authentication is valid, then check for that session var in the upload script to allow the upload.
Recently I created a script that accesses another script to run it. The only way the second script will run, is if it is accessed from the same server. To accomplish this I've made a simple if ($_SERVER['REMOTE_ADDR'] == "XXX.XXX.XXX.XXX") {
(Obviously the XXX.XXX.XXX.XXX is replaced with my server's IP.)
However, I'd like the script to be more portable, so I want it to somehow detect the IP of the same server or something.
Suggestions? Or is this even possible?
A better approach would be to store the second script outside of webroot.
To answer your question $_SERVER['SERVER_ADDR'] will return the IP Address of the server where the current script is executing, but yeah there's better ways to do this, such as making the script unable to be accessed from the web in the first place.
http://php.net/manual/en/reserved.variables.server.php
You could always do something like...
if($_SERVER['REMOTE_ADDR'] == $_SERVER['SERVER_ADDR'] ){
// Do my stuff
} else {
header("Location: http://elsewhere.com");
}
This solution is tied to Apache, but you could copy the idea of stopping hot-linkers linking to your images, just change the parameters to fit your named scripts, or put those local-access only scripts in a single folder - might be a bit more manageable.
http://altlab.com/htaccess_tutorial.html
I have a very similar setup to the person here:
PHP Background Processes
i.e a very long script that takes up to 10 minutes. However, I need the person who calls the script redirected back to the homepage while the script works. In other words, I need the user experience to be something like this:
click the update button
script begins to execute, can take up to 10 minutes
user is redirected back to the home page immediately
Is this possible using only PHP? I gather I will need
ignore_user_abort(true);
set_time_limit(0);
But how do I redirect the user? I can't use javascript because output only gets written to the page at long increments, and I want the redirection to be immediate. Can I use headers? Or will that mess with things?
Alternatively, I could use the cron job approach, but I have zero experience in making a cron job or having it run php code (is that even possible?)
Thanks,
Mala
Update:
Using headers to redirect does not work - the page will not load until the script is done. However, eventually the webserver times out and says "Zero-Sized Reply: The requested URL could not be retrieved" (although the script continues running). I guess my only option is to go with the cron job idea. Ick!
The most obvious solution to me would be splitting the redirect and the background calculation in two separate files and let the redirect script execute the 10-minute script:
job.php:
<?php
// do the nasty calculation here
redirect.php:
<?php
// start the script and redirect output of the script to nirvana, so that it
// runs in the background
exec ('php /path/to/your/script/job.php >> /dev/null 2>&1 &');
// now the redirect
header ('Location /index.php');
Assumptions for this to work: You should be on a Linux host with either safe_mode disabled or having set the safe_mode_exec_dir appropriately. When you're running under windows, the exec string needs to be adapted, while the rest about safe_mode remains true.
Notice: When you need to pass arguments to the script, use escapeshellarg() before passing it on, see also the PHP manual on exec
I've tried several methods and none seems to work, I've even tried to use register_shutdown_function() but that also failed. I guess you're stuck with making a cron job.
I just remembered something (but I haven't tested it), you can try to do something like this:
set_time_limit(0);
ignore_user_abort(true);
ob_start(); // not sure if this is needed
// meta refresh or javascript redirect
ob_flush(); // not sure if this is needed
flush();
// code to process here
exit();
Not sure if it'll work but you can try it out.
I have a similar situation with processing logins.
To keep it short...
I get a PDT, IPN and each sends me a logging email.
An email is sent to client on IPN VERIFIED to give serial number and password to client.
As PDT and IPN I use goto to send me a logging email instead of a bunch of sequential ifs.
On reading many answers I studied each to figure what would suit my isssue.
I finally used...
<?php
ignore_user_abort(TRUE); // at very top
As I worked through the progressive checks (no ifs), if they failed I use for example...
$mcalmsg .= "Check [serialnbr]\r\n";
if (empty($_POST['serialnbr']))
{ header('Location: '.$returnurl.'?error=1');
$mcalmsg .= "Missing [serialnbr]\r\n";
goto mcal_email; // Last process at end of script
}
else
{$serialnbr=strtoupper(htmlspecialchars(trim($_POST['serialnbr'])));
$mcalmsg .= "[serialnbr]=$serialnbr\r\n";
}
This (so far) is working just as needed.
Of course there is more in the script but follows the same concept.
Where this says location, there are also 3 information pages that can each be displyed using the same concept.
mcal_email: //last process before ending, always gets here after all else from goto or clearing all checks.
// compose email and send
?> // end of script
Why not try the header approach and see what happens? You could also try a call to php header method and see if this does the trick. I would work on trial and error to see what will solve your problem.