Read file with no read access in PHP - php

In php, I need to read a file that has no read access (the file permissions are -rw-r-----).
Changing the file's permissions isn't possible. The file sits on a local server.
Various methods I've tried in PHP don't work (file_get_contents, fopen, and curl) and maybe that's to be expected if that last read bit isn't set. Is that because the web server is being blocked access?
If that's the case then why is it that Firefox can read the file directly (using file://) as does curl from a shell? About to write an external python script that can read the file... what am I missing here?

It depends what user owns the file, and what user PHP/Apache is running as. You could check it by running whoami from PHP. If you can't change any part of the permissions/owner on the file, nor the Apache user, then, well, you're stuffed sorry.

Related

How can I find the real location of the tmp dir for a specific program on Ubuntu?

Ubuntu creates a separate /tmp directory for each user, so when Apache asks for /tmp it actually gets /tmp/systemd-private-654e145185f84f6ba097649873c88a9c-apache2.service-uUyzNh/tmp. (That code is different each time.) This is generally a good thing, but it’s annoying me now.
I want to create a bunch of PDF files (using TCPDF in PHP) in /tmp, then use shell_exec() in PHP to run the pdfunite script to create a single output PDF, then load that PDF into memory to serve it to the browser. My problem is that the pdfunite script doesn’t work, which I presume is because it’s not seeing the same path as the files are actually in.
Does PHP’s shell_exec() run code as the Apache user (www-data)? If so, I would assume that it would see the same /tmp dir, but in that case the pdfunite script should work. Is there a sensible workaround here, other than using a different directory and doing the cleanup myself?

PHP can't download files from another server locally when accessed with browser

I'm trying to download a file locally from a remote server but every time i access the file that contains the PHP code using the browser file_get_contents() fails because it doesn't have the permission to write to /var/www/html (Apache2). I tried using cURL but that didn't work either, checked to see if allow_url_fopen is on (it is) and added the php file to sudoers. I can't seem to find any solution online.
i think this is permission issue, see this maybe will solve your problem, pick that suit to your enviroment.

Protect file in web root but give access from php

I have a situation where I want to protect a file from public access, but enable read and write from php. The file contains sensitive information like passwords.
The problem is that
I cannot put the file outside the web root (server security restriction on access from php)
I would like to avoid mysql database.
Also I would try to avoid .htacess files.
So if I make a folder, say private, in the web root, and do
chmod 700 private
Then, if the file to protect is private/data, I do
chmod 700 private/file
will this be a safe setup? So now I can read and write to the file from php but it is not accessible for the public?
Is this a safe setup?
PHP runs as the same user as the webserver so if PHP can read it, so can your webserver (and vice versa).
If you don't want to use .htaccess there is another trick: save the file as a .php file. Even if someone accesses the file from the web they can't see the source, they might just get a white page or maybe an error depending on what exactly is in the file.
If you're running suPHP or fastCGI php, you can use a setup similar to what you've described to limit access to files. Otherwise, PHP will use the same user as the web server, and any file PHP can access is also accessible via url.
If want to keep the restrictions stipulated (which are rather strange), and as (i guess) you do not wish/have access to apache config directives, consider adding PHP to some group and give the group only rights to the file, ie. apache cannot read (if its not in root/wheel).
Or make it a valid .php file (so only php would be invoker when the file is requested) which returns nothing or redirects when invoked with php. or just cipher it.

Apache server file permissions and url accessibility

Is it possible to arrange file permissions/group ownership/etc in such a way that a file can be read by the function readFile() for a forced download, but it cannot be downloaded by navigating to the literal url of the file?
Maybe you could add the user that is running apache / php to the group that owns the file. And set config to read and write for owner and owner group, and no permission at all for others. (-rwxrw---- 0r 0760)
Never tested it, but it should work.
The Apache user will need read permissions. To prevent it from being navigated to, the best (and easiest) solution is to store the file outside of the web folder.

Set php file permissions so only my server's curl can run

I have a PHP script that I don't want anyone to be able to run through their browser.
It sends emails and is called via curl by my server's cron, yet needs to be in the public www directory. What permissions or owner do I need to set the file to to allow only my server's curl to execute (or do I mean read) the file?
I'm on centos.
Thanks!
You could either limit access to the files by placing a .htaccess file with appropriate access limitations in the directory or by implementing a basic password check at the beginning of your php file like this:
<?php
$password = $_GET['password'];
hash = '40bd001563085fc35165329ea1ff5c5ecbdbbeef'; //precalculated sha1 hash of your password
if (sha1($password) != $hash) {
die('Forget it!');
}
For added security this could be further refined, but you get the idea ...
If you can, I would recommend doing it a different way: Running the script through the CLI (calling php -f in the cron job) and having the PHP script check how it is run. You can find out whether the script is called from the Command Line Interface using php_sapi_name(), and terminate when it's being called from the web. That would be the most secure solution as far as I can see.
If you really need to get it through curl, use Josh's solution or define a passkey that needs to be added to the script as a get parameter:
curl domain.com/script.php?password=123456
not terribly secure as the passkey will be visible in the crontab, but should provide decent protection against access from the outside, especially if you combine it with checking $_SERVER["REMOTE_ADDR"] and making sure it is 127.0.0.1.
Why not just use php-cli to run it from the command line instead of through curl?
If you really have to host it you can do something like this in your .htaccess
<Directory /your/path/>
Order allow,deny
Allow from 192.168.1.0/24
Allow from 127
</Directory>
This is impossible, as phrased. From what I can tell, I think there are a number of options that you could use to address your problem:
You can chown root or another user and then chmod 700, and then call the script from a cronjob using PHP's command line functionality from the owner's crontab file.
If you need to access the file over curl, then you're hitting the web server, and the web server needs to be able to execute/read the script, which would allow anyone to execute the script.
Another option would be to use rule based access control, as described here: http://library.linode.com/web-servers/apache/access-control/rule-based-access to make sure that only connections originating from your server will be able to access the file in question, but this is itself not entirely ideal.
There are other solutions of course, but I hope this is helpful.

Categories