We're making an app using PHP and using some third party services that require a secret API key.
We have a PHP file that contains all those keys definitions that we then import (using require_once) when needed.
Is this approach safe? Should we store the keys in a different place?
Thank you.
Something similar was asked today for a shell script. The answer is valid here as well: Make sure you store the file outside the web root, or (if that's not possible) protect it using a .htaccess file.
I also like to unset() any variables containing sensitive data after use, so not even a full variable dump (e.g. in a debug message) later in that script could reveal it.
It should be relatively safe as long as the file is not accessible from the web. A lot of sites will place sensitive files outside of the webroot on the server, and simply include them when needed into their app.
I always set the permissions of certificates and other files containing sensitive data such that only my development team and the apache service can access the file. This is important if you are using a server shared by a large organization, like a university, where lots of people might have permissions to the file by default. Often I've seen read permissions given to everyone so that the web server can access the file(since it is neither the owner nor in the group permission for the file, the only thing left is to give read to "other").
Instead, I ensure there is a group containing only my development team, and set the read/write permissions for the file to that group. I then use ACL to add a read permission for the APACHE service. You have to use an ACL since the owner and group are normally set to a developer and development team group, leaving you no options for setting access for apache other than using ACL.
Security by obfuscation:
create own namespace with function to encode, decode
add auto_prepend_file in php.ini to /usr/share/nginx/outsidehtml/keystorage/83738489384828838227.php
run nginx or apache in chroot,selinux
disable listing permission to keystorage folder setfacl -m user:nginx:x /usr/share/nginx/outsidehtml/keystorage/
add php.ini disable_class = ReflectionFunction disable_function = opcache_get_status,phpinfo,show_source,ini_get
For better harding you can store key in php.ini as value i.e 123key = secret64
TEST
print_r(glob('/usr/share/nginx/outsidehtml/keystorage/*.*'));
Related
I'm confused about something and need some explanations.
In my practice I normaly see that 90% of PHP developers all sensitive informations like database connections, FTP, SMTP setup etc. store inside some variable, array, objects or constants.
I wonder now is better to use some ini file out of root and store there? Or is better to hide somewhere .ini file and deny access via .htaccess?
Generaly, I want to save that sensitive data on most secure way.
There is no perfectly safe choice, but some are better than others.
Don't save sensitive information in your project's source code -- you don't want your passwords and API keys on github.
Saving sensitive information in a database is fine, but then you still need somewhere to store the database credentials, and you're right back where you started.
You can save sensitive information in environment variables. These would usually be set up in your web server's configuration file(s).
Saving sensitive information in an ini file is fine, provided the following:
The file has the minimal permissions required.
The file is completely outside the web server's document root and thus can't ever be served directly. Don't put the file in your main directory and then use .htaccess to deny access to it.
The file is not committed to source control. If you're using git, edit your .gitignore so that the file is ignored.
These should also go without saying:
The user account running the web server process should never have write permission to the files it's serving.
Other non-privileged users on the machine running the web server process should not have read access to the files it's serving.
For me, I would suggest to store it in a dot file such .env (like what Laravel does), or environment variables, or INI file (as what you said above) as long as it is hidden from the world or if some hack your server, they won't be able to see it easily or they won't be able to access it.
I followed this tutorial and it describes how to connect to a database using an Android app.
I need to create a folder structure similar to the one below:
My Question is where in a real server should the proposed files be placed?
I have a path of Home Directory/My Domain/....(folders)... so do I place that structure inside or outside My Domain folder.., and if outside how I am going to access them if I use the following?
require_once 'OUTSIDEFOLDER/include/Config.php';
Shouldn't I be blocked by permissions?
My Question is where in a real server should place the proposed files?
You must place your file.php in the www folder.
For example, if you have a WAMP server installed, you must place all your files (e.g. file.php, etc) in a path like c:\{wampPathInstalled}\www\mywebsite, where {wampPathInstalled} is the path where WAMP is installed on the C drive (or if it is another drive, use that drive letter instead).
You can access the scripts by running http://localhost/mywebsite/myfile.php in a browser.
Tell me what is your real server (IIS or other) & I can indicate where the best place is in your case.
if you use plesk, your response must be in this link.
Well ideally for productions websites I would create a directory for every site hosted and place at a location that will require sudo privileges for modifying, reading or deleting files. Therefore the directory could be inside /var/www or you could create one inside /var/data. I would always place all the configuration information for a site in a separate config file, this file will then be included in only files that requires that information. Furthermore implement your application using MVC approach where the application is served through one common router and also provide appropriate permissions to directories based on things they are doing avoid unnecessary read, write, execute permissions everywhere to your app. Hope this helps
I have not been able to find solid information on preferred (best practices) and/or secure methods to allow php to access config or other types of files on a linux server not contained in the public web directory or owned by the apache user so I'm hoping to find some answers here.
I am a fairly competent PHP programmer but am increasingly tasked with writing web applications (most of which are not publicly accessible via the web however) that require updating, changing or adding to config files or files generated by some service or application on the server.
For instance, I need to create a web interface that will view, add or remove entries from a /etc/mail/spamassassin/white-list.cf file owned by root.
Another scenario is that I need php to parse mime messages in /var/vmail that are owned by user vmail.
These are just a couple examples, there will be other files in locations owned by other processes/users. How can I write PHP applications that securely access and manipulate these files without opening security risks?
If I were needing to implement something like this, I would probably look at using something like sudo to fine-tune permissions. I'm not a Linux CLI expert, so I'm sure there are issues that I haven't taken into account when typing this out.
I would probably determine what tasks need to be done, and would write a separate script for each task that needs to be completed. Using sudo, I'd assign the necessary level of permissions for that script only.
Obviously, as the number of tasks increase, so would the complexity and the amount of work involved. I'm not sure how this would affect you at the moment.
I have found in my server, that some files are stored inside the webroot, for example
example.com/files/longnamewithcode_1234293182391823212313231230812.pdf
They are stored for a web app and hace sensible info on them.
If you access example.com/files you get an empty index.html, so you can't directly scan the directory. Anyway, I'm concerned about this: I feel that it is not safe and I would like to know what kind of attacks could be made to access the files. I understand that some brute force attack would be possible, but with the long code names I guess it's a less big problem.
Finally, I would say that the correct way is storing the files outside the web folder and return them with PHP, but I'm not sure I'll be able to have access to the code to change this.
If you have to make the files accessible from webroot by the webserver you can't really make it more safe than use sufficient amount of entropy in the file names, but that still not account for simply sharing a the links by users that get a hold of them somehow.
If you want to implement the permission checking inside php take a look into the various X-Sendfile implementations on popular webservers like, mod_xsendfile (apache), XSendfile (nginx) or X-LIGHTTPD-send-file (lighttpd). This allows you to use the webserver to serve the file basically as efficiently as simply accessing it from the webroot after you validated the accessing user.
have you considered an .htaccess file to restrict who is allowed to access those sensible files? you tagged it, but i'm not sure why you are not using it. =)
If you wish to block everything in the folder you can use an .htaccess file to block all connections.
Try
deny from all
in the .htaccess file in that directory. This will deny all access to anything in that directory including the index file.
The question is
Are the files supposed to be accessed by users freely?
If yes, don't worry about those files too much (as long as they're not writeable).
If no, i.e. users have to be logged in to download them, move them out from the publicly accessible folder and deeper into the application. Write a php script that will manage the permissions for the file i.e. /download?file_id=1.
I have a web based application which server's content to authenticated users by interacting with a soap server. The soap server has file's which the user's need to be able to download.
What is the best way to serve these files to users? When a user requests a file, my server will make a soap call to the soap server to pull the file and then it will serve it to the user via referencing the link to it.
The question is that these temporary files need to be cleaned up at some point and my first thought was this being a linux based system, store them in /tmp/ and let the system take care of cleanup.
Is it possible to store these files in /tmp and have apache serve them
to the user?
If apache cannot access /tmp since it is outside of the web root, potentially I could create a symbolic link to /tmp/filename within the web root? (This would require cleanup of the symbolic links though at some point.)
Suggestions/comments appreciated on best way to manage these temporary files?
I am aware that I could write a script and have it executed as a cron job on
regular intervals but was wondering if there was a way similar to presented
above to do this and not have to handle deleting the files?
There's a good chance that Apache can read the tmp directory, but that approach smells bad. My approach would be to have PHP read the file and send it to the user. Basically, you send out the appropriate HTTP headers to indicate what type of content you're sending and what name to use for the file, and then you just spit out the file with echo (for example).
It looks like there's a good discussion of this in another question:
HTTP Headers for File Downloads
An additional benefit of this approach is that it leaves you in full control because there's PHP between a user and the file. This means you can add additional security measures (e.g., time-of-day controls), pull the file from various places to distribute bandwidth usage, and so on.
[additional material]
Sorry for not directly addressing your question. If you're using PHP to serve the files, they need not reside in the Apache web root, just where Apache/PHP has file-system read access to them. Thus, you can indeed simply store them in /tmp and let the OS clean them up for you. You might want to adjust the frequency of those clean-ups, however, to keep volume at the level you want.
If you want to ensure that access is reliably denied after a period of time or a certain number of downloads, you can store tracking information in your database (e.g., a flag on the user to indicate that they've downloaded the file), and then check it with your download script and possibly deny the download. This effectively separates security of access from frequency of cleanup, two things you may want to adjust independently.
Hope that's more helpful....