Set default permissions to files in folders on VPS with Ubuntu - php

I know this probably is a subject many other places as well, but I have tried many of the things written in other posts, and still no luck.
I am running a Ubuntu VPS with apache, ftp and php.
My goal: Every time I add an image to a folder the image should be accessible for the public. I have tried different chmod-commands but still no luck.
I want each image to have access rights: rwxrwxrwx, but as default when I upload them to the the folder through Filezilla they end up with access rights rwx-------.
I hope also I have given enough information. Please comment below if not, and I will provide as fast as possible. ( I am kind of new to the game, sorry about that)

You can change the default permission in your ftp server. I don't know which ftp server you're using? Or you can build a script that changes the permissions and execute this.

Related

Security breach: PHP file edited on codeigniter site, ubuntu server

I run a file sharing site, built on Codeigniter, PHP 7.
We recently found one of the files in our www/application/controllers dir was very slightly edited to change a download request for 1 in every 3 Windows users, which pointed them to a malware file instead (which had been uploaded to our server legitimately)
The SSH access is locked down with a private key which only I have.
The dir/file permissions were possibly not great at the time, maybe 755 or even 777.
I'm trying to figure out how a file that isn't in a public directory was edited when I'm fairly confident they couldn't have obtained SSH access.
Are there any known vulnerabilities with CodeIgniter that would allow this?

Ubuntu - The right permissions to use file_put_contents

In my PHP application I am using file_put_contents() to create a file to display comprising of a blob retrieved from a table in my database. However, currently, in order to get the function to work on my Amazon EC2 Ubuntu instance, I had to give the folder it's writing to 777 permissions. However, I know this is extremely bad and I want to change this, but I don't know what to.
I'm a novice at Linux and I'm currently navigating around my instance through the help of Google. This is part of a university assignment, so I can't just hire a Linux expert (just in case one of you says I shouldn't be using such an instance if I don't know Linux!)
Assuming that you are using this from a browser with normal http requests, you need to find out what the user is that is running the web-server. Probably something like apache, www-user or something similar.
Then you can do 2 things:
Change the ownership of the directory where you want to save the file to the web-server user and then it can have permissions 755
Change the group of the directory to the group of the web-server and then the directory needs 775.

PHP FTP fwrite() to Windows server not working?

I am attempting to fwrite() via an FTP wrapper from a UNIX-based server to a Windows-based server with PHP. I am connecting successfully, and am able to (for example) create directories. However, I can't seem to write files to the folders! When I look at the directory permissions using an FTP client, I noticed that they are all 0000, and cannot be changed. Apparently, Windows-based servers do not use the same FTP permission system.
So... what's up? What do you recommend? I am able to upload files to the directory using an FTP client, but when I attempt to write files with PHP, nothing happens.
I am at a complete loss as to why this may be occurring. I have confirmed my script works by writing to UNIX-based servers, so that is not a problem. Is it possible that the fact the destination server does not have PHP installed matters? I would not think so, but I'm open to any ideas at this point!
Thank you!
EDIT - What's really getting me is that I AM able to create directories, so it doesn't make any sense that writing shouldn't be working. On Windows servers, is there some setting that would prevent just the writing of FILES, but not folders?
EDIT 2 - More research has told me that, though you cannot CHMOD on Windows servers, PHP's CHMOD still somehow does something with the permissions. However, this does not appear to be working for me. Is there a way to change the permissions directly with PHP code, or is this something that has to be done directly on the server, outside of my reach?
I recommend file_put_contents, and make sure PHP is allowed to write in that directory.

How to upload a site to a server where folders are writable for php

This problem occurred to me multiple times now, and it's time for me to do it the right way!
How can I upload a website to the server, where php has access to the folders for writing data.
Usually I use an FTP program, but I can't upload as root, so there are restriction problems all over the place...
How do you do stuff like this?
Thanks!
EDIT
I'm sorry, I accidentally added rails to the tags instead off php.
Probably I need to clarify my problem, since the answers didn't really help me out here:
I already have a server running apache, DirectAdmin and some other stuff like rails.
And the problem is when I upload a website like joomla or wordpress via FTP the restrictions always need to be set to 777/775 or these sites can't write to the folders..
So what I need to know is:
How can I upload these sites (via FTP/SSH) as a user (root) that is the same as php, so that php can create files in all folders it needs to write to?
Hope I'm being more clear now, thanks for the help so far!
Use a server with ssh access and full write access to wherever your Rails app is hosted (and usually ssh access is as the user that Rails runs as).
For me this usually means a VPS type server, I like Rackspace Cloud which turns out to be around $11 - $15 per month for a low traffic, low spec server. I've also heard good things about Linode
The solution
Upload your site with FTP
SSH to the server and go to the public_html folder
chown -R [user_name]:[group_name] [folder_name]
For me the right user was apache..

php creating files that cannot be deleted

When I download a file with curl through php I cannot seem to be able to delete it afterwards through ftp. I can delete it through the php script, but that's not exactly perfect. If the file isn't downloaded via curl, but still via php I can delete the file, it's just ones downloaded via curl that I cannot delete. When I try to run chown() through php on the file it gives me a permissions error. I've tested the same php script on multiple other servers and it works fine there, it's just this particular one it doesn't work on. Maybe it has something to do with php configuration and permissions but I'm not 100% on that.
Sounds like it is saved with the file owner being the user account of the web server. A non-privileged account can't chown to a different user, either, so that explains why chown fails... Try having PHP execute chmod 777 on the file before you delete it.
When you create a file it is usually owned by the Apache user (or whatever app server you use). The FTP user however is not the same one most of the time. You can fix this by adding the FTP user to the Apache group (or the other way around). Sometimes they already share a group (like on many plesk environments) so making files readable and writeable for that shared group may solve the issue.

Categories