Joomla 3.2 installation file/directory ownership issues - php

I am trying to install Joomla 3.2 using the host provided by my university. So, I have available one mysql database with the username and pass and the ftp username and pass to transfer data, only. That is, no Cpanel installed or ssh available.
If I extract the file Joomla_3.2.0-Stable-Full_Package.zip and ftp the contents on the website, the owner of these files/folders is user "ftp_username". I can continue to install Joomla only if I set-up the ftp layer.
This works, except I get from time to time messages like --> JFTP: :rename: Bad response Rename failed More specifically I get this message when using kunena.
Moreover, I read online that the FTP layer shouldn't be used normally. Also, the tmp/cache/logs create permission problems.
If I remove the ftp layer (by editing the configuration.php) then I cannot modify anything since Joomla cannot modify the files (owned by ftp_username). Of course, I could change all permissions to 777, but that would be suicide...
I found a post explaining a bit the situation I'm here! Especially the advice on using "chmod 4770" is feasible but I don't know about how secure it is (haven't tried it).
Anyone has an idea how I can make this work?
cross-posted here
Thanks in advance

I found a way to bypass this problem. Not worth it if you have an alternative provider!
I installed joomla with ftp layer
I installed Extplorer plugin
Used plugin to upload Joomla_3.2.0-Stable-Full_Package.zip on the server. Now the file is owned by the apache user.
Deleted all the (joomla installation) files on the server except the zip file
Uploaded a php script to unzip the file using php (http://php.net/manual/en/ziparchive.open.php). The installation folders/files are now owned by apache user
Reinstall Joomla without the ftp layer.
I guess I could have used some php uploader script instead of steps 1-4, but I already had Joomla installed with the ftp layer.
That's it. Up and working. If you have an alternative provider, don't bother.

It seems to me that the hosting option provided by the university may be too restrictive to work for pretty much any CMS. You need to be able to chmod and potentially chown all your files in bulk so as to avoid insanity.
There may be an FTP program out there that will chmod your files as you are uploading them. If your group is www-data or whatever group apache needs, then you can 775 your folders and 664 your files; you should be good to go.

Related

Issues with File System and Settings file in Drupal clone from Github using MAMP

I am brand new to Drupal and was looking around to find issues related to mine, but I have cloned a repo of a Drupal build so that i can collaborate with my team and also install it locally using MAMP.
I was able to clone it successfully into my htdocs folder, but when i went through the Drupal installation process I got the following errors see below in grey since i can't post an image because I don't have a 10 reputation yet.
File system The directory sites/default/files is not writable. An
automated attempt to create this directory failed, possibly due to a
permissions problem. To proceed with the installation, either create
the directory and modify its permissions manually or ensure that the
installer has the permissions to create it automatically. For more
information, see INSTALL.txt or the online handbook.
and
Settings file The settings file is not writable. The Drupal installer
requires write permissions to ./sites/default/settings.php during the
installation process. If you are unsure how to grant file permissions,
consult the online handbook.
Naturally I don't know what this means and was wondering what kind of problems I could expect if just ignore it, or better yet how to just resolve it and not go the route of unexpected errors later.
Any help on what this means or how it affects the install I would really appreciate.
Finally Drupal didn't ask whether an SQL database was installed and I'm not sure if that would've installed with the clone (I suspect not) but don't really know any advice on whether i should just get the SQL file from the Admin or if it is installed would be helpful.
Thank you.
A clean Drupal installation process includes creation of required tables (you provide the db username, password, host & db-name). Drupal also needs write permission on sites/default/files directory (It stores user uploaded stuff & cache related files).
You can know more about securing file permissions here: https://drupal.org/node/244924
For a local install what I do is, give 777 permission to sites/default directory. Later I change the permission of the settings.php file to 555 to get get rid of drupal warning messages.
For *nix machines, you can do chmod -R 777 sites/default
NOTE: this is just a quickfix and okay for local setup. Please set file permissions wisely in production/server setup.

Best way to enable writing on a Linux server with PHP

I have a CMS which i made with PHP and i want to add some self updating features to it.
A self updating system can be a complex and tedious thing, i know that, but that's a whole different story. The bottom line is is that i end up creating new PHP files, editting existing ones with new code and probably deleting some PHP files when an update happends.
I know how to read, write and delete files in PHP, but this almost always gives me problems on a Linux server when i don't chmod the folder with an FTP client first to 0777.
I also know PHP has chmod and chown functions. But most of the times these functions don't work for me either. Mainly because all the PHP files are uploaded through FTP which gives the rights to write to files and folders to the FTP group (or something like that).
I also have a Wordpress blog installed on a Linux server, but for some reason this Wordpress site has no trouble updating itself. I assume it does the same things i said; create, update and delete certain files. But none of the Wordpress sites i ever worked with complained that it couldn't update itself because of the folder/file permissions.
So i'm guessing Wordpress uses a smart way to change folder permissions when needed. I just couldn't figure out how.
So basically my question is, what is the best way to give PHP permissions to chmod files and folders on a Linux system? How does a system like Wordpress handle this?
You can set the folder to 0777 which would give PHP the capability to create files inside of it. You can also chown the folder to www-data or whatever your copy of PHP runs as...

dso (mod_php) and FTP/File permissions

I'm a bit baffled here. But it might just be my lack of experience.
I have setup PHP DSO (mod_php) and my server runs smoothly and stable. The issue is, though, that in order to run php with includes and everything, I had to set all user account files (/home/*/public_html/*) owner to nobody:nobody.
This introduces two questions for me:
- Is this really necessary? I'd rather have them user:user
- What about FTP? If I upload files using FTP, they're owned by user:user so they can't be included in another php file (throws errors). Files that are owned by nobody:nobody can't be modified through ftp..
FYI: I also have SuEXEC enabled. Should I disable this?
FYI2: I know I could set all permissions to 777, but that's just wrong.
Thanks a lot!
Ordinary "nobody" should only read executing files, and write/own only files that can be changed by php. Most files owner should be your ftp user.
Bad practice to keep php rights to change executable files.
Also if "nobody" has rights to run as root it provides php (and therefore users) all his rights.

Recommended File Permissions for Web-based SQLite File

I am working on a web app that needs read/write access to an SQLite database file. As I understand it, the parent folder needs to be set to 777 in order to be able to open the database with PHP (source).
What are the recommended file permissions for the .db file itself? Bear in mind that I need to be able to overwrite the file with PHP as well.
Furthermore, are there any security risks with the database if its parent is 777? The folder is created by PHP. I'm just trying to make sure I can get work done without creating security risks.
Thanks!
You should have Apache (the web server) own the folder.
Or maybe create a group like www-dev consisting of MySQL and Apache and then set the folder group to www-dev and the permissions to 770 .
Just don't make the folder public with 777 - specially on a shared server.

PHP method to transfer whole directory using FTP?

I've been trying to come up with a way to build an automatic update system for a cms that I'm building that will potentially be installed on numerous servers (likely with different configurations). What I've come up with is to keep a current version uploaded to my server in a predetermined directory. Then have the distributed systems check that directory (on a remote server) once every so often to see if a new version has been uploaded. If the version number of the uploaded version is greater than the one that particular system has, it will prompt the admin to update. Then the files will be copied via FTP into a tmp directory, then will be copied from tmp to replace the older versions of each file. then the tmp directory is deleted, and the systems version number is increased.
THe problem with this, is that I haven't found a way to transfer whole directories via PHP FTP. I know I can zip it and transfer it that way, but I haven't found a reliable way to unzip files in various server environments.
I know I can write my own method that will just go through each directory and transfer over each file one at a time, which is fine. I just wanted to know if someone else had already written something to this effect, or if someone knew of an alternate solution to this problem before I dive into it.
Thanks for your help!
Instead of sending a directory, why not zip the folder to save bandwidth and time? Then unzip on the destination servers. deleted since it was not an option for the OP
You could use PHAR archives for your app deployment (if you want to be bleeding-edge modern)
Or you could write an update script for Phing that gets the files from your server. You could even do checkouts from SVN then, instead of placing the build in a directory.
Try using exec() to run lftp from the command line:
/usr/bin/lftp -e 'o ftp://ftp.example.com/path/to/remote/directory && mirror --verbose && quit'

Categories