The update cannot be installed because we will be unable to copy some files. This is usually due to inconsistent file permissions.
When trying to update plugins from the admin interface. I’ve set the user to www-data for both directories and files … but nothing works.
What do I have to do to simply being able to use Wordpress as it is intended?
This issue is usually caused by incorrect file permissions or ownerships on your server (rather than being a problem with the plugin itself). Basically, WordPress isn’t able to properly access its own plugins folder, and as a result, can’t put the updated plugins files in there.
Permissions
Permissions can be set to 644 for files and 755 for folders. To do so, you can run two quick find commands.
To make all folders in your website path with 755 permissions run the following command
find /path/to/your_website -type d -exec chmod 755 {} \;
To make all files in your website root directory with 644 permissions, you can run
find /path/to/your_website -type f -exec chmod 644 {} \;
Please make sure to change /path/to/your_website with your real path.
Ownership
Ownership means which user and group are controlling the files. Usually, that’s www-data. So what you’ll need to do si
sudo chown -R www-data:www-data /path/to/your_website
Please make sure to change /path/to/your_website with your real path.
Once you do it, that’s it, you are good to go.
I've got the following problem with laravel (5.1) sessions on ubuntu 14.04. On every request a new session file in storage/framework/sessions is generated. As you have already guessed, the session driver is 'file', 'lifetime' is set to 120. This seems to be some sort of permission error. I've set the permission of the storage folder to 755 (also 777), but every newly generated session file has the permission 664 (rw- rw- r--). Via google I've only found a session issue related to dd(...), but this is not the case here, especially that it works fine on a windows environment.
What I originally wanted to do is use the redirect()->intended(), which uses the information stored in the session.
Do I have to run php artisan serve in a special way?
Since I use Vagrant and Homestead everything is fine. Running the PHP built in webserver seems the be a kludge only.
Test the funcionality giving all the permissions to the storage folder
find /path/to/storage/folder -type d -exec chmod 777 {} \;
If it works then set the propper permissions that in most of the cases should be
find /path/to/storage/folder -type d -exec chmod 775 {} \;
That way you only change the permissions to the directories not the files.
If the problem it's the newly created are being set with different owner you could set the permission for it.
It's like when I'm developing on my local server I have my user and the www-data user it's the one for Apache, so if apache creates a new file the ownership starts to mess up.
find /path/to/root/of/project -type d -exec chmod g+s {} \;
I am learning PHP at the moment on Linux. I have an Apache2 server running locally. Whenever I tried to save a PHP file into the root directory of Apache2 server ( /var/www/html/), I was told that permission denined.
So, I searched around and found that by default, the admininstartor do not have the root access unless explicitly request for it (like sudo su). I have also seen some posts which ask me to use gksu nautilus. However, my linux 14.04 LTS Ubuntu doesn't comes with it. (I know I can use apt-get gksu but at the moment, downloading it from internet is not an option).
Is there anyway that I can change the permission to my Apache2 server root directoy so that I can use any text editor to save/edit to that directory directly. Only the ways that do not need downloading stuffs from internet are feasiable for me at the moment.
For linux open the terminal with root login then go to the root folder and run the following command chmod 777 following is the example :-
To change all the directories to 777 (-rwxr-rwxr-rwxr):
find /opt/lampp/htdocs -type d -exec chmod 777 {} \;
To change all the files to 644 (-rwxr-rwxr--rwxr--):
find /opt/lampp/htdocs -type f -exec chmod 777 {} \;
If this will not work then try the following :-
Create a new group
groupadd webadmin
Add your users to the group
usermod -a -G webadmin user1
usermod -a -G webadmin user2
Change ownership of the sites directory
chown root:webadmin /var/www/html/
Change permissions of the sites directory
chmod 2775 /var/www/html/ -R
Now anybody can read the files (including the apache user) but only root and webadmin can modify their contents.
Hope this will help you in solving your problem.
You can set the DocumentRoot in your /etc/apache2/httpd.conf file to a place where Apache has write access. For example, you could set it to /tmp/www if you made a directory there. (If you still don't have access, you can always give everyone read access by running chmod a+r /tmp/www, but you should probably be fine.)
Obviously leaving your Apache Document Root as /tmp/www is a bad idea, so you can change it to something like /home/chris once you've got it working.
One important note: after you make a change like this, you must restart the Apache server. This can be done by running apachectl restart; ironically, you might have to have administrator rights in order to execute this (or even edit the config file in the first place), so make sure you prefix your edit & restart with sudo just in case.
I am new to Laravel. I was trying to open http://localhost/test/public/ and I got
Error in exception handler.
I googled around and changed the permission of storage directory using chmod -R 777 app/storage but to no avail.
I changed debug=>true in app.php and visited the page and got Error in exception handler:
The stream or file "/var/www/html/test/app/storage/logs/laravel.log"
could not be opened: failed to open stream: Permission denied in
/var/www/html/test/bootstrap/compiled.php:8423
Then I changed the permissions of storage directory using the command chmod -R 644 app/storage and the 'Error in exception handler' error was gone and a page is loaded. But in there I am getting this:
file_put_contents(/var/www/html/laravel/app/storage/meta/services.json):
failed to open stream: Permission denied
Suggestion from vsmoraes worked for me:
Laravel >= 5.4
php artisan cache:clear
chmod -R 775 storage/
composer dump-autoload
Laravel < 5.4
php artisan cache:clear
chmod -R 775 app/storage
composer dump-autoload
For those facing this problem with Laravel 5, this is a permission issue caused by different users trying to write at the same log file within the storage/logs folder with different permissions.
What happens is your Laravel config probably is setup to log errors daily and therefore your web server (Apache/nginx) might create this file under a default user depending on your environment it can be something like _www on OSX or www-data on *NIX systems, then the issue comes when you might have run some artisan commands and got some errors, so the artisan will write this file but with a different user because PHP on terminal is executed by a different user actually your login user, you can check it out by running this command:
php -i | grep USER
If your login user created that log file your web server you will not be able to write errors in it and vice-versa because Laravel writes log files with 655 permissions by default which only allows the owner to write in it.
To fix this temporary you have to manually give permissions for the group 664 to this file so both your login user and web server user can write to that log file.
To avoid this issue permanently you may want to setup a proper permissions when a new file is create within the storage/logs directory by inheriting the permissions from the directory this answer https://unix.stackexchange.com/a/115632 can help you to tackle with that.
You should not give 777 permissions. It's a security risk.
To Ubuntu users, in Laravel 5, I sugest to change owner for directory storage recursively:
Try the follow:
sudo chown -R www-data:www-data storage
In Ubuntu based systems, www-data is apache user.
For everyone using Laravel 5, Homestead and Mac try this:
mkdir storage/framework/views
some times SELINUX caused this problem;
you can disable selinux with this command.
sudo setenforce 0
NEVER GIVE IT PERMISSION 777!
go to the directory of the laravel project on your terminal and write:
sudo chown -R your-user:www-data /path/to/your/laravel/project/
sudo find /same/path/ -type f -exec chmod 664 {} \;
sudo find /same/path/ -type d -exec chmod 775 {} \;
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
This way you're making your user the owner and giving privileges:
1 Execute, 2 Write, 4 Read
1+2+4 = 7 means (rwx)
2+4 = 6 means (rw)
finally, for the storage access, ug+rwx means you're giving the user and group a 7
Problem solved
php artisan cache:clear
sudo chmod -R 777 vendor storage
this enables the write permission to app , framework, logs Hope this will Help
For vagrant users, the solution is:
(in vagrant) php artisan cache:clear
(outside of vagrant) chmod -R 777 app/storage
(in vagrant) composer dump-autoload
Making sure you chmod in your local environment and not inside vagrant is important here!
Try again with chmod -R 755 /var/www/html/test/app/storage. Use with sudo for Operation not permitted in chmod. Use Check owner permission if still having the error.
As per Laravel 5.4 which is the latest as I am writing this, if you have any problem like this, you ned to change the permission.
DO NOT LISTEN TO ANYONE WHO TELLS YOU TO SET 777 FOR ANY DIRECTORY.
It has a security issue.
Change the permission of storage folder like this
sudo chmod -R 775 storage
Change bootstrap folder permission like this
sudo chmod -R 775 bootstrap/cache
Now please make sure that you're executing both commands from your application directory. You won't face problems in future regarding permission. 775 doesn't compromise any security of your machine.
Suggest the correct permission, if for Apache,
sudo chown -R apache:apache apppath/app/storage
FOR ANYONE RUNNING AN OS WITH SELINUX: The correct way of allowing httpd to write to the laravel storage folder is:
sudo semanage fcontext -a -t httpd_sys_rw_content_t '/path/to/www/storage(/.*)?'
Then to apply the changes immediately:
sudo restorecon -F -r '/path/to/www/storage'
SELinux can be a pain to deal with, but if it's present then I'd STRONGLY ADVISE you learn it rather than bypassing it entirely.
If you have Laravel 5 and looking permanent solution , applicable both php artisan command line usage and Apache server use this:
sudo chmod -R 777 vendor storage
echo "umask 000" | sudo tee -a /etc/resolv.conf
sudo service apache2 restart
See detailed explanation here.
I had the same issue and the below steps helped me fix the issue.
Find out the apache user - created a test.php file in the public folder with the code
<?php echo exec('whoami'); ?>
And run the file from the web browser. It would give the apache user. In my case, it is ec2-user as I was using the aws with cronjob installed in /etc/cron.d/. It could be different user for others.
Run the below command on the command line.
sudo chown -R ec2-user:<usergroup> /app-path/public
You need to identify and use the right "user" and "usergroup" here.
I had the same problem but in the views directory:
file_put_contents(/var/www/app/storage/framework/views/237ecf97ac8c3cea6973b0b09f1ad97256b9079c.php): failed to open stream: Permission denied
And I solved it cleaning the views cache directory with the following artisan command:
php artisan view:clear
Xampp for use:
cd /Applications/XAMPP/htdocs
chmod -R 775 test/app/storage
From Setting Up Laravel 4.x on Mac OSX 10.8+ with XAMPP
Any time I change app.php I get a permission denied writing bootstrap/cache/services.json so I did this to fix it:
chmod -R 777 bootstrap/cache/
rm storage/logs/laravel.log
solved this for me
Setting permission to 777 is definitely terrible idea!
... but
If you are getting permission error connected with "storage" folder that's what worked for me:
1) Set "storage" and its subfolders permission to 777 with
sudo chmod -R 777 storage/
2) In browser go to laravel home page laravel/public/ (laravel will create necessary initial storage files)
3) Return safe 775 permission to storage and its subfolders
sudo chmod -R 775 storage/
If using laradock, try chown -R laradock:www-data ./storage in your workspace container
In my case solution was to change permission to app/storage/framework/views and app/storage/logs directories.
After a lot of trial and error with directory permissions I ended up with an epiphany...there was no space left on the disk's partition. Just wanted to share to make sure nobody else is stupid enough to keep looking for the solution in the wrong direction.
In Linux you can use df -h to check your disk size and free space.
This issue actually caused by different users who wants to write/read file but denied cause different ownership. maybe you as 'root' installed laravel before then you login into your site as 'laravel' user where 'laravel' the default ownership, so this is the actually real issue here. So when user 'laravel' want to read/write all file in disk as default, to be denied, cause that file has ownership by 'root'.
To solving this problem you can follow like this:
sudo chown -hR your-user-name /root /nameforlder
or in my case
sudo chown -hR igmcoid /root /sublaravel
Footnote:
root as name first ownership who installed before
your-user-name as the default ownership who actually write/read in site.
namefolder as name folder that want you change the ownership.
If you use Linux or Mac, even you can also run in ssh terminal. You can use terminal for run this command,
php artisan cache:clear
sudo chmod -R 777 storage
composer dump-autoload
If you are using windows, you can run using git bash.
php artisan cache:clear
chmod -R 777 storage
composer dump-autoload
You can download git form https://git-scm.com/downloads.
If anyone else runs into a similar issue with fopen file permissions error, but is wise enough not to blindly chmod 777 here is my suggestion.
Check the command you are using for permissions that apache needs:
fopen('filepath/filename.pdf', 'r');
The 'r' means open for read only, and if you aren't editing the file, this is what you should have it set as. This means apache/www-data needs at least read permission on that file, which if the file is created through laravel it will have read permission already.
If for any reason you have to write to the file:
fopen('filepath/filename.pdf', 'r+');
Then make sure apache also has permissions to write to the file.
http://php.net/manual/en/function.fopen.php
Just start your server using artisian
php artisian serve
Then access your project from the specified URL:
I have the same issue when running vagrant on mac. solved the problem by changing the user of Apache server in https.conf file:
# check user for php
[vagrant] ubuntu ~ $ php -i | grep USER
USER => ubuntu
$_SERVER['USER'] => ubuntu
[vagrant] ubuntu ~ $
Run apache under php user instead of user daemon to resolve file access issue with php
# change default apache user from daemon to php user
sudo sed -i 's/User daemon/User ubuntu/g' /opt/lampp/etc/httpd.conf
sudo sed -i 's/Group daemon/Group ubuntu/g' /opt/lampp/etc/httpd.conf
now, php created cache file can be read and edit by apache without showing any access permission error.
I got same errors in my project...
But found out that I forgot to put enctype in my form.
<form method="#" action="#" enctype="multipart/form-data">
Hopes it helps somewhere somehow...
While working on Windows 10 with Laragon and Laravel 4, it seemed to me there was no way to change the permissions manually, since executing chmod-commands in the Laragon-in-built-terminal had no effect.
However, it was possible in this terminal to go to the storage folder and manually add the desired folders like this:
cd app/storage
mkdir cache
mkdir meta
mkdir views
mkdir sessions
The cd-command in the terminal brings you to the folder (you might need to adjust this path to suit your file structure).
The mkdir-command will create the directory with the given name.
I did not have the opportunity to test this approach in Laravel 5, but I expect that a similar approach should work.
Of course there might be a better way, but at least this was a reasonable workaround for my situation (fixing the error: file_put_contents(/var/www/html/laravel/app/storage/meta/services.json): failed to open stream).
First, delete the storage folder then again create the storage folder.
Inside storage folder create a new folder name as framework.
Inside framework folder create three folders name as cache, sessions and views.
I have solved my problem by doing this.
I have wordpress running on my localhost on mac Lion.
Everytime I try to install or delete plugins it asks me for hostname, ftp username and ftp password.
I configured my localhost to 127.0.0.1, but I have never configured the ftp username and password for my localhost. How can I get which user and password it's by default?
I have tried almost every user and pass I have on mysql, my osx admin, etc. with no results.
Any ideas?
Update as mentioned by Chaudhry Waqas in comments it might be sufficient to just add define('FS_METHOD', 'direct'); to wp-config.php, so please try this first as it's safer not to change access rights if not necessary
WARNING: only do this on your local computer, it's a huge security risk on a public installation!
I haven't tried it but as mentioned by #misterfancypants comment, changing those settings only for wp-content/plugins/ should be sufficient
updated to incorporate this info
This one worked for me
$ cd /Users/<username>/Sites
# (wordpress = name of the directory, change as needed)
$ sudo chown -R :_www wordpress
$ sudo chmod -R g+w wordpress
and then add following in wp-config.php
define('FS_METHOD', 'direct');
found on http://soderlind.no/running-wordpress-locally-on-mac-os-x-lion/#crayon-533a956214a8e343167867
Cheers Can
I fixed it by:
cd /var/www
sudo chown -R www-data:www-data wordpress
Updated on 03-05-2019
cd /var/www
sudo chown -R www-data:www-data [YOUR_WORDPRESS_PROJECT_DIR]
Actually, problem is that WordPress create a temp file to check the file permissions
and it compare that temp file's owner with its a core file's owner (refer fileowner())
both should match. in most of the case, it does not match on localhost hence we extracted wp files in different user access and PHP has its own user group.
So there are 2 ways to solve this problem.
Way 1:
cd wordpress
sudo find . -type d -exec chmod 0755 {} \;
sudo find . -type f -exec chmod 0644 {} \;
and following
define( 'FS_METHOD', 'direct' );
in wp-config.php
This does not check any fileowners just uses the direct file system
way 2
set
sudo chown -R www-data:www-data wordpress
This sets the both WordPress into www-data use so actually the temp file(which created by WordPress) also comes inside this user, So both fileowners is same so problem solves
More info refer : https://developer.wordpress.org/reference/functions/get_filesystem_method/
Add this in your config.php file,
define('FS_METHOD','direct');
In my experience, WordPress can be a bit fussy about permissions and ownership when it comes to self-update without FTP, so using FTP to localhost is a perfectly valid tactic, I'd say. But as others have said, just ensuring that everything from your WordPress root directory on downwards is writable by the PHP process, and owned by the same user, may well be enough to avoid the need for FTP.
If you do want to use FTP, are you sure you've enabled the FTP server? If so, you should just use a user who has permission to get to the directory via FTP (you can test with the command-line ftp tool.) As my sites are set up in my personal Sites directory, I just use my normal username and password (e.g. for /Users/matt/Sites/whatever I log in as matt.)
Other things to check: What happens if you try ftp localhost on the command line? Can you log in there?
Edit the wp-config.php file and add the piece of code mentioned below:
define('FS_METHOD','direct');