php+nginx+vagrant - php fails to write - php

I have a vagrant box setup running my dev code which is a nginx/php setup.
(Quick info on vagrant - its a virtualbox wrapper: http://www.vagrantup.com/).
In the vagrant/virtualbox setup, it is using linux guest additions to mount a shared folder on my host computer (MAC OSX).
linux guest path: /var/www/local
OSX host path: ~/src/
On multiple occasions, I find that php can't seem to write anything through any command (file_put_contents, fwrite.. etc) to any path location on the mounted shared folder, However it is able to write outside of the /var/www/local (for example /var/www/not-mounted/..).
I find this very difficult to work with, as I am using a cache system and it keeps failing to write any of the cache javascript/css files to (/var/www/local/public/root/cache/) which I need to be in the root folder of my website which is (/var/www/local/public/root/index.php).
I have done a lot of research on this topic:
it seems, the folder mount has the right permissions:
When I type mount command in the linux guest, I get this:
/var/www/local on /var/www/local/ type vboxsf (uid=1000,gid=1000,rw)
Clarify:
This happens all the time, it is a known problem I encounter which I try to workaround.
From cat /etc/passwd:
vagrant:x:1000:1000:vagrant,,,:/home/vagrant:/bin/bash
Can anyone help me on this?

I have figured out the problem.
I have forgot to give PHP the correct user-privileges and permissions to write to the folder. Basically, my PHP user/user-group was www-data/www-data however, vagrant has its own user/group (vagrant/vagrant) which mounts the folder /local/.
Since I did not want to mess with my vagrant mounting behaviour, I just simply changed my php config to start PHP with the user/group - vagrant/vagrant.
This fixed the issue for me.
Thanks for the help!

Related

Where does Homestead/Vagrant put folders?

I'm new to app development, new to Laravel and new to Homestead. I've just successfully served up my first 'hello world' home page via Vagrant/Homestead.
I have a few of questions:
Assuming my config is the following:
folders:
- map: ~/Documents/projects/tests
to: /home/vagrant/tests
sites:
- map: test1.local
to: /home/vagrant/tests/laravel1/HTML/public
Where the is the /home/vagrant/tests folder physically located? Or, where can I find this kind of info (apart from here)?
I started the server with vagrant up. Ok. I see no logs on the Terminal. I was used to have logs during requests. So will the server run forever and ever? Or how to eventually shut it down?
How to see logs?
Where the hell is /home/vagrant/tests folder physically located? Or, where can I find this kind of info (apart from here)?
The /home/vagrant folder is stored in the virtual hard disk of the virtual machine. You cannot access it from your host OS.
On my computer, it's located in ~/.vagrant.d/boxes/laravel-VAGRANTSLASH-homestead/5.1.0/virtualbox/ubuntu-16.04-amd64-disk001.vmdk. It may differ on your machine.
It doesn't matter where that directory is though, because the real files are stored in the ~/Documents/Projects/tests directory which is mapped to the virtual hard disk. Any changes that you need to make to these files, you should be making in that directory.
I started the server with vagrant up. Ok. I see no logs on the Terminal. I was used to have logs during requests. So will the server run forever and ever? Or how to eventually shut it down?
The machine will run until it's stopped, using vagrant halt or you shut down your machine.
How to see logs?
Laravel stores your logs in storage/logs within the Laravel directory. Based on your configuration, you should have log files in the ~/Documents/projects/tests/Laravel1/HTML/storage/logs directory.
It’s on your VM, so it’s inside the VM’s directroy. Depending on what VM you’re using, it’ll be in there.
The oposite of vagrant up is vagrant halt
As mapped so in ~/Documents/projects/tests on host OS and ~/tests/laravel1/HTML in guest OS
vagrant halt (from same directory where you issued vagrant up)
Logs on Ubuntu are in /var/log/apache2/access.log and /var/log/apache2/error.log by default. Application (PHP error within) you can see in (in your case) /home/vagrant/tests/laravel1/HTML/storage/logs/laravel.log
Not asked how but
You access your guest server virtual machine with vagrant ssh command

Docker data-only container permissions

I'm developing PHP app and I'm stuck with Docker volumes. I've tried to create a separate data-only container for my PHP app but couldn't make it work because of permission issues... I've googled and read all I could and the most close to working solution is described here
But it's a bit old and I couldn't make it work too.
I've created a repo with a simple test code:
https://github.com/oleynikd/docker-volumes-permissions/
This simple project uses docker-compose to run 2 containers:
For php fpm
For nginx
The php code is mapped to php container and lives in /src directory.
The main problem is that PHP has no rights to write to code's directory because it runs as www-data user but code's directory belongs to user with id 1000 and group staff.
After running docker-compose up and visiting /index.php you'll see the warning and ls -lah output that shows the permission issue. Here's the screenshot:
I've tried to fix this by adding RUN mkdir -p /src && chown -R www-data:www-data /src to php Dockerfile but that didn't helped.
So questions are:
Why the owner and the group of /src is 1000:staff?
How to fix this?
I'm sure the solution is simple but I can't find it.
Please help!
P.S. Feel free to contribute to repo if you know how to fix this issue.
The owner of the files is 1000:staff because 1000:1000 is the uid:gid of the owner of the files on the host machine.
You could avoid it using volumes without specifying the path of the files on the host machine and adding the files with a COPY instruction in the dockerfile. But maybe you need to easily access to theses files on the host?
For development environments (and development environment only), I use a hacky solution that I described in this answer to manage it.

Apache2 fail in Linux Mint 17 KDE

I am with Linux the 5th day only.
I am used to XAMPP based projects in Windows, where everything is out-of-box and easy.
But unreadable cyrillic letters in GitBash (when typing ls -la), file names no longer, than 256 symbols and absolutely non-secure access to non-public project folders forced me to switch to Linux Mint 17 KDE (Debian - Ubuntu fork).
After long searches in Internet I did all that is mentioned below.
I am absolutely frustrated with fact that when I type in browser address bar my first PHP script with <?php phpinfo(); inside (http://site1/foo.php) - everything is OK.
However the second (e.g. bar.php), the third (start.php) etc. scripts from the same folder - ERROR (no such file on server).
I made different sets of scripts in ~/server/site2 and ~/server/site3 and the necessary commands.
I made different parent folders (server2, html, sites).
I cleared Firefox cache.
The result is the same: the first script in a newly created folder is OK, - all the following ones - FAIL.
Any ideas?
Here is the code https://yadi.sk/d/Vi4VVho3bN3Ps
Sorry, I don't have comment ability yet but just so you know, you can use XAMPP on Linux, however it is generally not recommended outside of development.
SOLVED.
According to http://httpd.apache.org/docs/2.2/en/vhosts/ I needed Name-based Virtual Hosts (More than one web site per IP address) described here http://httpd.apache.org/docs/2.2/vhosts/name-based.html.
For this I needed to uncomment or create NameVirtualHost *:80 directive in httpd.conf.
However, here https://wiki.apache.org/httpd/DistrosDefaultLayout#Debian.2C_Ubuntu_.28Apache_httpd_2.x.29: is stated that debian based linux distros use two config files (/etc/apache2/apache2.conf and /etc/apache2/ports.conf) instead of classic httpd.conf one.
Neither of my two config files had NameVirtualHost *:80 directive.
Therefore, I created it in /etc/apache2/apache2.conf.
This didn't work even after checking access rights to folders and files in my user folder (644 instead of 755 also OK) and sudo a2dissite site1.conf, sudo a2ensite site1.conf, service apache2 reload.
I made PC reboot - and voila! - I see my virtual hosts in browser.

Laravel Homestead Serve code not creating and connecting to folder(s)

I have tried running the following code, and even though it runs and sets the nginx blocks ok, its not linking to the folder in question with in the host machine.
serve projects.dev /home/vagrant/Code/projects.dev
When i then list the folders within the Code folder on the guest machine, i only get the folders that were created via the automated Yaml config file on init set up.
It seems not be creating the folder and/or linking to it at all with between the guest and host machines
Running it on a iMac OS Mavericks.
Vagrant 1.6.2
VirtualBox 4.3.12
That is strange. What I have is I have a folder mapped on the "folders" section from my host machine that inside contains all the sub-folders for all my projects in progress. So my nginx sites just basically link to the sub-folders inside this directory. If you have something like that it "should" show you the folders from your host inside that Code directory in your VM. Maybe post your YAML file to check it out?
My problem was the new Homestead file, use small c for spelling directory "code".
I was upgrading my homestead, so the old code folder was using capital c, spelled "Code".
I just change my Homeastead.yaml file from calling "code" to "Code".

Laravel 4 - cloning the local project on the VPS

I use Laravel 4 to develop my projects.
I develop on my Mac, commit to Git, then clone it on the server (linode 1G VPS).
Since "vendor" folder is by default "GIT-ignored", I usually do "composer install" after cloning the project on the server.
After that, any other packages I install locally, I do "composer update" on the server.
Yesterday, I reported this problem - PHP Composer update "cannot allocate memory" error (using Laravel 4)
So far, I have not found a solution. I even tried to do a "fresh" cloning and "composer install", it's giving me the memory error. This is extremely frustrating.
My question is then, is it ok to just upload my entire project to the server? Since "vendor" folder is only thing that is "git-ignored", if I just copy everything there, would it work? (I haven't tried it since my server is alive at the moment and I don't want to damage anything).
What is the actual role of "compiled.php" file? Is it platform dependent? Can I copy that file too?
I've seen this memory issue quite a few times now and read other people reporting the similar issue. I hope I can just upload the entire project folder and cross my fingers that it will work.
Thanks for your help!
I do not have VPS, or even shell access to my custom/shared hosting from my provider, but I can run git and composer commands without that.
Use sshfs http://osxfuse.github.io/
sshfs actually does SFTP connection to your server and mounts server to local directory.
This way, you can run git and composer commands localy. You do not depend on your VPS/hosting server. sshfs sends files in background to remote server.
To mount VPS to local dir, run this
sshfs user#serverip:. /path/to/existing/local/dir // to mount root dir
cd !$ // to get into mounted dir
// or
sshfs user#serverip:foldername /path/to/existing/local/dir // to mount specific dir
cd !$ // to get into mounted dir
Now you can do whatever you want.
a good thing to know for you - it is possible to set up Laravel config in such a way, that the same app (the very same copy of code) can act differently on different servers (environments).
I am wrtiting that, because if you sync your remote server with a local copy of the code sooner or later you will stumble upon issues like changing the db credentials or app setup after every sync - which of course doesn't make sense :)
Check out Laravel 4 Docs Environment configuration to read more about that, or follow this tutorial by Andrew Elkins - How to set Laravel 4 Environments
The environment is based on url matches.
You’ll find that configuration in /bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
));
Now say you are developing locally and use the prefix/postfix local. E.g: my-new-site.local or local.my-new-site
$env = $app->detectEnvironment(array(
'local' => array('local.*','*.local'),
));
That sets the environment, now to use it you’ll need to create a local folder in /app/config/
1 mkdir app/config/local
And so you want to have a different database configuration for local. Just copy the database config file in to the local directory and modify it.
1 cp app/config/database.php app/config/local/database.php
To sum up and answer your question:
1) I guess it's OK to copy the whole project dir to remote server (although, if your copying vendor it might take a lot of time - it usually contains a big number of files)
2) if you do so, remember to have the updated composer.json on remote server (to reflect all the necessary requirements)
3) If you are using different database servers local and remote - you obviously have to run migrations and seeders on the remote server (this concernes also package migrations/seeds)
4) after you migrate all your files, do
composer dump-autoload
php artisan optimize --force --env=YourProductionEnvironmentName
which should rebuild the bootstrap/autoloaders
5) if you are using the Laravel Environments setup mentioned above, remember to have your remote server seen as production (if your local is testing/staging).

Categories