Where does Homestead/Vagrant put folders? - php

I'm new to app development, new to Laravel and new to Homestead. I've just successfully served up my first 'hello world' home page via Vagrant/Homestead.
I have a few of questions:
Assuming my config is the following:
folders:
- map: ~/Documents/projects/tests
to: /home/vagrant/tests
sites:
- map: test1.local
to: /home/vagrant/tests/laravel1/HTML/public
Where the is the /home/vagrant/tests folder physically located? Or, where can I find this kind of info (apart from here)?
I started the server with vagrant up. Ok. I see no logs on the Terminal. I was used to have logs during requests. So will the server run forever and ever? Or how to eventually shut it down?
How to see logs?

Where the hell is /home/vagrant/tests folder physically located? Or, where can I find this kind of info (apart from here)?
The /home/vagrant folder is stored in the virtual hard disk of the virtual machine. You cannot access it from your host OS.
On my computer, it's located in ~/.vagrant.d/boxes/laravel-VAGRANTSLASH-homestead/5.1.0/virtualbox/ubuntu-16.04-amd64-disk001.vmdk. It may differ on your machine.
It doesn't matter where that directory is though, because the real files are stored in the ~/Documents/Projects/tests directory which is mapped to the virtual hard disk. Any changes that you need to make to these files, you should be making in that directory.
I started the server with vagrant up. Ok. I see no logs on the Terminal. I was used to have logs during requests. So will the server run forever and ever? Or how to eventually shut it down?
The machine will run until it's stopped, using vagrant halt or you shut down your machine.
How to see logs?
Laravel stores your logs in storage/logs within the Laravel directory. Based on your configuration, you should have log files in the ~/Documents/projects/tests/Laravel1/HTML/storage/logs directory.

It’s on your VM, so it’s inside the VM’s directroy. Depending on what VM you’re using, it’ll be in there.
The oposite of vagrant up is vagrant halt

As mapped so in ~/Documents/projects/tests on host OS and ~/tests/laravel1/HTML in guest OS
vagrant halt (from same directory where you issued vagrant up)
Logs on Ubuntu are in /var/log/apache2/access.log and /var/log/apache2/error.log by default. Application (PHP error within) you can see in (in your case) /home/vagrant/tests/laravel1/HTML/storage/logs/laravel.log
Not asked how but
You access your guest server virtual machine with vagrant ssh command

Related

Can't view re-uploaded php web files in my Linux server

I used to have a Windows OS server where i uploaded some old php web files to it. I could then access them, edit them, and view them online via my host name.
After much debating and reasoning, we had to change the OS of the server from Windows to Linux. After the change had been completed, a backup of the server was uploaded to the new Linux installation where all my old files were kept.
I could view these files online as I used to do when the server had windows OS.
The only thing I did encounter was the following:
a) I downloaded my files from the server using putty,
b) I deleted the old copy in my Linux server,
c) I then re-uploaded the same file that used to be in the server without making absolutely no change whatsoever to it, to the exact place where it was,
d) When I try to access it via its web address like I did earlier, it throws an error message saying..."The page isn't working".
I don't know much about Linux and there fore I am stuck. I don't know what the problem is. I can't understand why I can view all the files via their web address if they were placed there from the backup, but when I download them, delete their file from the server and then re-upload the exact same downloaded file to the exact place where it used to work, I get an error message.
Extra info: I connect to this Linux server from a windows OS machine using putty.
I found the problem. Since I migrated from a Windows OS server to a Linux Cent OS server, I didn't know that you had to configure the privileges of each folder in order to be accessed from the web. By default, my uploaded files where tagged by ownership of "user". The server was configured to only display files that were tagged by ownership of "root". The way I solved this was by typing the following command in the terminal.
NOTE: "You have to be in the folder where the file you are going to change ownership is."
sudo chown root:root filename.php
sudo -> Execute in admin mode
chown -> Change ownership of file to...
root:root -> ... root instead of user
filename.php -> the name of my file
Executing this corrected the error. Hope it helps someone else since I coudn't find anything related.

Symfony 3.3.9 deployment to PHP 5.6 wont render default application

I am struggling to get a default Symfony application from PhpStorm 2017.2.1 which works locally (Windows), but seemingly will not deploy to a remote server (Linux) and run.
Windows setup:
Windows 10, PHPstorm 2017.2.1, php5.6, symfony 3.3.9, default symfony app (namely, I installed Symfony and am using their default app -- no personally written code). This works, when run with "php bin/console server:run" from a command prompt, and "localhost:8000" in a broswer, generating "Welcome to Symfony 3.3.9" "Your application is now ready. You can start working...."
I believe nothing is wrong here.
Remote Linux Setup:
Shared hosting, bash 4.2.46, I don't have root access, everything supposed to go in the public_html directory. A simple one line phpinfo.php file containing only
"<?php phpinfo(); ?>"
placed in that directory will render when accessing
'www.example.com/phpinfo.php'
the usual phpinfo() dump, thus I believe the PHP server is operational. I'm not sure if there is some configuration error with apache that is causing problems on the server for something more complicated like a default symfony application, or whether the issue is my PHPstorm configuration.
PHPstorm deploy configuration:
SFTP to www.example.com (test connection works, files will upload).
root path: /home/example/public_html
username and password work, auth type=password.
Web server root URL=sftp://ftp.example.com,
and it does allow files to be browsed on the server.
PHPstorm mappings:
local path=c:\blah (works, since files upload)
deployment path on server=/
web path on server=/web
I have only one mapping.
PHPstorm excluded paths:
empty
Upload and Prepare:
Cleared Linux public_html directory of everything including . files. Uploaded the local app to remote Linux server. Logged on via SSH and chmod everything public_html and under to 777 (yes, horrible security practice, but this is a test, and there is nothing else on the domain at the moment, and this will rid me of any security protection issues for test).
Test results, in order:
Test1: Browsing to www.example.com results in
**
Forbidden
You don't have permission to access / on this server.
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.
**
Test 2: Browsing to www.example.com/web results in
**Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator at webmaster#example.com to inform them of the time this error occurred, and the actions you performed just before this error.
More information about this error may be available in the server error log.
Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.
**
Logs: Performing
ls -alsgR | grep log
produces nothing that appears to be a server logfile.
What process do I follow to get the default application to work?
This appears to have been a web server issue.
My web hosting provider would not allow me to see the httpd.conf file for this shared hosting situation, but did run test cases for me, and for whatever the configuration is, determined that
1. All files within public_html/ need to be have permissions 644 (777 not good enough)
2. All directories within public_html/ need to have permissions 755 (777 not good enough).
The two bash commands to do this, run after cd to the public_html/ folder are
find . -type f -exec chmod 644 {} \;
find . -type d -exec chmod 755 {} \;
Very frustrating problem solved w.r.t. the server.
Although I haven't fully checked it out, PHPstorm does appear to have,
under file/settings/Build/Deployment/Options, an ability to override default
protections for files and directories, so I believe it likely that any
problems occurring with PHPstorm can be fixed by configuration of those
overrides.

Apache2 fail in Linux Mint 17 KDE

I am with Linux the 5th day only.
I am used to XAMPP based projects in Windows, where everything is out-of-box and easy.
But unreadable cyrillic letters in GitBash (when typing ls -la), file names no longer, than 256 symbols and absolutely non-secure access to non-public project folders forced me to switch to Linux Mint 17 KDE (Debian - Ubuntu fork).
After long searches in Internet I did all that is mentioned below.
I am absolutely frustrated with fact that when I type in browser address bar my first PHP script with <?php phpinfo(); inside (http://site1/foo.php) - everything is OK.
However the second (e.g. bar.php), the third (start.php) etc. scripts from the same folder - ERROR (no such file on server).
I made different sets of scripts in ~/server/site2 and ~/server/site3 and the necessary commands.
I made different parent folders (server2, html, sites).
I cleared Firefox cache.
The result is the same: the first script in a newly created folder is OK, - all the following ones - FAIL.
Any ideas?
Here is the code https://yadi.sk/d/Vi4VVho3bN3Ps
Sorry, I don't have comment ability yet but just so you know, you can use XAMPP on Linux, however it is generally not recommended outside of development.
SOLVED.
According to http://httpd.apache.org/docs/2.2/en/vhosts/ I needed Name-based Virtual Hosts (More than one web site per IP address) described here http://httpd.apache.org/docs/2.2/vhosts/name-based.html.
For this I needed to uncomment or create NameVirtualHost *:80 directive in httpd.conf.
However, here https://wiki.apache.org/httpd/DistrosDefaultLayout#Debian.2C_Ubuntu_.28Apache_httpd_2.x.29: is stated that debian based linux distros use two config files (/etc/apache2/apache2.conf and /etc/apache2/ports.conf) instead of classic httpd.conf one.
Neither of my two config files had NameVirtualHost *:80 directive.
Therefore, I created it in /etc/apache2/apache2.conf.
This didn't work even after checking access rights to folders and files in my user folder (644 instead of 755 also OK) and sudo a2dissite site1.conf, sudo a2ensite site1.conf, service apache2 reload.
I made PC reboot - and voila! - I see my virtual hosts in browser.

Laravel 4 - cloning the local project on the VPS

I use Laravel 4 to develop my projects.
I develop on my Mac, commit to Git, then clone it on the server (linode 1G VPS).
Since "vendor" folder is by default "GIT-ignored", I usually do "composer install" after cloning the project on the server.
After that, any other packages I install locally, I do "composer update" on the server.
Yesterday, I reported this problem - PHP Composer update "cannot allocate memory" error (using Laravel 4)
So far, I have not found a solution. I even tried to do a "fresh" cloning and "composer install", it's giving me the memory error. This is extremely frustrating.
My question is then, is it ok to just upload my entire project to the server? Since "vendor" folder is only thing that is "git-ignored", if I just copy everything there, would it work? (I haven't tried it since my server is alive at the moment and I don't want to damage anything).
What is the actual role of "compiled.php" file? Is it platform dependent? Can I copy that file too?
I've seen this memory issue quite a few times now and read other people reporting the similar issue. I hope I can just upload the entire project folder and cross my fingers that it will work.
Thanks for your help!
I do not have VPS, or even shell access to my custom/shared hosting from my provider, but I can run git and composer commands without that.
Use sshfs http://osxfuse.github.io/
sshfs actually does SFTP connection to your server and mounts server to local directory.
This way, you can run git and composer commands localy. You do not depend on your VPS/hosting server. sshfs sends files in background to remote server.
To mount VPS to local dir, run this
sshfs user#serverip:. /path/to/existing/local/dir // to mount root dir
cd !$ // to get into mounted dir
// or
sshfs user#serverip:foldername /path/to/existing/local/dir // to mount specific dir
cd !$ // to get into mounted dir
Now you can do whatever you want.
a good thing to know for you - it is possible to set up Laravel config in such a way, that the same app (the very same copy of code) can act differently on different servers (environments).
I am wrtiting that, because if you sync your remote server with a local copy of the code sooner or later you will stumble upon issues like changing the db credentials or app setup after every sync - which of course doesn't make sense :)
Check out Laravel 4 Docs Environment configuration to read more about that, or follow this tutorial by Andrew Elkins - How to set Laravel 4 Environments
The environment is based on url matches.
You’ll find that configuration in /bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
));
Now say you are developing locally and use the prefix/postfix local. E.g: my-new-site.local or local.my-new-site
$env = $app->detectEnvironment(array(
'local' => array('local.*','*.local'),
));
That sets the environment, now to use it you’ll need to create a local folder in /app/config/
1 mkdir app/config/local
And so you want to have a different database configuration for local. Just copy the database config file in to the local directory and modify it.
1 cp app/config/database.php app/config/local/database.php
To sum up and answer your question:
1) I guess it's OK to copy the whole project dir to remote server (although, if your copying vendor it might take a lot of time - it usually contains a big number of files)
2) if you do so, remember to have the updated composer.json on remote server (to reflect all the necessary requirements)
3) If you are using different database servers local and remote - you obviously have to run migrations and seeders on the remote server (this concernes also package migrations/seeds)
4) after you migrate all your files, do
composer dump-autoload
php artisan optimize --force --env=YourProductionEnvironmentName
which should rebuild the bootstrap/autoloaders
5) if you are using the Laravel Environments setup mentioned above, remember to have your remote server seen as production (if your local is testing/staging).

php+nginx+vagrant - php fails to write

I have a vagrant box setup running my dev code which is a nginx/php setup.
(Quick info on vagrant - its a virtualbox wrapper: http://www.vagrantup.com/).
In the vagrant/virtualbox setup, it is using linux guest additions to mount a shared folder on my host computer (MAC OSX).
linux guest path: /var/www/local
OSX host path: ~/src/
On multiple occasions, I find that php can't seem to write anything through any command (file_put_contents, fwrite.. etc) to any path location on the mounted shared folder, However it is able to write outside of the /var/www/local (for example /var/www/not-mounted/..).
I find this very difficult to work with, as I am using a cache system and it keeps failing to write any of the cache javascript/css files to (/var/www/local/public/root/cache/) which I need to be in the root folder of my website which is (/var/www/local/public/root/index.php).
I have done a lot of research on this topic:
it seems, the folder mount has the right permissions:
When I type mount command in the linux guest, I get this:
/var/www/local on /var/www/local/ type vboxsf (uid=1000,gid=1000,rw)
Clarify:
This happens all the time, it is a known problem I encounter which I try to workaround.
From cat /etc/passwd:
vagrant:x:1000:1000:vagrant,,,:/home/vagrant:/bin/bash
Can anyone help me on this?
I have figured out the problem.
I have forgot to give PHP the correct user-privileges and permissions to write to the folder. Basically, my PHP user/user-group was www-data/www-data however, vagrant has its own user/group (vagrant/vagrant) which mounts the folder /local/.
Since I did not want to mess with my vagrant mounting behaviour, I just simply changed my php config to start PHP with the user/group - vagrant/vagrant.
This fixed the issue for me.
Thanks for the help!

Categories