I use Laravel 4 to develop my projects.
I develop on my Mac, commit to Git, then clone it on the server (linode 1G VPS).
Since "vendor" folder is by default "GIT-ignored", I usually do "composer install" after cloning the project on the server.
After that, any other packages I install locally, I do "composer update" on the server.
Yesterday, I reported this problem - PHP Composer update "cannot allocate memory" error (using Laravel 4)
So far, I have not found a solution. I even tried to do a "fresh" cloning and "composer install", it's giving me the memory error. This is extremely frustrating.
My question is then, is it ok to just upload my entire project to the server? Since "vendor" folder is only thing that is "git-ignored", if I just copy everything there, would it work? (I haven't tried it since my server is alive at the moment and I don't want to damage anything).
What is the actual role of "compiled.php" file? Is it platform dependent? Can I copy that file too?
I've seen this memory issue quite a few times now and read other people reporting the similar issue. I hope I can just upload the entire project folder and cross my fingers that it will work.
Thanks for your help!
I do not have VPS, or even shell access to my custom/shared hosting from my provider, but I can run git and composer commands without that.
Use sshfs http://osxfuse.github.io/
sshfs actually does SFTP connection to your server and mounts server to local directory.
This way, you can run git and composer commands localy. You do not depend on your VPS/hosting server. sshfs sends files in background to remote server.
To mount VPS to local dir, run this
sshfs user#serverip:. /path/to/existing/local/dir // to mount root dir
cd !$ // to get into mounted dir
// or
sshfs user#serverip:foldername /path/to/existing/local/dir // to mount specific dir
cd !$ // to get into mounted dir
Now you can do whatever you want.
a good thing to know for you - it is possible to set up Laravel config in such a way, that the same app (the very same copy of code) can act differently on different servers (environments).
I am wrtiting that, because if you sync your remote server with a local copy of the code sooner or later you will stumble upon issues like changing the db credentials or app setup after every sync - which of course doesn't make sense :)
Check out Laravel 4 Docs Environment configuration to read more about that, or follow this tutorial by Andrew Elkins - How to set Laravel 4 Environments
The environment is based on url matches.
You’ll find that configuration in /bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
));
Now say you are developing locally and use the prefix/postfix local. E.g: my-new-site.local or local.my-new-site
$env = $app->detectEnvironment(array(
'local' => array('local.*','*.local'),
));
That sets the environment, now to use it you’ll need to create a local folder in /app/config/
1 mkdir app/config/local
And so you want to have a different database configuration for local. Just copy the database config file in to the local directory and modify it.
1 cp app/config/database.php app/config/local/database.php
To sum up and answer your question:
1) I guess it's OK to copy the whole project dir to remote server (although, if your copying vendor it might take a lot of time - it usually contains a big number of files)
2) if you do so, remember to have the updated composer.json on remote server (to reflect all the necessary requirements)
3) If you are using different database servers local and remote - you obviously have to run migrations and seeders on the remote server (this concernes also package migrations/seeds)
4) after you migrate all your files, do
composer dump-autoload
php artisan optimize --force --env=YourProductionEnvironmentName
which should rebuild the bootstrap/autoloaders
5) if you are using the Laravel Environments setup mentioned above, remember to have your remote server seen as production (if your local is testing/staging).
Related
I am fairly new to docker, please bear with me if its a basic question. I have a laravel project on a server, the project is dockerized. What I want to do is move a file from my project to another location on the same server that is not dockerized.
So, my project is setup on /var/www/my-project directory and I want to copy a file from my-project/storage/app/public/file.csv to /var/{destination_folder}. How can I do that in laravel? I think my issue is not related to laravel it is related to docker which is not allowing to move files out of it. Please don't add laravel or php file copy code snippets,I have tried plenty.
What I've tried?
1- I have tried copying file using:
Storage::disk('local')->put('/var/{destination_folder}', 'my-project/storage/app/public/file.csv' )
but, it does not copy the file.
2- I have also tried moving the file using bash script which I'm executing from my laravel controller using shell_exec or process but, it is also not working.
cp "/var/www/my-project/storage/app/public/file.csv" "/var/destination_folder"
What's hapening in this solution is that it is working when I run the command from terminal, but its not working when I call it from my controller and it gives me
cp: cannot create regular file '/var/destination_folder/file.csv': No such file or directory
After googling the above error it seemed that this is a permission issue wo, I changed the permission of the destination folder to 775 and I also checked the user from which I was running the laravel app and it gave me root when I ran whoami from the app.
Let me know how this could be achieved, thank you!
The entire point of docker is that it is isolated from the base host. You cannot simply copy the file out, as the docker host does not have access to any disk that is not mounted.
The easiest option is to create a destination directory and create a bind mount as per https://docs.docker.com/storage/bind-mounts/
You would then use the following argument for your docker run:
--mount type=bind,source=/var/destination_folder,target=/some_directory_inside_your_docker and copy the file to some_directory_inside_your_docker and it will appear in the parent host.
Another option is to generate a user account on the parent host, LOCK IT DOWN HARD for security reasons, and then have a private key inside your docker that would allow your docker to SSH to the parent host (note, this won't work with every network configuration). I don't think it's a good idea when you can do bind mounts, but it would work.
I am using localhost as my development server in which I added Composer and installed the Stripe dependencies.
Since I have a shared hosting account, I want to transfer the Composer files to my server since I don't have root access to run the Composer install locally.
Which files do I move, and where do I put them? I have moved composer.json composer.lock and the vendor folder into my sites root directory with all of my sites files and folders.
However, when I go to run a Stripe command I get errors in which I have to include each file in the directory. Obviously, Composer is not correct on the new server since I have to include each file.
How can I get it so that Composer works properly on the new server?
First of all, you don't need root access to use Composer. Just download (or transfer from your local machine) composer.phar and use it on the server.
About copying: I advice to copy the whole project directory as is (with all src, vendor, bin dirs and other file). It's in your interests to make local and remote environments the same.
How to upload my project to server , migrate my database , and edit my project using github or any other way .
i tried this way and it seems to be very stupid.
i uploaded myproject.zip and extracted .
then created database and imported a backup from my localhost database.
suggest any helpful easier way to do it .
thanks.
Maybe git-ftp is something for you.
You can use git-ftp for script based projects like PHP.
Most of the low-cost web hosting companies do not provide SSH or git support, but only FTP.
Git-ftp provides an easy way to deploy git tracked projects. Instead of transferring the whole project, it only transfers the files that changed since the last time.
Even if you are playing with different branches, git-ftp knows which files are different. No ordinary FTP client can do that.
First of all create a repository on github. And then on your local computer initialize git if you are on Linux (ubuntu to be precise) navigate to your project folder and run
git init
it will show an output about initialized a working directory or something like that. Navigate to .git folder and edit config file with your favorite editor, the content looks like the following
[core]
repositoryformatversion = 0
filemode = true
bare = false
logallrefupdates = true
[remote "origin"]
url = https://your-username#github.com/your-user-name/your-repository.git
fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
remote = origin
merge = refs/heads/master
after modifying this file navigate back to your project root folder and run
git add .
git commit -m 'first commit'
git push origin master
now login to your server via putty or any ssh client you use navigate to your project root folder initialize git by
git init
navigate to .git folder and edit the config file paste the same code you pasted on your local machine and then navigate back to your project root folder(on server) run
git pull origin master
You are in sync now. No anytime you make any changes to your localhost you should perform following commands from your project root folder
git add .
git commit -m 'any custom message about these changes'
git push origin master
and from your server
git pull origin master
Never make any changes on your development server in ANY CASE. you should add the files you want to ignore while upload and download in the .gitignore file under your project root in your localhost. e.g to exclude a folder from uploaded add following in your .gitignore file
/foldername/*
I have a vagrant box setup running my dev code which is a nginx/php setup.
(Quick info on vagrant - its a virtualbox wrapper: http://www.vagrantup.com/).
In the vagrant/virtualbox setup, it is using linux guest additions to mount a shared folder on my host computer (MAC OSX).
linux guest path: /var/www/local
OSX host path: ~/src/
On multiple occasions, I find that php can't seem to write anything through any command (file_put_contents, fwrite.. etc) to any path location on the mounted shared folder, However it is able to write outside of the /var/www/local (for example /var/www/not-mounted/..).
I find this very difficult to work with, as I am using a cache system and it keeps failing to write any of the cache javascript/css files to (/var/www/local/public/root/cache/) which I need to be in the root folder of my website which is (/var/www/local/public/root/index.php).
I have done a lot of research on this topic:
it seems, the folder mount has the right permissions:
When I type mount command in the linux guest, I get this:
/var/www/local on /var/www/local/ type vboxsf (uid=1000,gid=1000,rw)
Clarify:
This happens all the time, it is a known problem I encounter which I try to workaround.
From cat /etc/passwd:
vagrant:x:1000:1000:vagrant,,,:/home/vagrant:/bin/bash
Can anyone help me on this?
I have figured out the problem.
I have forgot to give PHP the correct user-privileges and permissions to write to the folder. Basically, my PHP user/user-group was www-data/www-data however, vagrant has its own user/group (vagrant/vagrant) which mounts the folder /local/.
Since I did not want to mess with my vagrant mounting behaviour, I just simply changed my php config to start PHP with the user/group - vagrant/vagrant.
This fixed the issue for me.
Thanks for the help!
I have a developer tool that modifies the local file system using php. These projects are often checked out from SVN. The issue that I would like to avoid is the moving / deleting / renaming of version controlled directoires - and the breakage that occurs to a checked out SVN when doing those alterations locally.
Is there a simple way to apply the SVN commands (move add, delete, rename etc) when I also perform the matching file system operations?
Or is it easier just to delete .svn dirs if found in move / rename targets?
To clarify this:
A checked out svn that has a structure of:
somedir/
--foo/
--cheese/
User 1 alters this file system structure (using the dev tool), renaming the 'foo' dir (which is under version control) to 'modified', when they go to commit their changes the svn will error due to the change in name not being done via SVN commands.
This could be running on a variety of development servers (usually desktop machines running apache on win or mac)
Dont want to ignore or disconnect from the svn.
In your PHP file, you could use the svn commands via exec()
<?php
$Output = '';
$ExitCode = 0;
exec('svn mv "'.$Current.'" "'.$New.'"', $Output, $ExitCode);
$Output is the output of the command and $ExitCode will be 0 if the command was successful.
Keep in mind however, that for security reasons the exec() command might be disabled on some systems. If you have control of your php.ini file, you can choose to enable/disable it.
Programmatically you may be better off just removing .svn folders. It probably is just one or two lines of code.
Another thing to try, why dont you export the module? that way you dont get .svn folders.
Also note that you can script all svn commands and run them.
If I understood correctly, you have one version controlled directory (slave) within another (master). Why not just add slave directory to master svn ignore list?
svn propset svn:ignore slave .
svn ci . -m 'Added ignore dir'
From now on running svn up/add/delete in master directory won't mess up slave svn, while the latter remaining fully functional.