How to upload my project to server , migrate my database , and edit my project using github or any other way .
i tried this way and it seems to be very stupid.
i uploaded myproject.zip and extracted .
then created database and imported a backup from my localhost database.
suggest any helpful easier way to do it .
thanks.
Maybe git-ftp is something for you.
You can use git-ftp for script based projects like PHP.
Most of the low-cost web hosting companies do not provide SSH or git support, but only FTP.
Git-ftp provides an easy way to deploy git tracked projects. Instead of transferring the whole project, it only transfers the files that changed since the last time.
Even if you are playing with different branches, git-ftp knows which files are different. No ordinary FTP client can do that.
First of all create a repository on github. And then on your local computer initialize git if you are on Linux (ubuntu to be precise) navigate to your project folder and run
git init
it will show an output about initialized a working directory or something like that. Navigate to .git folder and edit config file with your favorite editor, the content looks like the following
[core]
repositoryformatversion = 0
filemode = true
bare = false
logallrefupdates = true
[remote "origin"]
url = https://your-username#github.com/your-user-name/your-repository.git
fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
remote = origin
merge = refs/heads/master
after modifying this file navigate back to your project root folder and run
git add .
git commit -m 'first commit'
git push origin master
now login to your server via putty or any ssh client you use navigate to your project root folder initialize git by
git init
navigate to .git folder and edit the config file paste the same code you pasted on your local machine and then navigate back to your project root folder(on server) run
git pull origin master
You are in sync now. No anytime you make any changes to your localhost you should perform following commands from your project root folder
git add .
git commit -m 'any custom message about these changes'
git push origin master
and from your server
git pull origin master
Never make any changes on your development server in ANY CASE. you should add the files you want to ignore while upload and download in the .gitignore file under your project root in your localhost. e.g to exclude a folder from uploaded add following in your .gitignore file
/foldername/*
Related
I am fairly new to docker, please bear with me if its a basic question. I have a laravel project on a server, the project is dockerized. What I want to do is move a file from my project to another location on the same server that is not dockerized.
So, my project is setup on /var/www/my-project directory and I want to copy a file from my-project/storage/app/public/file.csv to /var/{destination_folder}. How can I do that in laravel? I think my issue is not related to laravel it is related to docker which is not allowing to move files out of it. Please don't add laravel or php file copy code snippets,I have tried plenty.
What I've tried?
1- I have tried copying file using:
Storage::disk('local')->put('/var/{destination_folder}', 'my-project/storage/app/public/file.csv' )
but, it does not copy the file.
2- I have also tried moving the file using bash script which I'm executing from my laravel controller using shell_exec or process but, it is also not working.
cp "/var/www/my-project/storage/app/public/file.csv" "/var/destination_folder"
What's hapening in this solution is that it is working when I run the command from terminal, but its not working when I call it from my controller and it gives me
cp: cannot create regular file '/var/destination_folder/file.csv': No such file or directory
After googling the above error it seemed that this is a permission issue wo, I changed the permission of the destination folder to 775 and I also checked the user from which I was running the laravel app and it gave me root when I ran whoami from the app.
Let me know how this could be achieved, thank you!
The entire point of docker is that it is isolated from the base host. You cannot simply copy the file out, as the docker host does not have access to any disk that is not mounted.
The easiest option is to create a destination directory and create a bind mount as per https://docs.docker.com/storage/bind-mounts/
You would then use the following argument for your docker run:
--mount type=bind,source=/var/destination_folder,target=/some_directory_inside_your_docker and copy the file to some_directory_inside_your_docker and it will appear in the parent host.
Another option is to generate a user account on the parent host, LOCK IT DOWN HARD for security reasons, and then have a private key inside your docker that would allow your docker to SSH to the parent host (note, this won't work with every network configuration). I don't think it's a good idea when you can do bind mounts, but it would work.
I am using localhost as my development server in which I added Composer and installed the Stripe dependencies.
Since I have a shared hosting account, I want to transfer the Composer files to my server since I don't have root access to run the Composer install locally.
Which files do I move, and where do I put them? I have moved composer.json composer.lock and the vendor folder into my sites root directory with all of my sites files and folders.
However, when I go to run a Stripe command I get errors in which I have to include each file in the directory. Obviously, Composer is not correct on the new server since I have to include each file.
How can I get it so that Composer works properly on the new server?
First of all, you don't need root access to use Composer. Just download (or transfer from your local machine) composer.phar and use it on the server.
About copying: I advice to copy the whole project directory as is (with all src, vendor, bin dirs and other file). It's in your interests to make local and remote environments the same.
I am having a really strange problem. I'll try keep it simple.
I am developing a web app using Laravel and it involves allowing users to upload and download files. I have created a function that allows a user to upload a file. This works grand. I can login to my site from another computer and see that the file was uploaded, and I can download it. So it seems to be working grand.
However, the file that I have uploaded is not appearing in the folder which it is supposed to on GitHub. This is really strange because the file definitely is there...because I can download it from another computer.
Code for uploading the file.
$destinationPath = public_path().'/files/';
$file->move($destinationPath, $file->getClientOriginalName());
I cannot see why the file I have uploaded isn't appearing on Git seeing as it has definitely uploaded.
Any ideas?
The files that you are looking for are located on your server. Adding or removing files from the server does not add or remove them from your git repository. In order to add them to your git repository, you would need to run the git commands to do so. The commands depend on what exactly you want to add, but as a general example:
To add all files
git add -A
Followed with a commit
git commit -m "example commit"
They will not appear on GitHub until you push
git push origin branch-name
For Git you have to manually commit the files.
Go to your command line and run
git status
If it shows public/ files then you would need to manually add them.
git add public
If they're not showing there, they're either being ignored or they've been committed and possibly not pushed.
First thing inside your project folder if the .git folder is created, delete it. Then write this series of code
commands:
git config --global user.name "your name"
git config --global user.email "your email"
git init
git add .
git commit -m "Initial commit"
git status
git remote add origin # paste here ssh link from github repository
git push origin master
Have a look at your .gitignore file. It might be the case that you instruct Git there to ignore the file.
I have a laravel app on Cloudway DigitalOcean, my app in /public_html , I want to update my app using git , so I have created folder private_html/git , where I pull my edited project from bitbucket, now I want to checkout it to my public_html/ , how do I do that ? Thank You
You can either deploy the changes to public/html via git or manually copy the files across.
Manually copy option
Depending on full path, something like:
cp -a private_html/git/. public_html/
Note: -a is a recursive option that preserves file attributes
This won't remove any files that have been removed in private_html/git so you would have to do that manually, or remove everything before copying the files across.
Git pull option
First, make sure you have pushed all the changes made in private_html/git to your remote (bitbucket repo).
Set up the current copy in public_html/ as a git repo.
In public_html/
git init
Then add the bit bucket remote
git remote add origin git#bitbucket.org:user-name/repo-name
Note: Get the proper bitbucket remote from your bitbucket account
Then pull the changes from the remote
git fetch --all
git reset --hard origin/master
Warning: You'll lose any differences that are currently in public/html so be careful. Always a good idea to back up everything before these kinds of changes so I'd suggesting archiving the code in public_html before overwriting it.
I use Laravel 4 to develop my projects.
I develop on my Mac, commit to Git, then clone it on the server (linode 1G VPS).
Since "vendor" folder is by default "GIT-ignored", I usually do "composer install" after cloning the project on the server.
After that, any other packages I install locally, I do "composer update" on the server.
Yesterday, I reported this problem - PHP Composer update "cannot allocate memory" error (using Laravel 4)
So far, I have not found a solution. I even tried to do a "fresh" cloning and "composer install", it's giving me the memory error. This is extremely frustrating.
My question is then, is it ok to just upload my entire project to the server? Since "vendor" folder is only thing that is "git-ignored", if I just copy everything there, would it work? (I haven't tried it since my server is alive at the moment and I don't want to damage anything).
What is the actual role of "compiled.php" file? Is it platform dependent? Can I copy that file too?
I've seen this memory issue quite a few times now and read other people reporting the similar issue. I hope I can just upload the entire project folder and cross my fingers that it will work.
Thanks for your help!
I do not have VPS, or even shell access to my custom/shared hosting from my provider, but I can run git and composer commands without that.
Use sshfs http://osxfuse.github.io/
sshfs actually does SFTP connection to your server and mounts server to local directory.
This way, you can run git and composer commands localy. You do not depend on your VPS/hosting server. sshfs sends files in background to remote server.
To mount VPS to local dir, run this
sshfs user#serverip:. /path/to/existing/local/dir // to mount root dir
cd !$ // to get into mounted dir
// or
sshfs user#serverip:foldername /path/to/existing/local/dir // to mount specific dir
cd !$ // to get into mounted dir
Now you can do whatever you want.
a good thing to know for you - it is possible to set up Laravel config in such a way, that the same app (the very same copy of code) can act differently on different servers (environments).
I am wrtiting that, because if you sync your remote server with a local copy of the code sooner or later you will stumble upon issues like changing the db credentials or app setup after every sync - which of course doesn't make sense :)
Check out Laravel 4 Docs Environment configuration to read more about that, or follow this tutorial by Andrew Elkins - How to set Laravel 4 Environments
The environment is based on url matches.
You’ll find that configuration in /bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
));
Now say you are developing locally and use the prefix/postfix local. E.g: my-new-site.local or local.my-new-site
$env = $app->detectEnvironment(array(
'local' => array('local.*','*.local'),
));
That sets the environment, now to use it you’ll need to create a local folder in /app/config/
1 mkdir app/config/local
And so you want to have a different database configuration for local. Just copy the database config file in to the local directory and modify it.
1 cp app/config/database.php app/config/local/database.php
To sum up and answer your question:
1) I guess it's OK to copy the whole project dir to remote server (although, if your copying vendor it might take a lot of time - it usually contains a big number of files)
2) if you do so, remember to have the updated composer.json on remote server (to reflect all the necessary requirements)
3) If you are using different database servers local and remote - you obviously have to run migrations and seeders on the remote server (this concernes also package migrations/seeds)
4) after you migrate all your files, do
composer dump-autoload
php artisan optimize --force --env=YourProductionEnvironmentName
which should rebuild the bootstrap/autoloaders
5) if you are using the Laravel Environments setup mentioned above, remember to have your remote server seen as production (if your local is testing/staging).