okay i'll explain my issue short:
I have a website, lets say a board where user can chat and upload avatars, images, files etc.
I want to use git with this website.
So what i did:
created a repo with bitbucket
clone this repo local
adding a .gitignore file (see below)
copied my website to my local repo
made a first commit
push everything to bitbucket
Bitbucket now has my website, but without my ignored files and folders
my .gitignore looks like this (e.g):
uploades/user/avatars
cache/
logs/
Okay next step:
I'm logging in with ssh to my webserver
clone the bitbucket repo
Now my website is online without any user files (uploades, images, files etc...)
So if a user now uploades something in the git ignored folders i want the files to remain.
When i make local php changes and push it again to bitbucket and then do a "git pull" on my webserver all user files are deleted.
What do i miss? What is the best procedure?
SSH to the server and manually create uploades folder. You don't want uploades folder to be in git. I assume it doesn't have any code, and all it has is user content - so just leave in on the server.
This is how I push my local repo to a remote live server: I create a bare repo on the server, and push local repo to remote repo (on local add origin as remote). Once remote repo (which is outside my public directory) is up to date, I go to my public directory (public_html) and git clone from my other remote repo. Works like a charm.
First time setup: send local repo to remote server
On remote server create bare repo:
git init --bare myproject.git
On my local machine:
git remote add origin user#ip:sshport/git_path_of_bare_repo/myproject.git
Send local changes to remote bare repo
git push origin master
On the remote server, login into your public directory (public_html, or /var/www/, or whatever your document root is):
git clone /path/to/your/bare/repo/above/myproject.git
Inside your public directory, create whatever folders you need (uploads, logs, tmp, whatever). Further git pulls will not delete these folders.
That's it!
Sending further updates from local to remote
From your local machine just type:
git push origin master
Now your local changes are on your remote repo. On your server go to your public directory (step#4 above), and pull updates:
git pull
Works for me perfectly.
Related
My PHP app is stored in Cloud9 and I deployed it to Heroku from Cloud9 Git terminal.
Now there is a .json file in the app files, has a value stored, which comes from a value entered by the user in a TextField in the User Interface of the app.
The question is that whenever the user changes the value in the TextField, and saves the changes (in the source code, saving corresponds to writing the changes to the .json file), will someone have to commit (using $ git commit) and push (using $ git push heroku master) these changes from Cloud9 to Heroku? Or will that happened automatically?
Your Heroku repo is simply a remote repo as far as cloud9 is concerned. Type:
git remote -v
in cloud9 to see your remotes.
Files from your cloud9 repo only get deployed to Heroku when you do a git push.
I have a big old project that has an .env.php file with configs for many clients.
Each client (in my case, client is a repository) has an env file with his specifics configurations.
I have a centralized repository that push for all others 'clients projects'. I have a remote in git called "main" and a lot of others remotes with the names of clients.
Then I can not put the env file inside the centralized repository. I need one env for each client (remote in git).
On heroku, each client as an app. And of course, this remote git address (of heroku app) is the address of my clients.
Today, I am trying to migrating to heroku. And I faced with this question about the many .env.php files that I have to put in each clients.
What would be the best way to send this files in heroku?
A typical approach is to write a script that pulls from the repos you need: the master one and a client specific one.
The script copies all the files into a directory and initializes a throwaway git repository on your local file system, adding all the combined files.
Finally the script pushes the throwaway repo to heroku.
This is my first website (website itself is all done). I'm trying to upload my website files to my Openshift PHP 5.4 domain so when I click my OpenShift domain url, my website appears (pretty simple, right?). My Openshift account is set up. I've connected to it with FileZilla, and set up a private (or public) key. I've installed Ruby & Git. I followed everything here, and am stuck on this step:
Web Console
If you create an application from the web console, you’ll need to tell Git to clone the repository. Find the Git URL from the application page, and then run:
C:\> git clone <git_url> <directory to create>
I don't know what the "git_url" is supposed to be. Nor do I know what the "directory to create" is supposed to be. I don't know what OpenShift directory to put my website files in (when I connect with FileZilla) so that when I open my OpenShift domain url, I see my website (see below).
Again, my goal is to see my website when I open my OpenShift PHP 5.4 url. Where do I go from here?
Your problem is that you don't know what git is about, I recommend that you go read about it: https://git-scm.com/book/en/v1/Getting-Started
If you have your site ready and you've created an OpenShift app, do the following:
1) Grab the git url (you can find it by browsing to https://openshift.redhat.com/app/console/applications, clicking your app, and then copying the long address on the right, under "Source Code"), it should look something like this: ssh://afa231av#app-domain.rhcloud.com/~/git/app.git/
2) Open up a terminal (or use some git ui tool) and clone (download) your app with
git clone ssh://afa231av#app-domain.rhcloud.com/~/git/app.git
3) You should have a folder named app, now use the file explorer go inside it and paste in all your website
4) Go back to the terminal and do
cd app (or whatever name your app has)
git add .
git commit -m "Add my website"
git push origin master
Wait for it to finish, and if there are no errors, you're all done.
git_url: Go to OpenShift Web Console -> Application -> find and click your application, on the right there is Source Code panel with something like ssh://***.rhcloud.com/~/git/php.git/, this will be your git_url
directory_to_create: is just a name of directory that will be created that will contain your git repo on your local file system
When you clone this repo, you should copy your files to it, then commit and push to branch named 'master', and it will be automatically deployed on server.
Is there anyway I can upload some new files into my bitbucket repo without installing GIT into my server?
I am trying to use PHP (on a shared hosting Linux server) to upload files git add into a shared repo.
A friend suggested me that I use this but that also needs an installation
of git on the server:
git remote add origin git#bitbucket.org:your_account/your_repo.git
First of all you need to add a new remote that points at bitbucket (you have this command already):
git remote add origin git#bitbucket.org:your_account/your_repo.git
Then you need to push all your branches (You named the bitbucket remote origin so):
git push --all origin
But if you want to push some of them you need to push them individually:
git push origin master
I have a php app on openshift with mysql and manage it through git. I can easily add photos to my local clone and push it up to git and it works fine. But when a client uploads a photo through my site, to the openshift server, that uploaded photo (via php $_FILE) doesn't get pushed into the git repo and when I pull git repo to my local machine I can't find that uploaded photo. Any workaround?
I'm assuming that those photo's are being stored in your $OPENSHIFT_DATA_DIR which by default isn't tracked in your git repo. I would suggest setting up a rsync cron job that will sync any new files from your $OPENSHIFT_DATA_DIR to your local machine.
Something like:
rsync -raz --progress /var/www openshift_ssh_information:~/app-root/data/