How can I automatically (maybe using a cronjob?) push code from my FTP server (only a wordpress plugin, not the entire project) to GitHub, using PHP?
I know that you'd usually have a local environment, and then push that to GitHub and then pull from GitHub to FTP, but in this scenario, I want to change the files directly on FTP, and then every hour or half day, push to GitHub automatically. Is this even possible? It needs to commit, push and merge automatically.
We want to use this so we have the possibility to go back and look at earlier versions of the file. We've looked into GitHub API, but we can't seem to find what we need. I hope the question makes sense.
Related
I will try to describe what I'm liiking for as best as I can.
I have a team of developers working on different sites all written with PHP.
For the example, they are now working on a site mysite.com
I want them to have something like a git solution where the master will be the staging area but I will be able to access the different branches and see them.
Let's give an example.
Dan is working on mysite.com
He has a local development environment which he forked from staging.mysite.com (the master).
Dan is creating a branch called dan-branch-1 and another one called dan-branch-2.
Both branches need to be available at the addresses staging.mysite.com/dan/dan-brach-1 and staging.mysite.com/dan/dan-branch-2
When I'm saying available I mean a browsable website so I will have a copy of staging.mysite.com at each of the branches where my client will be able to see the changes we made for him.
After the changes are approved, Dan will merge the branches back to the master.
Finally at some point we will upload the master to the production server.
Is there such a solution that can help me with that?
Thanks !
After a lot of searching I found the exect solution to my issue.
It's called GitLab and it offers everything I looked for.
The interface seems a bit old fashioned, and the learning curve seem steap but it is actually a complete solution so I'm going to learn it and try it.
Netbeans have remote server options.
you can try this.
start netbeans PHP project with remote server access or you can do existing project right click properties and rin configuration give remote server details, FTP, folder path.
if two developer working in same file they have to syncronize manually.
code will merge . No need spend time to merge .
and application will uptodate every time.
thanks..
Good day to you all,
I am currently developing a project on Laravel. So far I have always developed online, directly editing my files on the webserver throuh FTP (using PSPad or similar simple editing tools).
What I want to do now (and what i believe most people actually do) is setup a (W)LAMP stack on my local machine and program locally. However it is a little bit unclear to me how to keep my local code (including databases) in sync with the live website. How do you folks do that? I know there's probably lots of ways and tools to do that, but what would be your advice for a best practice? Any advice would be very welcome :)
What many companies do is build offline, then push their edits up to a server using git.
Im no expert on the software so ill describe what you do in a basic form:
My advice would be to create an online repo (repository) to store your project while you edit/update.
There are several git project management systems such as github or bitbucket. I personally use bitbucket
What git does, is when you have built or added what you need offline on local (w)lamp, you then git push them up to your repo or server. The changed files then get merged with the existing on the repo or the server. If you'd like the most recent version of your project you'd simply just git pull them down.
Read the full documentation here to see the wide range of options available when using git
We have a settings array within our platform available as $res::Config.
At runtime, a variable is changed from 'dev' to 'live' after checking the HTTP Host, obviously depending on the IP address.
Within our framework bootstrapping, depending on the value of $res::Config->$env, or the environment set previously as either dev or live, the settings for the database connection are set. You store these settings in the Config array as db_live or db_dev.
However you do it, use an environmental variable to figure out whether you want live or dev, and set up and array of settings accordingly.
We also have sandbox and staging for intermittent development stages.
As for version control, use git or subversion.
Edit: It's also possible that within our vhost file, we setup an environmental variable as either live or dev, and our application reads from this accordingly. I'd suggest this approach :)
There are a number of ways of doing this. But this is a deceptively HUGE question you've asked.
Here is some good practice advice - go and research these items, then have a look at my approach.
Typically you use a precess called version control which allows you to create "versions" or snapshots of your system.
The commonly used "SVN" software is good, but the new (not really any more) kid on the block is GIT, and I personally recommend that.
You can use this system to push the codebase live in a controlled fashion. While the files/upload feature is essentially similar to FTP, it allows you to dump a specific version of your site live.
In environments where there are multiple developers, this is ideal - you can compare/test and work around each other, and version control tends to stop errors between devs.
So - advice part 1: Look up and understand version control, then use it to release CODE to the live environment.
Part 2: I use database dumps and farm them back to my machine to work with.
If the live database needs updating, I can work locally and simply export, then re-import on the live system.
For example: on a recent Moodle project I worked on, to refresh the whole database took seconds... I could push a patch and database update in a few minutes.
However: you should think about maintenance and scheduling... if the site is live and has ongoing data changes then you need to be careful with this. Consider adding a maintenance page.
Advice 2: go research SQL dump/export and importing.
I personally use phpmyadmin to dump and re-import, as it's very convenient.
Advice 3: Working locally then pushing live is MUCH BETTER PRACTICE. You're starting down a much safer and better road than you're on!
Hope that helps... but bear in mind - this is a big subject, so you'll need to research a fair bit.
I am developing a PHP project using github.
My editor of choice is Coda 2 which has a function of saving at the same time on local computer and on the FTP server.
Now, I also have the need to commit the changes to git, and therefore to github as well so, every time I am saving (both locally and remotely) I commit to github.
The problem arises now:
What if I need to revert changes? Those would be only affected on github and will probably lead to a mess. What I am doing currently to "revert" is just writing manually the pieces of code that need to be backed up.
You are using Git wrong. There is no reason to commit when you save. Also it's okay if you keep your local git repo for developing and sync with github from time to time only.
So all you need to do is to change your workflow. Remove the commit on save.
Don't upload manually using ftp, instead clone the repository on the server and pull there.
Of course this is only possible if you have shell access to the server. If you want to do serious development you should have a server with shell access.
You misunderstand the purpose of git. It is not a backup tool, but a collaboration tool along with version control system. It never meant to backup your code. It is of course possible, but git's and any other VCS' nature is to provide readable and trackable flow of the project's development phase.
You basically never commit code without explaining changes you've done to the file(s). VCSs encourage you to describe the commit and it's purpose so that the project's team could get into it and see why a certain change was made.
However, to adopt git's capabilities I would suggest making a backup branch and committing files there, so that you know that specific branch is nothing more than chaotic code flow. That at least will make things clear.
Then, when a certain feature is ready and tested, you can squash-merge backup branch into dev branch.
This way you will get organized structure of your repo and you'll be able to revert or pull from the dev branch the state of the code that you really need.
EDIT:
I also suggest having a look at successful git branching blogpost. In it's time it made me understand how git project evolves and develops.
I am looking at setting up a blog for all my plugins. All the plugins are hosted oh github.
because each plugin has all the information I would want to show in my blog already I want to try generating my blog pages automatically (and the demo pages for the plugins). I know how to setup post-receive hooks to get A ping when the repo gets pushed to. And I could download the entire zip of that repo but I am wondering if there is a more elegant way and was looking at pulling just the changes and merging them. Is this possible with php?
Maybe Git.php can do what you need.
Have you Googled for a PHP+Git library? I did and found this: PHP Git, and this git-php.
Okay, I have been grasping as all sorts of solutions to my problems with questions like Recommended DVCS mechanism for hosting many independent patches and Using Mercurial patch queue repository on BitBucket for many users and patches, but hopefully this will be the last question I need to ask about how to establish source control for my project described at https://sourceforge.net/p/iotabuildit/wiki/Home/. Then I can accept some answer on my other questions and move on.
The requirements I am struggling to fulfill are:
Contributing source modifications (patches) must be very easy for the user. That means they should not have to download a version control client.
Any version of the code needs to be easily hosted online because it doesn't work in Chrome when run from a local set of files due to W3C security requirements (which IE apparently ignores, but Chrome honors).
Users should be able to register by themselves and automatically have permission to contribute patches that anyone else can play and review (like Little Big Planet, but more integrated - allowing changes to anything instead of just adding content).
The paths I have tried so far have failed for the following reasons:
Mercurial Patch Queue - after exploring this for a short while, I discovered it's far too complicated for an average player/user to get involved in. We're talking potential non-developers here.
Github & BitBucket - These repositories still appear to require a local DVCS client, and require additional steps to contribute patches. In the end they are still to difficult for an average user, especially someone who simply wants to play some user's particular version of the game. There appears to be no way to host the files online.
So now my proposed solution is as follows, and I want to see if I am missing something that could be better handled in some other way. I will create some PHP scripts on my SourceForge project that will:
Allow users to register their own email/account with a password so that only they can update their own "patches"
Allow them to clone the standard repository or any other users repository.
Create the clone online in a directory accessible to the PHP script, but not to HTTP.
Allow them to submit a set of files to their working directory and commit the differences to their repository as a patch.
Allow patches to be pulled from other user's repositories in the same directory. (Pulling and committing to your own repository doesn't require authentication, which is a stumbling block when I was considering more formal online repository hosting that already exists)
Have a separate PHP script that copies your workspace into a hosted directory where the game can be played online. Any players could visit this workspace to play your version of the game (which may include patches pulled from any number of other repositories).
It seems odd that I can't use existing repositories to do this, but I can't think of a way around the authentication problem. So I need to create my own clones that I know the PHP script should be able to access and commit to without pushing. Other DVCS clients will probably not be able to pull from these online clients, unfortunately, but there are probably ways to export patches if need be. And I don't know what I'm going to do when a merge conflict comes along. But this is the closest I've come to a workable solution so far.
So what I end up with is an online DVCS client to avoid the users having to download a DVCS client and avoid having to find a host for their version of the game. Am I overlooking a simpler solution? (Am I violating SourceForge's terms of service? I could host it on Dreamhost too if (I can get) Mercurial installed there.)
Few comments/answers:
With a DVCS (with a 'D' as Distributed), there is no 'server' or 'client'.
If you want to access the code locally (on your workstation), you will need a DVCS (git or Mercurial), and you would clone an upstream repo (that is, a remote repo stored in GitHub or BitBucket).
For what I understand, each user would fork the main repo, creating one repo per user, still stored on the upstream server (GitHub or BitBucket), since it is the idea behind a fork (clone on the remote side).
That would address 1. since each user is the owner of his/her own fork, and have write access only there.
2. is a given (you can fork a repo)
The rest of the points won't be addressed by GitHub or BitBucket, but by a dedicated server, where you have a DVCS installed, and where you have added the relevant hook in order to automate what you want.
That dedicated server can monitor what is pushed on GitHub or BitBucket, for a given repo of a given user, do the required clone or update, and sync the repo to the right directories (accessible to the PHP script in your case, for instance).