I have a Website with an FTP Server in BigRock.com. What my issue is, whenever there is a deployment, i will have to manually search and find all the files that need to be changed. Now the project is getting larger and larger and making these kind of changes is taking a lot of my valuable time.
So is there any Software/Tools available for Syncing with the FTP Server and changing files based on changes made locally? I am not sure about FileZilla Client since i couldn't find much options in it. I am pretty sure that there would be some solution for this. My project is done in Zend Framework with Doctrine ORM and Many other Libraries.
If you need a one-way synchronization from local files to server, you can check this free tool: http://sourceforge.net/p/phpfilesync/wiki/Home/
It could not be much easier to install or use.
try Allway Sync
It uses innovative synchronization algorithms to synchronize your data between desktop PCs, laptops, USB drives, remote FTP/SFTP servers, different online data storages and more. Encrypted archives are supported. Allway Sync combines bulletproof reliability with extremely easy-to-use interface.
url http://allwaysync.com/
I tried personally is working fine for ftp and local file sync. and also it is free..
Working with Assembla, i found the FTP Tool in it.
Here is the reference link,
http://blog.assembla.com/assemblablog/tabid/12618/bid/78103/Deploying-a-Web-site-is-easy-with-Assembla-s-FTP-tool-Video.aspx
Its Easy to Work Out,
Add the FTP tool using the Tools section of your Admin tab.
Deploy code to any server with ftp or sftp installed.
Set the deploy frequency to manual (you push a button) or schedule automatic deployments for every commit, hourly, daily, or weekly.
Only changed files are deployed for fast and accurate deployments.
Easily roll back and deploy prior revisions.
Add multiple servers for staging and production releases or simply to deploy different repository directories to different locations.
Related
After many hours of reading documentation and messing around with Amazon Web Services. I am unable to figure out how to host a PHP page.
Currently I am using the S3 service for a basic website, but I know that this service does not support dynamic pages. I was able to use the Elastic Beanstalk to make the Sample Application running PHP. But i have really no idea how to use it. I read up on some other services but they don't seem to do what I want or they are just way to confusing.
So what I want to be able to do is host a website with amazon that has dynamic PHP pages. Is this possible and what services do you use?
For a PHP app, you really have two choices in AWS.
Elastic Beanstalk is a service that takes your code, and manages the runtime environment for you - once you've set it up, it's very easy to deploy, and you don't have to worry about managing servers - AWS does pretty much everything for you. You have less control over the environment, but if your server will run in EB then this is a pretty easy path.
EC2 is closer to conventional hosting. You need to decide how your servers are configured & deployed (what packages get installed, what version of linux, instance size, etc), your system architecture (do you have separate instances for cache or database, whether or not you need a load balancer, etc) and how you manage availability and scalability (multiple zones, multiple data centers, auto scaling rules, etc).
Now, those are all things that you can use - you dont have to. If you're just trying to learn about php in AWS, you can start with a single EC2 instance, deploy your code, and get it running in a few minutes without worring about any of the stuff in the previous paragraph. Just create an instance from the Amazon Linux AMI, install apache & php, open the appropriate ports in the firewall (AKA the EC2 security group), deploy your code, and you should be up & running.
Your Php must run on EC2 machines.
Amazon provides great tools to make life easy (Beanstalk, ECS for Docker ...) but at the end, you own EC2 machines.
There is no a such thing where you can put your Php code without worrying about anything else ;-(
If you are having problems hosting PHP sites on AWS then you can go with a service provider like Cloudways. They provide managed AWS servers with one click installs of PHP frameworks and CMS.
As the title says, my (small) business is provided a free Rackspace Cloud client account. We receive a decent amount of traffic but I haven't been able to put together a business case to move to our own server yet. However, we are developing some complex apps and I'm frustrated with not having the ability to even ssh into the remote server. Ultimately, I'd like to set up some sort of version control (at this point, I'll take anything, git or otherwise).
I have control over databases, can FTP, set up cron jobs, and perform a few other basic functions. I can't think of any way to set up git or something similar without ssh access. A thought went through my mind that maybe some sort of PHP version control exists that I might be able to set up, but I haven't had any luck finding it yet.
Do you guys have any ideas, thoughts, or advice?
There is no complete php git client out there. But you can just use ftp for deployment. Have a look at git ftp: https://github.com/resmo/git-ftp
There's no need for your git repos (or whatever you want to use) to be on your Rackspace server. You can use one of the various hosting services such as GitHub or BitBucket, or you could just stick a cheap computer in the corner of your office and use that.
We are working on a project which is getting bigger and bigger.
Till now our work looked like this:
I have a web server, I'm coding with VIM directly over SSH
Another programmer send me new files I add them and integrate them (he have a copy of project and local server)
Designer sends me design I also integrate it.
It is pretty much waste of time, because every time they made change I have to integrate it.
Now we have a hosting which doesn't support SSH (I cannot use vim now). How should we work on a project like this? How should I set up my VIM to work on remote project (I don't want to download and upload every time I want to change file)?
You didn't mention your OS. That would be certainly hepful if you want a precise answer.
The first thing would be to find a better host. You can rent very decent dedicated servers for as low as 40 or 50 € or less. If your project is big and serious, 50 €/month or 100 €, or 200 € is perfectly acceptable and you can install/enable whatever you need. Depending on the size of your project, a VPS could be enough. Whatever the price, a web host without SSH access is worse than shit.
But you may not have any power on that area.
Since your server doesn't support SSH, a proper VCS is not an option. The only practical solutions I see are rather "old-school" but they work:
Solution A:
Download the whole site on your local machine with an FTP client.
Edit locally.
Test your changes with a local web server.
Upload the changed files when your tests are OK.
Solution B:
Connect to your server with an FTP client.
Use its "Edit Locally" feature to open the files in Vim.
Write your changes, the file is automatically updated on the server.
Solution C:
Use Vim's bundled netrw plugin: :e ftp://host/path/to/file. See :h netrw.
Note that the process will always be download -> edit -> save -> upload, whether you notice it or not. Depending on the solution you choose, the process can be horribly repetitive and inneficient or almost completely invisible.
But, seriously, get another server and use a VCS and a local server.
I recommend using version management software like git with SSH hooks that automatically upload changes to your server.
You can use a version manager, like git and make a git pull in the web server every time you have a stable version.
You collaborator can push the new content and you dont need manage the file youtself.
I’ve been working on a cloud based (AWS EC2 ) PHP Web Application, and I’m struggling with one issue when it comes to working with multiple servers (all under an AWS Elastic Load Balancer). On one server, when I upload the latest files, they’re instantly in production across the entire application. But this isn’t true when using multiple servers – you have to upload files to each of them, every time you commit a change. This could work alright if you don’t update anything very often, or if you just have one or two servers. But what if you update the system multiple times in one week, across ten servers?
What I’m looking for is a way to ‘commit’ changes from our dev or testing server and have it ‘pushed’ out to all of our production servers immediately. Ideally the update would be applied to only one server at a time (even though it just takes a second or two per server) so the ELB will not send traffic to it while files are changing so as not to disrupt any production traffic that may be flowing to the ELB .
What is the best way of doing this? One of my thoughts would be to use SVN on the dev server, but that doesn’t really ‘push’ to the servers. I’m looking for a process that takes just a few seconds to commit an update and subsequently begin applying it to servers. Also, for those of you familiar with AWS , what’s the best way to update an AMI with the latest updates so the auto-scaler always launches new instances with the latest version of the software?
There have to be good ways of doing this….can’t really picture sites like Facebook, Google, Apple, Amazon, Twitter, etc. going through and updating hundreds or thousands of servers manually and one by one when they make a change.
Thanks in advance for your help. I’m hoping we can find some solution to this problem….what has to be at least 100 Google searches by both myself and my business partner over the last day have proven unsuccessful for the most part in solving this problem.
Alex
We use scalr.net to manage our web servers and load balancer instances. It worked pretty well until now. we have a server farm for each of our environments (2 production farms, staging, sandbox). We have a pre configured roles for a web servers so it's super easy to open new instances and scale when needed. the web server pull code from github when it boots up.
We haven't completed all the deployment changes we want to do, but basically here's how we deploy new versions into our production environment:
we use phing to update the source code and deployment on each web service. we created a task that execute a git pull and run database changes (dbdeploy phing task). http://www.phing.info/trac/
we wrote a shell script that executes phing and we added it to scalr as a script. Scalr has a nice interface to manage scripts.
#!/bin/sh
cd /var/www
phing -f /var/www/build.xml -Denvironment=production deploy
scalr has an option to execute scripts on all the instances in a specific farm, so each release we just push to the master branch in github and execute the scalr script.
We want to create a github hook that deploys automatically when we push to the master branch. Scalr has api that can execute scripts, so it's possible.
Have a good look at KwateeSDCM. It enables you to deploy files and software on any number of servers and, if needed, to customize server-specific parameters along the way. There's a post about deploying a web application on multiple tomcat instances but it's language agnostic and will work for PHP just as well as long as you have ssh enabled on your AWS servers.
I have two computers that I use for development, one at home and one at the office. I use Aptana Studio 3 on both machines and would like to be able to easily work on a single project from both computers.
What are some easy ways to transport the project between computers. Right now I am just using a USB drive to transfer the files between computers.
Also, I'm using a local apache server on one computer, and a local IIS server on the other.
I think you should use something like SVN, GIT, Mercurial and so on. I suggest you this software to manage your project:
Tortoise
You might try connecting remotely (via Remote Desktop Connection for example) from home to the office computer and in this way you will only work on the office one and there will be no need for file transfer.
Alternatively, you can setup a source control server (by using SVN for example) and commit your projects to it. This way, you will be able to work on them from multiple locations.
if you have the opportunity to use a (virtual) server, and remote desktop from both computers, you won't come back to another solution. Overall with a server-side language like php, it's ideal, as you have your repository on your test web server (lamp/wamp) directly. That ensure ONE version for all your tools, easy and faster backups, synch, etc.
If I had no access to a virtual remote server, I would use github as an alternative, for the sake of code base security, and decent synch times. But I'm no expert on github.
Staging Environment
The best situation in your case will be to have a separate staging machine which is accessible to all three development machines. This can be another machine you control, one of the development boxes available to all of the others, or an external host - you can get a dedicated virtual machine from cloud providers for as little at $10/month, or if your project is simple enough use a plain web host even more cost-effectively.
Source Code Manager
A source code manager is a good start - SVN is common and free. Git is another and can even be set up to do remote deployments. These tools will give you two benefits:
Shared code between all environments, always up to date
Protection against data loss and error - if something major breaks revert to a working copy
File Synchronization
Finally, a tool like Dropbox can synchronize your files across all three systems as you make changes. Again, this one is free and can be installed on most operating systems.