We've been battling this problem for some time now, and can't seem to find a perfect solution that would satisfy all the requirements of making life easier for developers.
Right now we have the following setup:
Linux development server (as everything we produce runs on linux, and it uses some linux-specifix libraries)
Windows desktops (as the office network is on windows)
Every developer has a home folder on the dev server with a virtual host set up to run their code. This folder is shared using Samba.
Zend Studio IDE that is set up to use that location (as a network drive) to work on projects
Remote debugging to be able to run applications on the dev server and be able to step through the code
So the main problem we are having is that everything is slow...
Zend is slow to index the project, as it has quite a bit of files (including externals like full framework) that need to be transferred through SMB.
Remote debugging is slow, as Zend studio needs to fetch the file, then send it back to the server to run it (running "Local if available, else server"; otherwise breakpoints don't work)
Tortoise SVN is slow to get file status for the commit (command line remedies the problem, but it's much less user friendly, especially with more complicated things like conflict resolution while merging)
Branching out to any of the solutions that would have multiple server configurations brings up a problem that there is a chance of having different configurations everywhere, which will introduce additional layer of uncertainty and possibly bugs in production.
Development and debugging under windows is not possible because of linux dependencies in the code (like POSIX functions).
So how do organizations solve these problems? What kinds of set up are you using? What kinds of problems are you facing, and how to you resolve them?
One solution that works in some situations is to :
Have the code on your local disk, on the physical computer running windows
This code is the one you're modifying with your IDE
So, IDE is working as fast as possible : no SMB access for each file.
Also have the code on the Linux server
So Apache runs fast : the code is present on the server
Use some kind of synchronisation mecanism, to push every modification made on a file on the Windows machine to the Linux server, via the SMB share.
Using Eclipse, the FileSync plugin does a good job, over the SMB share.
WinSCP can also be used, to keep a remote and local folder synchronized, over an SSH connexion
Advantages :
All local operations are fast
All server operations are fast
Drawbacks :
You must always use the tool that ensures synchronisation (For instance, with FileSync, everything must be done in Eclipse -- and nothing in any other software)
Note : for SVN, no need to use Tortoise : there are plugins that integrate into Eclipse (Subversive, for example)
Not sure about debugging
Modifications done directly on the Linux machine might not (depending on the solution) get synchronized to the windows desktop.
Still, the best (fastest and most powerful) solution is generally to use only one computer -- that would run Linux, in your case, and not Windows.
Your tools will most likely work under Linux
If needed, you can install Windows in a Virtual Machine, for some software that don't run on Linux
It'll encourage everyone in your team to know Linux better ; which is always useful, when your production environment is not Windows ;-)
Related
I am currently working alongside a project team for the development of a website and we are using SmartFtp for file sharing.
Does anyone know how to compile/edit php files through SmartFtp? I.e. using apache for compiling and atom for editing.
Please note: I have already tried copying the files into the htdocs folder within xampp but had no luck. The php files did not successfully copy into the htdocs folder.
Thanks again
Fair warning...
This is a terrible way to host a project. Each developer should have their own isolated project installation. Even though you only have a few developers, it's only a matter of time before you get a collision and somebody loses work.
Using FTP is also a terrible idea. It is completely insecure.
That said, you've got a couple options:
If your dev server is a *nix flavor, you can probably use SSHFS to mount the remote directory on your local machine. This will allow you to edit the remote files live, as if they were any other regular local file. This is secure and relatively easy to set up, but you may find it a bit too slow for anything but small projects.
Use FTPS/SFTP/SCP to push files to the remote server when you save them locally. I'm not terribly familiar with Atom, but many IDEs (like NetBeans and PHPStorm) can be configured to automatically FTPS/SFTP/SCP push changed files to remote servers. Just save locally as you normally would, and in the background, the IDE will perform an FTPS/SFTP/SCP push. Do not use FTP. If your server has FTP configured, it probably also has SFTP and/or FTPS configured.
Create your own development environment. Host your own site on your own machine so that you don't collide. You can run Apache/Nginx/PHP/MySQL directly on your machine, in a virtual box, or even a docker container. This is the best and most flexible option, but also requires the most effort to get running.
This should be a comment, but its a bit long.
I am really confused by your question.
how to compile/edit php files
PHP uses a run-time compiler. Are you talking about Roadsend or HipHop or something else?
development of a website and we are using SmartFtp for file sharing
Presumably you don't give a damn about your code integrity, managing conflicting code changes, version control or the security of your development environment. It's 2017. FTP was way past its sell-by date before the turn of the millenium.
compile/edit php files through SmartFtp
It's an FTP client. Just one of many things you don't use to compile or edit files (others include an avacado, a tennis shoe, scissors, a sunset...).
Presumably you are using this client to connect to a server - which you've told us nothing about. You probably want to do the collaborative bit of your code management (if that is what you are asking) on the server.
(from comments)
Development sever with multiple people pushing edits ad hoc while they code through an Ftp server.
That's not a "development server" that's a recipe for code armageddon.
What I have so far:
Two VirtualBox LAMP machines (separate locations) where I connect my two Windows development machines via SFTP, to write code using PHPStorm.
One VPS machine where I deploy my code written for a project in Laravel.
What I am trying to achieve:
Fast and easy code deployment, as in: write the code in Windows via PHPStorm, test it on the LAMP machine, deploy to VPS if necessary.
The problem is that I need to use some php artisan commands on LAMP machines to get some code generated. This means that I always have to synchronize PHPStorm with LAMP files tree in order to see the changes. Then, I need to also sync the other dev (LAMP) machine and the other PHPStorm running on Windows machine number 2. I know that this can be done via Git.
So everytime I use the command line to generate code, I need to sync 4 machines (excluding the deployment server).
Later, if I add another pair of Windows/LAMP dev machines, the complexity grows.
Back in the days of Dreamweaver, I could write code directly on the deployment server. Not the greatest idea, but it was way much simpler and faster, and that's what I need now.
Any ideas on how can I simplify this?
Switching to WAMP so I can have files in sync with PHPStorm (because all is locally) is not OK because... Windows and PHP libraries issues :)
Also, switching to Dreamweaver is not OK either.
What other options do I have?
Thanks!
LE: on the side, I am also thinking if a NAS can be helpful for this type of problem.
LLE: is Linux Desktop + PHPStorm the only straight solution?
No matter the protocols
You can put your code outside of the guest machine and configure the guest machine to mount the code (mount a folder from the host inside the guest).
Apache will run slower because it will use the mounted remote-folder but PHPStorm will run at its maximum speed.
Taking protocols into consideration
Try using SSHFS on windows.
I use SSHFS for remote development from my Linux host machine to headless vagrant boxes (and/or to remote development servers / staging servers).
It's much faster than SAMBA (the windows SMB protocol) and oddly faster than NFS even though SSH uses encryption.
Coleages using Windows+SMB often leave their computers for 30 min while PHPStorm is indexing and git branch changes on the dev machine go unnoticed for minutes at a time.
Indexing over SSHFS usually takes less than 5 min on a Symfony2 project. Branch changes are detected in less than 15s.
Using Linux (shameless plug)
Linux is nice, and it's free, and it works out of the box (Ubuntu) -- including pesky USB-Modems which would normally require an install on windows.
You already know how to handle a Linux CLI the learning curve is already halfway crossed.
Auto-updates don't rule your life, they're not the king of you!
All the applications you need are part of the software repositories you don't need to look for anything, download 40+ executables and attempt to install them just to be welcomed by an error "invalid architecture", "windows version not supported", ".NET framework version too old", "DirectX version too new (wtf?)", "your cousin is a software pirate".
Dependency management is a concept Linux never fully solved -- but at least the bloody tried and in 90% of consumer use-cases it fits the bill. Windows is still eating glue at the back of the class.
How I solved the problem:
I have an extra Mac, on which I installed everything for my PHP ecosystem, including the IDE, so everything is local. That's the dev machine. Then I manually copy the code to the VPS when needed. Another solution was to install Ubuntu Desktop (or similar) on dual boot with Windows and use it as a local dev environment.
Much faster development / deployment :)
I recently launched a service, meaning I can no longer work directly on the site, or I do so at a risk.
I haven't been able to find any "standard" or "best" way to make a development server. The two things I have seen are
a) Using a GIT or SVN to host the data (this doesn't quite solve my problem, I need to be able to develop somewhere, preferably not my home computer)
b) Capistrano (for Rails, is there something for PHP?)
The current solution I'm looking at is putting a complete copy of the server on "development.domain.com", which would then allow me to work on everything, and I can simply copy the files over to the main section.
Is this a workable solution? What's the optimal solution? (Separate server, special tools, etc.)
EDIT
This system be developed by a number of developers. The server settings have been tweaked considerably to allow for the full functionality and security of the system. Having the development on my own computer is not a workable solution, nor on an intranet type of system as none of our programmers are in the same location.
I'm looking for an on-a-server solution.
Three suggestions:
1) You are on the right track with making sure that your source code is in some form of source control (git and svn are both excellent choices). This should be priority #1.
2) Have EVERYTHING that has deviated from a standard configuration be in some form of source control. This means your apache configs, your php.ini, database configs, etc. Then when you setup your staging and dev servers you can be (relatively) assured that everything is the same across all of your servers, otherwise you are just guessing.
3) Look into some sort of build scripts, either ant, phing, or anything that you can use to reliable build your environment from scratch on any machine.
There are tons of other things you can do, but if you implement these three you'll be well on your way to having the ability to easily setup a dev/staging server. This will also give you the added benefit of ensuring that all of your developers have a similar environment when doing their development as well.
http://www.wampserver.com/ for windows
or
www.mamp.info for mac
or
load up a VM
Personally, I do my programming on a mac, running VMWare with suse or redhat for the server test environment. I've used mamp in the past and it works well; but sometimes I like to work in a real operating system.
That, or setup a physical test server. PHP / (choice of DB) now adays runs on anything (mac, windows, linux)
Depending on what how you want to do it, you could install VMWare right on the production server and dev in there; that is, if you run the server yourself. If your collocated or on shared hosting, you probably can't do that.
-Mario
Your development, staging, and production environments should be exactly the same otherwise you risk the change of something bombing as you move between environments. Obviously your development environment will have development settings (e.g. PHP's display_errors on, possibly a remote debugger, etc) but otherwise they should be as identical as possible.
As everyone else has mentioned if you aren't using version control you as asking for trouble. Not only is this a good practice for development but also eases deployment between your different environments. This is especially true when there are multiple developers working on a project.
I started to use a git repository as starting point. Main development is done on my local mac. At a certain point I push the changes and pull them on a development server where further testing is done. If everything is fine, I pull things on the development server. This is more or less the way, capistrano works, I think. I wrote a central script for these tasks, so I can update development or production servers with a single comand.
When developing PHP applications, it's best to have a server you develop/test on, and then a live server you put everything once it's ready.
OK, but how?
If you are hosting through a hosting company how can you setup your own development server to test on that mimics all the LAMP settings as your live server? Because if they differ then testing on one that isn't identical to the live one, defeats the purpose right?
Is it better use another server through the same hosting company and ask them make both the development and live ones have the exact same settings?
Also what is the best work-flow to use to check files out from "live server" work on them in the "development server", then check them back into the live server?
Thanks!!
Two points from my daily work:
XAMPP is your one-stop shop for setting up a Apache/mySQL/PHP stack on Windows. I develop with it and deploy to Linux machines, no problem.
If you want to set up a Linux environment on a home server or virtual machine, I asked a question a while ago that may interest you: Pre-installed Linux for Web Developers?
Is it better use another server through the same hosting company and ask them make both the development and live ones have the exact same settings?
If you can afford a second server, it may well be the best way to go. On the other hand, a local machine you could upgrade and fiddle around with at will, and all that at a fraction of the long-term cost of a second rented server. If in doubt, I would go for a local machine.
Don't forget PHP is a very portable language. If you don't use any specific command-line tools or totally exotic extensions, making a PHP application work across Linuxes, and even on Windows is a question of some settings and details, but not really a big deal any more.
Also what is the best work-flow to use to check files out from "live server" work on them in the "development server", then check them back into the live server?
There are many, many opinions and practices in this field. For me personally, the following workflow has turned out to be ideal wherever I've used it - I am still in the process of implementing this myself in all projects and for all clients.
Edit files locally in IDE
Upload to development server via built-in FTP function of IDE
Test on development server
Once a feature is tested and works on the development server (i.e. it's "finished"), check the whole package into a Subversion (or other) repository
On the live server, have a build script check out the latest revision from the repository, download it to a directory with the revision number, and when finished, change a symbolic link that pointed to the previous revision to the latest one.
That way, every change you make to the live environment is logged in the version control system, and reverting to the previous revision is a question if seconds. To me, this was a huge relief compared to working with pure FTP everywhere.
Possibly also interesting question: Setting up a deployment / build / CI cycle for PHP projects
You can check all the production server's setup via phpinfo() and copy them on your development environment, no need for them to be on the same provider.
I usually commit the code to source control, and checkout in the production environment, hiding all the repository information via .htaccess, for example, see here.
Another (less recommended) option is to just have your master source in the development machine, and once it's ready FTP it up, there are various free tools that will only upload changed files.
As for the server side, you have multiple possibilities. You could use vHosts when you have Apache, with two different DocumentRoots: one for the live version, and one for development.
Or you can have the development environment on your local machine, and then the live (+ staging) on your dedicated server / webspace whatsoever.
In our current project we have a three-tier system:
development, staging and live. Staging and live are really almost the same, so that I can eliminate any problems when rolling out from dev to staging. It gives me another security layer before rolling out to live and eventually noticing that something went wrong.
Considering the workflow for rolling out, you should create an application config, where you can define several application environments (development and production) that automatically choose their environment based on URLs, defined environment variables or something else. So, in Zend Framework for example, this configuration driven behavior is built into your applications. In your config.ini file, you have a template which looks like this:
[production]
[staging : production]
[testing : production]
[development : production]
In there you can define different options for, lets say, your database connection i.e.
So when you check your changes on the dev machine into subversion and do a rollout onto your live system, you do not have to change the configuration. It should just work.
As far as workflow goes, that's typically what happens for small sites. Depending on the size of the project, though, it might be a good idea to use version control like Git or Subversion.
You don't need to go anywhere near as far as asking a hosting company to setup two identical hosting environments for you. The majority of the time they have up to date versions of php, mysql and apache. I develop on a Linux machine, that has a lamp stack setup already, so my workflow is pretty seamless, and I use a svn with post-commit hooks to upload to the live server.
If you are worried about incompatibilities between you're 'dev' server and the hosting server, the easiest thing to do is make a phpinfo file,
<?php phpinfo(); ?>
and check that your hosting server does not prohibit any special functions you use on your dev server (and this is pretty rare that the hosting company blocks vital things, and if they do you can easily send support an email and 99% of the time they will assist you in enabling whatever particulars you require. But as far as setting up your dev environment, I would go down the track of grabbing virtualbox, and installing ubuntu, find a tutorial for making ubuntu a web server (seriously only a few apt-get commands) and you will be smoking with gas !
We have various php projects developed on windows (xampp) that need to be deployed to a mix of linux/windows servers.
We've used capistrano in the past to deploy from windows to the linux servers, but recent changes in architecture and windows servers left the old config not working. The recipe works fine for the linux deployment, but setting up the windows servers has required more time than we have right now. Ideas for the Capistrano recipe are valid answers. obviously the windows/linux servers don't share users, so this complicates it a tad (for the capistrano assumption of same username/password everywhere).
Currently we're using svn-update for the windows servers, which i dislike, since it leaves all the svn files hanging on the production servers. (and we still have to manually svn-update them on windows) And manual updating of files using winscp and syncing the directories with their linux counterparts.
My question is, what tools/setup do you suggest to automatize this deployment scenario:
"Various php windows/linux developers deploying to 2+ mixed windows/linux machines"
(ps: we have no problems using linux tools or anything working through cygwin, we simply need to make deployment a simple one-step operation)
edit: Currently we can't work on a all-linux enviroment, we have to deploy to both linux and windows server. We can start the deploy from anywhere, but we'd prefer to be able to do it from either enviroment.
I use 4 different approaches depending on the client environment:
Capistrano and similar tools (effective, but complex)
rsync from + to Windows, Linux, Mac (simple, doesn't enforce discipline)
svn from + to Windows, Linux, Mac (simple, doesn't enforce discipline)
On-server scripts (run through the browser, complex)
There are some requirements that drive what you need:
How much discipline you want to enforce
If you need database (or configuration) migrations (up and/or down)
If you want a static "we're down" page
Who can do the update
Configuration differences between servers
I strongly suggest enforcing enough discipline to save you from yourself: deploy to a development server, allow for upward migrations and simple database restore, and limit who can update the live server to a small number of responsible admins (where the dev server is open to more developers). Also consider pushing via a cron job (to the development server), so there's a daily snapshot of your incremental changes.
Most of the time, I find that either svn or rsync setups are enough, with a few server-side scripts, especially when the admin set is limited to a few developers.
This will probably sound silly but... I used to have this kind of problem all the time until I decided in the end that if I'm always deploying on Linux, I ought really to at least try developing on Linux also. I did. It was pain free. I never went back.
Now. I am not suggesting this is for everyone. But, if you install VirtualBox you could run a Linux install as a local server on your windows box. Share a folder in the virtual machine and you can use all your known and trusted Windows software and techniques and have the piece of mind of knowing that everything is working well on its target platform.
Plus you'll be able to go back to Capistrano (a fine choice) for deployment.
Best of all, if you thought you knew Linux / Unix wait until you use it everyday on your desktop! Who knows you may even like it :)
Capistrano is the nicest deployment tool I've seen. Do the architecture changes make it impossible to fix the configs so it works again?
Why you can't use capistrano anymore?
Why you dislike svn-update?
What things in your app requires an special deployment ?
You can setup svn:ignore property on configuration files, so that svn update doesn't erase them, and then use svn export /target/path/ to get rid of .svn files in your Subversion repository.