I'm hoping someone have encountered this problem before and will be able to help
So, the problem goes like this: There's 2 developers, each one of them needs 2 versions of a website (development + production). Due to very different operating systems, triple configuration difficulities and administration restrictions on them it's impossible to run projects locally. But there's a bunch of problems while running them remotely:
How to run Symfony2 CLI commands in Netbeans (since those commands need access to db)?
How to differentiate between production/development? I have allowed external access to db for 2 IP addresses, but theres only one parameters.ini file. And once it's used to connect to localhost (when run by http), and second it's used to connect via remote host (when CLI command is run in netbeans)
More info you may ask, so here it goes:
Dev #1: Kubuntu 12.04 x64 3.2.0-25-generic NB#7.1.2
Dev #2: Windows 7 x64 NB#7.1.2
Server: Ubuntu Server 12.04 x64
SQL: PostgreSQL
webserver: Apache2
Our workflow now looks (or at least should look) something like this:
Dev #1 does something locally, but on run changes are being moved to the server so he can check how it went on beta1.sitename.com.
Each change is commited to user branch in git when suitable.
After testing changes are baing merged to master, tested, pulled to a sitename.com, and then tested again.
After this the cycle repeats :)
PS. Valid answer would also be a hint how proper workflow in that kind of situation should be. I've tried already with post-* git handles, and it didn't work really well either...
First of all I would highly recommend trying to solve the development machine issue. There is vagrant which can be used together with chef or puppet (or a specialiced virtual base machine) to move the development environment into a virtual machine executed on the developers pc. This would also solve many issues regarding the remote server.
If this cannot be done, here are some thoughts:
Netbeans commands won't work remotly. Have your developers ssh into the machine and execute their commands their.
I don't get your development/production environment problem. Their should a at least on virtual host with different config/cache/logs for each developer so the configs can be set correctly. The parameters.ini should not be in your git repository (You can handle this by creating a parameters.ini.dist and ignoring the parameters.ini file) so you can have different parameters.
Another interesting thing (presented on the symfony live 2012 Paris) is that you can do SET_ENV SMFONY_PARAMETER_NAME inside your apache vhost and then use %parameter.name% inside your config files (mind the two underscorse). This could be usefull in your case.
There really is no workflow I know of which could handle multiple developers on one machine with different configs and the like. It's just a mess and you either solve your problems with complex scripts which are run everytime something happens or by finding a better solution like virtual machines or different vhosts with different directories on your servers.
Related
I've successfully deployed the Laravel application to Heroku.
It works online.
But when I try to run "heroku local" I get:
vendor/bin/heroku-php-apache2: No such file or directory
Which makes sense, since looking into "vendor/bin", the only thing listed is:
psysh -> ../psy/psysh/bin/psysh
So, where's my heroku-php-apache or how do I fix this?
You should have these lines in your composer.json
"require-dev": {
"heroku/heroku-buildpack-php": "*"
}
be sure to run composer update after you add them.
After extensive research, trial and error and talking to the Heroku Support team, I found out that, although Slow Loris's answer was a part of the process, the following answer was given to me by Heroku's Support:
To cut a long story short, heroku local is not officially supported for PHP >applications. The reason is that unlike all the other languages we support on the >platform, PHP has no web servers written in userland. Instead, we use PHP-FPM >together with Apache or Nginx, and the boot scripts (vendor/bin/heroku-(php|hhvm)-(apache2|nginx)) dynamically inject the correct configuration for port >binding and the FastCGI comms sockets.
This works with vanilla PHP and Apache builds, provided that:
1) the current user has all the correct permissions (in your case, >/var/log/apache2/ isn't writable);
2) the correct proxy modules are loaded in the main httpd.conf;
3) the main httpd.conf doesn't bind to a port at all, or at least not to one >under 1024 (which are reserved for superusers).
The main config also needs to be handled by each user on their own, because >sometimes, the modules to be loaded are in libexec/, sometimes in >lib/apache2/modules/, and so forth. Just too many variations; otherwise, we could >ship a full Apache config to users and the experience would be much better.
But the problems don't end there. FPM does not work at all on Windows, and on >most Linux systems, httpd is not a command that works; instead, apache2ctl >handles starting and stopping, and thus, running a server in the foreground is >not possible. In the end, there are simply too many possible permutations in >system configs that make it impossible to ensure every user has a great >experience.
It's simply the current reality in PHP land. Ruby, Python, Node, Java all have >web servers that are written in each respective language, and you don't need >external servers. Which also makes it possible to stream file uploads, handle web >socket upgrades, and so forth. Maybe with PHP 7 we'll see something like that >emerge soon (in PHP 5 it's simply not feasible at all, because a fatal error >kills the engine, so your web server would be gone too).
I know this question is a little dated but I recently deployed a heroku app for the first time and was unable to get heroku local to work for me. I'm on the current branch of Laravel which is 5.8, I am also on Windows 10 using VS Code. I searched all over trying to rectify this issue and could not get it to work no matter what.
I did come up with a solution to be able to work on this locally with only a few lines in terminal.
In VS Code, I used gitbash terminal, once in my heroku project folder composer require laravel/homestead --dev, once that is complete, then we need to install homestead, vendor/bin/homestead make, and then once that is complete, simply run vagrant up and your app will be accessible through localhost:8000.
Docs - https://laravel.com/docs/5.8/homestead
Hope this helps someone!
What I have so far:
Two VirtualBox LAMP machines (separate locations) where I connect my two Windows development machines via SFTP, to write code using PHPStorm.
One VPS machine where I deploy my code written for a project in Laravel.
What I am trying to achieve:
Fast and easy code deployment, as in: write the code in Windows via PHPStorm, test it on the LAMP machine, deploy to VPS if necessary.
The problem is that I need to use some php artisan commands on LAMP machines to get some code generated. This means that I always have to synchronize PHPStorm with LAMP files tree in order to see the changes. Then, I need to also sync the other dev (LAMP) machine and the other PHPStorm running on Windows machine number 2. I know that this can be done via Git.
So everytime I use the command line to generate code, I need to sync 4 machines (excluding the deployment server).
Later, if I add another pair of Windows/LAMP dev machines, the complexity grows.
Back in the days of Dreamweaver, I could write code directly on the deployment server. Not the greatest idea, but it was way much simpler and faster, and that's what I need now.
Any ideas on how can I simplify this?
Switching to WAMP so I can have files in sync with PHPStorm (because all is locally) is not OK because... Windows and PHP libraries issues :)
Also, switching to Dreamweaver is not OK either.
What other options do I have?
Thanks!
LE: on the side, I am also thinking if a NAS can be helpful for this type of problem.
LLE: is Linux Desktop + PHPStorm the only straight solution?
No matter the protocols
You can put your code outside of the guest machine and configure the guest machine to mount the code (mount a folder from the host inside the guest).
Apache will run slower because it will use the mounted remote-folder but PHPStorm will run at its maximum speed.
Taking protocols into consideration
Try using SSHFS on windows.
I use SSHFS for remote development from my Linux host machine to headless vagrant boxes (and/or to remote development servers / staging servers).
It's much faster than SAMBA (the windows SMB protocol) and oddly faster than NFS even though SSH uses encryption.
Coleages using Windows+SMB often leave their computers for 30 min while PHPStorm is indexing and git branch changes on the dev machine go unnoticed for minutes at a time.
Indexing over SSHFS usually takes less than 5 min on a Symfony2 project. Branch changes are detected in less than 15s.
Using Linux (shameless plug)
Linux is nice, and it's free, and it works out of the box (Ubuntu) -- including pesky USB-Modems which would normally require an install on windows.
You already know how to handle a Linux CLI the learning curve is already halfway crossed.
Auto-updates don't rule your life, they're not the king of you!
All the applications you need are part of the software repositories you don't need to look for anything, download 40+ executables and attempt to install them just to be welcomed by an error "invalid architecture", "windows version not supported", ".NET framework version too old", "DirectX version too new (wtf?)", "your cousin is a software pirate".
Dependency management is a concept Linux never fully solved -- but at least the bloody tried and in 90% of consumer use-cases it fits the bill. Windows is still eating glue at the back of the class.
How I solved the problem:
I have an extra Mac, on which I installed everything for my PHP ecosystem, including the IDE, so everything is local. That's the dev machine. Then I manually copy the code to the VPS when needed. Another solution was to install Ubuntu Desktop (or similar) on dual boot with Windows and use it as a local dev environment.
Much faster development / deployment :)
I started trying to create a website which uses PHP on an old computer (previously used by another programmer).
I wanted to test my PHP code without uploading it each time, so I downloaded Apache and installed it. I was starting to set Apache up when I discovered this computer already had Apache on it.
Now I had multiple versions of Apache, so I went into add/remove programs and got rid of Apache (which only showed up once in the list).
Unfortunately windows decided it would uninstall the old version and keep mine which was not functioning properly. Also for whatever reason it seemed to have kept a good few files from the old version, but not enough that I could actually use it in any way. I believe it just had some configurations files.
I thought I would copy my files from the new version into the old version and not replace anything so hopefully I would be able to run under the older configurations, but that didn't work.
At this point I just wanted to cut my losses, so I put all the versions of Apache in an archive so there was no way the computer could be using them. I also removed Apache from the windows startup and rebooted the computer after configuring one single version of the newer copy of Apache to supposedly run PHP.
The problem is upon startup I could immediately log in to localhost and apache was already running. Also when I opened Apache manually from the files I had left unzipped, it only gives me the option to start Apache (not an option to stop or restart implying it is not running) and when I click it, it says "The requested operation has failed!" which is less than helpful.
So anyway, I just want to be able to run PHP locally and now I don't feel like I can even successfully uninstall and start from scratch anymore. Does anyone know what I have to do to get this to work? Sorry for the long description, I wove such a tangled knot.
One way to solve is use a XAMP (Apache + MySQL + PHP) client like XAMPP http://www.apachefriends.org/en/xampp.html brings it all set up for you to use. The problem of conflict can be caused by any IDE (Netbeans for example) that already carries a version of Apache.
Try to configure config files of Apache case exist.
Assumption: Running Windows OS for development and linux for production environment.
My recommendation is to not mix the Windows and Linux worlds as while they can be made to behave after lots of work, it is usually more pain than good.
Instead, as a humble windows and linux user, download and install Virtualbox [https://www.virtualbox.org/wiki/Downloads], a free open source virtualisation tool.
Then download a linux distribution of your choice and install that into a new virtual machine.
Configure the linux tools inside linux and leave your windows machine relatively untouched.
A useful linux service to install would be Samba - windows file sharing - you can use this to edit your code in windows using any IDE of your choice, while saving directly to linux and testing through linux. When happy, upload from the linux system (again like any other file uploader) and all will be well.
If you are deploying to a linux based environment in your production service then this will help you avoid common mistakes such as case-sensitivity trouble and many others.
Building and running this system is free and it will help teach you more about the linux environment you are deploying to also.
Equally, when you don't have the virtual machine booted, there's no services lying around exposing your computer to possible local network threats and consuming resources - as opposed to installing Apache on Windows where it will be using some resources all the time.
We've been battling this problem for some time now, and can't seem to find a perfect solution that would satisfy all the requirements of making life easier for developers.
Right now we have the following setup:
Linux development server (as everything we produce runs on linux, and it uses some linux-specifix libraries)
Windows desktops (as the office network is on windows)
Every developer has a home folder on the dev server with a virtual host set up to run their code. This folder is shared using Samba.
Zend Studio IDE that is set up to use that location (as a network drive) to work on projects
Remote debugging to be able to run applications on the dev server and be able to step through the code
So the main problem we are having is that everything is slow...
Zend is slow to index the project, as it has quite a bit of files (including externals like full framework) that need to be transferred through SMB.
Remote debugging is slow, as Zend studio needs to fetch the file, then send it back to the server to run it (running "Local if available, else server"; otherwise breakpoints don't work)
Tortoise SVN is slow to get file status for the commit (command line remedies the problem, but it's much less user friendly, especially with more complicated things like conflict resolution while merging)
Branching out to any of the solutions that would have multiple server configurations brings up a problem that there is a chance of having different configurations everywhere, which will introduce additional layer of uncertainty and possibly bugs in production.
Development and debugging under windows is not possible because of linux dependencies in the code (like POSIX functions).
So how do organizations solve these problems? What kinds of set up are you using? What kinds of problems are you facing, and how to you resolve them?
One solution that works in some situations is to :
Have the code on your local disk, on the physical computer running windows
This code is the one you're modifying with your IDE
So, IDE is working as fast as possible : no SMB access for each file.
Also have the code on the Linux server
So Apache runs fast : the code is present on the server
Use some kind of synchronisation mecanism, to push every modification made on a file on the Windows machine to the Linux server, via the SMB share.
Using Eclipse, the FileSync plugin does a good job, over the SMB share.
WinSCP can also be used, to keep a remote and local folder synchronized, over an SSH connexion
Advantages :
All local operations are fast
All server operations are fast
Drawbacks :
You must always use the tool that ensures synchronisation (For instance, with FileSync, everything must be done in Eclipse -- and nothing in any other software)
Note : for SVN, no need to use Tortoise : there are plugins that integrate into Eclipse (Subversive, for example)
Not sure about debugging
Modifications done directly on the Linux machine might not (depending on the solution) get synchronized to the windows desktop.
Still, the best (fastest and most powerful) solution is generally to use only one computer -- that would run Linux, in your case, and not Windows.
Your tools will most likely work under Linux
If needed, you can install Windows in a Virtual Machine, for some software that don't run on Linux
It'll encourage everyone in your team to know Linux better ; which is always useful, when your production environment is not Windows ;-)
We have various php projects developed on windows (xampp) that need to be deployed to a mix of linux/windows servers.
We've used capistrano in the past to deploy from windows to the linux servers, but recent changes in architecture and windows servers left the old config not working. The recipe works fine for the linux deployment, but setting up the windows servers has required more time than we have right now. Ideas for the Capistrano recipe are valid answers. obviously the windows/linux servers don't share users, so this complicates it a tad (for the capistrano assumption of same username/password everywhere).
Currently we're using svn-update for the windows servers, which i dislike, since it leaves all the svn files hanging on the production servers. (and we still have to manually svn-update them on windows) And manual updating of files using winscp and syncing the directories with their linux counterparts.
My question is, what tools/setup do you suggest to automatize this deployment scenario:
"Various php windows/linux developers deploying to 2+ mixed windows/linux machines"
(ps: we have no problems using linux tools or anything working through cygwin, we simply need to make deployment a simple one-step operation)
edit: Currently we can't work on a all-linux enviroment, we have to deploy to both linux and windows server. We can start the deploy from anywhere, but we'd prefer to be able to do it from either enviroment.
I use 4 different approaches depending on the client environment:
Capistrano and similar tools (effective, but complex)
rsync from + to Windows, Linux, Mac (simple, doesn't enforce discipline)
svn from + to Windows, Linux, Mac (simple, doesn't enforce discipline)
On-server scripts (run through the browser, complex)
There are some requirements that drive what you need:
How much discipline you want to enforce
If you need database (or configuration) migrations (up and/or down)
If you want a static "we're down" page
Who can do the update
Configuration differences between servers
I strongly suggest enforcing enough discipline to save you from yourself: deploy to a development server, allow for upward migrations and simple database restore, and limit who can update the live server to a small number of responsible admins (where the dev server is open to more developers). Also consider pushing via a cron job (to the development server), so there's a daily snapshot of your incremental changes.
Most of the time, I find that either svn or rsync setups are enough, with a few server-side scripts, especially when the admin set is limited to a few developers.
This will probably sound silly but... I used to have this kind of problem all the time until I decided in the end that if I'm always deploying on Linux, I ought really to at least try developing on Linux also. I did. It was pain free. I never went back.
Now. I am not suggesting this is for everyone. But, if you install VirtualBox you could run a Linux install as a local server on your windows box. Share a folder in the virtual machine and you can use all your known and trusted Windows software and techniques and have the piece of mind of knowing that everything is working well on its target platform.
Plus you'll be able to go back to Capistrano (a fine choice) for deployment.
Best of all, if you thought you knew Linux / Unix wait until you use it everyday on your desktop! Who knows you may even like it :)
Capistrano is the nicest deployment tool I've seen. Do the architecture changes make it impossible to fix the configs so it works again?
Why you can't use capistrano anymore?
Why you dislike svn-update?
What things in your app requires an special deployment ?
You can setup svn:ignore property on configuration files, so that svn update doesn't erase them, and then use svn export /target/path/ to get rid of .svn files in your Subversion repository.