I'm developing from my Windows laptop but need to test my development on my shared Linux hosting. I have thrown together the following in place of the normal $application_path = "application"; line.
$env['laptop']['ip'] = '127.0.0.1';
$env['laptop']['host'] = 'MATT-WINDOWS7';
$env['laptop']['path'] = 'private';
$env['mattpotts']['ip'] = '12.34.56.78';
$env['mattpotts']['host'] = 'my.webhost.com';
$env['mattpotts']['path'] = '../private/blah';
$ip = $_SERVER['SERVER_ADDR'];
$host = php_uname('n');
foreach($env as $e)
if($e['ip'] == $ip && $e['host'] == $host)
$application_folder = $e['path'];
unset($env);
if(!isset($application_folder))
die('application folder not set');
...which works fine for setting the application path but now I'm running into trouble with the need for a database config for each environment.
I can make it work with a few simple ifs but I was wondering if there's a best practice solution to this one.
Cheers
Use revision control such as Subversion. Have one configuration file deployed to your test environment and a modified version checked out in your development environment. Simply tell your client to not commit those configuration changes so they don't make it to your testing/production environment.
This is definitely the best practice solution :)
P.S. If you don't feel like setting up a Subversion server, there's always hosted solutions like Beanstalk and if you're on Windows, TortoiseSVN is a slick client.
If you don't feel like setting up subversion you can always detect what site you're on by looking at SERVER_NAME. In a CI site in the past I used the following within my config.php to figure out dev vs production servers:
if ($_SERVER['SERVER_NAME'] == 'www.mysite.com') {
$config['log_path'] = '/var/log/site/';
} else {
$config['log_path'] = '/var/log/dev_site/';
}
You can use this anywhere you need to have different variables based on environment. That being said hard-coding stuff like this into your code isn't always the best idea.
Here's a nice clean solution to running multiple production environments on a single codebase:
http://philsturgeon.co.uk/news/2009/01/How-to-Support-multiple-production-environments-in-CodeIgniter
As Phil's solution is no longer accessible, here's another solution that'll provide you with the exact same solution.
Here's the link to the Git repo: https://github.com/jedkirby/ci-multi-environments and this is a brief explanation of the module: http://jedkirby.com/blog/2012/11/codeigniter-multiple-development-environments
Phil Sturgeons page has moved to here:
http://philsturgeon.co.uk/blog/2009/01/How-to-Support-multiple-production-environments-in-CodeIgniter
Related
Laravel 4.1 removed the feature to use the domain for detecting what environment the app is running in. Reading the docs they now suggest using host names. However, to me that seems cumbersome if you are working in a team. Should everyone change the bootstrap/start.php file and add their own host name to be able to run the app in a dev environment? Also, what if you want to have two different environments on the same machine?
How to best detect the environment if you are working in a team in Laravel 4.1+?
Here is my settings from bootstrap/start.php file:
$env = $app->detectEnvironment(function() use($app) {
return getenv('ENV') ?: ($app->runningInConsole() ? 'local' : 'production');
});
Instead of default array, this method in my case is returning closure with ternary. That way I got more flexibility in choosing desired environment. You can use switch statement also. Laravel will read return value and configure itself.
With getenv native function, I am just listening for a given environment. If my app is on the server, then it will "pick" server configurations. If locally, then it will pick local (or development)
And don't forget to create custom directories for you environemnts in app/config
There is also testing env, which is choosen automatically, every time you are unit testing app.
Laravel makes working with environments really fun.
UPDATE:
With environments we are mostly cencerned about db credentials.
For production I use Fortrabbit, so when configuring new app on server, fortrabbit is generating those values for me. I just need to declare them. e.g. DB of just database or db... Or DB_HOST or HOST ...
Locally, those values are the one you use for your localhost/mysql settings.
Update:
In Laravel 5.0 environment detection is no longer needed in the same way. In the .env file you can simply have a variable for which environment the app should run in.
Old answer for Laravel <= 4.2
What I ended up doing is very close to what carousel suggested. Anyway, thought I would share it. Here is the relevant part of our bootstrap/start.php file:
$env = $app->detectEnvironment(function ()
{
if($app->runningInConsole())
return "development";
$validEnvironments = array("development", "staging", "production");
if (in_array(getenv('APP_ENV'), $validEnvironments)) {
return getenv('APP_ENV');
}
throw new Exception("Environment variable not set or is not valid. See developer manual for further information.");
});
This way all team members have to declare an environment variable somewhere. I haven't really decided if throwing an exception if the environment variable is not set or just default to production is the best thing. However, with the above, it's easy to change.
For me, I just use 'dev' => '*.local' and it works. I haven't 100% tested in a team situation but I think it'd work (big assumption alert:) assuming you're on OSX and get the default Alexs-iMac.local-like hostname.
As for faking an environment, I'm not sure it's really supported. It'll be doable, but in general the whole point of environments is that dev has entirely different needs to production and the two are mutually exclusive. Having the ability to switch on one physical environment seems counter to that goal.
Laravel 4.1 and 4.2 detects the environments through the machine names specified in the "bootstrap/start.php" file.
For example, in my case the config becomes:
$env = $app->detectEnvironment(array(
'local' => array('Victor.local', 'Victor-PC'),
));
This means that Laravel will use the 'local' environment settings for both machines: 'Victor.local' (a Mac) and 'Victor-PC' (Windows).
This way you can regsiter several machines to work as local environment. Other environments can be registered too.
In order to know the current machine name, you can use the following PHP code:
<?php echo gethostname(); ?>
Hope it helps!
You can use something like this:
$env = $app->detectEnvironment(function(){
if($_SERVER['HTTP_HOST'] == 'youdomain_local')
{
return 'local';
}elseif($_SERVER['HTTP_HOST'] == 'youdomain_team')
{
return 'team';
}else{
return 'production';
}
});
what i did is, make dir app/config/local and use code
$env = $app->detectEnvironment(function(){
return $_SERVER['HTTP_HOST']=="localhost"?"local":"production";
});
For localhost and online.
I didn't like that production was default, so I made it anything other than live server will go to local configs:
in bootstrap/start.php :
$env = $app->detectEnvironment(function(){
if (gethostname() !== 'live_server_hostname'){
return 'local';
} else {
return 'production';
}
});
In bootstrap/start.php define this:
$env = $app->detectEnvironment(function() use($app) {
$enviromentsHosts = [
'localhost',
'local',
];
if ($app->runningInConsole() || in_array($app['request']->getHost(), $enviromentsHosts)) {
return 'local';
} else {
return 'production';
}
});
I believe it is better to use only the resources of Laravel 4
We discovered bugs in our site after deploying from our localhost development environment to our staging server.
We narrowed the bugs down to the following PHP code snippet:
$_SERVER['HTTP_HOST']
We use that to construct paths to various files used by our website.
For example:
// EXAMPLE #1 -- THIS WORKS ON OUR LOCAL MACHINE
$theSelectedImage = "http://" . $_SERVER['HTTP_HOST'] . "/ourWebSite/egyptVase.png";
// EXAMPLE #2 --BUT IT MUST BE THIS TO WORK ON THE STAGING SERVER
$theSelectedImage = "http://" . $_SERVER['HTTP_HOST'] . "/egyptVase.png";
Here is the crux of the problem:
on our localhost machine, $_SERVER['HTTP_HOST'] resolves to 'localhost' -- notice we then need to append our website folder name like I show in EXAMPLE #1 above. I.E. on our localhost machine, $_SERVER['HTTP_HOST'] is unaware of which website folder involved -- we have to append it.
but on our staging server, $_SERVER['HTTP_HOST'] resolves to 'www.ourWebSite.com'.
In that case we don't need to append anything -- the staging web server returns our website folder.
Have we thought of a couple kludgy inelegant 'snide-remarks-in-code-reviews' workarounds. Yes.
But I'm thinking there's a better way -- any ideas?
One option is to have a configuration file that would pull properly depending on what host it's on (normally based on APPLICATION_ENV). Zend Framework does this by loading the correct section of a .ini file corresponding to the APPLICATION_ENV in the apache config. You could use a PHP file with an array or anything else, really. All you would need to do is set the APPLICATION_ENV to be one thing on staging/production but something different on dev servers.
A simple example:
switch (APPLICATION_ENV) {
case 'dev':
define('APPLICATION_URL', $_SERVER['HTTP_HOST'] . '/ourWebSite');
break;
case 'production':
case 'staging':
define('APPLICATION_URL', $_SERVER['HTTP_HOST']);
break;
}
The config file is the correct answer.. I use an default and override set up.
In config.php
$config = array('url' => 'http://production.url'); // 'conf1'=> 'var1', etc
$dev_config = array();
#include_once('dev-config.php');
$config = array_merge($config, $dev_config);
In dev-config.php you add any overrides to the $dev_config
$dev_config['url'] = 'http://localhost/ourWebSite';
Then in production, I just delete the dev-config file. Super easy, works flawlessly.
If you want to keep it the way you have it, this can also be solved with an http.conf tweak on your dev box. You need to set your documentRoot to what ever it is now plus the /ourWebSite that way http://localhost/ will point to the same folder with in your code as http://production.url/
how about
function get_homepath(){
$host = $_SERVER['HTTP_HOST'];
if ( $host == 'localhost' ){ return $host . '/ourWebSite/'; }
return $host . '/';
}
Usually people set up different environment settings depending on if its a development environment or production environment. The reason for this, as you found out, is to easily move code around from development to production.
Instead of inserting a $_SERVER variable into the url directly, set up a condition that will check $_SERVER['HTTP_HOST'], find out what environment we are, and inject a different variable holding the real path for your scripts.
Is it possible to check if the website (php) is running locally or on a hosted server?
I want to enable some logs if the website is running locally and I don't want these to appear on the site online..
I can set a variable $local=1; but I'll have to change that before uploading.. is there anyway to automate this task?
Local Server : WampServer 2.0 / Apache
WebServer: Apache
Check $_SERVER['REMOTE_ADDR']=='127.0.0.1'. This will only be true if running locally. Be aware that this means local to the server as well. So if you have any scripts running on the server which make requests to your PHP pages, they will satisfy this condition too.
I believe the best approach is to 'fake' a testing mode, which can be done by creating a file in your local environment.
When I used this approach I created an empty text file called testing.txt and then used the following code:
if (file_exists('testing.txt')) {
// then we are local or on a test environment
} else {
// we are in production!
}
This approach is 100% compatible with any Operating System and you can use several test files in case you want a more granular approach (e.g. development.txt, testing.txt, staging.txt, or production.txt) in order to customise your deployment process.
You should automate deployment
This is not directly the answer to your question, but in my opinion the better way. In an automated deployment process, setting a variable like $local = true, like other configuration values (for example your db-connection), would be no manual, error prone, task.
Checking for 'localness' is in my opinion the wrong way: you dont want to show your logs to every local visitor (a Proxy may be one), but only when deployed in a testing environment.
A popular tool for automated deployment is Capistrano, there should be PHP-Centric tools too.
Just in case this is useful to anybody, I made this function as the above answers didn't really do what I was looking for:
function is_local() {
if($_SERVER['HTTP_HOST'] == 'localhost'
|| substr($_SERVER['HTTP_HOST'],0,3) == '10.'
|| substr($_SERVER['HTTP_HOST'],0,7) == '192.168') return true;
return false;
}
$whitelist = array(
'127.0.0.1',
'::1'
);
if(!in_array($_SERVER['REMOTE_ADDR'], $whitelist)){
// not valid
}
I have build this function that checks if current server name has name server records, normally local server don't has.
<?php
function isLocal ()
{
return !checkdnsrr($_SERVER['SERVER_NAME'], 'NS');
}
?>
Your remote server is unlikely to have a C drive! So I run with this:
//Local detection
$root = $_SERVER["DOCUMENT_ROOT"];
$parts = explode("/",$root);
$base = $parts[0];
$local = false;
if ($base == "C:") {
$local = true; //Change later for if local
}
I came across a similar question once on SO but didn't mark it and can't find it now.
Currently my init.php file looks like this:
$db_username = user;
$db_password = pass;
//$db_username = user213124; // webhost
//$db_password = pass214142; // webhost
and I alternately comment/uncomment those lines depending on if its running on my test machine's XAMPP installation or on my webhost.
Sometimes I forget, and upload the wrong one, which is always good for a laugh.
I'm sure there's a way to write a short function allowing init.php to detect its location and use the appropriate username/pass combo.
The first solution that popped into my head was a simple if(file_exists()) {} and checking for something that would ONLY be on my test machine. I'm guessing there's a better way that doesn't prompt disk access though, amirite?
Thank you.
PS - please edit my subject to make it more accurate. Tried my best. Kinda hard to describe...
There are several ways of accomplishing this, some may be better than others, I think it all depends on your situation.
One way is to check for something in $_SERVER. For example, you might user the server name:
if ($_SERVER['SERVER_NAME'] == 'my.dev.server') {
$user = 'dev_user';
$pass = 'dev_pass';
} else {
$user = 'other_user';
$user = 'other_pass';
}
You might also consider having a local configuration file, unique to whatever environment you're on. This could be an ini file or php file that just defines the above variables, and is included, but isn't under version control and doesn't get changed very often.
Sometimes people will also define environment variables. There are probably other ways as well.
I use the ServerAdmin directive of apache to identify my local development machine... Just look how you can set the ServerAdmin of your XAMP server and check
$_SERVER['SERVER_ADMIN']
I work at a small php shop and I recently proposed that we move away from using our nas as a shared code base and start using subversion for source control.
I've figured out how to make sure our dev server gets updated with every commit to our development branch... and I know how to merge into trunk and have that update our staging server, because we have direct access to both of these, but my biggest question is how to write a script that will update the production server, which we many times only have ftp access to. I don't want to upload the entire site every time... is there any way to write a script that is smart enough to upload only what has changed to the web server when we execute it (don't want it to automatically be uploading to the production enviroment, we want to execute it manually)?
Does my question even make sense?
Basically, your issue is that you can't use subversion on the production server. What you need to do is keep, on a separate (ideally identically configured) server a copy of your production checkout, and copy that through whatever method to the production server. You could think of this as your staging server, actually, since it will also be useful for doing final tests on releases before rolling them out.
As far as the copy goes, if your provider supports rsync, you're set. If you have only FTP you'll have to find some method of doing the equivalant of rsync via FTP. This is not the first time anybody's had that need; a web search will help you out there. But if you can't find anything, drop me a note and I'll look around myself a little further.
EDIT: Hope the author doesn't mind me adding this, but I think it belongs here. To do something approximately similar to rsync with ftp, look at weex http://weex.sourceforge.net/. Its a wrapper around command line ftp that uses a local mirror to keep track of whats on the remote server so that it can send only changed files. Works great for me.
It doesn't sound like SVN plays well with FTP, but if you have http access, that may prove sufficient to push changes using svnsync. That's how we push changes to our production severs -- we use svnsync to keep a read-only mirror of the repository available.
I use the following solution. Just install the SVN client on your webserver, and attach this into a privately accessible url:
<?php
// make sure you have a robot account that can't commit ;)
$username = Settings::Load()->Get('svn', 'username');
$password = Settings::Load()->Get('svn', 'password');
$repos = Settings::Load()->Get('svn', 'repository');
echo '<h1>updating from svn</h1><pre>';
// for secutity, define an array of folders that you do want to be synced from svn. The rest should be skipped.
$svnfolders = array( 'includes/' ,'plugins/' ,'images/' ,'templates/', 'index.php' => 'index.php');
if(!empty($_GET['justthisone']) && array_search($_GET['justthisone'], $svnfolders) !== false){ // you can also just update one of above by passing it to $_GET
$svnfiles = array($_GET['justthisone']);
}
foreach($svnfiles as $targetlocation)
{
echo system("svn export --username={$username} --password {$password} {$repos}{$targetlocation} ".dirname(__FILE__)."/../{$targetlocation} --force");
}
die("</pre><h1>Done!</h1>");
I'm going to make an assumption here and say you are using a post-commit hook to do your merging/updating of your staging server. This may work, but I would strongly recommend you look into a Continuous Integration solution. The following are some that I am aware of:
Xinc - http://code.google.com/p/xinc/ (PHP Specific)
CruiseControl - http://cruisecontrol.sourceforge.net/ (Wildly popular.)
PHP integration made possible with http://phpundercontrol.org/about.html
Hudson - [https://hudson.dev.java.net/] (Appears to be Java based, but allows for plugins/extensions)
LFTP is capable of synchronizing directories over ftp.
Just an idea:
You could hold a revision of your project on a host you have access to and where subversion is installed. This single revision reflects the production server's version.
You could now write a PHP script that makes this repository update over svn and then find all files that have been changed since the rep was updated. These files you can upload.
Such a script could look like this:
$path = realpath( '/path/to/production/mirror' );
chdir( $path );
$start = time();
shell_exec( 'svn co' );
$list = array();
$i = new RecursiveIteratorIterator( new RecursiveDirectoryIterator( $path ), RecursiveIteratorIterator::SELF_FIRST );
foreach( $i as $node )
{
if ( $node->isFile() && $node->getCTime() > $start )
{
$list[] = $node->getPathname();
}
// directories should also be handled
}
$conn = ftp_connect( ... );
// and so on
Just as it came to my mind.
I think this will help you
https://github.com/midhundevasia/deploy
its works well in Windows.