I work at a small php shop and I recently proposed that we move away from using our nas as a shared code base and start using subversion for source control.
I've figured out how to make sure our dev server gets updated with every commit to our development branch... and I know how to merge into trunk and have that update our staging server, because we have direct access to both of these, but my biggest question is how to write a script that will update the production server, which we many times only have ftp access to. I don't want to upload the entire site every time... is there any way to write a script that is smart enough to upload only what has changed to the web server when we execute it (don't want it to automatically be uploading to the production enviroment, we want to execute it manually)?
Does my question even make sense?
Basically, your issue is that you can't use subversion on the production server. What you need to do is keep, on a separate (ideally identically configured) server a copy of your production checkout, and copy that through whatever method to the production server. You could think of this as your staging server, actually, since it will also be useful for doing final tests on releases before rolling them out.
As far as the copy goes, if your provider supports rsync, you're set. If you have only FTP you'll have to find some method of doing the equivalant of rsync via FTP. This is not the first time anybody's had that need; a web search will help you out there. But if you can't find anything, drop me a note and I'll look around myself a little further.
EDIT: Hope the author doesn't mind me adding this, but I think it belongs here. To do something approximately similar to rsync with ftp, look at weex http://weex.sourceforge.net/. Its a wrapper around command line ftp that uses a local mirror to keep track of whats on the remote server so that it can send only changed files. Works great for me.
It doesn't sound like SVN plays well with FTP, but if you have http access, that may prove sufficient to push changes using svnsync. That's how we push changes to our production severs -- we use svnsync to keep a read-only mirror of the repository available.
I use the following solution. Just install the SVN client on your webserver, and attach this into a privately accessible url:
<?php
// make sure you have a robot account that can't commit ;)
$username = Settings::Load()->Get('svn', 'username');
$password = Settings::Load()->Get('svn', 'password');
$repos = Settings::Load()->Get('svn', 'repository');
echo '<h1>updating from svn</h1><pre>';
// for secutity, define an array of folders that you do want to be synced from svn. The rest should be skipped.
$svnfolders = array( 'includes/' ,'plugins/' ,'images/' ,'templates/', 'index.php' => 'index.php');
if(!empty($_GET['justthisone']) && array_search($_GET['justthisone'], $svnfolders) !== false){ // you can also just update one of above by passing it to $_GET
$svnfiles = array($_GET['justthisone']);
}
foreach($svnfiles as $targetlocation)
{
echo system("svn export --username={$username} --password {$password} {$repos}{$targetlocation} ".dirname(__FILE__)."/../{$targetlocation} --force");
}
die("</pre><h1>Done!</h1>");
I'm going to make an assumption here and say you are using a post-commit hook to do your merging/updating of your staging server. This may work, but I would strongly recommend you look into a Continuous Integration solution. The following are some that I am aware of:
Xinc - http://code.google.com/p/xinc/ (PHP Specific)
CruiseControl - http://cruisecontrol.sourceforge.net/ (Wildly popular.)
PHP integration made possible with http://phpundercontrol.org/about.html
Hudson - [https://hudson.dev.java.net/] (Appears to be Java based, but allows for plugins/extensions)
LFTP is capable of synchronizing directories over ftp.
Just an idea:
You could hold a revision of your project on a host you have access to and where subversion is installed. This single revision reflects the production server's version.
You could now write a PHP script that makes this repository update over svn and then find all files that have been changed since the rep was updated. These files you can upload.
Such a script could look like this:
$path = realpath( '/path/to/production/mirror' );
chdir( $path );
$start = time();
shell_exec( 'svn co' );
$list = array();
$i = new RecursiveIteratorIterator( new RecursiveDirectoryIterator( $path ), RecursiveIteratorIterator::SELF_FIRST );
foreach( $i as $node )
{
if ( $node->isFile() && $node->getCTime() > $start )
{
$list[] = $node->getPathname();
}
// directories should also be handled
}
$conn = ftp_connect( ... );
// and so on
Just as it came to my mind.
I think this will help you
https://github.com/midhundevasia/deploy
its works well in Windows.
Related
I'll get mallwared site hosted on linux hosting. All php files now start from lines:
<?php
$md5 = "ad05c6aaf5c532ec96ad32a608566374";
$wp_salt = array( ... );
$wp_add_filter = create_function( ... );
$wp_add_filter( ... );
?>
How I can cleanup it's with bash/sed or something?
You should restore your backup.
FILES="*.php"
for f in $FILES
do
cat $f | grep -v 'wp_salt|wp_add_filter|wp_add_filter' > $f.clean
mv $f.clean $f
done
Just a warning, the wp_add_filter() recursively evaluates encoded php code, which in turn calls another script that is encoded and evaluated. This larger script not only injects malicious code throughout your site but appears to collect credentials, and execute other hacks. You should not only clean your site, but make sure the flaw is fixed and any credentials that might have been exposed are changed. In the end, it appears to be a wordpress security issue but I've not confirmed this. I've added some comments on this over at http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html, which includes a clean-up script and more information on how to decode the malicious script.
You can do it with PHP (fopen, str_replace and fwrite) . There shouldn't be any encoding problems.
I just hit with this on a very full hosting account, every web file full of php?!
Much digging and post reading everywhere I came across this guys cleaner code (see http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html) - and tried it on a couple of the least important sites first.
So far so good. Pretty much ready to dig in and utilize it account wide to try and wipe this right off.
The virus/malware seems to be called "!SShell v. 1.0 shadow edition!" and infected my hosting account today. Along with the cleaner at http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html, you actually need to discover the folder containing the shell file that gives the hackers full access to your server files and also discover the "wp-thumb-creator.php" that's the file that does all the php injection. I've posted more about this # my blog: http://www.marinbezhanov.com/web-development/6/malware-alert-september-2011-sshell-v.1.0/
I have Wamp (server called emerald) running and Mamp running on my Mac. People register on Mamp. Emerald is basically file hosting.
Emerald connects to Mamp's mysql database, to login users. However, I want to create a directories for new registrations on Emerald using PHP.
How can I do this? I have tried using this code:
$thisdir = "192.168.1.71";
$name = "Ryan-Hart";
if(mkdir($thisdir ."/documents/$name" , 0777))
{
echo "Directory has been created successfully...";
}
But had no luck. It basically needs to connect the other server and create a directory, in the name of the user.
I hope this is clear.
You can't create directories through http. You need a filesystem connection to the remote location (a local hard disk, or a network share for example).
The easiest way that doesn't require setting up FTP, SSH or a network share would be to put a PHP script on Emerald:
<?php
// Skipping sanitation because it's only going to be called
// from a friendly script. If "dir" is user input, you need to sanitize
$dirname = $_GET["dir"];
$secret_token = "10210343943202393403";
if ($_GET["token"] != $secret_token) die ("Access denied");
// Alternatively, you could restrict access to one IP
error_reporting(0); // Turn on to see mkdir's error messages
$success = mkdir("/home/www/htdocs/docs/".$dirname);
if ($success) echo "OK"; else echo "FAIL";
and call it from the other server:
$success = file_get_contents("http://192.168.1.71/create_script.php?token=10210343943202393403&dir=HelloWorld");
echo $success; // "OK" or "FAIL"
Create a script on another server that creates the dir and call it remotely.
Make sure you have security check (+a simple password at least)
There is no generic method to access remote server filesystems. You have to use a file transfer protocol and server software to do so. One option would be SSH, which however requires some setup.
$thisdir = "ssh2.sftp://user:pass#192.168.1.71/directory/";
On Windows you might get FTP working more easily, so using an ftp:// url as directory might work.
As last alternative you could enable WebDAV (the PUT method alone works for file transfers, not creating directories) on your WAMP webserver. (But then you probably can't use the raw PHP file functions, probably needs a wrapper class or curl to utilize it.)
I know this is old but i think this might me useful, in my experience:
if(mkdir($thisdir ."/documents/name" , 0777))
doesn't work, i need to do it:
mkdir($thisdir, 0777);
mkdir($thisdir ."/documents" , 0777);
mkdir($thisdir ."/documents/name" , 0777));
hope it helps :)
What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.
I'm developing from my Windows laptop but need to test my development on my shared Linux hosting. I have thrown together the following in place of the normal $application_path = "application"; line.
$env['laptop']['ip'] = '127.0.0.1';
$env['laptop']['host'] = 'MATT-WINDOWS7';
$env['laptop']['path'] = 'private';
$env['mattpotts']['ip'] = '12.34.56.78';
$env['mattpotts']['host'] = 'my.webhost.com';
$env['mattpotts']['path'] = '../private/blah';
$ip = $_SERVER['SERVER_ADDR'];
$host = php_uname('n');
foreach($env as $e)
if($e['ip'] == $ip && $e['host'] == $host)
$application_folder = $e['path'];
unset($env);
if(!isset($application_folder))
die('application folder not set');
...which works fine for setting the application path but now I'm running into trouble with the need for a database config for each environment.
I can make it work with a few simple ifs but I was wondering if there's a best practice solution to this one.
Cheers
Use revision control such as Subversion. Have one configuration file deployed to your test environment and a modified version checked out in your development environment. Simply tell your client to not commit those configuration changes so they don't make it to your testing/production environment.
This is definitely the best practice solution :)
P.S. If you don't feel like setting up a Subversion server, there's always hosted solutions like Beanstalk and if you're on Windows, TortoiseSVN is a slick client.
If you don't feel like setting up subversion you can always detect what site you're on by looking at SERVER_NAME. In a CI site in the past I used the following within my config.php to figure out dev vs production servers:
if ($_SERVER['SERVER_NAME'] == 'www.mysite.com') {
$config['log_path'] = '/var/log/site/';
} else {
$config['log_path'] = '/var/log/dev_site/';
}
You can use this anywhere you need to have different variables based on environment. That being said hard-coding stuff like this into your code isn't always the best idea.
Here's a nice clean solution to running multiple production environments on a single codebase:
http://philsturgeon.co.uk/news/2009/01/How-to-Support-multiple-production-environments-in-CodeIgniter
As Phil's solution is no longer accessible, here's another solution that'll provide you with the exact same solution.
Here's the link to the Git repo: https://github.com/jedkirby/ci-multi-environments and this is a brief explanation of the module: http://jedkirby.com/blog/2012/11/codeigniter-multiple-development-environments
Phil Sturgeons page has moved to here:
http://philsturgeon.co.uk/blog/2009/01/How-to-Support-multiple-production-environments-in-CodeIgniter
within PHP (XAMPP) installed on a Windows XP Computer Im trying to read a dir which exists on a local network server. Im using is_dir() to check whether it is a dir that I can read.
In Windows Explorer I type \\\server\dir and that dir is being shown.
When I map a network drive a can access it with z:\dir as well.
In PHP I have that script:
<?php if( is_dir($dir){ echo 'success' } ) ?>
For $dir I tried:
/server/dir
//server/dir
\server\dir
\\server\dir
\\\\server\\dir
and
z:\dir
z:\\dir
z:/dir
z://dir
But I never get success?
Any idea?
thx
I solved it by changing some stuff in the registry of the server as explained in the last answer of this discussion:
http://bugs.php.net/bug.php?id=25805
Thanks to VolkerK and Gumbo anyway!
I love stackoverflow and their great people who help you so incredibly fast!!
EDIT (taken from php.net):
The service has limited access to network resources, such as shares
and pipes, because it has no credentials and must connect using a null
session. The following registry key contains the NullSessionPipes and
NullSessionShares values, which are used to specify the pipes and
shares to which null sessions may connect:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters
Alternatively, you could add the REG_DWORD value
RestrictNullSessAccess to the key and set it to 0 to allow all null
sessions to access all pipes and shares created on that machine.`
add RestrictNullSessAccess=0 to your registery.
You probably let xampp install apache as service and run the php scripts trough this apache. And the apache service (running as localsystem) is not allowed to access the network the way your user account is.
A service that runs in the context of the LocalSystem account inherits the security context of the SCM. The user SID is created from the SECURITY_LOCAL_SYSTEM_RID value. The account is not associated with any logged-on user account.
This has several implications:
...
* The service presents the computer's credentials to remote servers.
...
You can test this by starting the apache as console application (apache_start.bat in the xampp directory should do that) and run the script again. You can use both forward and backward slashes in the unc path. I'd suggest using //server/share since php doesn't care about / in string literals.
<?php
$uncpath = '//server/dir';
$dh = opendir($uncpath);
echo "<pre>\n";
var_dump($dh, error_get_last());
echo "\n</pre>";
Try the file: URI scheme:
file://server/dir
file:///Z:/dir
The begin is always file://. The next path segment is the server. If it’s on your local machine, leave it blank (see second example). See also File URIs in Windows.
Yes, I know this is an old post, but I still found it, and if anyone else does...
On Windows, with newer servers, verify the SMB is installed and enabled on the target machine.