I have a php script that logs into my servers via the ftp function, and backs up the entire thing easily and quickly, but I can't seem to find a way to let the script determine the folder where the main index file is located.
For example, I have 6 servers with a slew of ftp accounts all set up differently. Some log into the FTP root that has httpdocs/httpsdocs/public_html/error_docs/sub_domains and folders like that, and some log in directly to the httpdocs where the index file is. I only want to backup the main working web files and not all the other stuff that may be in there
I've set up a way to define the working directory, but that means I have to have different scripts for each server or backup I want to do.
Is it possible to have the php script find or work out the main web directory?
One option would be to set up a database that has either the directory to use or nothing if the ftp logs in directly to that directory, but I'm going for automation here.
If it's not possible I'll go with the database option though.
You cannot figure out through FTP alone what the root directory configured in apache is - unless you fetch httpd.conf and parse it, which I'm fairly sure you don't want to do. Presumably you are looping to do this backup from multiple servers with the same script?
If so, just define everything in an array, and loop it with a foreach and all the relevant data will be available in each iteration.
So I would do something like this:
// This will hold all our configuration
$serverData = array();
// First server
$serverData['server1']['ftp_address'] = 'ftp://11.22.33.44/';
$serverData['server1']['ftp_username'] = 'admin';
$serverData['server1']['ftp_password'] = 'password';
$serverData['server1']['root_dir'] = 'myuser/public_html';
// Second server
$serverData['server2']['ftp_address'] = 'ftp://11.22.22.11/';
$serverData['server2']['ftp_username'] = 'root';
$serverData['server2']['ftp_password'] = 'hackmeplease';
$serverData['server2']['root_dir'] = 'myuser/public_html';
// ...and so on
// Of course, you could also query a database to populate the $serverData array
foreach ($serverData as $server) {
// Process each server - all the data is available in $server['ftp_address'], $server['root_dir'] etc etc
}
No, you can't do it reliably without knowledge of how Apache is setup for each of those domains. You'd be better off with the database/config file route. One-time setup cost for that plus a teensy bit of maintenance as sites are added/modded/removed.
You'll probably spend days getting a detector script going, and it'll fail the next time some unknown configuration comes up. Attemping to create an AI is hard... you have to get it to the Artificial Stupidity level first (e.g. the MS Paperclip).
Related
Ok, so I've created a project where a client can drag and drop files onto our server and all works great! Now I've been asked to have the files that are being uploaded/transferred by our clients over a specific port range (let's say between 10000 and 11000 for argument sake). I do not know how to accomplish this. My current uploading function looks something like this:
File's name: test/upload.php
$dir = "path/to/directory/";
$tempFile = $_FILE['file']['tmp_name'];
$tagetFile = $dir.$_FILE['file']['name'];
move_uploaded_file($tempFile, $targetFile);
Where $_FILE is a file being uploaded.
Please disregard any syntax or spelling error in the code above, it works perfect at the moment. I have removed a lot of code to give a simplistic idea of what my code is currently doing.
If any configuration changes to PHP are to be made, they need to target this directory specifically as the rest of our website needs to stay on the current port. I am not exactly sure where to begin with specifying the ports to be used for file transfers. The file transfers are purely client to server and will never be vice-versa. We do have an FTP server setup however if possible, we'd like to remain off of it. I am not sure if what I am asking is possible otherwise.
I am using the Dropzone.js plugin (from here: http://www.dropzonejs.com/), however all the PHP code is mine.
I am not sure if something like the code below (from here) is the way to go, I've never used the fsockopen function before.
$fp = fsockopen($host, $port, $errno, $errstr, $timeout);
$responding = 1;
if (!$fp) { $responding = 0; }
$tend = microtime(true);
fclose($fp);
All answers are welcome. Thank you.
Since you want the client to upload the file on a different port, you will need to stand up a web server on that port. You could tell your current server to listen on that port, but that would do nothing to reduce load on your main website, so a separate machine is necessary. The new machine will have to be set up to listen to only the file upload port, but will need to contain your server (Apache, etc), PHP, and have access to the network storage location for your files.
If you have one, you may need to configure your firewall so traffic comes in to the correct machine depending on which port it is sent to.
The actual PHP code you use will not really be any different from what you have working now. Your JS code will need to be updated to it posts to the server using port 10000 or whatever you choose.
Here's a simple diagram that may help.
In the past I have received a lot of help from the SO community, so once I figured this out, I thought here's my opportunity to give back a little. Hopefully it helps someone.
The issue I was faced with was having my core site built on WordPress, with another database for an e-commerce section of the site, I wanted to backup the entire site (all files, both databases, etc.) to Dropbox on a daily basis.
After a lengthy search, I couldn't find anything that did exactly what I was looking for.
Disclaimer: You don't need to be running WordPress or an e-commerce site for this to work. It will work on any MySQL database(s) and requires PHP.
I came across the WordPress Backup to Dropbox plugin, which got me about 90% there. The plugin allowed me to back up all the files on the site plus it does a WordPress database backup at a frequency you schedule.
The problem is that the plugin only does a backup of the WordPress database, but not my e-commerce database.
I also found a MySQL backup to Dropbox tutorial (credit where it's due), which some of the code below is based on. It is a great tutorial, but I wanted it to backup and delete the backup at different times - the tutorial backed up and deleted all at the same time.
The solution I came up with is not specific to WordPress or an e-commerce site. Anyone who has a MySQL database and can run PHP should be able to benefit from this. Perhaps with a few tweaks to my answer, but still they should be able to accomplish the end result.
To store a backup of the e-commerce database, I created a folder in my site's root directory (/temp - call it whatever you want). Then I had to actually create the database backup. Open up a text editor and create a file called backup_dropbox.php.
backup_dropbox.php
<?php
// location of your /temp directory relative to this file. In my case this file is in the same directory.
$tempDir = "";
// username for e-commerce MySQL DB
$user = "ecom_user";
// password for e-commerce MySQL DB
$password = "ecomDBpa$$word";
// e-commerce DB name to backup
$dbName = "ecom_db_name";
// e-commerce DB hostname
$dbHost = "localhost";
// e-commerce backup file prefix
$dbPrefix = "db_ecom";
// create backup sql file
$sqlFile = $tempDir.$dbPrefix.".sql";
$createBackup = "mysqldump -h ".$dbHost." -u ".$user." --password='".$password."' ".$dbName." > ".$sqlFile;
exec($createBackup);
//to backup multiple databases, copy all of the above code for each DB, rename the variables to something unique, and set their values to whatever is appropriate for the different databases.
?>
Now this script should create a backup of the database "ecom_db_name" whenever it is run. To get it to run on a scheduled interval (I want it to run just a couple minutes before my WordPress backup starts to run at 7am). You can either use WP-Cron (if your site gets enough traffic to reliably trigger it to run at the right time) or schedule a cron job.
I am no expert on cron jobs and these types of commands, so there may be a better way. I have used this on two different sites and run them two different ways. Play around with what works best for you.
The first way is on a directory that is not password protected, the second is for a password protected directory. (Replace username and Password with your username and password, and obviously set example.com/temp/backup_dropbox.php to wherever the file resides on your server).
Cron Job to run backup_dropbox.php 5 minutes before WP backup
55 6 * * * php /home/webhostusername/public_html/temp/backup_dropbox.php
OR
55 6 * * * wget -q -O /dev/null http://username:Password#example.com/temp/backup_dropbox.php
Now the cron job is set up to run backup_dropbox.php and create my database backup every day at 6:55am. The WordPress to Dropbox backup that starts at 7am usually takes about 5-6 minutes, but could take a little longer.
I want to delete my .sql backup files after they have successfully been backed up to Dropbox so its not sitting out there forever for someone to somehow open/download the database file.
Fire up the text editor again, and create another file called clr_bkup.php.
clr_bkup.php
<?
$tmpDir = "";
//delete the database backup file
unlink($tmpDir.'db_ecom.sql');
// if you had multiple DB backup files to remove just copy the line above for each backup, and replace 'db_ecom.sql' with your DB backup file name
?>
Since the WordPress backup takes a few minutes to finish up, I want to run a cron job to execute clr_bkup.php at 10 past 7, which should give it enough time. Again, the first cron job below is for an unprotected directory, and the second for a password protected directory.
Cron Job to run clr_bkup.php 10 minutes after WP backup starts
10 7 * * * php /home/webhostusername/public_html/temp/clr_bkup.php
OR
10 7 * * * wget -q -O /dev/null http://username:Password#example.com/temp/clr_bkup.php
Sequence of events
To help wrap your head around what's going on, here's the timeline:
6:55am: Cron Job is scheduled to run backup_dropbox.php, which creates a backup file of my database.
7:00am: WordPress Backup to Dropbox runs, and backs up all files that have changed since the last backup, which includes my 5 minute old, newly created database backup.
7:10am: By now the Dropbox backup has finished up, so the Cron Job is scheduled to run clr_bkup.php, which removes the backup file from the server.
Variables, Notes, and Misc. Info
Timing
The first thing that hung me up was getting the timing right. For simplicity, I used the times in the example above as if everything was happening in the same time zone. In reality, my web host's server is in the US West Coast, while my WordPress timezone is set to the US East Coast (a 3 hour difference). My actual cron jobs are set to run 3 hours earlier (server time) than what is displayed above. This will be different for everyone. The best bet is to know the time difference up front.
Run Backup with a Time Check
In the directory that is not password protected, I wanted to keep the backup_dropbox.php script from running at any other time of the day than 6:55am (by visiting it in a browser at 10am for example). I included a time check at the beginning of the backup_dropbox.php file, which basically checks to see if it isn't 6:55am, then don't let it execute the rest of the code. I modified backup_dropbox.php to:
<?php
$now = time();
$hm = date('h:i', $now);
if ($hm != '06:55') {
echo "error message";
} else {
// DB BACKUP code from above goes here
}
?>
I suppose you could also add this to the clr_bkup.php file to only let it delete the backup files at 7:10am, but I didn't really see the need since the only time clr_bkup.php will do anything is between 6:55-7:10am anyhow. Up to you though if you decide to go that route.
Not on WordPress?
There are a number of free and paid services that will backup your website either to Dropbox or another similar service like Google Drive, Amazon S3, Box, etc., or some will store the files on their servers for a fee.
Backup Machine, Codeguard, Dropmysite, Backup Box, or Mover to name a few.
Want Redundant Offsite Backups?
There are plenty of services that will allow you to automatically create remote redundant backups on any of the cloud storage sites listed above.
For example if you backup your site to Dropbox, you can use a service called If This Then That (IFTTT) to automatically add files uploaded to a particular Dropbox folder to Google Drive. That way should Dropbox ever have an issue with their servers, you'll also have a Google Drive backup. Backup Box listed above could also do something like this.
Hope this helps
There may be a better way of doing all of this. I was in a pinch and needed to figure something out that works reliably, which this does. If there are any improvements that can be made, please share in the comments.
I think this post explain a solution wich can help you:
http://ericsilva.org/2012/07/05/backup-mysql-database-to-dropbox/
For security reasons, there is a certain file on my web server I want to be able to monitor access to. Every time it is accessed, I want to have an entry added to a MySQL log table. This way, I can actively respond to security breaches from within the web application.
The Apache HTTP Server provides logging capabilities.
The server access log records all requests processed by the server. The location and content of the access log are controlled by the CustomLog directive. The LogFormat directive can be used to simplify the selection of the contents of the logs. This section describes how to configure the server to record information in the access log.
It can be used to write the log to a file. If you need to store in a MySQL table, run a cron job to import the file into the database.
Further information on logs is here:
http://httpd.apache.org/docs/1.3/logs.html#accesslog
Its been removed from PHP7 but for anyone else who finds this post there are a number of options within the FAM (now PECL) extension. This function http://php.net/manual/en/function.fam-monitor-file.php seems to describe what is needed here
Additionally you can access a lot of detail about the files status with http://php.net/manual/en/function.stat.php. Put this within a cron or sleep driven script and you can then see when its changed.
The file may be accessed from three points:
Direct filesystem access
Call to the url like www.example.com/importantfile.jpg (apache serves the file)
Call to some php script on your server www.example.com/readfile.php?name=important.jpg which reads the file.
If you are concerned only about case 2 then check the solution of Rishi Dua.
But if you want more than that then you should write a script with fileatime() call and then add it to cron to run every minute for example.
The pseudocode for it:
<?php
$previous_access_time = get_previous_access_time(); // get the previous last access time from you remembered in db or textfile or whatever
$current_access_time = fileatime('path/to/very_important_file.jpg');
if ($previous_access_time != $current_access_time) {
log_access_to_db();
save_new_access_time(); // update the new last access time
}
This solution however has some problems.
First is that you can get only the access time but not the user-id or ip of who accessed the file.
Second is that as the manual says, some Unix system do not update the access time and so the solution would fail.
If you are seriously concerned about the security, then I think you have to check for some audit util like this
$tmpUploadFolder = "C:\\www\\intranet\\uploads";
//$finalUploadFolder = "file:////server//photos//overwrite";
$finalUploadFolder = "file://server/photos/overwrite";
//$finalUploadFolder = "\\\\server\\photos\\overwrite";
//$finalUploadFolder = "\\server\photos\overwrite";
//$finalUploadFolder = "P:\\overwrite";
//$finalUploadFolder = "P:/overwrite";
$from = $tmpUploadFolder . "\\" . $_REQUEST['ext'];
$to = $finalUploadFolder. "\\" . $_REQUEST['ext'];
copy($from, $to);
I am trying to do a PHP upload using a jquery tool. The Jquery tool nicely places the file onto the PHP upload dir before the page submit. So i want to (upon post of the form) quickly move the file from it's tmp folder location (it'll already be there you see) to it's final destination on an image store server (I use the _REQUEST['ext'] variable to hold the filename jquery held.
Rest assured these paths are good they work lovely in dos. As you can see I have tried every known unc syntax I know.
I cannot for the life of me get php to work I have written a VBS "copy . file" and tried to trigger it under whost.exe via system() in php, i've downloaded the oldeskool runas.exe and tried to get it to copy via system(), I have used unc paths and network shares, and mapped network drives, I have made apache service "log on as " administraor and even a custom adhoc new user made just for this and given it full permissions
It works fine if I change P:\ to C:\
I KNOW IT'S EFECTIVE PERMISSONS RE: APACHE - BUT WE DO NOT RUN ACTIVE DIRECTORY AND I CAN'T GET IT TO WORK
it simply will not let me copy this file onto a network and this is a major major MAJOR problem child for me.
Is there a solution? If you are going to help me with things like "it's file permissions" then I am going to need a break down of exact and careful instructions because I am pulling my hair out because I know it's file permissions rights but I just can't get it to work
I am tired now.. please help?
ok I figured it out so for the benefit of those going after me here is the solution THAT WORKS
1.make sure php windows "apache2.2" service is running as a administrator user (I made a user called apacheusr and gave it a password and popped it into local administrators) you do this by right clicking properties on the "apache2.2" service in administrative tools->services and going to the logon tab->this account and picking the apacheusr
2.because I don't run active directory I made this apacheusr user on BOTH machines (phpserver/ imageserver) as a local administrator user and gave them BOTH the same username password and tick password never expires.
3.I then log in/out at least once onto windows with both these accounts. (don't ask me why but it seemed to help, it stopped the runasexe --that I gave up with-- moaning in dos)
4.finally on the php server right click share the destination folder on imageserver and make damn well sure this apacheusr can log in to that folder. The simplest way to do this is when you log/in/out as apacheusr on your php server and try to go to your image server folder - you then need to be on the imagesever and tick everything correctly in the share/permissions bit
THEN the final bit is (where _REQUEST['ext'] is a file name EG: "pic.jpg")
$tmpUploadFolder = "C:\\www\\intranet\\uploads";
$finalUploadFolder = "\\\\server\\photos\\overwrite";
$from = $tmpUploadFolder . "\\" . $_REQUEST['ext'];
$to = $finalUploadFolder. "\\" . $_REQUEST['ext'];
copy($from, $to);
The above code works!
In what environment do you run php? Apache? IIS? These run most of the time as a service with System Credentials and cannot access Network shares...
Change the Webserver Account to a User that can write and it should work (with one of those URLs at least)
What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.