Backing up uploaded document on Linux - php

So I have a PHP application running on linux machine which is using a mysql database. I have manage to add the back up my mysql database every day by adding a code in the CRONTAB. In my application clients are able to upload document, of which a saved in a directory in the application folder ie /myapp/uploaded_documents/, I am looking at backing up this directory.
My question is: how do I back up a directory to a certain remote location on a certain time every day? Is it possible to also password protect this directory on my app folder?
Thank you

As told in previous answer, to backup periodically on a remote machine you can use rsync+ssh+crontab. Just set ssh to access the remote machine without password following (for ubuntu distro) https://help.ubuntu.com/community/SSH/OpenSSH/Keys, then add to crontab a rsync job at the time and days you want (check man crontab to understand how to do this), telling rsync to backup over ssh on a remote machine, something like 0 2 * * * rsync -ae ssh dir_to_bkp name#host:dir_where_bkp to backup each day at 02:00 am the "dir_to_bkp" in the "host" machine using "name" user and "dir_where_bpk" as destination. The -e ssh option in rsync specify to use ssh.

The best way is to use rsync, as you would be uploading (most likely) changes only.
http://linux.die.net/man/1/rsync
Additionally you can create incremental backup:
http://www.mikerubel.org/computers/rsync_snapshots/
So my suggested solution would be rsync + crontab

Related

Eclipse auto-copy via SSH from one server to another upon save, PHP

I'm writing a PHP app, and our team has our servers set up so that I have to copy code onto a second server every time I edit a file. Naturally this is against the DRY concept, and simply a monotonous chore.
When I make a change to the file, and go to refresh the webpage to see the changes, both of these servers have to be updated, or else there is a chance that I will be seeing the old version on the server that hasn't been updated yet. So I'm copying/pasting/saving hundreds of times a day while I develop this app!
I'm using Eclipse Luna 4.4.1. In the Remote Systems Explorer, the system type is "SSH Only." This is the same for server 2.
I would like there to be some kind of functionality where each time I save sample.php on server 1, the same file sample.php will be automatically updated and saved with the same code on server 2. I don't want to have to do anything else other than just save the file on server 1; after doing that I'd like for the file on server 2 to be automatically updated.
Both servers are running on nginx OS.
I've been reading a lot about this here on SE, but many of the question/answers are 5+ years old, and I wonder if there is an optimal way to do this today. I don't have much experience with Java, but I'm happy to do this any way that will work. Any plugin or other way would be great.
Take a look at running rsync over ssh, and put it on a cron to run the update on a periodic interval as you desire.
Here is an example of an rsync process I run to copy database backups (type / contents of the file are mostly irrelevant here):
rsync -ar -e "ssh -l <username>" <local_path_of_files_to_watch> hostname:<target_path_where_files_land>
This is what my command looks like to rsync databases from a remote system that monitors radio traffic and records decoded messages to a daily log file:
rsync -av --remove-source-files --progress --stats -e "ssh -l stratdb" 192.168.0.23:/var/log/stratux.sqlite.*.gz /Users/stratdb/sqlite-dbs/.
If you want this process to run without having to define a password, you'll want to setup .ssh with shared keys to allow passwordless login from server 1 -> server 2 for this process (Ideally this is done from a user that has limited capabilities on both servers).
There are some continues deployment processes you could use that would do builds and copies and the like (e.g. Jenkins) if you need really robust and highly configurable solution... but if you just need to brute-force copy files from one server to another with a single easy command (ad-hoc) or automagically.. I'll investigate rsync

Web Server Interrupt Driven File Transfer

I have a webpage that currently takes an upload from a user and stores this into a directory (/upload). [Linux based Server]
I am looking for a way instead of storing this on the server/in that directory to instead transfer the file onto a local machine. [Running Ubuntu 12.04]
Assuming I already have public/private keys setup how might I go about doing this?
Current Ideas:
ftp transfer
rsync
Ideas:
1) Stop running anything on the server, and forward every byte to your local box. Just run ssh -N -R :8080:localhost:3000 remote.host.com This will allow anyone to hit http://remote.host.com:8080 and get your port 3000. (If you do port 80, you'll need to SSH in as root.) Performance will be kinda bad, and it won't be that reliable. But might be fine for real-time transfer where you're both online at once.
2) use inotifywait to watch the upload dir on the server, and trigger rsync from the server to your local box. (Requires exposing SSH port of your box to the world.) If you sometimes delete files, use unison bidirectional file sync instead. (Although unison doesn't work on long filenames or with lots of files.)
3) Leave the system as-is, and just run rsync from cron on your local box. (Ok, not realtime.)
Of course, most people just use dropbox or similar.Alghough

mysql database Back-up (linux server to client m/c) automatically

How can we store the mysql database Back-up from a linux server to a client windows m/c automatically in a fixed duration of time...
You cannot "guarantee" that the backup will be saved in a fixed amount of time. If you setup a share to a folder on the windows machine you can have a cron job backup the database and save it to the share at a specific start time. But it depends on the size of the backup how long it takes to create and copy/move the backup to the share.
I wrote these scripts to backup a mysql database.
http://codingsnippets.com/category/mysql/
They'll have to be called from a cron job to do it regularly.
http://adminschoice.com/crontab-quick-reference
If you want to create a windows share on the windows box, you can mount it from linux with a command like this:
mount -t smbfs -o username=bjones //192.168.1.100/"Storage (H)" /mnt/windows-storage/
Good luck

Automatically backing up a file to another server?

I have a backup created of my mysql database (vbulletin forum v3.8) everyday. It's about 360mb in size. It's stored as one text file in a secure folder.
I'm thinking of getting another server, through a different host, and somehow automatically transferring the backup to my second server every day.
Any ideas on how I could automate this process? I'm thinking PHP and a cron job.
Cron definately. Php, if you like it, but using bash with mysqldump combined with gzip works wonders.
Schedule rsync to transfer the files (over ssh) with cron (if you're on Linux).
Cron + rsync may be your best bet. If the file is text as you say and the changes are "diff"able then rsync can be used to transfer only the updates to that file. For example, the crontab could look something like this:
20 4 * * * rsync -a --delete source/ username#remotemachine.com:/path/to/destination/
This will sync the remote machine once a day deleting any files in the remote copy that no longer exist on the source machine.
As a note, I just read again and noticed this is a mysql backup so they output of the dump could eventually contain binary and in that case you probably want to just use a replication server or copy the file whole each day. rsync could be used for the copy as well...
slightly different approach and technically not a backup solution, but might want to consider running mysql in a replication mode and replicating changes live. so if the worst happpens you will be up to date with the data and not a day behind.

Whats the best way to transport a file to a different server?

Im running a file host thats grown beyond the capacity of a single server, and I need to implement multiple server storage of files. I'd like to do this as cheap as possible, so those fancy mass storage methods are out of the question. I simply want to move a file thats uploaded by the user to the "gateway" server, which hosts all the http and mysql, to one of the media servers. It can either be done at the end of the user's request, or via cron every couple of minutes.
At this point the only method Im truly familiar with, is using ftp_put php function, and simply ftping the file to a different server, but I've had problems with this method in the past, especially for larger files, and a lot of the files that will be getting transferred will be over 100mb.
Can anyone suggest a good solution for this? Preferably Im looking for a purely software solution... hopefully nothing more than a php/bash script.
One method that requires no programmatic setup, and is quite secure is to mount a folder from the other server, and use php to save to that directory like you would normally.
You don't have to use sshfs, there are a bunch of other ways to provide the same solution. I would use sshfs for this situation as it works across ssh, and therefore is secure and relatively easy to setup.
To achieve this (using sshfs):
Install sshfs
Mount a ssfs file to a folder accessable to php
Use a php script to store the files within the sshfs mount, and therefore on the other server.
Another method is to setup an rsync command between the two using crontab.
you can write a bash script that run in cron
and use command line like scp or sftp or rsync
example :
[bash]# scp filename alice#galaxy.example.com:/home/alice
Copy "filename" into alice's home directory on the remote galaxy.example.com server
Use cron, a bash script, and rsync.
cron: "Cron is a time-based job scheduler in Unix-like computer operating systems. 'cron' is short for 'chronograph'."
bash: "Bash is a free software Unix shell written for the GNU Project."
rsync: "rsync is a software application for Unix systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate."
In summary, place the rsync command in a bash script and get cron to run it at regular intervals.
See here for some examples:

Categories