Rclone - Backup your VPS to Google Drive

Previously I used VPS to store backups using the Duplicity tool or Rsync. However, there is now a more efficient, more economical (Free) way to back up Cloud to Rclone.

Rclone is a data synchronization tool similar to Rsync but is focused on developing connectivity to cloud storage services.

The advantage of using cloud storage services is the high speed (due to the fact that servers are located all over the world), data security (no worries about hardware, network issues) and most of them Free. I especially like Free stuff!

RCLone supports many popular cloud services such as:

  • Google Drive

  • Amazon S3

  • Swift Openstack / Rackspace cloud files / Memset Memstore

  • Dropbox

  • Google Cloud Storage

  • Amazon Drive

  • Microsoft OneDrive

  • Hubic

  • Backblaze B2

  • Yandex Disk

  • SFTP

  • The local filesystem


Now instead of backing up to another VPS for storage, I switched to Google Drive, free storage of 15GB, buy more pretty cheap, only 45k / month is 100GB already. Those with a free Google Apps account are even better.

In this article there are two main parts, one is the Rclone installation on the VPS, the second is to use Rclone to upload backup to Google Drive. With other cloud services you do the same.

Creating a backup file of all VPS data I have detailed instructions in the article Guide automatically backup all VPS, this article will focus on installing automatically upload files to Google Drive. See instructions for using Rclone with Google Drive and other cloud services at Rclone Docs.

The backup scenario is as follows:

  • Backup all MySQL databases, one database per .gz file

  • Back up all code in /home/domain.com/public_html/

  • Backup the entire nginx configuration in the /etc/nginx/conf.d/ directory.

  • Integrates all data into a folder

  • Upload the backup file to Google Drive at 2am

  • Automatically delete the backup file on the VPS after the upload is complete, delete the backup file on Cloud if more than 2 weeks


Now let's get started.


I. Rclone Installation Guide


1. Install Rclone


Rclone is a command line program so we will down and move the file to the /usr/sbin/ directory of VPS for future use.
Note: If you installed the older version, just run the following command to update. Refer to the Changelog versions.

- Install with v1.42 64bit Linux operating system
cd /root/
wget https://downloads.rclone.org/v1.42/rclone-v1.42-linux-amd64.zip
unzip rclone-v*.zip
\cp rclone-v*-linux-amd64/rclone /usr/sbin/
rm -rf rclone-*

- Installed with v1.42 32bit Linux operating system
cd /root/
wget https://downloads.rclone.org/v1.42/rclone-v1.42-linux-386.zip
unzip rclone-v*.zip
\cp rclone-v*-linux-386/rclone /usr/sbin/
rm -rf rclone-*

Direct download links Rclone version.

Now you can use the rclone command to see more usage information.

2. Some common commands


Statements that use Rclone are usually of the form:
rclone command <parameters> <parameters...>

Where command is a statement, parameters are parameters.

Some common commands when using Rclone:

  • rclone config - Configure connection to cloud services.

  • rclone copy - Copy files from server to cloud, skip if existing data exists.

  • rclone sync - Synchronize between server and cloud, only update data on the cloud only.

  • rclone move - Move files from server to cloud.

  • rclone delete - Delete the folder's data.

  • rclone purge - Deletes folder data and entire contents.

  • rclone mkdir - Create a folder.

  • rclone rmdir - Empty empty folder at the path.

  • rclone rmdirs - Deletes the entire empty folder at the path.

  • rclone check - Check whether the server and cloud data are synchronized or not.

  • rclone ls - List all data including size and path.

  • rclone lsd - List all directories.
    rclone lsl - List all data including modification time, size and path.

  • rclone size - Returns the size of the directory.


Details of each command you see here.

II. Backup VPS to Cloud with Rclone


1. Create a connection with Google Drive

We will first configure the Rclone connection to Google Drive, which should only be done once. The connection is named remote

Connect SSH to VPS and run the command:
# rclone config

You will get the message: No remotes found - make a new one, enter n and press Enter to create a new connection.

In the name line enter the remote to name the connection, you can choose any name.

A list of cloud services appears, select No. 11, Google Drive, and press Enter.

In the next two lines of Client ID and Client Secret, leave blank and press Enter.

Scope that rclone should use when requesting access from drive select 1 – drive

Next, leave the ID of the root folder and the Service Account Credentials JSON file path

Asked Use auto config? Type n then press Enter. Immediately, Rclone will give you a link, you can click directly or copy and paste into the browser.

Tao-ket-noi-Google-Drive-2

The interface looks like this:

Cap-quyen-cho-Rclone-truy-cap-Google-Drive

Press the Allow button to agree, then you will receive the verification code as shown below:

Ma-verification-code

Go back to the SSH window, copy and paste this code into Enter verification code> then Enter.

Select n means no to Configure this as a team drive?

Rclone needs to confirm the information again, press y to agree and then press q to exit the connection configuration interface.

The whole installation process is similar to the following (red words need input):
# rclone config

No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> remote
Type of storage to configure.
Choose a number from below, or type in your own value
1 / Alias for a existing remote
\ "alias"
2 / Amazon Drive
\ "amazon cloud drive"
3 / Amazon S3 Compliant Storage Providers (AWS, Ceph, Dreamhost, IBM COS, Minio)
\ "s3"
4 / Backblaze B2
\ "b2"
5 / Box
\ "box"
6 / Cache a remote
\ "cache"
7 / Dropbox
\ "dropbox"
8 / Encrypt/Decrypt a remote
\ "crypt"
9 / FTP Connection
\ "ftp"
10 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
11 / Google Drive
\ "drive"
12 / Hubic
\ "hubic"
13 / Local Disk
\ "local"
14 / Mega
\ "mega"
15 / Microsoft Azure Blob Storage
\ "azureblob"
16 / Microsoft OneDrive
\ "onedrive"
17 / OpenDrive
\ "opendrive"
18 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
19 / Pcloud
\ "pcloud"
20 / QingCloud Object Storage
\ "qingstor"
21 / SSH/SFTP Connection
\ "sftp"
22 / Webdav
\ "webdav"
23 / Yandex Disk
\ "yandex"
24 / http Connection
\ "http"
Storage> 11
Google Application Client Id - leave blank normally.
client_id>
Google Application Client Secret - leave blank normally.
client_secret>
Scope that rclone should use when requesting access from drive.
Choose a number from below, or type in your own value
1 / Full access all files, excluding Application Data Folder.
\ "drive"
2 / Read-only access to file metadata and file contents.
\ "drive.readonly"
/ Access to files created by rclone only.
3 | These are visible in the drive website.
| File authorization is revoked when the user deauthorizes the app.
\ "drive.file"
/ Allows read and write access to the Application Data folder.
4 | This is not visible in the drive website.
\ "drive.appfolder"
/ Allows read-only access to file metadata but
5 | does not allow any access to read or download file content.
\ "drive.metadata.readonly"
scope> 1
ID of the root folder - leave blank normally. Fill in to access "Computers" folders. (see docs).
root_folder_id>
Service Account Credentials JSON file path - leave blank normally.
Needed only if you want use SA instead of interactive login.
service_account_file>
Remote config
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine or Y didn't work
y) Yes
n) No
y/n> n
If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offlinexxxx
Log in and authorize rclone for access
Enter verification code> 4/AABw8gMKPxxxxxxxxxx
Configure this as a team drive?
y) Yes
n) No
y/n> n
--------------------
[remote]
type = drive
client_id =
client_secret =
scope = drive
root_folder_id =
service_account_file =
token = {"access_token":"xxx","token_type":"Bearer","refresh_token":"1/xxx","expiry":"2018-05-16T10:55:03.488381196+07:00"}
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name Type
==== ====
remote drive

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

That's it, now you can test with the directory listing command in the remote connection:
# rclone lsd remote:

2. Create a connection with Amazon Drive / Onedrive / Yandex

Currently, besides GG Drive, other Clouds are becoming more popular and have more incentives for users. Making a connection to Amazon Drive / OneDrive / Yandex is similar to GG Drive, just stepping away from access_token.
Note: For OneDrive, you need to select the correct personal account type Personal / Business Bussiness.

Authentication is performed through a browser-based remote machine (for example, your computer) with the command rclone authorize "name_cloud".

OneDrive specific example, to the authentication step:
Microsoft App Client Id - leave blank normally.
client_id>
Microsoft App Client Secret - leave blank normally.
client_secret>
Remote config
Choose OneDrive account type?
* Say b for a OneDrive business account
* Say p for a personal OneDrive account
b) Business
p) Personal
b/p> p
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes
n) No
y/n> n
For this to work, you will need rclone available on a machine that has a web browser available.
Execute the following on your machine:
rclone authorize "onedrive"

Run the rclone authorize "onedrive" command on the machine you are using to obtain access_token.

  1. Download Rclone on a personal computer (for Windows / OSX) at Rclone Downloads. Specifically, WindowsAMD64 - 64 Bit is rclone-v1.42-windows-amd64. Inside there is a rclone.exe file to run on the command prompt. For example, the path of file D: \ Rclone / rclone.exe

  2.  Open the CMD application of Windows (Run - CMD) and run the command above with the file path. For example, D:\Rclone\rclone.exe authorize "onedrive"

  3. The browser pops up asking you to log in for authentication. Success will show Success message. Go back to rclone to continue on the browser. Back to the CMD application you copy access_token to the VPS authentication.


C:\Users\HocVPS>D:\Rclone\rclone.exe authorize "onedrive"

Choose OneDrive account type?
* Say b for a OneDrive business account
* Say p for a personal OneDrive account
b) Business
p) Personal
b/p> p
If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...
Got code
Paste the following into your remote machine --->
{"access_token":"EwD4Aq1DBAAUcSSzoTJJxxx","expiry":"2018-05-16T11:43:25.3184173+07:00""}
<---End paste

Note that access_token is a seamless string, so when copied from the CMD you paste it through an editor like EmEditor or Notepad ++ to completely remove the extra characters that cause the line.

3. Script to backup all VPS and upload to Cloud

In the Tutorial of Automatically Backing Up All VPS I shared the VPS backup script, but in this article I will edit a bit, so that script automatically upload to Cloud after creating the compressed file.
- Script works with RCLone version 1.35 or later.
- Connect Rclone to the remote cloud name, if you use another connection, change the name in the script
- If you use another HocVPS, use script 2 and fix your MySQL login, source directory directory and Nginx Conf directory.

This script is written under the folder structure on the server managed by HocVPS Script.

  • Create a backup.sh file in /root/


nano /root/backup.sh

- Copy the entire script below and paste it (apply to HocVPS Script 2.0 or higher).
# HocVPS Script Plugin - Backup Server and Upload to Cloud

#!/bin/bash

SERVER_NAME=HOCVPS_BACKUP

TIMESTAMP=$(date +"%F")
BACKUP_DIR="/root/backup/$TIMESTAMP"
MYSQL=/usr/bin/mysql
MYSQLDUMP=/usr/bin/mysqldump
SECONDS=0

mkdir -p "$BACKUP_DIR/mysql"

echo "Starting Backup Database";
databases=`$MYSQL -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql)"`

for db in $databases; do
$MYSQLDUMP --force --opt $db | gzip > "$BACKUP_DIR/mysql/$db.gz"
done
echo "Finished";
echo '';

echo "Starting Backup Website";
# Loop through /home directory
for D in /home/*; do
if [ -d "${D}" ]; then #If a directory
domain=${D##*/} # Domain name
echo "- "$domain;
zip -r $BACKUP_DIR/$domain.zip /home/$domain/public_html/ -q -x /home/$domain/public_html/wp-content/cache/**\* #Exclude cache
fi
done
echo "Finished";
echo '';

echo "Starting Backup Nginx Configuration";
cp -r /etc/nginx/conf.d/ $BACKUP_DIR/nginx/
echo "Finished";
echo '';

size=$(du -sh $BACKUP_DIR | awk '{ print $1}')

echo "Starting Uploading Backup";
/usr/sbin/rclone move $BACKUP_DIR "remote:$SERVER_NAME/$TIMESTAMP" >> /var/log/rclone.log 2>&1
# Clean up
rm -rf $BACKUP_DIR
/usr/sbin/rclone -q --min-age 2w delete "remote:$SERVER_NAME" #Remove all backups older than 2 week
/usr/sbin/rclone -q --min-age 2w rmdirs "remote:$SERVER_NAME" #Remove all empty folders older than 2 week
/usr/sbin/rclone cleanup "remote:" #Cleanup Trash
echo "Finished";
echo '';

duration=$SECONDS
echo "Total $size, $(($duration / 60)) minutes and $(($duration % 60)) seconds elapsed."

- Script 2: With HocVPS Script version 1.8 or lower or other administrative system
# HocVPS Script Plugin - Backup Server and Upload to Cloud
# Version: 1.1
#!/bin/bash

. /etc/hocvps/scripts.conf

SERVER_NAME=HOCVPS_BACKUP

TIMESTAMP=$(date +"%F")
BACKUP_DIR="/root/backup/$TIMESTAMP"
MYSQL_USER="root"
MYSQL=/usr/bin/mysql
MYSQL_PASSWORD=$mariadbpass
MYSQLDUMP=/usr/bin/mysqldump
SECONDS=0

mkdir -p "$BACKUP_DIR/mysql"

echo "Starting Backup Database";
databases=`$MYSQL --user=$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql)"`

for db in $databases; do
$MYSQLDUMP --force --opt --user=$MYSQL_USER -p$MYSQL_PASSWORD $db | gzip > "$BACKUP_DIR/mysql/$db.gz"
done
echo "Finished";
echo '';

echo "Starting Backup Website";
# Loop through /home directory
for D in /home/*; do
if [ -d "${D}" ]; then #If a directory
domain=${D##*/} # Domain name
echo "- "$domain;
zip -r $BACKUP_DIR/$domain.zip /home/$domain/public_html/ -q -x /home/$domain/public_html/wp-content/cache/**\* #Exclude cache
fi
done
echo "Finished";
echo '';

echo "Starting Backup Nginx Configuration";
cp -r /etc/nginx/conf.d/ $BACKUP_DIR/nginx/
echo "Finished";
echo '';

size=$(du -sh $BACKUP_DIR | awk '{ print $1}')

echo "Starting Uploading Backup";
/usr/sbin/rclone move $BACKUP_DIR "remote:$SERVER_NAME/$TIMESTAMP" >> /var/log/rclone.log 2>&1
# Clean up
rm -rf $BACKUP_DIR
/usr/sbin/rclone -q --min-age 2w delete "remote:$SERVER_NAME" #Remove all backups older than 2 week
/usr/sbin/rclone -q --min-age 2w rmdirs "remote:$SERVER_NAME" #Remove all empty folders older than 2 week
/usr/sbin/rclone cleanup "remote:" #Cleanup Trash
echo "Finished";
echo '';

duration=$SECONDS
echo "Total $size, $(($duration / 60)) minutes and $(($duration % 60)) seconds elapsed."

Note:

  • The above script will back up the entire database, each database compressed into a .gz file, stored in the mysql directory

  • Each directory containing the website is compressed into a .zip file

  • The entire Nginx configuration of the websites is stored in the nginx directory

  • SERVER_NAME defaults to HOCVPS_BACKUP, if you want to change the folder on Cloud you change this parameter.

  • Want to adjust the time of deleting the backup file, you modify the delete and rmdirs parameters.

  • Currently, automatic deletion of files / folders older than 2 weeks.


- Press Ctrl + O, Enter to save and Ctrl + X to exit.

- Permissions for scripts
chmod +x /root/backup.sh

- So that's it, now you can test again by running the command:
/root/backup.sh

Try checking on Cloud for a new folder with backup data, or test with the command:
rclone lsl remote:HOCVPS_BACKUP

If there is no problem, the result is a directory that contains the current date, the inside of which contains the .zip file, the Nginx configuration (.conf) and the database (.gz).

4. Create cronjob automatically backup daily

I will now run the script automatically at 2:00 am.
EDITOR=nano crontab -e

Paste the following content into the Terminal window
0 2 * * * /root/backup.sh > /dev/null 2>&1

Press Ctrl + O, Enter to save and Ctrl + X to exit

That is, every 2am morning script will automatically run, backup all data of VPS and upload to Cloud. Also, backup data on the VPS will be deleted after upload.

See also cronjob manual.

III. Download the backup file from Cloud to VPS


The easiest way to restore your data is to download the backup file from Cloud to your computer, and then back up to the VPS. However, if you want to download the backup file directly to VPS, you can always use Rclone with the copy command.

Reference Example:
rclone copy "remote:/HOCVPS_BACKUP/2017-11-01" /root/

The above command will copy the 2017-11-01 folder in the Cloud HOCVPS_BACKUP directory to the / root / VPS directory. The speed of upload and download from the Cloud is very fast.
After copying the backup data to VPS, you proceed to extract the zip file, copy the web directory and nginx to the correct location and proceed to import the database.

IV. Summary


It is very important to backup your VPS / Server and have all your data lost due to incorrect rebuild and non-backup backups. Hopefully with this detailed tutorial, you will have more new methods of saving and more effective.

Now it's your turn to do it, need more support or comments for each comment below.

Nhận xét