Simple incremental backups on Linux with TAR and GPG

My script with which I make backups in Linux

I love UNIX-way, here backups can be made much more flexible.

To backup the home directory, I use a regular incremental tar and encrypt it with my gpg key.

For other files, like backups of my videos that I record for YouTube, I use rsync. RSYNC is more rational to use when synchronization of a large number of files is not critical

#!/bin/bash
NOW=$(date +%Y%m%d%H%M)
MNOW=$(date +%Y%m)
BACKUP_HOME="/tmp/home/"
EMAIL="devpew"
ARCHIVES_DIR="/tmp/backup/"
DOW=date +%a              # Day of the week e.g. Mon
DOM=date +%d              # Date of the Month e.g. 27
DM=date +%d%b             # Date and Month e.g. 27Sep
if [[ ! -d " class="formula inline">{MNOW} ]]
then
mkdir " class="formula inline">{MNOW}
else
echo &>/dev/null
fi
tar --exclude-from=/home/dm/mybin/.backup.excludes -v -z --create --file " class="formula inline">{MNOW}/" class="formula inline">{ARCHIVES_DIR}" class="formula inline">{MNOW}.snar " class="formula inline">{ARCHIVES_DIR}" class="formula inline">{NOW}.log
if [ -d " class="formula inline">{ARCHIVES_DIR}-l) != "0" ]
thenИ
 gpg -r " class="formula inline">EMAIL --encrypt-files " class="formula inline">{MNOW}/.tar.gz 
&& rm -rf " class="formula inline">{MNOW}/.tar.gz
fi
scp " class="formula inline">{MNOW}/" class="formula inline">{ARCHIVES_DIR}" class="formula inline">{MNOW}.snar dm@192.168.0.152:/home/dm/backup/${MNOW}

If you need a more flexible increment of the second level, for example, by weeks, then you can use the following conditions

DOW=`date +%a`              		# Day of the week e.g. Mon
DOM=`date +%d`              		# Date of the Month e.g. 27
DM=`date +%d%b`             	# Date and Month e.g. 27Sep

if [ $DOM = "01" ]; then
  echo 'this is monthly backup'
fi
  
if [ $DOW = "Sun" ]; then
  echo 'this is weekly backup'
else
  echo 'this is daily backup'
fi

HOW IT WORKS

Now, briefly about what this script does.

The script will go to the specified directory for backups and create a directory in it with the name of the year and month in the format “202205” if it is May 2022.

Further, all backups for May will be located in this folder.

Further, if there is no file with an increment in the folder (for example, we launched the script for the first time or a new month has begun), then we will create a full backup of the entire system.

In the future, when you run this script, increments from the current full backup will be created

In addition, a log file will appear

After we have made a backup, it will be encrypted with our GPG key and the TAR file will be deleted.

After that, we will copy our backup to our server

exclude

If you need to set exceptions. That is, files or directories that do not need to be backed up, then this can be done in a file with exceptions. You need to be careful here, any extra space can break everything

➜  mybin cat /home/dm/mybin/.backup.excludes
/tmp/home/Nextcloud
/tmp/home/.cache
/tmp/home/youtube-videos

Also, nothing will work if you put a slash at the end.

For example the line /tmp/home/Nextcloud will work, but the line /tmp/home/Nextcloud/ will no longer work. So be careful

If you need to unpack

We make incremental backups and encrypt them. If we need to get data, then first we need to decrypt the file

You can decrypt with the command

gpg --output 202205122134.tar.gz --decrypt 202205122134.tar.gz.gpg

After that, let’s start unpacking tar starting from the very first one. First, unpack the archive from the first day.

tar --extract --verbose --listed-incremental=/dev/null --file=202205010101.tar.gz

And after that, we unpack the rest of the increments, if you need to restore the system state, for example, on the 11th, then you need to sequentially unpack the tar archives from 2 to 11 into the same folder

If this seems like a long process and you restore data frequently, you can add a second level increment, such as a weekly increment.

Or if manually it seems too long for you, you can throw in a small unpacking script. In the simplest case like this:

tar --extract --incremental --file file0.tar
tar --extract --incremental --file file1.tar
tar --extract --incremental --file file2.tar

Or, for example, like this:

for i in *.tbz2; do tar -xjGf "$i"; done;

If you want to extract only specific directories from the archive:

tar -xjGf levelX.tar --wildcards 'example/foo*' 'example/bar*'

autorun

If in ubuntu or debian, then you need to run this script through cron. There is no cron in Arch and autorun is done differently. In this example, we will run our script every day at 03:30

Need to create a file

sudo nvim /usr/lib/systemd/system/backup.service
[Unit]
Description=backup

[Service]
Type=simple
ExecStart=/home/dm/mybin/backup

And file

sudo nvim /usr/lib/systemd/system/backup.timer
[Unit]
Description=my backup

[Timer]
OnCalendar=*-*-* 03:30:00

[Install]
WantedBy=multi-user.target

After that we can start our service

sudo systemctl start backup.timer
sudo systemctl enable backup.timer

After that, check if it was added by the command

sudo systemctl list-timers --all

or team

sudo systemctl status backup.timer

REMOVE OLD BACKUPS

In addition to the fact that we constantly create backups, we need to delete the old ones in order not to fill up all the disk space. If you have a script for this, share it, and I will post your solution here.

Similar Posts

One Comment

  1. Hi, I am a bit lost. I am trying to figure my own backup solution and I have two questions regarding your script. How do you do the incremental part without a timestamp file? And why rsync is not a good option if you care about synchronisation of files? Thanks for the post!

Leave a Reply

Your email address will not be published. Required fields are marked *