It’s just like this. I will try to explain myself with an example.
Let’s configure duplicifs multiple backup and create a file named /etc/backup-data/duplicifs.include.
The backup will contain only files listed inside /etc/backup-data/duplicifs.include, except the global excludes.
If we create a file named /etc/backup-data/duplicifs.exclude, list of exclude files is read only from this file.
Why this implementation? Because it’s very flexible and allow the creation of backups with limited data set.
You can have the single backup saving everything, and a new mailbackup which backups only the mail every hour.
To achieve this you need to create two files:
/etc/backup-data/backup.include which contains only:
Just added the same example to the administrator manual.
After some tests, we also implemented the restic Prune option once a week.
On our backup, a daily prune takes around 2 hours; if it’s execute once a week it takes around 6 hours.
If the prune is executed less often (like once a month), the process will be really slow and could clash with the next backup job. (/cc @pike@m.traeumner).
If it is possible to save an NFS/SAMBA to an RDX tape, which has 5 different tapes in a /dev/sdc1, I have to create 5 jobs, which is not possible, because there could only be one tape per day. Would I have to do a full backup every time?
Sorry but tape is not supported, but I know @filippo_carletti would like to add a custom script for it.
Just try to add your tar-based script inside the post-backup event.
RDX is not a real tape, but a USB drive with removable disks, it is only recognized as tape by some backup software. I have now solved my one cronjob that performs a rsync and with the Sogo tool I make a backup to the medium.
I just configured a simple (not multiple) rsync backup to a NFS share, launched backup-data -b mybackup some times and noticed those lines at the beginning of the /root/.rsync_tmbackup/xxx.log file
2018/09/17 21:40:47 [14466] building file list │
2018/09/17 21:40:47 [14466] rsync: chown "/mnt/backup-TimeMachine/cloud/2018-09-17-214047/root" failed: Operation not permitted (1) │
2018/09/17 21:40:47 [14466] rsync: chown "/mnt/backup-TimeMachine/cloud/2018-09-17-214047/root/.byobu" failed: Operation not permitted (1)
And so on for all folders already existing on the NFS share.
Since I don’t administer the nfs server, which is a backup storage provided by my hosting company, it looks like rsync will not work for me : the only other options are CIFS (which doesn’t support linux extensions and therefore hard links) and FTP (no comment).
Any thought to adding support for Duplicati? It’s got some nice features:
Built-in support for a number of cloud storage providers (Microsoft OneDrive, Amazon Cloud Drive & S3, Google Drive, box.com, Mega, hubiC, etc.), as well as standard protocols like WebDAV, FTP, SSH, etc.
Built-in AES-256 encryption of backup data
Has a decent web GUI of its own to set up backups (including getting authentication tokens from the cloud storage, if you’re using that), or it can run from the CLI as well.
There’s a CentOS-compatible RPM available, and yum does a fine job of tracking dependencies and such. After reviewing this write-up, getting it running on a Neth box only took a few minutes (the only difference was that I substituted config set fw_duplicati ... and signal-event firewall-adjust for the fw commands there.
If all Duplicati is published under an Open Source license, we could try to integrate it.
Before integrating, the software should have the following features (from command line):
inclusion and exclusion of a list of files and directories
execute a backup
restore a file or an entire backup in selected target directory
Supported, though I think it’d be by way of multiple --include and --exclude arguments, rather than by passing a list.
All of these are supported.
It looks like the “find” command could do something like this, though it isn’t immediately clear from the docs how you’d go about getting a list of all the contents of the most recent backup.
Edit: As to doing it on the command line, here’s the command line for the full system backup I’m running: