I just configured a simple (not multiple) rsync backup to a NFS share, launched backup-data -b mybackup some times and noticed those lines at the beginning of the /root/.rsync_tmbackup/xxx.log file
2018/09/17 21:40:47 [14466] building file list │
2018/09/17 21:40:47 [14466] rsync: chown "/mnt/backup-TimeMachine/cloud/2018-09-17-214047/root" failed: Operation not permitted (1) │
2018/09/17 21:40:47 [14466] rsync: chown "/mnt/backup-TimeMachine/cloud/2018-09-17-214047/root/.byobu" failed: Operation not permitted (1)
And so on for all folders already existing on the NFS share.
Since I don’t administer the nfs server, which is a backup storage provided by my hosting company, it looks like rsync will not work for me : the only other options are CIFS (which doesn’t support linux extensions and therefore hard links) and FTP (no comment).
Any thought to adding support for Duplicati? It’s got some nice features:
Built-in support for a number of cloud storage providers (Microsoft OneDrive, Amazon Cloud Drive & S3, Google Drive, box.com, Mega, hubiC, etc.), as well as standard protocols like WebDAV, FTP, SSH, etc.
Built-in AES-256 encryption of backup data
Has a decent web GUI of its own to set up backups (including getting authentication tokens from the cloud storage, if you’re using that), or it can run from the CLI as well.
There’s a CentOS-compatible RPM available, and yum does a fine job of tracking dependencies and such. After reviewing this write-up, getting it running on a Neth box only took a few minutes (the only difference was that I substituted config set fw_duplicati ... and signal-event firewall-adjust for the fw commands there.
If all Duplicati is published under an Open Source license, we could try to integrate it.
Before integrating, the software should have the following features (from command line):
inclusion and exclusion of a list of files and directories
execute a backup
restore a file or an entire backup in selected target directory
Supported, though I think it’d be by way of multiple --include and --exclude arguments, rather than by passing a list.
All of these are supported.
It looks like the “find” command could do something like this, though it isn’t immediately clear from the docs how you’d go about getting a list of all the contents of the most recent backup.
Edit: As to doing it on the command line, here’s the command line for the full system backup I’m running:
Hi there pagaille, is posible to ask You the command to configure a USB external disk for a backup, at the end of the year I have the disks to probe this new feature, but I have all the weekend trying to configure the drive for backup and is not posible.
Hey giacomo, the cockpit was the answer for my problem Amazing job, nethserver 8 will be the top server to administrate, thanks , now is to check an error to edit the backup configuration.
@pagaille, thanks for the answer but the cockpit didit for me.
I need to rotate the hardrives every week (one drive is connected the other is in a bank locker), also i need to backup only 2 two days not 30d with rsync, I have to do it manualy but is not clear how to do it.
Uncaught TypeError: b.props.NotifyTo.join is not a function
at VueComponent.initWizard (webpack:///./src/router/index.js_+66_modules?:2813)
at VueComponent.openEditBackupData (webpack:///./src/router/index.js+66_modules?:3288)
at click (webpack:///./src/router/index.js+_66_modules?:3488)
at invoker (webpack:///./node_modules/vue/dist/vue.esm.js?:2024)
at HTMLAnchorElement.fn._withTask.fn._withTask (webpack:///./node_modules/vue/dist/vue.esm.js?:1823)
Out of curiosity, why would you want to put a retention policy ? The rsync script is made to mimic Time Machine (Mac) behavior which fills the hard drive until free space is exhausted. From then it purges the oldest backups to make room for the next one.
That way your backup disk space is used as efficiently as possible.
I think you may have missed a point here; moreover you’re taking risks by modifying the script since it will be probably overwritten by future updates.