well… having big amount of live data on live systems means having big backups and so an environment that can manage it.
AFAIK duplicity doesn’t support file deduplication while, for example, backuppc does…
it’s up to you to find the way to havign a good backup… NS is linux, you’ve to find the right tool for you, even if is something that isn’t native in NS ecosystem.
taken from http://backuppc.sourceforge.net/info.html
A clever pooling scheme minimizes disk storage and disk I/O. Identical files across multiple backups of the same or different PCs are stored only once resulting in substantial savings in disk storage and disk I/O.
One example of disk use: 95 latops with each full backup averaging 3.6GB each, and each incremental averaging about 0.3GB. Storing three weekly full backups and six incremental backups per laptop is around 1200GB of raw data, but because of pooling and compression only 150GB is needed.