
I looked a bit in the source, but wasn’t able to see anything obvious. The problem doesn’t happen during a backup using this command, even for a initial backup: duplicacy backup -storage digitalocean And then it constantly climbs up to 350 MB, at which point it is killed. I am still running into issues here when using this commands: duplicacy copy -from digitalocean -to google-driveĭuplicacy copy -from digitalocean -to sftp-driveĪfter analyzing and before copying the chunks it starts at 25 MB memory usage. Snap | rev | | files | bytes | chunks | bytes | uniq | bytes | new | bytes | Total chunk size is 575,005K in 192 chunksĪll chunks referenced by snapshot website at revision 9 existĪll chunks referenced by snapshot website at revision 10 existĪll chunks referenced by snapshot website at revision 11 existĪll chunks referenced by snapshot website at revision 12 existĪll chunks referenced by snapshot website at revision 13 existĪll chunks referenced by snapshot website at revision 14 existĪll chunks referenced by snapshot website at revision 15 existĪll chunks referenced by snapshot website at revision 16 existĪll chunks referenced by snapshot website at revision 17 existĪll chunks referenced by snapshot website at revision 18 existĪll chunks referenced by snapshot website at revision 19 exist Here is the output of check Listing all chunks Yes they are correct, that’s why I was asking.
