CloudBerry Backup Review – Windows, MacOS, Linux

CloudBerry Backup Review – Windows, MacOS, Linux

CloudBerry Backup is a versatile backup tool for Microsoft Windows, MacOS, and Linux. Today I’m going to do a Cloudberry Backup Review of the Desktop Pro version 5.6.0.182 on Windows 10.

You can also read our introduction to backups, which includes recommendations for backup products and online backup tools.

Contents

Executive Summary

CloudBerry Backup is a flexible and powerful backup tool with a large range of storage options. The tool may be somewhat overwhelming for a computer novice, but competent computer users should have no problems. The software it does a good job of walking beginners through using defaults and wizards.

Note that CloudBerry doesn’t provide storage, but lets you back your files up to any of a large number of types of storage or storage providers: local disks or NAS, removable disks, AWS Simple Storage Service (S3), Amazon Glacier, Microsoft Azure, Google Cloud, Google Drive, FTP, and many others (see the screenshot below). This covers just about every destination most people would want to backup to.

Pros: Flexible, powerful, huge range of backup destinations, relatively easy to use given the power, very well documented, provides good support, relatively reasonably priced, free tier provides 200GB of backups free (with some limits such as no encryption or compression)

Cons: The archive mode is space inefficient. Advanced backup mode creates a large number of files, which can increase backup costs on some storage providers – though probably not by much.

I’ve chosen CloudBerry Backup as my primary backup tool for online backups, internal mirroring, and offsite backups.

 

Background and Backup Best Practices

For more background on the what, why, and best practices for backups please see our backups overview article.

Features and Overview

CloudBerry has numerous features:

  • Backs up to a dizzying range of backup locations, including local disk, NAS, Amazon S3, Amazon Glacier, Google, Azure, etc, etc
  • Provides incremental, somewhat de-duplicated backups. This means only changed data is backed up, saving backup time and storage space.
  • Provides several backup modes – advanced (block based and deduplicated), simple (copies files but doesn’t do versions), archive (merges all backup files into a single file), and custom (I haven’t been able to figure this one out)
  • Provides compression and encryption of backups.
  • Allows you to keep multiple versions of backups.
  • Lets you configure whether deleted files are removed from storage, including how long after deletion it’s done. It warns you when this is going to happen.
  • Provides scheduled or manual backups.
  • Available on Windows and Mac OS-X.
  • While the website says it’s limited to 1TB of backups, but support tells me it’s actually “5 TB for cloud storage data, 5 TB for Glacier data, unlimited storage for local backups”. I have the desktop edition and I have 3TB of backups to external disks and 60GB in the cloud, so I think support is more accurate than the website.
  • There is extensive online documentation and blog posts, explaining almost anything you’d want to know.
  • The software is available free to run restores.
  • A command line version of the program is available for advanced users.
  • Support is reasonably fast and helpful.

I’ve found only a few cons:

  • Deduplication isn’t the best I’ve seen. It appears to be file based, rather than backup set based, and even in advanced mode it often stores more data than it really needs to when a new version of a document is created.
  • Archive mode isn’t very space efficient. If you add an identical file to the backup it stores it again, rather than referencing the existing file. Running a full backup stores all the files, whereas advanced mode references existing files. Because of these I wouldn’t use archive mode until it’s improved. This is a shame, as it would reduce costs when backups are stored in S3, though at $0.005 per 1,000 requests it’s unlikely to break the bank (full S3 prices can be found here). I also note that archive mode results in slower uploads to S3, as it’s a single threaded upload. Advanced mode uploads many files in parallel, which results in much faster backups.
  • Ease of use is fairly good, but having to provision backup storage separately from the backup program would confuse many people. It’s really a tool for competent to advanced computer users.

Ease Of Use

CloudBerry Backup is reasonably easy to use in itself. Having to provision backup storage separately from the backup tool would confuse many people. Setting up Amazon S3 and IAM users would baffle anyone who’s not an experienced or expert computer user.

Putting aside setting the backup storage destinations, the program could be used by anyone who considers themselves “ok with computers”. I know that if my 65 year old parents tried to use CloudBerry for backups they might manage it eventually, but it would take quite a bit of effort and reference to technical support.

The excellent documentation significantly increases ease of use.

CloudBerry Backup Walkthrough

The main screen of CloudBerry (click to see it larger) shows you your backup results, upcoming backups, settings, and storage accounts (aka backup destinations). This screenshot was from a virtual machine I set up to do a test restore. From this screen you can delete and edit your storage accounts, run backup validations, and do similar administrative tasks.

Cloudberry Backup Main Window

 

The backup plans page shows you details about each backup you have configured. It lets you run your backups manually, which is helpful for offsite backups which you can’t schedule. It also allows you to change backup settings, restore files, run a full backup, and do other similar things.

CloudBerry Backup Plans

 

The Backup Storage tab allows you to view your backups, and start a restore job. It’s an intuitive process.

The CloudBerry Backup Storage tab allows you to restore from your backups

 

The options screen provides you with many tabs that let you configure the way your backups works. A key thing I changed was the S3 upload size, so that files are uploaded in 250MB chunks rather than the default of 10MB. This makes backups faster and reduces costs slightly.

CloudBerry Backup has a wide range of advanced options

 

Setting up a Backup

The following shows how I set up one of my backups using the wizard interface.

I want to show the range of backup destinations CloudBerry supports. No other tool I’ve seen offers this range. I’ve only heard of a small fraction of these!

CloudBerry Backup lets you backup to a wide variety of locations and services

 

First, I setup my Amazon AWS account, created an S3 bucket, and created my user. CloudBerry outlines this process here and here – it’s reasonable simple but has quite a few steps. AWS itself is a complex system, and you can spend significant time setting up users and access policies.

Cloudberry Backup S3 Configuration

 

Next I choose the destination for my backup.

CloudBerry Backup Wizard destination selection

 

Next I named my backup and told CB to store my backup plan on S3.

CloudBerry Backup Wizard plan name

 

I chose to use advanced backup mode, as I wanted the features it provides. I don’t recommend archive mode at the moment as it’s not particularly efficient. Simple mode could be useful if you want to make some

CloudBerry Backup Wizard mode selection

 

I didn’t change anything on this screen.

CloudBerry Backup Wizard advanced options

 

Next you choose what folders and files you want backed up

CloudBerry Backup Wizard backup selection

 

The advanced filter screen gives you a lot of control over what’s backed up. I tell it to back up everything, but you can include or exclude file by type, backup files only when they’re a certain age, exclude large files, etc.

CloudBerry Backup Wizard advanced filter options

 

I chose to have this backup set compressed and encrypted. I don’t use compression for some of my backup sets, as some of my data is already compressed, additional compression slows things without giving any advantage. I used AES128 encryption even though 256 bit is available – nothing I have is that secret, and I assume AES128 is faster.

An important option here is to use the S3 Standard-IA storage class, which lowers your storage costs by around 50% with no significant advantages . This is for “infrequently accessed files” – all backups will be infrequently accessed. You shouldn’t use Reduced Redundancy storage, it’s more expensive than IA and has a higher chance of data loss.

CloudBerry Backup Wizard compression and encryption options

 

The next window gives you significant flexibility in scheduling your backups. I chose daily backups, but you can do real time or just about anything else you can think of.

CloudBerry Backup Wizard schedule window

 

This screen allows you to schedule full backups. In advanced mode this appears to check the backup and upload anything not already backed up. Full backups are useful to ensure backup integrity and remove old unused parts of files based on version history settings.

CloudBerry Backup Wizard schedule full backup

 

CloudBerry allows you to get email notifications when backups complete. I didn’t find this reliable, but I didn’t spend much time looking into it.

CloudBerry Backup Wizard email notifications

 

This screen summarises your settings, so you can check what you’ve done.

CloudBerry Backup Wizard summary screen

 

This is the final confirmation that the wizard is complete.

CloudBerry Backup Wizard confirmation screen

From here you can run your backup, or you can go back to the main screen and run it manually.

CloudBerry Backup Performance

I have a Windows 10 PC with two SSDs (one for operating system, one for caches and latency sensitive data), 4TB RAID for data (technically, it’s a ReFS formatting Storage Spaces mirror), and another single drive for things like family videos. I have an internet upload speed of 20Mbps over fiber, though being in New Zealand we have a lot of latency to Oregon in the USA where I store my S3 files. I could use S3 in Sydney but it’s more expensive.

I have backup sets that back my data up from the drive array to the internal drive, to an external drive, and to Amazon S3.

I found backup performance to be very good. Backup of files between drives varied between 10MB/sec and 110MB/sec. Directories of medium sized files tended to be around 45MB/sec, and backup of large files like virtual machine images averaged around 80MB/sec and peaked around 120MB/sec.

Backup to S3 in the USA using advanced backup mode varied between 10Mbps and 20Mbps, which is excellent given my internet connection. Backup in archive mode was slower, using a single thread, around 2Mbps, which is about what I’d expect.

Overall I’m happy with backup performance.

CloudBerry Backup Compression and Deduplication

CloudBerry says it does “block level backups”, which is generally related to deduplication. Based on my testing it does try to do this, but it’s only ok at it. Unlike other backup products that do block level backups across the whole backup set, CloudBerry seems to do them at the file level. That means if you add a second copy of a file to CloudBerry it will save the file again, whereas other backup programs will recognize the blocks are already stored and won’t store them again. This is less efficient than it could be.

I found that small changes to large files resulted in more data being stored than was necessary. I changed a few bytes in a 100MB file twice, which resulted in incremental backups of 30MB and 10MB respectively. This is larger than I’d have expected.

Test Restore

The whole point of a backup is to be able to restore your files. As such test restores are an essential part of backups, and should be done every few months.

I did a test restore from S3 on a clean virtual machine. The files stored on S3 were compressed and encrypted. I used a different IAM user that only had read only access to S3, and I restored only a small number of files. The restore process was intiative, fast, and successful.

In the future I’ll try test restores of files I know have many versions, including restores to older versions. At this stage I’m confident that CloudBerry will handle this just fine.

CloudBerry Backup Review Conclusion

CloudBerry backup is an effective, versatile tool that’s easy to use. The interface is easy to understand and use, though a few things like how to edit backup destinations wasn’t immediately obvious. Backup performance is excellent. It backs up to internal disks, external disks, NAS’s, and a huge variety of online storage providers easily. Backups can be run in parallel, which is ideal for backing up multiple backup sets to external disk.

Overall I’m very happy with CloudBerry backup. It’s a very flexible, configurable tool that performs well. I haven’t tried the Glacier backup functionality yet, but I assume based on my experience that it will work just fine.

After evaluating CloudBerry and a few other products, I’m going to make CloudBerry my main backup solution. It will replace the backup scripts, two backup programs, and CrashPlan backups. I’ll run it in parallel with the old backup system for 3-6 months to ensure there are no bugs or gotchas, but I don’t expect any.

I don’t think it’s prudent trust all of my backups to a single program, media type, or location. As such I’ll keep an archive in Amazon Glacier, store using different software and different IAM users so permissions don’t overlap. I’ll also keep a simple copy of my files on an offsite disk, as additional insurance.

I recommend CloudBerry backup to anyone with reasonable computer skills looking for a comprehensive, reliable backup solution.

 

Disclaimer: as a result of this review I was given a free license to CloudBerry backup standard edition. The review was written before I knew a free copy was offered to bloggers. This review is my honest unbiased opinion, and if I hadn’t been given a free license I would’ve happily paid for it.

 

Reference: S3 IAM Policy

For reference, here’s the IAM backup policy I’m using for CloudBerry backups to S3. A user associated with this policy has can list all buckets, and has full read and write access to the specified bucket. The user needs list access so they can let the user choose which bucket to backup to.

In future I plan to experiment with a couple of changes:

  • I might remove delete permissions, though this would prevent CloudBerry deleting old versions of files
  • I might experiment with S3 bucket versioning, which would let CloudBerry delete files but would keep a version history so they could be restored. In this case I would remove the S3 DeleteObjectVersion permission from the policy.

 

{
  "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Action": "s3:ListAllMyBuckets",
       "Resource": "arn:aws:s3:::*"
     },
     {
       "Effect": "Allow",
       "Action": "s3:*",
       "Resource": [
       "arn:aws:s3:::bucketname",
       "arn:aws:s3:::bucketname/*"
       ]
     }
  ]
}
Facebook Comments

3 thoughts on “CloudBerry Backup Review – Windows, MacOS, Linux

  1. Gordon Merryweather

    Interesting review – there aren’t too many out there of this product that properly dig into things.

    I have one Windows2016 Server running HyperV and a couple of Linux/Win VMs (Home Automation and File Server). I’m undecided between this and Duplicacy right now as they’re much the same cost – there’s a nominal renewal fee for the Duplicacy licence but Cloudberry is 30% more expensive for the server product.

    Have to admit the “poor” deduplication is not a trivial issue given how much cloud storage costs. Obviously you felt it wasn’t enough of an issue? How do you feel about the product now, one year later?

    1. Tim

      A year on my feelings about CloudBerry backup haven’t changed. Their core algorithms seem to be fairly simple with relatively poor deduplication. I use their incremental backups to local external disk, but I also use their “sync” function to save a copy on the disk just in case incremental backups can’t restore – though I have no good reason to think there would be any problem. I’m due to do a restore test soon, so I may update the article with the results of that.

      I backup some personal data to S3 using CloudBerry, mostly images and documents – for this I trust AWS far more, so I use S3 encryption and versioning rather than CloudBerry, and I don’t use incremental backups to S3. My data volume is fairly low, even with RAW files and some video, so the S3 costs of less than $1 a month are low enough they’re not worth spending time on. Note that with S3 I only keep files there six months until I go through my archiving process, which makes sure images exist on two local external disks and stores medium res photos / videos plus originals of all documents in AWS Glacier.

      I can’t comment on Duplicity (assume spelling error in your post) as I’ve never looked at it, but on the surface it looks interesting. I really like the idea of Duplicati which is free / open source, but my testing showed problems with restores when using non-standard block sizes a year ago. That bug is still open, and recently when they changed the status from alpha to beta software the main author said something like “we have to get around to tidying that up at some point”, which suggests the data is there and able to be restored and that it’s a tool / UI issue.

      All in all, I have no reason to think there’s any problem with CloudBerry, it’s just a gut feel.

      1. Gordon Merryweather

        Yes, I’d discounted Duplicati also as although I think it has great potential, the development is slow and still feels like a beta product which I can’t trust wholly with my data.

        It was Duplicacy 🙂 but I think it’s actually based on the same backend as Duplicity? https://duplicacy.com/home.html

        Many thanks for your thoughts.