# My new backup and archive routine



## JohnG (Jan 13, 2018)

I've spent a lot of time worrying about not having a remote backup and now I do. Here's what I'm doing:

Locally, I have a copy of everything copied from all the computers to hard drives set for RAID 1 (the mirror one, so you make a copy right away). So that's really a double backup.

But that won't help if there's a big fire. So you need a remote solution too. Remotely, I take the mirror'd drives and swap them out periodically, keeping one in the studio and one in a safe deposit box at the bank.

For archiving old Pro Tools and DAW projects, I keep a local (upstairs in the house) copy on hard drive, and a second copy on MDisc also in the safe deposit box.

MDisc purports to write data to a ceramic layer that is stable indefinitely (they say 1,000 years but whatever). The large disks can hold 100 GB.

I also create daily backups of all project and Pro Tools files onto a separate drive in those computers, using Carbon Copy Cloner on the Macs and AOMEI for the PCs.


----------



## anp27 (Jan 13, 2018)

I use Carbon Copy Cloner and Backblaze.


----------



## JohnG (Jan 14, 2018)

anp27 said:


> Backblaze.



I'm still considering adding back blaze to back up only my sequences and VE Pro setups. Too much data to use a service like that for sample libraries.


----------



## BGvanRens (Jan 14, 2018)

Interesting to see what and how other people take care of this.

Personally I have a NAS configured to use RAID 1, planning on adding a second NAS on another location and have them synced up during the night. Although it's only a 2 slot unit, could always add more NAS units or rebuild the mirror using bigger drives (using 2TB drives now). I never considered backing up libraries, might not be a bad thing to do..


----------



## anp27 (Jan 15, 2018)

JohnG said:


> Too much data to use a service like that for sample libraries.



Yeah, I have a relatively small sample library and projects (data in general) so Backblaze + Carbon Copy Cloner are perfect for me. Backblaze offers unlimited storage by the way.


----------



## JohnG (Jan 15, 2018)

anp27 said:


> Backblaze offers unlimited storage by the way.



I haven't heard anything bad about Backblaze. For all the online and offsite services, I'm not concerned about the quality, but the amount of data -- it's the rate at which things can be uploaded and stored. I have about 8 TB* of stuff to back up and someone told me that would take months.

If a computer fails, I want to be able to restore in minutes / hours, not a day or more, so that's why I think you have to supplement with a local backup even if you go the Backblaze / other route.

*Initially I wrote "8 GB" which was dumb of me.


----------



## JohnG (Jan 15, 2018)

BGvanRens said:


> I never considered backing up libraries, might not be a bad thing to do..



Sounds like a good setup. The only reason to back up libraries locally is if you're on a sharp deadline and you have to restore quickly to keep moving. You think "I can just re-download them" and you can, but sometimes that can take quite some time depending on one's ISP and the server speed. Some of my large libraries took 12 hours or more to download.


----------



## Ashermusic (Jan 15, 2018)

I back up my boot drive and audio drive to a physical HD with Time Machine. Once a month I back up my boot drive and audio drive with Carbon Copy Cloner to a different drive, that I then unplug. I have Backblaze for off site backup for ALL my drives, including the samples.

Not taking chances.


----------



## Kardon (Jan 15, 2018)

There's a very good "Best Practices" backup workflow write-up from the American Society Of Media Photographers. They open their overview with "There are two kinds of people in the world - those who have had a hard drive failure, and those who will." So true. I lost 3 hard drives last year, one fully backed up, one partially backed up, and one WAS a backup in my NAS (non-RAID). It's very important to be diligent with your backup routines, and keep them up to date (which was my partial backup nightmare, though I think this one can be firmware repaired by a vendor).

The ASMP promotes the 3-2-1 backup strategy: 3 copies, 2 different media types, and 1 offsite. I've slacked off of burning DVD backups for the 2nd media type, and use another hard drive.

The ASMP site is http://www.dpbestflow.org/node/262 and http://www.dpbestflow.org/links/39

World Backup Day is March 31st. But don't wait until then...


----------



## anp27 (Jan 15, 2018)

JohnG said:


> I have about 8 GB of stuff to back up and someone told me that would take months.



Hmm... Backblaze just finished backing up 600GB from my 1TB drive in about 12 days. I do have extremely fast Internet though and I put set Backblaze on the Unlimited setting (fastest).



JohnG said:


> If a computer fails, I want to be able to restore in minutes / hours, not a day or more, so that's why I think you have to supplement with a local backup even if you go the Backblaze / other route.



Exactly, which is why everything that is backed up on Backblaze is also available on external hard drives in my home. A copy and a copy of the copy. Oh, and I also have a Time Machine backup running.


----------



## JohnG (Jan 15, 2018)

@anp27 looks like you're covered!



anp27 said:


> 600GB from my 1TB drive in about 12 days



I miswrote -- should have been 8 TB, not 8 GB!

So if the speed is linear and you have 8TB to back up, it would take 3 months initially. Especially given that it's spread over a number of computers, that sounds like agony. I am happy with the safe deposit box!


----------



## JohnG (Jan 15, 2018)

Kardon said:


> I've slacked off of burning DVD backups for the 2nd media type, and use another hard drive.



That's why I bought the MDisc thing -- the largest disks hold 100 GB each, which makes it (somewhat) less painful for archiving. I still don't see it as "backup," but more long term storage.


----------



## BGvanRens (Jan 16, 2018)

JohnG said:


> You think "I can just re-download them" and you can, but sometimes that can take quite some time depending on one's ISP and the server speed. Some of my large libraries took 12 hours or more to download.



That is exactly the reason why it might not be a bad idea in my case. Even though I could download it at home using my 200Mbit line. I don't like that idea. The connection at my, currently in construction, studio is far less impressive. 14Mbit at best, so having the libraries locally is a huge time saver.


----------



## anp27 (Jan 16, 2018)

JohnG said:


> So if the speed is linear and you have 8TB to back up, it would take 3 months initially.



Yup, this sounds about right..


----------



## Henu (Jan 16, 2018)

I've been backupping basically every single music- related thing I've done since 2004 ranging from own projects to work via different album recordings and back. Basically, as my colleagues and bandmates say, "if he's done it, he still has it", haha! I'm completely OCD on archiving everything and make sure everything's backupped as tidy (and small, so I use a lot of zipping) as possible with clear folder structures and names.

Everything's backupped on (at least) two different physical medias at different locations and some to the cloud as well so I'm pretty much covered. I tend to look my work HDD only as a "temporary" storage and make sure that nothing which is not backupped or not important is staying there. The more empty is is, the more secured I feel.

I also have a dirty and quick backup for projects in general: a simple Windows bat- file I run twice a week on the task scheduler or usually manually (double- click on desktop) almost every time I finish the day. Running the file searches my pre- defined locations for all Cubase project files and backs them up in the cloud with the same folder structure I have on my PC. It's especially handy with midi- only projects as there's nothing more you need to backup than just the project file. In case anyone wants to try it out, let me know and I'll copypaste the command lines here!


----------



## macmac (Jan 16, 2018)

Internal boot drive and 4 externals here for daily use. Every day or so I do cloned backups (via SuperDuper), doesn’t take long.


----------



## synergy543 (Jan 16, 2018)

Amazon has Seagate Backup Plus 8T drives for $169. If you're using a Mac, be sure to get the Mac version ($10 more) or formatting is a bit of a challenge.
(turn ad block off to see)


----------



## synthpunk (Jan 16, 2018)

The one problem I have is when I tried Backblaze, it did not finish backing up my system in the 30 day trial period. I kind of gave up then. Anything changed ?

Pretty close otherwise to John's (Redundant Raid 1 of samples, projects, files, etc.) and Jay's (CCC & TM system backups once a week to two seperate drives) procedure.

Like many others one more layer of backup would make me feel much better. But it seems the dream of realistic cloud backup was killed when so many cloud services changed or went under.


----------



## jeffc (Jan 16, 2018)

JohnG said:


> I'm still considering adding back blaze to back up only my sequences and VE Pro setups. Too much data to use a service like that for sample libraries.



Hey John. I just dealt with this. I ended up using I-Drive for offsite backup - the great thing about it is... they send you a hard drive for the initial backup. Which saves you the weeks it would take. After that, all the periodic backups happen whenever you schedule, every 15 minutes. Once they get your initial backup, it's really smooth and totally runs in the background, and pretty flawless. It's kind of like Dropbox, where it just does its thing and you never have to think about it. I obviously haven't had to restore from it yet, but if it was ever needed, they also offer a hard drive restore option, if you need it fast.


----------



## synthpunk (Jan 16, 2018)

Nice to see Super Duper being developed again. I switched to CCC when things were up in the air last but always enjoyed SD for it's simplicity and free limited version.



macmac said:


> Internal boot drive and 4 externals here for daily use. Every day or so I do cloned backups (via SuperDuper), doesn’t take long.


----------



## SuperD (Jan 17, 2018)

For many years now, I've been using Déjà Vu for my backups. Quick, easy, affordable, and stable. http://propagandaprod.com

Normally I do backups once a week of all my drives. I keep my DAW projects on one, instrument libraries on another, plus I have my main OS drive on the MBP. I get DV to clone each of the drives, plus a few specific folders from the Mac. This system seems to work well for me. When I get the chance I'll get more drives to have off-location backups.


----------



## Rohann (Jan 17, 2018)

JohnG said:


> I'm still considering adding back blaze to back up only my sequences and VE Pro setups. Too much data to use a service like that for sample libraries.


Eh, not really? I'm sure I don't have as much as you, but I have about 4TB backed up to it. You can change settings so it only backs up certain files if there have been changes made to them, etc. It's technically a no-limit backup provider, and at $5 a month it was a no-brainer for me.


----------



## JohnG (Jan 17, 2018)

@Rohann I have at least 8TB, so it seems like it would take three months to start and then three months to restore.

Plus, updates seem to arrive regularly with new articulations and so on, so I picture a constant need to be online with that function.

So I am not going to go that route. The idea is to be able to get back in business quickly after a catastrophe like a fire, and three months is...a...long...time.

Last, I really don't think anything in the cloud is that secure. Which is a consideration if one has footage from clients.


----------



## Rohann (Jan 17, 2018)

JohnG said:


> @Rohann I have at least 8TB, so it seems like it would take three months to start and then three months to restore.
> 
> Plus, updates seem to arrive regularly with new articulations and so on, so I picture a constant need to be online with that function.
> 
> ...


Ooh, good point. Haha yeah that's a horridly long time. For my modest home setup and personal projects, it works as a last-ditch backup, mainly for the purpose of projects and sample libraries I can't re-download. I'm sure I'd be more particular with more clients and sensitive work, however, especially film. If that gets leaked somehow, one is probably hooped.

That said, since Backblaze is purely a backup service and has a cloud storage service which is distinct from its backup, I imagine it would be more secure. For $5 a month, even with the long download and upload times, it does seem worth it for yet an extra level of security.


----------



## TimRideout (Jan 18, 2018)

There's an excellent solution that I'm surprised more people aren't using - it used to be called BTsync (BitTorrentSync) - and is now known as Resilio Sync: https://www.resilio.com/individuals/
https://www.resilio.com/individuals/
I use it to sync files across my computers and also to a remote NAS. Incredibly easy and efficient.


----------



## Piano Pete (May 1, 2018)

Question to everyone using cloud storage: why do you not just vpn into an offsite server to sync with an onsite nas?

I guess the first thing would be having a place to park one, but if you did, wouldn't this be technically more secured (no 3rd parties involved) and cheaper in the long run?


----------



## Øivind (May 2, 2018)

I use a Synology NAS and then back those drives up towards Synologys C2 cloud backup. Initial backup takes a while if you have large quantities of data, but subsequent backups only add what is new/changed.

If i had an offsite area i would just get another NAS and then make the onsite one backup towards the offsite one instead of the C2 Backup.


----------



## Dewdman42 (May 2, 2018)

There are a lot of different strategies for backing up. The 3-2-1 strategy is a good overall set of objectives. Let me just make a few points i have come to over the years


Backup is NOT the same as redundancy. Mirrored drives are for redundancy, NOT backup. That means with mirrored drives if one drive fails you don’t stop working. You should have a spare drive ready to replace the failed drive the minute it fails. Mirrored drives are for zero downtime. That being said you are also advised to know that sometimes when a drive goes down it can corrupt the other drive, so you cannot count on mirrored drives as 100% protection
So you need another complete backup of whatever it is you want backed up, which will be your go to local place if the mirrored drive goes down completely or if you don’t have a mirrored drive then this is where you restore from when the main drive fails. You have some downtime while you restore. Your backup plan should take into consideration how long it will take to restore and you should make a test run.
Then for data you absolutely can’t lose you need it stored offsite somehow just in case your house burns down
Consider additional points:


You have two different reasons for needing a backup. You need it in case your drive fails, and you need it in case you make a human error and accidentally change or delete something and want to get back to what you had this morning or yesterday or last week, etc it s software glitch could cause data loss or corruption.
time machine is an excellent product for obtaining a version of your file(s) that got corrupted or lost somehow. It is not a good solution for a complete backup of anything should a drive fail and need to be replaced and restored.
I have also made a personal decision that I don’t need to backup stuff that is easily replaceable from a vendor. For example I don’t backup NI Komplete because I can easily redownload it or get it from the usb drive they sent me. I try to keep two copies of everything locally just in case. The only thing that goes to the cloud is stuff that would be hard to replace or is irreplaceable because it was customized by me or no longer available any other way.
I use crashplan for offsite storage. It does take a long time for the initial backup but it’s fast after that. I only backup the non replaceable stuff there, which is currently about 2T, but it’s important to note that I still must have a complete backup of that stuff locally, which is the real backup. Crashplan could go out of business or whatever and I’d still have my primary backup at home. The cloud backup is only for worst case scenario that you lost all your local data including both source and backup.
I also decided I can live with some downtime if I need to resetup my Mac, reinstall software, etc. I am not a working professional. If I was I would have a more elaborate backup strategy they has minimal downtime to get a Mac back to its working state. 
So generally I use time machine every hour to backup every file change on my Mac in a way they if I screw up I can get back to what I had a few hours earlier. I use a mirrored nas device to hold my critical data. Projects, documents, photos, irreplaceable libraries. The nas gets backed up to the cloud as well as to another local nas which is not mirrored. If I wanted less downtime I would also make periodic image backups of my Mac using carbon copy cloner so that I wouldn’t have to reinstall everything to get a working Mac back. I have a lot of sample libs on my Mac and those would be excluded from an image backup. They should be backed up separately because they are so big and rarely change. But be advised that even just copying 8T from one drive to another will take a long time. So if time matters, then use carbon copy cloner to image the whole Mac including sample libs every night to another drive. If it fails you can boot from the backup and keep going.
Backup strategies can get complicated depending on what your needs are. I try to ride the fence of making sure irreplaceable stuff is backed up 3 times including one offsite and that stuff that changes often is backed up to some kind of versioning system like time machine. But for me I have decided I could live with being down for a week if I had to reinstall all my apps and sample libs from scratch. That simplifies my backup procedures and reduces how much space I need for the backup.


----------



## gsilbers (May 2, 2018)

For those who use backblze cloud backup....

Do you do one time uploads and that’s it or is there a way to keep updating monthly file changes/additions?


----------



## gsilbers (May 2, 2018)

For local backups I use chronosync. 
For those not familiar w it... it’s awesome and easy.


----------



## Dewdman42 (May 2, 2018)

Crashplan definitely works like that and actually you can restore from versions there, in other words crashplan is similar to time machine in that you can restore from a point in time


----------



## JohnG (May 2, 2018)

gsilbers said:


> For local backups I use chronosync.



Mac or PC? Or both?

I would be so excited to discover a program as straightforward as Carbon Copy Cloner or even Time Machine (both Mac backup programs) but that works on a PC. I am using AOMEI but I would rather just have a program that copies files over in their original format instead of disk images and all that.

The problem with Crashplan or Backblaze or any of those is that recovering 4-8TB of data is just going to take months. That is totally impractical even if this is just a hobby. That's why hard drive backups at the bank, rotated regularly, work better in my view.

The online backups are fine for project files -- midi files and such -- but not for one's entire rig.


----------



## Dewdman42 (May 2, 2018)

You should not think of any cloud backup service as your primary backup. Its a secondary or even third backup...worst case scenario last resort in case your house burns down.

Remember they could go out of business or they could have their own technical problems or your internet could be down, etc. Many people are using cloud backup to backup their systems as their only backup and for many casual users that is probably fine, but for people like us with huge amounts of data, not really fine. Even 1TB would take a very long time to restore. And you might need to change cloud providers for any reason under the sun, forcing you to re-back-it-all-up again..

Strategically, I think people in our shoes need to think long and hard about what needs to be backed up. Different things need different kinds of backup. Sample libs typically don't change and usually you can reinstall them from the vendor if you have to. Do they really need to be backed up at all? Do you need to be able to get your main DAW machine back up and running within minutes or hours after the drive crashes? Then you better have an exact clone of it on drives in your studio. 

I personally would not back sample libs up to the cloud at all EXCEPT for stuff you have customized or created, or sample libs that perhaps are no longer available from the vendor for whatever the reason.

Now projects, documents, photos and such are another matter. Those change frequently and are totally irreplaceable if you lose them. For that you definitely want a local backup as well as something in the cloud. And you want versioning so that you can restore from a specific point in time. For example let's say you delete something by accident and don't realize it until a week later. If your nightly backup clones the drive, then two days later your backup volume won't have your deleted file. If you have versioned backups..then you can go back two days, or a week or whatever and find that deleted file. Believe me, you want this, but you don't need it on everything. You need it for stuff that changes a lot, like documents and such, but also configuration files, etc..

Time Machine does that well, but if you're on PC you need a different solution and unfortunately I am not familiar with PC options to recommend anything.

Crashplan also does versioning, and Crashplan will also let you backup to another device in your local network, so in a way crash plan is like a cross platform time machine. Its worth looking into. but I still claim the cloud backup should be your secondary backup

*My Strategy*

My backup strategy is that anything super important and irreplaceable is saved to my mirrored NAS as a network volume in my LAN. The NAS is synchronized with another non-mirrored NAS for a local backup. Its also backed up to the Crashplan cloud. I have a few sample libraries also saved on my NAS but in their original installer form. These are libs which I think might be hard to obtain later. Stuff like NI Komplete is NOT saved there. My cloud backup is about 2T, but I don't expect to ever have to restore from it unless my house burns down.

Then my macs in the house have time machine being used to back them all up to a time capsule server. This gives me versioning for whatever is on the macs that is included in the time machine backup set. I use time machine because its built in to mac. On PC I'd have to find something else.

I am considering running nextCloud on my NAS which will make it possible to have folders on my macs which automatically sync with the NAS, so that basically I can for example work on a project in LogicPro using local drives...and in the background nextCloud will copy that data over to the NAS device, which in turn can make sure to back it up to the cloud. 

Any sample libs I have purchased, I make sure I can either get the data from the vendor again if I have to, or its backed up at least once here onto USB hd drives, just like they ship on typically. 

If I wanted zero downtime, I could personally use Carbon Copy Cloner to exactly clone my mac's HD's, including sample lib drives..over to other drives that are connected via USB or something...so that in the case the drive goes bad, I can be back up and running in minutes.


----------



## tav.one (Aug 2, 2018)

JohnG said:


> and then three months to restore.



If you didn't know, Backblaze ships the hard drive with your backup data internationally and you can return the drive and get a refund, or keep the drive.
https://www.backblaze.com/restore.html


----------



## JohnG (Aug 2, 2018)

tav.one said:


> If you didn't know, Backblaze ships the hard drive with your backup data internationally and you can return the drive and get a refund, or keep the drive.
> https://www.backblaze.com/restore.html



I hear you and thanks for the suggestion / pointer. The goal of my backup setup is to be able within minutes, or at most an hour or two, to be composing again even if I lose a drive with a TB, or worse, a computer with 2-4TB of data.

The hard drive shipping normally would be great, but would be too slow if you're in the middle of a project and "Drive X" goes out with all your drums or strings or something. I realise that one can always re-download plugins and libraries, but that can take overnight for a single library, let alone "Drive D: Strings."

Your suggestion is still accurate of course, and could be an extra layer for smaller files. For example, Backblaze seems ideal for backing up project files, such as my DAW folder(s) for current project(s) and VE Pro setups on the PC slave computers. Those change quite often and, although I do back those up regularly (like once a day), it couldn't hurt to have an offsite in case who-knows-what happens.

In short, Backblaze does seem great for quickly restoring a relatively small number of files, but not 10TB of sample libraries and Pro Tools sound files.


----------



## tav.one (Aug 2, 2018)

I get what you're saying. You definitely need Carbon Copy Cloner or something similar for that scenario.


----------



## Dewdman42 (Aug 2, 2018)

If I were a working pro with a 10TB system I depended on for my livelihood, I would honestly clone the entire machine, all 10TB and I would have the second machine synced every night so that it looks like the the first machine... the second machine could be operated as a VEP slave or something so that its not sitting there doing nothing most of the time, but the critical point is that every night it would be updated to look exactly like the first machine in terms of what is on the disks. In a pinch, you can use machine #2 if machine #1 ever goes down for any reason.

CCC is a good software choice for the nightly sync, there are others too.


----------



## ReversedLogic (Aug 2, 2018)

I use a software called Cloudberry which is destination agnostic. You can have it mirror locally but it also supports several cloud destinations (Amazon S3/Azure etc etc) I use one called Wasabi because their storage costs are extremely cheap! $.0049 per GB/month

The client has encryption protection so it will alert you if encrypted files are detected and won't back them up unless you okay them. It works quite well although it's a bit more work than using something like crashplan but also more flexible - really depends how much control you want. This allows you to have multiple plans with different frequencies and retention policies for each etc.


----------



## JohnG (Aug 2, 2018)

Dewdman42 said:


> If I were a working pro with a 10TB system I depended on for my livelihood, I would honestly clone the entire machine, all 10TB and I would have the second machine synced every night so that it looks like the the first machine



That's what I do, essentially, though to an external drive. Every computer gets backed up constantly. And a third copy at the safe deposit box at the bank, which is not quite as up to date but, for the most part, not too far out either except for project files.

I still have some exposure of course, since I don't have a fully redundant system. It is indeed possible that a computer can completely blow up, but since I switched to using high quality components and making my own computers that has abated to a great extent.


----------



## Øivind (Aug 2, 2018)

Some nice tips for backing up here.




As for me, i now i use SyncBackFree to monitor the directories i have projects in, which automatically syncs those directories to my Synology NAS once every day. The Synology NAS then backs up to C2 Synology (cloud backup) once a week. Works great and i don't have to think about doing anything


----------



## benmrx (Aug 4, 2018)

+1 for ChronoSync. 

We use ChronoSync for automated nightly backups of all studios, calendars, FileMaker database, etc. to a local rack mount Raid, and manual backups to 2nd local rack mount raid.

We also do backups every night to LTO-6 using Retrospect. Then make clones of those tapes that go to an off-site storage unit. Each tape holds around 2.5TB. 

We also do backups to Amazon Glacier. 

Last month I had to pull a ton of old Pro Tools jobs from well over 10 years ago. Keeping everything organized is key!!


----------



## jcrosby (Aug 5, 2018)

gsilbers said:


> For local backups I use chronosync.
> For those not familiar w it... it’s awesome and easy.


Chronosync was recently recommended to me. From what I can tell it's _the_ missing link in my backup... I have backups of backups of backups of backups... It's the syncing of two non-identical machines that kills me, (laptop and desktop; both of which I write on somewhat evenly). So far Chronosync appears to be the best game in town for this kind of thing...


----------



## Kevin Fortin (Dec 1, 2018)

Thank you for this thread. It helped me (a hobbyist) sort out what would work best for me. In my case, I'm trying out CrashPlan, with the idea that I will back up Documents and Desktop and other essentials to the cloud, while separately backing up my sample drive to a local external drive to avoid the tedium of reinstalling those.

Edit: I ended up subscribing to Acronis True Image, which made more sense for my purposes.


----------



## Rob Elliott (Dec 1, 2018)

Yea for me it is the a two headed monster - Redundancy and Backup.

For Backup I have std drives on all three slaves and main (with all project and sample files that each computer holds). Worse case an SSD data drive goes down - I could stumble through that day's work with the onboard std drives - once cue is out take care of drive replacement.(I only turn these std drives 'on' when I b.u. once a week.) More often for WIP hot projects.

Each night I back up to my NAS.

Once a week I back up (sync toy) to external usb drives that go into my fire safe.


Where I feel I am exposed is for 'off-site' storage (in case of local fire) AND SSD cloned OS drives if one of those go down or corrupt (which in fact has happened to me - albeit only once.)


----------



## tack (Dec 1, 2018)

I wonder if any two people have the same backup solution -- except of course for those who have none. 

I imagine this is because everyone's circumstances are different: their budget, the amount of data, data location, their sensitivity to data privacy, their personal RTO, RPO and MTO, their technical proficiency, the nature of the disasters they want to protect against, etc.

In my case, my requirements were:

Budget: up to $20 CAD/month on services, $1k CAD for capital outlay.
Amount of data: 4TB total, ~400GB utterly irreplaceable (i.e. personal data).
Privacy: Paranoia level. All data must be encrypted and readable only by me. Closed source backup software distrusted.
RTO: 0.25 hours for simple failures (disk failure, accidental file deletion), 48 hours for catastrophic disasters (e.g. fire or theft)
RPO: 1 day
MTO: 5 days
Tolerance to technical difficulty: high
Worst-case tolerated disaster: major natural disaster in my area (one that I was lucky enough to survive )
Here's my own solution:

All software (including sample libraries) is copied to my home NAS (4x6TB RAID5) before installation. (This was a standard practice for me anyway even before I had a proper backup strategy.)
I have a server running a bunch of Linux VMs whose filesystems I also rsync nightly to a backup volume on my NAS
Local Nextcloud
Nextcloud installed on my NAS
Nextcloud client running on my desktops, syncing project directories and other document folders I store personal data to.
This protects against fat fingering, and Nextcloud also provides versioning so I can grab older saves of projects in case I need to recover something.
My Nextcloud instance is accessible from the Internet as well, so I can access important data if I'm away from home.
Edit 2019-09-18: I've since switched to Syncthing for versioned file-syncing to my NAS but still use Nextcloud as a means of storing documents and sharing files.

Local backup
I bought an additional 3 6TB disks for local backups
Each weekend I backup important directories (including sample libraries) on my NAS to one of the disks (via rsync and the storage is encrypted)

Off-site backup
As I rotate through the disks for my local backups, I try to keep one of them at a friend's house but I'm a bit more lax on this than I want to be

Cloud backup
The encryption requirement ruled out Backblaze (their unlimited Personal Backup solution, not B2)
I settled on Wasabi (which I see ReversedLogic mentioned earlier in the thread) which I've been quite happy with
Excellent performance, very low cost, but not so low cost that I think their business model is unsustainable (which is a problem I see in quite a lot of cloud backup offerings, especially unlimited ones)
The region where my data is stored is 1000km away from my home; quite sufficient for DR purposes.

I decided to only back up irreplaceable personal data to cloud
I use the free version of Duplicacy and it runs each night on my NAS, backing up essential personal data (about 400GB)
Duplicacy provides client-side encryption and ensures Wasabi can't access my data
It also has a really nice property in that all backups are equal snapshots (no need to choose between incremental, differential, or full backups)
Consequently, I can maintain version tiering with my cloud backups, with access to versions going back up to a year

Total cost is $5 USD/month

The hard part really is the initial setup and getting the automation in place. The only manual thing I need to do is the weekend backups.

I should be able to weather most sorts of disaster, although I hope in the fullness of time I'll discover all this has been for naught.


----------



## Dewdman42 (Dec 2, 2018)

A lot of ways to approach backup and redundancy. It can easily get overwhelming if you cover all bases. This is what I’m currently doing.


Netgear readynas functioning exclusively as a time capsule server. 2-3 macs are backed up vía time machine to this server. However, many large directories full of easily downloadable content are excluded from time machine backup, such as Logic Pro content, most sample libs, etc IMPORTANT POINT: main reason for this backup is to restore a file here or there that gets accidentallly deleted or corrupted for any reason including user error. Frankly I rarely have needed to.
A second Netgear readynas running in raid-5 mode. This is a large volume with enough space to hold all mission critical data files, most sample libs that i own. Because it is raid, one drive can go bad and I will still be able to keep using it until I can get a replacement drive. I work directly from this network volume whenever I can but daw projects are manually copied over at points in time deemed by me.
The raided readynas is backed up to crashplan, which also includes versioning similar as time machine. Anything that is irreplaceable is included in that backup. For example daw projects, photo collections, some sample libs, but in general things which I know can be redownloaded or reinstalled from elsewhere are not included in this offsite backup.
So takeaway is that my critical and irreplaceable files are raided and backed up offsite. Replaceable things are kept somewhere offline once, that i can obtain and reinstall as needed are definitely not included in nightly or regular backups. Granted if something happens to the drive on my macs, i will have some downtime while I get a new drive and reinstall OS X and all my apps and libs which is probabaly a 20-40 hour process. But I can live with that to avoid backup mania leading to complication and expense. If I was a working pro it would be a different matter and what I would do is have one completely mirrored volume of every Mission critical machine so that if it goes down i swap drives and keep working unaffected.


----------



## Anders Wall (Dec 3, 2018)

Dewdman42 said:


> The raided readynas is backed up to crashplan.


May I ask how?
Don't find anything on how to backup a NAS to Crashplan.
Thanks.
/Anders


----------



## Dewdman42 (Dec 3, 2018)

Anders Wall said:


> May I ask how?
> Don't find anything on how to backup a NAS to Crashplan.
> Thanks.
> /Anders



https://hub.docker.com/r/jlesage/crashplan-pro/

Crashplan has had a linux client for years. The old way to do it was to do some unsupported hackery whereby the gui on Mac/PC connects to the linux box running the gui-less crash plan in order to configure and control it. Apparently that mode of operation is being ended, though I'm still using it for now. The above method is what I plan to switch over to soon, using docker and web based gui.

The guts of crash plan run as a command line program without any gui, on all platforms, including linux. They generally use a java client that connects to it for configuring it, etc...and with the above docker you use a web based gui instead. Supposedly. I haven't set that way up yet, but will be doing so soon.

this requires your NAS to be running linux. Readynas works. Probably Synology works. Not sure about others.


----------



## JohnG (Dec 3, 2018)

Rob Elliott said:


> Once a week I back up (sync toy) to external usb drives that go into my fire safe.
> 
> Where I feel I am exposed is for 'off-site' storage (in case of local fire)



I put an extra backup in my bank safe deposit box periodically to cover the off-site matter, something you could consider. 

I first back up to disks 1 and 3 of a four-bay drive enclosure. The enclosure is set to generate a mirror on disk 2 of everything on disk 1 (Macs), and a mirror on disk 4 of everything on disk 3 (PCs / Windows). I swap out disks 2 and 4 once in a while to the bank, and when I bring back the third pair (the disks from the bank), I put them in bays 2 and 4, where they're automatically overwritten by the last backup.


----------



## dzilizzi (Dec 3, 2018)

I am slowly working on backing up all my computers and programs after my music computer HDD crashed recently. I thought I had a image - I did, but it was too old to be more than just a place to start. Fortunately, it just had programs and effects on it. All the samples were on other computers and most of the data was duplicated somewhere. 

I'm looking at all these recommendations for cloud storage. It always makes me nervous to put personal stuff in the cloud, but with the recent fires here in California, it seems like a good idea. I noticed that Amazon's Glacier is extremely inexpensive, and combined with a safe deposit box, might be what I need. At least for things like pictures and sample libraries that can't be redownloaded. It can be expensive to download from it (and slow) so I don't think of it as a primary back up solution. 

Has anyone lost all their information that was backed up to a cloud storage? If you accidentally missed the payment, do they delete it all the next day? With my luck, that is when my computers would all crash.


----------



## Gerhard Westphalen (Dec 3, 2018)

Anyone ever have issues with a "live" backup that continuously mirrors to a NAS or cloud? I know of a few people who work directly in their Dropbox folders but I wouldn't want to be constantly uploading all day and eating up hard drive bandwidth. I'd be worried about issues if I were, say, recording a number of tracks and then the hard drive is simultaneously trying to get it all uploaded. I just do a nightly backup because of this. I think Time Machine works continuously?


----------



## Desire Inspires (Dec 3, 2018)

2TB with Google Drive for $9.99 a month.


----------



## JohnG (Dec 3, 2018)

dzilizzi said:


> I'm looking at all these recommendations for cloud storage.



I do think cloud storage -- in a limited way -- could be helpful for composers.

The "limited" proviso arises from the relatively glacial speed of recovery. If you work in TV your recovery time has to be measured in hours, not days or weeks or longer. Cloud storage could certainly back up your current sequences and audio material you've recorded for a show or, maybe, a movie/game, though once you've recorded the orchestra / large numbers of parts the data can get pretty huge for a movie too.

I doubt cloud storage's recovery speed is fast enough when you are talking about recovering your sample libraries. If you have multiple terabytes of sample data, recovery could take many days or weeks.



Gerhard Westphalen said:


> I think Time Machine works continuously?



Yes it does, or at least something like once an hour (so not, technically, "continuously.")


----------



## Gerhard Westphalen (Dec 3, 2018)

JohnG said:


> Yes it does, or at least something like once an hour (so not, technically, "continuously.")


If you were recording something like 24 tracks at 96k where it's generating a lot of data (so that the Time Machine ends up constantly running to try to catch up), couldn't that lead to some issues in terms of limiting the drive's bandwidth while you're recording? Not sure if it would be an issue with SSDs.

Edit: And if it's going over the network, you also have all of the network activity which could lead to DPC spikes and all that jazz.


----------



## Synetos (Dec 3, 2018)

I use SYNC.COM because all the data is encrypted and won't be searched like Google Drive and others. Upload is slow (12Mbps) but download is 300+. Not a great solution for fast recovery, but certainly easy offsite storage. 2TB solution for $96 a year was cheaper than another USB drive and cant die or get lost. I archive long term storage items, but use sync feature for things i might want immediately available. Just gotta watch the data on your service provider, if you dont have an unlimited data plan.


----------



## Synetos (Dec 3, 2018)

BTW-I do not record to the sync drive. I record to an SSD, and then have an online backup process robocopy the data to a second SSD drive. From there, I will archive to USB and also to the Cloud. That is my process at this point. I havent lost anything, yet...


----------



## Stevie (Dec 3, 2018)

Dewdman42 said:


> https://hub.docker.com/r/jlesage/crashplan-pro/
> 
> this requires your NAS to be running linux. Readynas works. Probably Synology works. Not sure about others.



It does, running the docker version for quite a while on my Syno.


----------



## JohnG (Dec 3, 2018)

Gerhard Westphalen said:


> If you were recording something like 24 tracks at 96k where it's generating a lot of data (so that the Time Machine ends up constantly running to try to catch up)



I would never try to get Time Machine to back up that much. I use it for project files (like sequencer files) -- not heavy audio drives.


----------



## dzilizzi (Dec 3, 2018)

JohnG said:


> I do think cloud storage -- in a limited way -- could be helpful for composers.
> 
> The "limited" proviso arises from the relatively glacial speed of recovery. If you work in TV your recovery time has to be measured in hours, not days or weeks or longer. Cloud storage could certainly back up your current sequences and audio material you've recorded for a show or, maybe, a movie/game, though once you've recorded the orchestra / large numbers of parts the data can get pretty huge for a movie too.
> 
> ...


Very slow download, yes. But it would not be the primary or secondary backup. Kind of a just in case my house and my bank were destroyed in an earthquake. I'm looking at more than 2TB of data, which is why I'm thinking about it. I am a packrat, I have way too much stuff....


----------



## Anders Wall (Dec 3, 2018)

Dewdman42 said:


> https://hub.docker.com/r/jlesage/crashplan-pro/


Perfect, thank you!!!
/Anders


----------



## Stevie (Dec 4, 2018)

Here's a step by step guide: 
https://blog.rylander.io/2018/07/20...mall-business-using-docker-on-a-synology-nas/


----------



## Anders Wall (Dec 4, 2018)

Stevie said:


> Here's a step by step guide:
> https://blog.rylander.io/2018/07/20...mall-business-using-docker-on-a-synology-nas/


Thank you, helped me solve the "CrashPlan for Small Business is exceeding inotify's max watch limit."
/Anders


----------



## JohnG (Dec 4, 2018)

dzilizzi said:


> Kind of a just in case my house and my bank were destroyed in an earthquake.



I guess it's conceivable your bank and house both could be destroyed, but I'm banking (har-har) on those odds approaching the negligible. In the case where both house _and_ bank are flattened, composing may be off the table for a non-trivial amount of time.

I'm more worried about fire at my house.

*Mucho Terabytes*

I use this enclosure: TerraMaster D4-310 USB3.0 Type C External Hard Drive 4-Bay RAID Enclosure Supports 2 Sets of RAID Storage with 2 USB3.0 HUB’s (Diskless)

It can hold lots of terabytes (from their description): Can install four either 2.5 or 3.5 inch SATA or SSD drives. Compatible with 3.5 inch SATA drives up to 10TB, and a maximum total capacity of 2 sets of 20TB.


----------



## MartinH. (Dec 4, 2018)

JohnG said:


> I haven't heard anything bad about Backblaze. For all the online and offsite services, I'm not concerned about the quality, but the amount of data -- it's the rate at which things can be uploaded and stored. I have about 8 TB* of stuff to back up and someone told me that would take months.
> 
> If a computer fails, I want to be able to restore in minutes / hours, not a day or more, so that's why I think you have to supplement with a local backup even if you go the Backblaze / other route.
> 
> *Initially I wrote "8 GB" which was dumb of me.



Bit of a necro reply, but I thought it might be interesting for someone interested in Backblaze: I live in Germany and actually tried to use Backblaze for a couple of weeks. They use a transfer protocol that limits transfer bandwidth based on latency and they don't have datacenters in Europe. Because of this my upload speed was so ridiculously slow, it hovered around 2% utilization of my 10 mbit upload bandwidth. Their support was very friendly and quick to respond, but told me that there's nothing they can do at that moment (that was March of this year, no idea if the opened up new data centers). If you are in the US it might not be an issue for you, I'm mainly mentioning this for non-US based composers.


----------



## whiskers (Dec 4, 2018)

The IT guy in me just wants to say "RAID is not a backup"


----------



## tack (Dec 4, 2018)

MartinH. said:


> They use a transfer protocol that limits transfer bandwidth based on latency and they don't have datacenters in Europe. Because of this my upload speed was so ridiculously slow, it hovered around 2% utilization of my 10 mbit upload bandwidth.


This might just be a basic TCP issue, and not a limitation with the application protocol (except inasmuch as it's not explicitly optimized for high latencies through parallelism -- perhaps that's exactly what you meant).

Since we're talking about Backblaze I know you weren't sending from Linux. But for those of you who are uploading to cloud storage providers from a Linux system, if you have a relatively recent kernel, definitely enable the bbr congestion control algorithm.

It's really truly magical what it can do to upstream performance. At work, I was getting some underwhelming results over a private backhaul between the US and Singapore, getting maybe 100Mbit/s upstream with cubic (the default congestion control algorithm) in the face of about 250-300ms latency, and switching to bbr abruptly improved the upstream to 2-3Gbit/s with no untoward queuing on any of the intermediate network elements.


----------



## MartinH. (Dec 4, 2018)

tack said:


> This might just be a basic TCP issue, and not a limitation with the application protocol (except inasmuch as it's not explicitly optimized for high latencies through parallelism -- perhaps that's exactly what you meant).


I already forgot the details, but probably yes.


----------



## Soundhound (Dec 4, 2018)

I used to use - 
Crashplan for working files, archives
Two separate external HD sets for sample libraries
Drobo for non working files, old odds and ends storage

I currently use
- Backblaze - everything, including 12tb of sample libraries etc. Went to Backblaze when Crashplan changed to business oriented backup posture earlier this year. Liking Backblaze much more so far. 
- One set of external HDs for sample libraries.
- The Drobo recently died (either the Drobo itself or the drives, finding out) There's some old stuff on there that may be gone forever, won't know if it's needed until I need it.  Currently looking for a replacement for the Drobo.


----------



## Dewdman42 (Dec 4, 2018)

I have found the business model of crash plan fine. But Backblaze is also well reviewed and economical. Do they have a linux client for it yet?


----------



## Dewdman42 (Dec 4, 2018)

I want to also add a little philosophy while I sip my coffee this morning. People should always think of cloud backup as something they will probably never have to restore from. This is the last ditch backup location in case your house burns down, etc. Not only that you should not think of it as permanent archival. Backblaze or Crashplan could go out of business at any time, etc This is just a place to copy your absolutely irreplaceable files that if you lost everything, your entire studio was fried by an solar flare EMT, then you could get that valuable data back from the cloud and you'd be thankful beyond belief to get it back even if it takes a week to download it.

That being said, there have been a few times where I recovered a single file or folder from crash plan and it only took a few minutes for that, it wasn't terrible and the versioning of crash plan absolutely did come in handy to get a version of something back to my NAS from a week ago or whatever. So it does have occasional use like that which is good.

But really if you lose an entire drive or very large datasets, you should not be expecting the cloud to be the primary backup location for that drive. You should be keeping it backed up locally. This is a much more likely disaster then your house burning down or an EMT event. You want to be able to restore that within minutes or a few hours and you will need local backups of anything you expect to be able to restore quickly and get back to work. It costs money to create so many backups, so everyone has to determine their priorities about what needs to be able to restore in minutes vs what would be ok to reinstall over a few days, or what would be ok to wait for a week or two to download it from the cloud, etc..

And when using cloud services, you should never ever think of that as your archival place. Do not depend on any service for such a thing. You must archive it yourself. Be ready at any time to move to a different service provider and upload it all again because the first one went out of business or whatever. Archival needs to be in more than one place and probably put in a bank vault as John does. And by the way, DVD's are not permanent, don't expect them to be.


----------



## Synetos (Dec 4, 2018)

Well...in support of the Cloud as "a" place to store a backup, not "the only" place, it takes about 8hrs to pull down 1TB at 300Gbps. It takes about 8 days to get it in the cloud at 12Gbps. So, I agree that it isnt a great single solution, but it is workable for a restore of a project, which probably isnt 1TB. It just depends on what you have for upload and download services.


----------



## tack (Dec 4, 2018)

Synetos said:


> it takes about 8hrs to pull down 1TB at 300Gbps. It takes about 8 days to get it in the cloud at 12Gbps.


Mbps


----------



## dzilizzi (Dec 4, 2018)

tack said:


> Mbps


So, 6 weeks?


----------



## Dewdman42 (Dec 4, 2018)

it took me more than a month to upload my 2TB of data to the cloud. I have 100Mbit speed, which is high for most people actually. Very few have the luxury of Gbps bandwidth. if you do, you can probably afford a better in house backup system too.


----------



## JohnG (Dec 4, 2018)

whiskers said:


> The IT guy in me just wants to say "RAID is not a backup"



"Look, contradiction isn't an argument"

"Well...can be."

RAID can be a backup-of-a-backup if it's the right kind of RAID. I back up to disk 1 and have a mirror RAID to disk 2, so the same data is on both. So, if I understand it correctly, it would take both disks to fail to have no backup at all.


----------



## tack (Dec 4, 2018)

dzilizzi said:


> So, 6 weeks?


No, it takes ~8 hours to pull down 1TB at 300Mbps, not 300Gbps. 300Gbps could do it in 27 seconds, although I don't think your storage will be fast enough.


----------



## Dewdman42 (Dec 4, 2018)

Strictly speaking, whiskers is correct. You should definitely NOT think of raid as a backup, it is not. RAID is "redundancy", which means if a single drive goes down, you are still operating without any downtime, until one of the other drives also goes down, then you're not operating anymore until you can restore from backup.

An actual "backup" means that you are protected from a drive going down and you're also protected against accidental deletions, file corruptions, or other strange things that can happen to cause your data to be corrupted in some way, whether by user error or system failure of some kind. A raid does not protect you from that it just redundantly copies that corruption to both drives.

So raid is good yes, for zero downtime. You can sleep a little easier knowing that if you get a drive failure, you can replace it quickly and have zero down time. However, that is really not a complete backup and should not be thought of as one. You still need to backup the entire raid to a completely different volume at regular intervals. You can use versioning or not, depending on how much you want to spend and complexity. Then you finally have cloud backup beyond that in case your house burns down.

The old adage is, keep your important data in 3 places.


----------



## tack (Dec 4, 2018)

JohnG said:


> RAID can be a backup-of-a-backup if it's the right kind of RAID. I back up to disk 1 and have a mirror RAID to disk 2, so the same data is on both. So, if I understand it correctly, it would take both disks to fail to have no backup at all.


From an IT perspective, backups solve a particular problem that's different from physical failures (disk or hardware failure, network partition, etc.). For example, you might have a database that replicates data realtime between different continents, but we (IT nerds) don't consider that a backup because it wouldn't solve e.g. fat fingering a deletion, or provide a means of recovering historical data that's changed over time.


----------



## whiskers (Dec 4, 2018)

JohnG said:


> "Look, contradiction isn't an argument"
> 
> "Well...can be."
> 
> RAID can be a backup-of-a-backup if it's the right kind of RAID. I back up to disk 1 and have a mirror RAID to disk 2, so the same data is on both. So, if I understand it correctly, it would take both disks to fail to have no backup at all.



Yes, for that scenario you would be correct, you're protecting against loss of a single disk with RAID 1. However since data is written to the second disk almost instantaneously, you're not protecting against corrupted data written, malware, etc.

From the IT operative the more holistic solution would be 3 copies of the data, on 2 different types of media, with at least 1 offsite copy.

But you're right, for the home user RAID 1 is better than nothing. Cheers.


----------



## JohnG (Dec 4, 2018)

Fair enough points @tack and @Dewdman42 and @whiskers

But just to review the bidding, the RAID I'm using is mirroring my backup, not my original files, so the RAID is to protect against drive failure. The "originals" are on the DAW and the PC slave computers and, although they're on an automatic backup schedule, it's once a week, hopefully long enough to discover malware etc.*

Then I take the mirrors periodically and swap them out, putting the fresh ones in the bank. So I guess I have:

1. The original files

2. The backup of those files on an HDD

3. A copy of that backup on another HDD, mirrored; and

4. A copy at the bank.

The copy at the bank is of course out of date, but still a lot better than what would be left after a fire.

I had all my computers sitting by the front door for a week recently, watching ash fall around the house, so it's not a wholly theoretical risk. For me -- suddenly -- nothing happened. Some people were a lot less lucky.

*I actually have daily backups too onto a hard drive of Digital Performer (DAW) files and my Pro Tools recordings. That's the extra-extra backup in case I delete something.


----------



## whiskers (Dec 4, 2018)

JohnG said:


> Fair enough points @tack and @Dewdman42 and @whiskers
> 
> But just to review the bidding, the RAID I'm using is mirroring my backup, not my original files, so the RAID is to protect against drive failure. The "originals" are on the DAW and the PC slave computers and, although they're on an automatic backup schedule, it's once a week, hopefully long enough to discover malware etc.*
> 
> ...



That sounds much more thorough, I'd be comfortable with that. I was just pointing out a general pet peeve of mine, not digging on your solution  Just dont forget to verify your backups occasionally and you're good!


----------



## Soundhound (Dec 4, 2018)

@Dewdman - I uploaded about 8tb of sample libraries in less than a week (I think) with Backblaze. It was very quick, not sure of the exact amount of time/days. That was at our place in LA which has 150mbps upload. In Georgia where we care currently it's about 10 upload, so takes waay longer. 

When Crashplan changed to business orientation I looked around for another option and I think it was on ViC that I heard about Backblaze. I'd always found Crashplan a little difficult to manage and understand, but I'm not very tech savvy. Backblaze I find a lot more user friendly. 

And the point about backing up locally as well as in the cloud - when my Drobo crashed recently I didn't have my Time Machine going, as we'd moved recently, and I did have to recover about 300gigs of working data from Backblaze. I think it may have taken overnight to prepare and download the stuff, not bad and saved my butt. 

Next up is updating my local backup of the sample libraries. This backup stuff takes a lot of time. I'm turning into the disgruntled IT guy.


----------



## dzilizzi (Dec 4, 2018)

I'm calling this my backup project. I'm hoping it will be set up by the end of December. It's involving a lot of culling of files and organizing. I tend to DL everything to my desktop, prior to copying it to my studio computer or my laptop. In someways it had been a back up. I wanted to make an image of my desktop only to realize there is about 800 GB of files in the DL folder. 

I probably need to combine it with the move everything on CD and DVD to hard drives project. It is way too much work.


----------



## MartinH. (Dec 4, 2018)

whiskers said:


> Just dont forget to verify your backups occasionally and you're good!



How does one best do that? I use beyond compare, should I occasionally do a full CRC checksum compare between main drive and backup drive instead of the quick compare that gets made during regular backup? I'd have to schedule something like that down to smaller chunks because it would take too long for the whole drive at once.


----------



## whiskers (Dec 4, 2018)

MartinH. said:


> How does one best do that? I use beyond compare, should I occasionally do a full CRC checksum compare between main drive and backup drive instead of the quick compare that gets made during regular backup? I'd have to schedule something like that down to smaller chunks because it would take too long for the whole drive at once.



from an IT perspective, I would be tempted to say attempt a restore, occasionally, but for most people that's not feasible. 

Checksums/Hashes would definitely be a good idea, in theory, but really only useful if you haven't changed your source you're comparing to. As soon as you change an item in the source of whatever you backed up, the hash is going to look different. But if you hashed right after, that would be a good integrity check and provide some peace of mind.

Correct me if I'm wrong, BC will tell you if any files are missing/any deltas between two drives or images correct? But i don't think it would account for corrupted files? Been awhile since I've been directly involved in the sysadmin side of things, so others may have better ideas. Cheers!


----------



## MartinH. (Dec 4, 2018)

whiskers said:


> Correct me if I'm wrong, BC will tell you if any files are missing/any deltas between two drives or images correct? But i don't think it would account for corrupted files? Been awhile since I've been directly involved in the sysadmin side of things, so others may have better ideas. Cheers!



BC works on "file level", no images. That's why I like it so much. I have a huge dataset (roughly 3 TB) and make changes to multiple gigabytes of files daily and I don't need any automated versioning, because I'm used to saving versions of my files manually. All I need that backup to do is make sure all files are regularly mirrored to the backup drive (ideally daily) and I disabled deletion of "orphaned" files, so that it doesn't automatically remove a file from the backup when I accidentally delete it on the main drive. A full restore would be entirely impractical and impossible to manually check if the files are all still fine. I was thinking as long as I do an occasional crc check with the file comparison (which so far I have never done addmittadly), it should protect against "data rot" on the backup drive. Is that not the case? 

I do a fairly regular "drive rotation" though, to prevent total drive failure in the first place. I buy a new drive every ~2 years, use it as primary backup drive for a few months, then switch the roles of active and backup drive with the main drive. That way my main drive is always less than 2 years old and always was "battletested" for 2-3 months before it got used for "critical" work. I've based those numbers on the statistical clustering of hard drive deaths along the time axis, as demonstrated both by the big google study on hard drive longevity (released many years ago) and my own personal experience with drives.


----------



## whiskers (Dec 4, 2018)

MartinH. said:


> I was thinking as long as I do an occasional crc check with the file comparison (which so far I have never done addmittadly), it should protect against "data rot" on the backup drive. Is that not the case?



Sounds right. DR (disaster recovery) is not really my area of expertise, but my understanding is periodically running the drives and checking the contents will help bit rot/data decay/whatever you want to call it. When running the drives they typically perform SMART checks and can mark bad sectors, etc. from being used (think chkdisk or fsck), so it's definitely better to occasionally check the contents, for sure.


----------



## LinusW (Dec 5, 2018)

I have unlimited space at Jottacloud. Everything gets backed up there. As cheap as Backblaze, less regulations.


----------



## tack (Dec 5, 2018)

For services offering cheap unlimited storage, it's _very likely_ the case that your data is accessible to the storage provider. This is how they can offer unlimited storage: by relying on the fact that a good chunk of data can be deduplicated. (If everyone here uploads their copy of Monster Library X, they only need to actually store 1 copy of it.) This is invariably coupled by the requirement to use their proprietary software, rather than a third party backup program (because third party software will be able to properly encrypt the data).

I'd be quite interested to learn about exceptions to this: are there any services offering unlimited storage for around say $10USD a month that present, for example, and S3 API interface for use of third party backup software, and have been offering unlimited for more than 3 years? (The criteria of offering it for more than 3 years is important, because many unlimited storage offerings have come and gone over the years, but they tend to vanish when businesses realize the product isn't sustainable.) These are the really interesting options IMO, especially ones that perform well.

If data privacy isn't a requirement for you (and/or perhaps you have a higher priority conflicting requirement that you want to be able to access your data via web browser), then something like Backblaze Personal Backup (not B2) is hard to beat.


----------



## Stevie (Dec 5, 2018)

Hmm, Crashplan has the possibility to encrypt data in different ways:
https://support.code42.com/CrashPlan/4/Configuring/Security_Encryption_and_password_options

But, it's true, they don't provide S3 access.


----------



## Synetos (Dec 6, 2018)

tack said:


> For services offering cheap unlimited storage, it's _very likely_ the case that your data is accessible to the storage provider. This is how they can offer unlimited storage: by relying on the fact that a good chunk of data can be deduplicated. (If everyone here uploads their copy of Monster Library X, they only need to actually store 1 copy of it.) This is invariably coupled by the requirement to use their proprietary software, rather than a third party backup program (because third party software will be able to properly encrypt the data).
> 
> I'd be quite interested to learn about exceptions to this: are there any services offering unlimited storage for around say $10USD a month that present, for example, and S3 API interface for use of third party backup software, and have been offering unlimited for more than 3 years? (The criteria of offering it for more than 3 years is important, because many unlimited storage offerings have come and gone over the years, but they tend to vanish when businesses realize the product isn't sustainable.) These are the really interesting options IMO, especially ones that perform well.
> 
> If data privacy isn't a requirement for you (and/or perhaps you have a higher priority conflicting requirement that you want to be able to access your data via web browser), then something like Backblaze Personal Backup (not B2) is hard to beat.



An option is to have your own encryption software. Encrypt the data you want to store on the cloud...before you push it up to your cloud service provider. Then, no one will be looking at it. Added steps, but for things I for sure want private, I encrypt myself using Boxcryptor. 

I chose Sync.com for my cloud provider because they allow me to encrypt and hold the keys, so they are not scanning my data. But, for things like taxes or other financial stuff I want secure, I just encrypt it myself anyway.


----------



## tack (Dec 6, 2018)

Synetos said:


> I chose Sync.com for my cloud provider because they allow me to encrypt and hold the keys, so they are not scanning my data.


Well, you're using their software, so you _think_ they're not holding the keys and can't scan your data. And you have to trust that they're implementing their crypto properly, because it's proprietary and closed source and can't be audited by unbiased third parties without an expensive (and illegal in some jurisdictions) effort of reverse engineering.

If you're going through the trouble of creating encrypted local copies of all your data (and continuously reencrypting when there are changes) then, indeed, you can just point your cloud backup software at that.

Alternatively, if data privacy isn't important to one's backup strategy (which is a personal choice), that opens up plenty of options to choose from.


----------



## whiskers (Dec 6, 2018)

tack said:


> Well, you're using their software, so you _think_ they're not holding the keys and can't scan your data. And you have to trust that they're implementing their crypto properly, because it's proprietary and closed source and can't be audited by unbiased third parties without an expensive (and illegal in some jurisdictions) effort of reverse engineering.
> 
> If you're going through the trouble of creating encrypted local copies of all your data (and continuously reencrypting when there are changes) then, indeed, you can just point your cloud backup software at that.
> 
> Alternatively, if data privacy isn't important to one's backup strategy (which is a personal choice), that opens up plenty of options to choose from.


You, I like you.


----------



## Anders Wall (Dec 7, 2018)

Throwing this in the mix.
https://www.pcloud.com/cloud-storage-pricing-plans.html
For €350 you get a 2Tb cloud-drive, you only pay once.
They also provide some solid encryption tools.
(I'm no IT-guru/tech so don't take my word for it, but it sure looks solid)

Cheers,
Anders


----------



## Synetos (Dec 7, 2018)

Anders Wall said:


> Throwing this in the mix.
> https://www.pcloud.com/cloud-storage-pricing-plans.html
> For €350 you get a 2Tb cloud-drive, you only pay once.
> They also provide some solid encryption tools.
> ...



A single pay for 2TB that costs about the same as an external drive sounds interesting to me. Thanks for throwing it out there. How secure your data is, will always be a question on cloud sites, as Tack pointed out earlier. I still like the idea of a secondary encryption for really sensitive data.


----------



## GtrString (Dec 7, 2018)

This backup storage thing could end up being as expensive as hardware outboard gear.. wait.. it IS hardware outboard gear!


----------

