backup

Deadbolt impacts some ASUSTOR NASes — check your backup plan!

ASUSTOR ARM NAS with four hard drives and cover removed

A few months ago, I wrote up a post covering my backup plan. In it, I talk about the 3-2-1 backup strategy:

  • 3 copies of all your important data
  • 2 different media
  • 1 offsite

In that post, I mentioned I back everything up with two local copies (two separate NAS units), and a third offsite copy on Amazon Glacier Deep Archive.

My Backup Plan

I've had a number of people ask about my backup strategy—how I ensure the 6 TB of video project files and a few TB of other files stays intact over time.

3-2-1 backup plan

Over the past year, since I got more serious about my growing YouTube channel's success, I decided to document and automate as much of my backups as possible, following a 3-2-1 backup plan:

  • 3 Copies of all my data
  • 2 Copies on different storage media
  • 1 Offsite copy

The culmination of that work is this GitHub repository: my-backup-plan.

The first thing I needed to do was take a data inventory—all the files important enough for me to worry about fell into six main categories:

6 backup categories

AWS S3 Glacier Deep Archive - Difficulty deleting files with accents

A few days ago, my personal AWS account's billing alert fired, and delivered me an email saying I'd already exceeded my personal comfort threshold—in the second week of the month!

AWS billing alert email

Knowing that I had just rearranged my entire backup plan because I wanted to change the structure of my archives both locally and in my S3 Glacier Deep Archive mirror on AWS, I suspected something didn't get moved or deleted within my backup S3 bucket.

And I was right.

But I wanted to write this up for two reasons:

Retrieving individual files from S3 Glacier Deep Archive using AWS CLI

I still haven't blogged about my overall backup strategy (though I've mentioned it in the past a few times on my YouTube channel)—but overall, how it works is I have two local copies of any important data, and most of the non-video data is also stored in my Dropbox folder, so I get two local copies and one cloud backup for 'free'.

Then I also back up everything (including video content) from my NAS to an Amazon S3 Glacier Deep Archive-backed bucket at least once a week (sometimes more frequently, when I am working on a big project and manually kick off a mid-week backup).

Push your Git repositories to a central server, in a bare repository

GitHub is a great central repository silo for open source projects, and for private Git repositories for companies and organizations with enough cash to afford the features GitHub offers.

However, for many projects and developers, GitHub can be overkill. For my needs, I have many smaller private projects that I'd like to have hosted centrally, and backed up, but don't warrant BitBucket or GitHub accounts. Therefore, I've taken to creating bare repositories on one of my Linode servers, and pushing all my local branches and tags to these repos. That server is backed up nightly, so I know if I lose my local environment, as well as my Time Machine backup (a very unlikely occurrence, but still possible), I will have backed up and fully intact Git repos for all my projects.

I recommend you do something like the following (presuming you already have a Git repo on your local computer that you've been working with):

Photography Weekend Part 3 - Backup Strategies and Disaster Preparedness

See previous posts:

An Ounce of Prevention...

When you work on a project where every piece of work (in this case, every photograph) needs to be cataloged, backed up, and sent to production as it's created, you have to plan things out pretty well in advance, but also be ready to fix problems and adapt to difficulties as they arise.

During my weekend of photography at Steubenville St. Louis, I was quite prepared for most difficulties that could crop up in photography:

Backup Strategy for Mac OS X Using Disk Utility, Carbon Copy Cloner, etc.

A blast from the past! The following article is from one of my first websites, ca. 1999, and was updated a couple times throughout it's history. I am re-posting it here because my old website will be deprecated quite soon.

A few notes before we begin: Since the writing of this article, Time Machine came into being (along with Mac OS X 10.5), and has brought about a revolution in the way I maintain backups: my schema now is to have a local daily Time Machine backup to my external hard drive (I recommend a simple 1-2 TB External USB hard drive), then do a once-a-month DVD backup (stored offsite) of my most important files. For most home/small business users, this should be adequate.

Another revolution in data backup is the idea of backing up 'to the cloud' - with the prevalence of broadband Internet access, and the plethora of options for online storage, many companies offer solutions to online backup that were only dreamt of back in the late nineties. Some solutions I recommend: MobileMe (what I use, but not for everyone), Mozy, BackJack, and JungleDisk. (No, those aren't referral links—would I try pulling that on you?).

Backup Strategies for OS X

A question often asked on the Apple Discussion boards and by my fellow Mac users is: "How/when should I backup my Mac, and what is the best/quickest and most reliable way to do it." This is a complicated question, as there are many different ways one can go about backing up OSX.

There are three basic ways that I would like to cover in this article:

  1. Using Disk Utility to quickly and easily make a complete, bootable backup to an external drive;
  2. Using Carbon Copy Cloner to either (a) do the same thing as Disk Utility, or (b) to clone a certain folder or group of folders (another program that does a great job is SuperDuper!);
  3. Drag-and-drop copy files and folders for a quick backup of important files.