… a successful off-site backup.
Now, to flesh out that commentary – most of our services are colocated at various Undisclosed Locations around the US (usually one door down from Dick Cheney). This means it’s sort of difficult to trot over to the machine, pop in a backup tape, make a copy, and toss the tape in the storage vault. Sure, colo facilities provide this capability, but it tends to be painful to work with, not to mention costing extra money.
With broadband now as widely available as sand, it’s possible to take on the concept really put forward 10+ years ago. “Why not back things up over the net?” When I was first approached with the idea (oh, 1995-ish), network connectivity was just edging out of dialup, and if you were SUPER lucky, you could have a T1 line, but it would most likely run you $800 a month. That sort of connectivity at home? Not likely.
Nowadays folks have DSL and Cable modems that have huge amounts of bandwidth. We’re no different here at Chez Geek, happily slurping at the nozzle from Comcast.
Last night I finished noodling a 4-5 line backup script that synchronizes various Important Directories [tm, reg us pat off] on our server (things like, oh, say, /home), and a spare 160gig USB drive I picked up at Microcenter for $80.
What does this have to do with Linux, I hear you cry? Because all of this was done with free software, and it did it fast and efficiently. Using the rsync utility, it takes one command to synchronize one set of dirs on one machine with another. It even compresses and optimizes the transfers, only copying those files that need to be updated.
So we have free software, cheap bandwidth, cheap disk space, and cheap hardware. It’s Geek Nirvana.
For the detail-oriented folks in the crowd, here’s the end result:
sent 3756414 bytes received 7763330071 bytes 530665.56 bytes/sec
total size is 13301911159 speedup is 1.71
That’s some nice throughput.