Backing Up with FTP
I finally set aside some time to upgrade to Windows 7. Most everyone else in the office has already upgraded. I was holding out so I could finish up a few client projects and get the Ann Arbor GiveCamp website up and running. Now it’s my turn!
A Clean Install
A few people told me NOT to do an upgrade install of Windows 7. Instead, do a clean install by formatting your hard drive. Obviously, you’ll want to make sure everything is backed up before doing this. At home, I have a FreeNAS server that does daily backups via rsync (thanks to DeltaCopy). However, I wasn’t quite backing up everything so I had a little extra work to do. Actually, I found quite a bit of important stuff that wasn’t on my daily schedule.
I originally was using Windows Explorer to copy/paste directories. However, when you’ve got a lot of copying to do, it’s not very efficient. Small hiccups along the way usually halt the entire operation and you have to start over. I decided I’d just use FileZilla and FTP everything over to my NAS server (since it can be an FTP server as well). I can place a whole ton of files in a queue, control the queue processing and easily retry failed transfers. This type of control reduced my backup time immensely.
File Transfer Mode
The one problem with using FTP to backup files is that FTP has two different transfer modes: Binary and ASCII. When moving files between a windows and unix-based system (as I was since FreeNAS is based on FreeBSD), ASCII/text files are converted so the CR/LF from windows gets converted just to a CR in unix. Most FTP clients (including Mozilla) have an “auto-detect” mode and pick the appropriate mode to use based on the file type. They always transfer things like .txt and .xml files as ASCII and other files like .png, .bmp or .mp3 as binary and everything just works.
But there are exceptions! After moving some of my files back to my Win7 machine (using FileZilla in the opposite direction), I noticed that my Mercurial repositories were showing up as having changes (I committed everything before re-paving my machine). And one repository was reporting a corrupt index. Ugh… As I started poking around inside the Mercurial data storage files, I started to think that perhaps some of these were transferred as ASCII when they actually should have been binary!
Once I realized what had happened, I felt pretty stupid. Since I was using FTP solely as a backup tool (and not really to “transfer” a file for use on the other machine), I should have forced FileZilla to use binary transfer so everything would have been an exact copy, byte for byte. I did a quick test of a Mercurial repository and transferred it in both “auto” mode as well as “binary” mode. Of course, binary mode went back and forth without a hitch.
Fixing My Repositories
This isn’t FileZilla’s fault. If I figured out exactly what caused the file corruption (and in what file), I probably could have just configured FileZilla to transfer those specific file types as binary. Luckily, the repositories that showed some corruption were small, private repositories that only I used. And the Mercurial wiki had a page that showed me how to fix the problem. I ended up using the “Recovery using Convert extension” option. It fixed my repositories and I didn’t loose any changesets. Nice!