Tue, 13 Sep 2011
Moving to New York City
My harebrained scheme actually went through: by the end of October I'll be living in New York City, starting a job at the Columbia University library at the beginning of November, doing various computery infrastructure things in support of many of the projects the library does.
I came up with this scheme, moving to NYC, after visiting it twice in July. Very simply, I fell in love with the city, and got bitten by its bug pretty hard. I knew, as I flew out of the city after my first visit, that I'd be living there within a year. Amazingly, it's only taken a few months.
It's not a decision I've taken lightly — I love Ypsilanti and southeast Michigan, the people here and the interesting and exciting things going on. And I love the immediate group of people I work with, and love working for the University of Michigan; it's just that the layers between those two things have become somewhat frustrating for me over the last two years. It was time to move on, but while for the past several months I've been half-assedly looking for a new job, it wasn't until I got back from NYC that I took that search seriously, that I actually focused.
I've become convinced over the last eight years or so that inertia is the primary guiding force in my life. And, for the most part, I'm fine with that. But I also realize that every so often some gumption breaks through, and when it does, it breaks through hard. And when that gumption comes, I have to grab it and go with it, even if that gumption comes in the form of a harebrained scheme. Sometimes, you just gotta see where those go.
I'm going to miss many things, and more importantly, many people, very dearly. But it's time to explore a new city, and I'm looking forward to that. Also: bagels.
Posted at: 17:04 | category: /life | Link
Sun, 11 Sep 2011
Hacking AFS Dumps for Fun and Profit
Well, for fun at least.
The traditional way of doing AFS volume dumps tends to follow a classical "Full,incremental,incremental" pattern, with occasional new Full dumps so that the number of dumps one has to restore for a given time period is manageable (at work, we do something that is roughly "Monthly-Weekly-Daily"). This also lets you do expiration of dumps for stuff you no longer need — if you only want to keep two months worth of dumps it is easy to determine which dump files you no longer need.
At home, however, doing full dumps is painful for large volumes, because my DSL connection has a rather paltry upload speed, and since I keep copies of the volume dumps both at home and at my colo location, no matter where I do the dump at least one of the transfers will be slow. What I would like to do, then, is a process where I do a painful full dump once, and then every day simply do a dump of what has changed since the previous day. This gets painful quickly, since after about three days the number of dumps to restore gets too large to want to do. In addition, you can never throw away any dump, since they are now all necessary (potentially), to do a restore.
My desire, then, is to have something that pulls apart dump files and keeps enough data around for every particular backup point so that I can synthesize what appears to be a full dump file for that point. AFS volume dumps handily do that: they will tell you either "Here's a vnode that's changed" and "This vnode is present but hasn't changed since your reference time" If you combine that with some logic that keeps track of what the vnodes looked like in the last backup, you all of a sudden have enough information to be able to do the sythesis.
Thus the impetus to create pyafsdump, a Python module that understands and can do various things with AFS volume dumps. As a proof-of-concept I put together a pair of hackish scripts, one of which pulls apart volume dumps and generates some metadata, and another which reads that metadata and synthesizes a full dump. A very rough test seems to indicate that it works, I was able to pull apart a full dump and three subsequent incremental dumps, and from that generate a full dump that contained what the volume looked like at the time the third incremental was made, which was restorable with vos restore.
A public git repository can be found at http://kula.tproa.net/code/pyafsdump.git
Posted at: 22:57 | category: /computers/afs | Link
Wed, 07 Sep 2011
Beet Salad
Inspired by the same dish at Syrian Bakery:
- 6 beets
- 1 small onion
- 5 cloves garlic, whole
- 1 lemon
- 8 oz feta cheese
- 1 cup parsley, chopped
- Pepper
- Olive oil
Put a steamer basket in a big pot with some water, start that up. Trim the leaves from the beets, cut off any little dangly bits, lightly score each one with an X on the bottom, put in the steamer. Cut the onions into quarter rings, peel the garlic but leave whole, drop both of those in with the beets. Steam.
When the beets are done (no resistance when poked with a sharp knive), pull them out of the steamer and put them aside to cool. Pluck the garlic cloves out, mash them in a bowl and juice the lemon over it. Peel the beets ( trim off the ends and just rub in a towel you don't care about, the skin will slide right off ), and slice — I usually cut each beet in half and then make half-moon slices, larger beets get quartered. Let these cool completely
Put the beets in a bowl, add the onions and parsley. Chunk up the feta, add that. Pour over the lemon juice/garlic mix, drizzle on some olive oil, add pepper to taste. Combine well. Let sit in the fridge for a while.
Posted at: 23:31 | category: /food/2011 | Link