Archive for the 'Beginnings' Category



Unlikely realities

Something that occasionally can seem like lazy historiography to me is when scholars call something “unlikely” to explain why they think it probably didn’t happen. It’s a way to argue with something that may show up in a primary source without necessarily having to reason your way through the disagreement; oh, well, such-and-such gives X account of this event but that’s “unlikely”, so we’ll assume it didn’t happen.

Here’s the thing. At the risk of getting all Dr. Manhattan on both of my regular readers, “unlikely” things happen all the time. I am an extraordinarily unlikely occurrence, given who my parents are, their personalities, their respective stations in life when they met, etc. It’s highly unlikely that my randomly going to a party one night while nearing emotional rock bottom should result, six years later, in me getting married to a person I met there (principally as somebody another friend of mine had a crush on). It’s highly unlikely that a college choir director deciding she was going plan a European tour should start a chain of events that would result in a conversion to Orthodox Christianity nine years later. And yet, these things defiantly happen nonetheless with callous disregard to whether or not a historian will later believe that they did.

A couple of other fairly unlikely things have happened to me in the last two or three weeks: for example, Megan and I, along with our godchildren Matt and Erin and our dear friend Anna, attended Lyric Opera of Chicago’s recent production of Tales of Hoffman. Not necessarily unlikely in and of itself (but since the last time we went to the Lyric was 9 years ago for Bryn Terfel’s Sweeney Todd, certainly not a regular occurrence), but consider the following: Hoffman was supposed to be produced at my first undergraduate institution, Western Washington University, my freshman year. WWU had put on a production of La Boheme a couple of years previous that had received national attention, was developing something of a reputation for being a good undergrad program for people who wanted to do opera, and Hoffman was going to be the big followup that would prove that Boheme wasn’t a fluke. Well — as the story was told to me in dribs and drabs from a few different people — political, economic, and practical concerns meant that this didn’t happen. Hoffman was nonetheless on my radar for the first time, and in short order the 1989 recording with Placido Domingo was the very first opera recording I ever owned. That disc featured people I’d never heard of before like Edita Gruberova and James Morris, and I played it over and over again.

Somebody who was in my freshman class was a soprano and cellist named Erin Wall. She was in 8am Music Theory with me the very first day of classes, we were in the same voice studio, and she was one of a group of Canadian students who were in Western’s music department for voice. She had a nice, full voice at a time when there were a lot of soubrettes hanging around; the last time I heard her during my time there was when she was one of the Flower Girls in The Marriage of Figaro in 1996, but after I dropped out I believe she got to do the title role in Susannah. Over the years I found out she was having quite the meteoric rise; she was a finalist for Canada in the Cardiff Singer of the World, she was part of Lyric Opera of Chicago’s young artist program, and then she started to get really busy.

A few months after leaving Western in ’97 I went to work for a Major Software Company and, shall we say, did a reasonable impersonation of a tester for a few years while trying to get to the next step as a singer. One of the things I tested had to do with web browsing, and one day I happened upon a student website for a soprano at Rice University named Anna Christy. There wasn’t anything particularly distinctive about the website, but I always remembered that I had hit it, particularly when I started seeing her name in Opera News a few years later as somebody who would be singing at Wolf Trap and so on.

Fall of 2003, after just starting at IU, I was flown back to Seattle to sing as the tenor soloist for a concert of Bach cantatas with the Seattle Symphony, John Harbison conducting. It was the biggest professional thing I ever got to do, and except for the check, it was a real waste for me and for the Seattle Symphony people. I was cast in a role at IU that I was removed from over this; the Seattle contract had been signed months ago, and I was to be gone the second to last week before opening. I didn’t even know I was up for anything in this particular show, and I explained my situation as soon as I found out I was cast. “Take it up with the stage director when staging rehearsals start,” I was told. Well, as soon as the stage manager said at the first staging rehearsal, “We’re not excusing anybody for any reason from any rehearsals,” I knew I had a problem, and sure enough, I was kicked out. (This was, of course, considered to be my fault from the standpoint of the opera administration, but never mind that now.) Not only that, but as soon as I got off the plane in Seattle, I came down with probably the worst sore throat I’ve ever had in my life, and my ability to phonate, still reasonable at the first rehearsal, was in tatters by the concerts. It was the first (well, only) time I’d ever been on a gig like this, I had no idea whom to talk to or what to do, and while I managed to sort of scrape by in the concerts — well, funny thing, the Seattle Symphony folks never called me again. (My voice teacher in Seattle, who had sent Seattle Symphony my way in the first place, said that from what he had heard it wasn’t exactly a “He’ll never sing in this town again” kind of thing, but that I was remembered as somebody who had problems, and he’d have to specifically arrange an audition for me down the road when the time came. Needless to say, the time never came, and thank God.)

Anyway, the bass in the solo quartet was one Christian Van Horn, who had just won the Met auditions. I doubt he would have any memory of who I am, and if he did remember me I doubt he’d remember me well, given the circumstances, but he was a tough guy to forget — physically and vocally imposing, to say the least.

My second year at IU, a mezzo-soprano named Jamie Barton started her Masters. She distinguished herself quickly in operas like La Cenerentola, but she was also a frequent guest at Chez Barrett, back in the day when I used to host large gatherings of IU voice people over nachos on a weekly basis. (Hey, that’s how I made friends when I first moved here — I fed people.) She won the Mets a few years ago, and since then, she’s been one popular mezzo.

So Chicago’s Hoffman featured James Morris (from that first recording) as the four villains, Erin Wall as Antonia, Anna Christy as Olympia, Christian Van Horn as Crespel, and Jamie Barton as Antonia’s Mother. (As well as Matthew Polenzani as Hoffman, whom I had last heard ten years ago in Seattle as Almaviva in Barber of Seville.) And with me in the audience — what an unlikely confluence of people and circumstances! If I took a time machine back to that first day of freshman year in September of 1994 and told the 19 year old Erin what would be happening in seventeen years, she’d laugh in my face, I’m sure. (The set looking like it was reproduced from a Chris Van Allsburg book was also pretty unlikely. Fascinating looking at times, but unlikely.)

The second unlikely thing to occur was a week ago today. I’ve written here and there about my lifelong fascination with Batman; well, as I had known for some time, Michael Uslan, the Executive Producer of the Batman films starting with the 1989 Tim Burton effort — and really the guy without whom a modern Batman on screen doesn’t happen — was an IU alumnus. He’s spoken on campus a few times since I’ve been here, but I’d never been able to go, so when I heard that there would be a screening of The Dark Knight in the new IU Cinema facility with Michael Uslan introducing the film, I made it a point to clear my calendar for the day and to order a copy of his memoir, The Boy Who Loved Batmanin time for the screening. As it happened, he gave a lecture in the afternoon in addition to the screening, and I was able to go to both. There is a brief account of the day here (hmmm — “RRB”, familiar initials, aren’t they?) so I’ll just say that the guy is one hell of an inspirational speaker, to say nothing of one hell of a self-promoter; he’s basically a comic book geek who has figured out how to make being so respectable, lucrative, and attractive. He was incredibly generous with his time at both the lecture and the screening; he kept answering questions until he was hooked off the stage, and during the book signing he talked to everybody.

So, chain of events — I find a book called Collecting Comic Books by Marcia Leiter at the Redmond Library in 1985, and my life is forever changed. Four years later on 23 June 1989, Batman introduces me to a way of thinking about movies that cares who’s in them, who directs them, who writes them, who designs the sets, who writes the music, and so on. I had been a Star Wars kid and then some, but I couldn’t have told you who George Lucas was. After the summer of 1989, though, damn skippy I cared who Tim Burton was and what else he had done and was going to do, who Danny Elfman was and what kind of music he did (followed by an obsession with Oingo Boingo for awhile), who Sam Hamm was and why it seemed he never wrote another movie anybody cared about, who Jon Peters was and why a former hairstylist was suddenly one of the most powerful producers in Hollywood, etc. At the very least, without Danny Elfman’s score, my interest in classical music probably doesn’t happen. (And then there’s something about a girl in high school that gets me starting to take voice lessons, but that’s somewhat beside the point at present.) Anyway, I then go to Indiana University in 2003 for music, which just happens to be Uslan’s beloved alma mater, leading to last week’s events. Again — how incredibly unlikely!

No historian will ever care about any of these things, I’m certain. If one were to ever to try to reconstruct these chains of events and concurrences of people and places and things, surely it would strain credibility. This doesn’t mean we have to interpret all of these things teleologically, necessarily, but it also means that just dismissing them is not really reflecting on how life works and how things play out.

Sam Zuckerflynn

My blog has been a touch more unloved in the past few months than I’ve really intended it to be. It’s like a paper diary; you get into a rhythm, then something disrupts that rhythm, and you know it’s going to take longer than usual to say what you want to say about it, so you put it off. Then more things happen while you’re putting it off, which means it’s going to take even longer, so you really have to put it off for a bit longer. Then, eventually, a new rhythm emerges as you fill the time you once spent journaling with other things, and next thing you know you look up and it’s been three months since last you wrote any thing and even longer than that since you did any more than write “Wow, what a day yesterday was, but I don’t have time to write about it right now.”

And, let’s be honest, on a normal day my blog posts are long to begin with. Catching up on several months’ worth of long blog posts is, shall we say, daunting.

In a nutshell, the Orthodox Music Symposium started taking up a lot of my free time about the time I actually realized I could write grants for the thing. By the end of August, school had started up again, and this year I’m a course assistant, so that was also taking up my time. By the middle of September, Megan had left for Germany, which meant that schoolwork, grading, the Symposium, church, trying to keep to a workout schedule and taking care of the house was taking up every waking moment I had.

The Symposium was a blast. The morning before, I got an e-mail from the Order of St. Ignatius saying, sorry this is so late, but a check is in the mail, which meant that we were fully funded before the event happened (if only by a matter of hours) — something I couldn’t say for John Boyer’s first visit a year ago. Anyway, people came, we had fun, everybody gave interesting talks, and I’m starting to take concrete steps towards the next one. The audio will eventually be available on Ancient Faith Radio, and I’ll provide a link here, of course.

In November the History department decided that I had completed a Master’s degree, which was great for all kinds of reasons, not the least being that I’m getting the ratio down — 11 years for a four year degree, and now four years for a two year degree. I may have my Ph. D. (to say nothing of a job) by the time I’m 40.

In December, I spent a week in Alaska with my mother and stepfather after Finals, and then spent 1 day in Indianapolis (thank you, East Coast weather disaster) and 10 days in Germany with Die Frau. I got back on 6 January, and then it was off to the races again. This semester, I’m sitting in on first year Syriac again to try to reclaim as much of it as I can (while also working through Thackston’s Introduction to Koranic and Classical Arabic so that I can have another Semitic language with which to coordinate Syriac, as well as potentially another liturgical language I can fake), taking a seminar on early Christian mysticism, another seminar on Herodotus and Thucydides, and grading for the Greek history course that covers the Persian War up to Alexander the Great. Plus there are these things called “Lent,” “Holy Week,” and “Easter.” In addition to all of that, St. Nicholas Orthodox Church in Urbana-Champaign is having me come out this weekend to do a daylong workshop, next weekend I’m meeting my father and stepmother in Memphis, and then the first weekend in February St. Raphael Orthodox Church in Iowa City is having me sing with them in a fundraiser concert. The choir director there, Lori Branch, is my predecessor’s predecessor here at All Saints, and is also godmother to Matthew Arndt of the music theory faculty at University of Iowa and somebody who’s been my friend since the seventh grade, so I could hardly say no.

Okay, we’ll consider that “caught up.”

Yesterday, I was in a Best Buy (now, why I would use the indefinite article when there’s only one Best Buy in Bloomington is inexplicable to me, but “the Best Buy” seems not quite right, and “Best Buy” by itself appears too abstract, like I was in the Platonic ideal of Best Buys) and I played with an iPad for a few minutes. I have to say, I felt a bit like Ed Dillinger in Tron. I even typed “Request: Access to Master Control Program. User Code: 00-Dillinger. Password: Master.” Alas, it didn’t reply, “Hello, Mr. Dillinger. Thanks for coming back early.”

No, I didn’t buy it. Yet.

Two movies came out between 1982-3 that captivated me: Tron and WarGames. The immediate result of the captivation was an Atari 800XL as a Christmas present in 1983, a fascination with computer graphics, and an appreciation of Jeff Bridges, Bruce Boxleitner, David Warner, Peter Jurasik, and John Wood that continues to this day.

In 1998, having dropped out of school and searching for a job that would allow me to move back to the Seattle area, thanks to some coaching from an old friend to say nothing of sufficient desperation to say “Yes, of course I can do that” to everything they asked me in the interview, I was fortunate enough to get a contract position as a tester for a major software company. For a number of reasons, I won’t name the company, even though it’s reasonably obvious and it’s not exactly a closely-guarded secret. Suffice it to say that, particularly once the job became full-time a year later, what I had would have been somebody else’s dream job. As a 22 year old kid with no college degree and certainly no higher math or computer science courses, even given the state of the tech industry in 1998, I shouldn’t have been able to get that job, but I did, and I was there until July of 2003, when I quit to go back to school. The company isn’t really “computer guy’s dream job” material anymore, as I understand it; it’s a good upper-middle-class place to work, much as Boeing was for years, but the fabulous cash and prizes that made people equate working there with winning the lottery during much of the ’80s and ’90s were basically gone by the time I showed up — at the very least, guys like me were the first generation who wouldn’t see any of that kind of benefit. In a way, that was lucky — the generation of employees immediately preceding mine were those who had to deal with loans taken out against options that, despite assurances that “the stock always goes up, way up!”, were underwater by 2000.

Part of why I don’t want to name the company is because, I guarantee you, they wouldn’t take me back in a million years, but I also wouldn’t want to go back in a million years. For me, that job was a means to an end, a way to support myself while I prepared for my next step as a singer without waiting tables. The work was okay, and it was cool being able to say that I did what I did, but I wasn’t very good at it, a lot of the internal processes weren’t intuitive to me even if I generally understood the basic logic of testing, and I wasn’t motivated to be constantly doing better at it the way everybody else around me was. I was an outsider in a lot of ways; I wanted a day job that allowed me to pursue a dream, not a lifestyle, and particularly at the time, you were expected to make it your lifestyle. Even if I wasn’t singing, though, I wouldn’t have had the motivation to do that, because the writing was on the wall with respect to the tech bubble and what that meant for the company’s stock within a year of being given the new hire stock option grant. Why kill yourself for alleged millions you know you’ll never see? Well, by the time I left, in the unit I was working in, the motivation was just to keep your job — they knew they had hired people in a tight tech labor market whom they wouldn’t have hired otherwise, and they could afford to be more selective of new employees, as well as threatening to existing employees, in 2003. Part of the motivation to go back to school in the fall of 2003 was because I’d strongly suspected since summer of 2002 that I would need to find a way to quit before I got fired. The young artist programs I auditioned for in fall of 2002 didn’t pan out, and the clock was ticking; finishing my degree as far away from the Pacific Northwest tech industry as I could manage was a really attractive option.

(By the way, to this day, when I get the question “What did you test?” and I tell the person what “my” feature of the Major Product Line I worked on was, I’m immediately asked, “Oh, then maybe you can tell me how to turn it off? I hate that silly thing.” All I can say is, don’t blame me. I tried to tell The Powers That Be that users would hate it back in 1999.)

Anyway, I’ve been out of the software world for longer than I was in it, which is strange to me in a lot of ways. In the intervening seven and a half years, Apple overtook Microsoft in market capitalization. Google became the hot, millionaire-making company. Chrome became the browser to watch. iPhones and iPads came out. Facebook happened. Microsoft still puts out the dominant operating system and productivity suite, but that’s kind of along the same lines as Ford making the cars that cops and old people drive — it’s not really what shapes how people on the street think about cars. I remember somewhere around 2000 a friend of mine who was a Microsoft employee telling me, “In ten years, Microsoft is going to be thought of as more of a communications company than a software company.” Yeah, um, no, not so much. That’s what’s happened for Apple, but Microsoft has had to expend too much energy supporting its own weight to be able to innovate in the ways that my friend was anticipating — at the very least, to be able to translate those innovations into products that are compelling in the marketplace. In many ways, if the Justice Department had actually succeeded in breaking up Microsoft, it might truly have been the best thing for them, because they wouldn’t be weighed down as much as they have been for the last decade. They wouldn’t have become IBM, in other words.

I’ve been hearing rumors on one movie site or another since probably 1996 about a sequel to Tron. In 2002, there was a rumor that seemed substantial enough to prompt me to write a letter to Steven Lisberger, the director of the original, trying to pitch myself as a consultant on how to capture the look and feel of of the offices of a modern software company. I got a polite letter back from Disney Studios a couple of months later just saying that no work was at present proceeding on a Tron sequel. In retrospect, I really should have kept on top of that.

I saw Tron: Legacy in IMAX and in 3-D on opening day (the first movie I’ve bothered with IMAX for since Watchmen, and the first of the new batch of 3-D movies I’ve seen), and I’ve seen it once more since. If nothing else, it’s a jaw-dropping visual accomplishment — it is easily one of the most beautiful movies I’ve ever seen, and I’ll also add my voice to the thousands out there that have praised the Daft Punk score (that clearly had a lot of help from Hans Zimmer, but never mind that now). I have to give Disney credit for the guts they’ve shown making an expensive sequel to a cult property 28 years later. I truly hope that they do the work of building the franchise. I think it’s an idea whose time has finally come, and I loved Tron: Legacy. I get the impression it’s become fashionable among my compatriot geeks to hate the movie, but it seems to me they’ve missed what the movie was doing. The original Tron was really based around a very simple idea, captured in the great Barnard Hughes’ line: “You can remove men… from the system, but we helped create it. And our spirit remains in every program we designed for this computer.” What an interesting idea — that a computer program, even something as simple as a compound interest calculator, retains the impression of its programmer. The religious nature of the idea is obvious — that of creating in one’s own image — and is underscored by Flynn taking on the form of a program, coming down from the user’s world to the computer world, saving the system by “dying,” coming back to life, and ascending back to his own realm.

Still, the understanding of the general public of computers in 1982 was pretty simplistic, and the special effects required to sell the idea left insufficient room, let alone vocabulary, to really mine the depths of the philosophical question. The most glaring question was — were these things alive? Well, maybe.

Tron: Legacy has been criticized for not reflecting the more sophisticated integration of computers into our daily lives into its storyline. Instead, it seems to go out of its way to avoid doing so — the computer world is on a private server that’s separate from the Internet and that hasn’t been touched since 1989. Wouldn’t it be more interesting, some reviewers have suggested, to see the battle between Clu and Flynn played out against the backdrop of the tech boom of the late ’80s and ’90s?

That would be an interesting movie, yes. However, it seems to me that what Tron: Legacy is going for is an exploration of the very question that the original sidesteps — are the programs in the computer world alive? If so, to what extent is that life similar to, and/or different from, human life? By presenting the computer world in Tron: Legacy as something separate from the technological advances in “the real world,” it can be explored more freely — what would happen if the world we saw in the original was just left to develop on its own for two and a half decades? That chip around Sam’s neck at the end will presumably suggest a way that the Grid can be integrated into the worldwide computer network of 2011, and if Kevin Flynn’s consciousness is still in there somewhere, then perhaps we might see him appearing to people on the Web — something like Count Zero. The religious ideas here are also very plain — Clu, like Lucifer (same first three letters!), cannot create new programs, he can only repurpose (“rectify”) or destroy existing ones. He intends to lead an army of repurposed programs into the “real world” — to wage war on heaven, in other words.

The main problem that Tron: Legacy has, as I see it, is that the iconic performance of Jeff Bridges since the original isn’t Preston Tucker (a criminally underrated performance in a criminally underrated movie) but rather The Dude, and The Dude already is not unlike an older Kevin Flynn. So, now that you’ve got Jeff Bridges playing an older Kevin Flynn, parts of it will inevitably come across as Dude-like. Oh well; it’s not Jeff Bridges’ fault that they didn’t make a Tron sequel earlier.

This last week I also occasioned to finally watch The Social Network. Whatever its historical merits may or may not be, it’s a fantastic movie. David Fincher, Aaron Sorkin, and the entire cast (including Justin Timberlake! Who knew?) knock it solidly out of the park, and in a lot of ways it’s a very perceptive generational portrait, not entirely dissimilar to David Fincher’s earlier perceptive generational portrait, Fight Club. In fact, one of the things that’s interesting about The Social Network is how it’s a period piece about a generation that would have been very recently influenced by Fight Club. Certainly one can see the Mark Zuckerberg-Eduardo Saverin-Sean Parker triangle as sharing outlines with Fight Club‘s Tyler Durden-Narrator-Marla Singer relationship, although it’s not possible to see a 1:1 relationship. Both Eduardo and Mark have qualities that the other desperately wants, and while Sean Parker certainly functions as a seducer in a lot of ways, the film suggests that he’s the real wannabe of the threesome.

The Social Network is interesting on a personal level for me for two reasons — first, it picks up in fall 2003, almost exactly where I left off in the technology industry. Second, it’s a timeframe and a narrative into which I can easily place myself (I first heard of Facebook probably around fall of 2004). That’s at once interesting and unsettling — interesting because I know where I was at virtually every moment of the film, unsettling because that makes it very easy to compare trajectories and accomplishments.

In a lot of ways, I’d argue that The Social Network and Tron: Legacy are curiously appropriate companion pieces. “Now we’re gonna live online,” Aaron Sorkin has Sean Parker saying in The Social Network, and that’s the very dilemma Kevin Flynn is dealing with by the time his son finds him in Tron: Legacy. Both films are about a software creation that ultimately gets beyond the creator’s control; both depict said software creation effectively freezing the creator at a certain age (the express goal of Facebook, according to the film, is to put the social experience of college, that is to say the social experience of a certain period of youth, online); both offer interesting commentary on the current state of software as a business and the people who run that business. Tron: Legacy gets this part just about exactly right; the ENCOM board meeting is pretty accurate with respect to my experience of those kinds of conversations. Alan Bradley asks what makes the new version of the ENCOM OS 12 different; he is told, “We put a ’12’ on the box,” only to then have Ed Dillinger, Jr. (it’s a very interesting thought to me that Cillian Murphy might be this generation’s David Warner) quickly assert that it is “the most secure operating system” in existence, as though fixing what should have been in place to begin with is actually the same thing as having a feature set that’s compelling for a new release. And, of course, the snafu with the release of OS 12 is intended to evoke Windows 98 bluescreening on Bill Gates at COMDEX.

However, from the point of view of The Social Network, the kind of company that occupies skyscrapers and has “big doors” and even has a “golden master” onsite somewhere is a dinosaur. It’s a business model that has nothing to do with how Zuckerberg has become the youngest billionaire in the world. In fact, the more anybody tries to pin a particular business model to Facebook, the more Zuckerberg claims they don’t get it. The film has him attending a talk on Harvard’s campus by Bill Gates, but he doesn’t come across as exactly inspired by Gates’ reminiscences about what computers were like when he wrote BASIC. He seems to be there more out of disinterested politeness than anything — Bill Gates is yesterday’s news to him. He could have sold an earlier invention to Microsoft and he didn’t — the film explicitly has Zuckerberg awkwardly shrug rather than explain this, but we’re left with the impression that he just didn’t want the old folks in charge of his ideas. Sean Parker is his idol, the guy who lost billions of dollars but still brought down the recording industry as we know it. Maybe Gates can be seen as Flynn in Tron: Legacy — an aging creator who cannot leave his own creation. Flynn’s discovery of the isomorphs may well have had the potential to change the world, but as Zuckerberg might see it, no one would care without somebody like him to make them “cool”.

Barnard Hughes has another great line in the original Tron: “The computers and the programs will start thinking, and the people will stop.” The Social Network seems to argue that as long as a sense of connection with other people is sufficiently simulated, then we users won’t necessarily see this as a bad thing, and it’ll be fun. Tron: Legacy suggests that this idea is actually what will ultimately isolate us as individuals from everybody else, rather than connect us. We’ll be very comfortable prisoners, as Flynn is, but we’ll be prisoners nonetheless.

Embracing paleostructuralism

It is late afternoon on Wednesday, and I have somehow managed to accomplish everything I needed to accomplish by this time. On Friday, this seemed like a goal that was unattainable, so I am reasonably pleased.

Somebody mentioned to me this last Saturday, “I occasionally read your rants against post-structuralism.” It had not been explicitly discussed in class that Foucault and company actually constitute an “-ism”, so I’m sure I was a deer in the headlights for a second while I figured out what my friend meant. Flesh of My Flesh has been explicitly exposed to more theory than I have, so I’ve been hearing about the supposed difference between signifier and signified for some time, but again, that this movement had a name was new information for me. A couple of things clicked once I understood the label; this is the same friend who a few years ago overheard me saying that it made no sense to me to read modern ideas of sexual equality and identity into texts for which those ideas would be anachronistic, and consequently chided me for “not believing in gender theory,” adding, “Applying theory is not ‘reading something into’ anything. That’s just you having an ideological problem.”

For all I know, maybe he’s right. He’s in the English department, and maybe there’s a way these things actually make sense from the standpoint of literature. Maybe, too, this is the difference between a “scholar” and an “intellectual” — I do not give a fat, furry, flying rat’s hindquarters about theory. I have not entered an academic discipline because I am interested in the “isms” which seem to plague the humanities right now. (I am told that “thing theory” was rather well-represented at last week’s Byzantine Studies Association of North America conference, which makes me want to tear out my own teeth with a rusty screwdriver.) I have entered an academic discipline, because, funny and naïve and idealistic as it may sound, I am actually interested in, and even like, my subject of study.

What does that make me? A paleostructuralist? If so, then so be it. (“Paleostructuralist” sounds cooler and more dignified than “anti-post-structuralist” anyway.)

I still have more to write on Foucault in this space, but it’s going to have to wait a bit yet while I finish some other things. In the meantime, my most recent (and last) response paper for my “Introduction to the Professional Study of History” course starts to sketch out some of the thoughts that will show up there. Certain elements will be no surprise to those who visit here somewhat regularly, there are a couple of moments where it will be evident that I just got through watching all of Christopher Nolan’s movies in chronological order (which merits its own post), and the couple of somewhat coy suggestions that certain things should be discussed elsewhere will be developed in my final paper for this course.

The Safe Retreat into Omniscient Third-Person:

The Problem of Historicizing Oneself

Or

A Response to Kate Brown’s “A Place in Biography for Oneself”

(As Well as a Number of Other Bits and Pieces from the Fall 2009 H601 Course)

“Historians,” writes Kate Brown in her essay “A Place in Biography for Oneself,” “expose other people’s biographies, not their own.”[1] How can this be, however, when according to Marx, “[m]en make their own history” [2]? How, ultimately, may historians be their own agents of history while being true to their own profession? How might historians assume the first person voice in their own work, that is to say, our own work, or still more to the point, my own work – honestly?

To expand Marx’s quote, men make their own history, “but they do not make it just as they please; they do not make it under circumstances chosen by themselves, but under circumstances directly found, given and transmitted from the past.” Brown certainly did not choose her circumstances. She is from a small Midwestern town whose economic history could have stepped out of the pages of The Marx-Engels Reader; in her home town of Elgin, Illinois, as she tells it, the beginning of her life intersected with a narrative of Western expansion, labor strife, industry flight, economic redevelopment, and gentrification.[3] Her own retelling of the story gives significant credibility to Marx’s claim that “[t]he tradition of all the dead generations weighs like a nightmare on the brain of the living”[4]:

From Elgin… I came to understand how closely one’s biography is linked to one’s place… I recognized the impulse to bulldoze and start over, to push on toward a brighter, cleaned-up destiny, which meant abandoning some places and people and losers of an unannounced contest.[5]

The past – that is to say, one’s history – and its relationship to location are a weight that one must learn to carry or learn to jettison. Perhaps this can be understood as an inversion of the opening line of Pat Conroy’s novel The Prince of Tides – rather than the wound being geography, the anchorage, the port of call, it is geography, and the confluence of circumstances that one encounters in that geography, that is the wound.

All well and good — but how real is this confluence of circumstances? How objectively may its existence be assumed? Per Benedict Anderson and his analysis of how seemingly disconnected events make up the front page of a newspaper, perhaps not much:

Why are these events so juxtaposed? What connects them to each other? Not sheer caprice. Yet obviously most of them happen independently, without the actors being aware of each other or of what the others are up to. The arbitrariness of their inclusion and juxtaposition… shows that the linkage between them is imagined.[6]

What, then, is the difference between one’s life and the front page of a newspaper? Do they both represent a constructed – that is to say, not objectively real – and affected way of arranging events? For the historian, how does that construction and that affectation influence how they read history, view history, and write history? How does understanding how one’s life interacts with one’s work impact either, for better or for worse?

As a scholar, I have been carefully trained to avoid using the first person in my work. “Don’t ever say things like ‘We can see the following…’ in your research,” I remember being told in one undergraduate course. “This is not a journey ‘we’re’ going on together. It’s a research paper.” My training in languages also tends to inform how I view texts – “Read what it says, not what you think it means,” my first Greek instructor repeatedly told our class. My research goal, therefore, is typically to state a clear, impersonal thesis and then get the hell out of the way of my own argument, simply letting the facts and the observations speak for themselves as much as possible. If I present it as something that “I” think, then I will have fundamentally devalued and undermined my argument – why should anybody care what I think?

Naturally, there is far more to it than a hope to rest comfortably on objectivity. Why should anybody care what I think, indeed. I’m a nobody, a college dropout from nowhere, a first generation college graduate at the age of 29, having taken eleven years to finish a four year degree (a B. Mus. at that, not a liberal arts degree), who then, even with good grades and test scores, still had to do three years of coursework as an unmatriculated student before there was any way to be competitive for graduate schools, all the while hearing from a chorus of professors, “I’m more than happy to write you a letter of recommendation, but I’m not sure you’re going to be able to get there from here.” Why should anybody care what I think? Good heavens, I will need to make sure I publish under a pseudonym just to be taken at all seriously. Better yet, I should somehow indicate on my C. V. that I simply sprang forth fully-grown from the head of Zeus with my PhD already in hand.

But there is still more to it than that, surely. I’ve been at Indiana University in one capacity or another since 2003, somewhat ironically making it the longest I’ve ever lived anywhere. My family bounced around a lot for reasons best recounted elsewhere, and even now, they live, quite dispersed, in places I have never lived, in houses I never called home, in zip codes I never visited until they moved there. Brown can rely on her connection with the place of Elgin, Illinois as an anchor for where she is now, but I am literally from nowhere, in the sense that I have had to construct my notion of “home” from different raw materials than place and family, and I find it very difficult to relate to concepts of home that do center around place and family. If my family moved around for reasons having to do with the military or career development, than I might be able to legitimately claim – as a friend of mine, the son of a prominent Russian History scholar, does – to be a “citizen of the world,” to be from everywhere. Alas, I can claim nothing quite so romantic or interesting. Robert Frost once said that home is where, if you have to go there, they have to take you, but the places where that is even marginally true are places that have never actually been a part of my life. If Brown is correct that one’s biography is closely linked to place, than I truly am the Nowhere Man – so again, why should anybody care what I think?

But, of course, there is still more to it than that.

“In my quest to explore the human condition,” writes Brown, “I have hidden behind my subjects, using them as a scrim to project my own sentiments and feelings.”[7] There is an undeniable connection between who somebody is and what interests them; for her own part, Brown describes this connection by saying, “I believe that I was able to see stories that had not yet taken shape for other historians because of the sensitivities I acquired in my past.”[8] My advisor, Professor Edward Watts, is potentially an example; he is an academic raised in a family of academics. His parents are both academics, and his sister is an academic. What was the subject of his dissertation? Rhetorical education in Late Antique Alexandria and Athens. As I told him after I read the book, it is difficult to not see his work as having an aspect of meta-commentary on the academic life. He chuckled and said, “You wouldn’t necessarily be wrong.”

Beyond that example, I saw with my own eyes how the personal connection between historian and subject might manifest with my colleagues during orientation and initial class meetings:

“Hi, I’m Roberto Arroyo, and I’m interested in Latin American history.”

“My name is Isaac Rosenbaum, and I do Holocaust history.”

“I’m Lakshmi Patel, and I’m studying the history of relations between India and Pakistan.”

The Late Antique Byzantinist whose last name is not “Ioannides” or “Sotiriou” is left at something of a disadvantage in such company. Yes, there is, in fact, a personal reason that connects me to my subject of inquiry, a personal reason that should not be too hard to surmise for the careful observer (but one that is best discussed in another setting), but a personal reason that is nonetheless internal, abstract, and conceptual rather than immediately and concretely constructed by place or family – that is to say, by the circumstances which I did not choose. I have personal stakes that led me to my areas of interest, but because they are of my own choosing I must be circumspect in how I speak in terms of “I”, “we”, and “our” if I am to be seen as having sufficient distance from my subject to be credible as a scholar. Edward Said and Dipesh Chakrabarty appear adamant that cultures and societies must define themselves, that to not allow such self-definition is cultural imperialism,[9] and yet this mandate of courtesy with respect to communal identity does not appear to extend to those who have embraced certain communities voluntarily.

Of course, I also have the problem that I am not interested in my subject from a critical point of view; I find it anachronistic to explicitly read whatever my own political beliefs and values may be – and, for today’s purposes, we may broadly describe them as uncomfortably conservative as Russell Kirk defined the word, which according to contemporary definitions probably makes me liberal – into my historical subject, but per Elizabeth Blackmar as quoted by Ted Steinberg, we historians are not supposed to evade the question of politics.[10] According to Steinberg, the role of the historian in the present day is evidently to explore “the history of oppression,”[11] and this attitude is one I see largely borne out in my cohort. Nonetheless, the reality is that such a history is not the history of the Late Antique Eastern Roman Empire I have any desire to write. I have better things to do than study something with the express purpose of tearing it down. I fundamentally believe it is possible to be more productive and constructive – but do I only believe that because of my other beliefs in the first place? Is my choice of the word “constructive” itself telling, possibly signifying that I would rather buy into the social constructions that historians are supposed to deconstruct? The 3rd person voice of objectivity keeps me from having to mess with such potentially treacherous questions.

If men make their own history, but not under circumstances they choose for themselves, and history is supposed to be the history of oppression, then must a historian writing their own history engage in self-hatred by definition? Brown does not appear to write a piece of self-hatred, but it is clear that she is uncomfortable with the implications of her own essay – “My palms sweat as I write this… The intimacy of the first person takes down borders between author and subject, borders that are considered by many to be healthy in a profession situated between the social sciences and the humanities.”[12] Chakrabarty suggests one possible way out, explicitly referencing autobiography and history as two separate and distinct genres[13] – so not only is autobiography, the history of oneself, not history, but history isn’t a discipline anyway, it’s a genre. But here is the rub – if history is a genre somewhere “between” the social sciences and the humanities, and a historian writing their own history must find a methodologically honest way to not engage themselves at the level of self-hatred, which then in fact moves the work into a different genre altogether, then the historian can never actually engage in a real work of self-historicization that is not self-mutilatory.

At any rate, can we claim objectivity anyway by avoiding biographical detail or the first person? In a post-structuralist world where we must assume a fundamental disconnect between signifier and signified, does it really matter to begin with? Or is a research paper written in the omniscient third person much like Bruno Latour’s depiction of the laboratory[14] or Bonnie Smith’s history seminar and archive[15] – a socially constructed, that is to say false, space of knowledge-based privilege that can assert authority it does not actually have simply because a particular group of people have become convinced that it does?

I do not have answers to my own questions, posed at the outset of this musing. I am not certain where to go with them. My inclination is to say the various circumstances of my own life may appear as arbitrary as Anderson insists the front page of the newspaper actually is, but by virtue of the very fact that I in fact experience those circumstances in chronological order, I nonetheless perceive them as my own narrative. My inclination is to say that I cannot be forced to historicize my own life as a history of oppression any more than I can legally be required to self-incriminate in a court of law. My inclination is to say that nonetheless, I am better off keeping my arguments in the third person and keeping my “self” out of the voice of my own work, that regardless of what I think, we all know what a coffee table will feel like if we rap it with our knuckles, and that in saying that I am not privileging people who have hands or who do not have nerve damage. My inclination is to say that there must be a world outside of our own minds, and that there must be a way we can discuss it, even if our own minds tell us how we’re going to organize our perceptions of that world. Are these words and ideas too strong, too dangerous, too naïve, too uninformed? I do not know, but I do not know where else to start.

And perhaps that is why it is good I work in a period many people find irrelevant. It keeps me from becoming a danger to myself or to others.

Works Cited

Anderson, Benedict. Imagined Communities. 2 ed. New York: Verso, 2006.

Blackmar, Elizabeth. “Contemplating the Force of Nature.” Radical Historians Newsletter no. 70 (1994).

Brown, Kate. “A Place in Biography for Oneself.” American Historical Review no. 114 (2009): 596-605.

Chakrabarty, Dipesh. “Postcoloniality and the Artifice of History: Who Speaks For “Indian” Pasts?” Representations no. 37 (1992): 1-26.

Latour, Bruno. “Give Me a Laboratory and I Will Raise the World.” In Science Observed: Perpsectives on the Social Study of Science, edited by Karin Knorr-Cetina and Michael Mulkay, 141-70. London: Sage, 1983.

Marx, Karl. “The Eighteenth Brumaire of Louis Bonaparte.” In The Marx-Engels Reader, edited by Robert C. Tucker, 594-617. New York: W. W. Norton and Company, Inc., 1978.

Said, Edward. Orientalism. New York: Vintage Books, 1994. Reprint, 2003.

Smith, Bonnie. “Gender and the Practices of Scientific History: The Seminar and Archival Research.” American Historical Review 100, no. 4 (1998): 1150-76.

Steinberg, Ted. “Down to Earth: Nature, Agency, and Power in History.” American Historical Review 107, no. 3 (2002): 798-820.


[1] Kate Brown, “A Place in Biography for Oneself,” American Historical Review, no. 114 (2009), 603.

[2] Karl Marx, “The Eighteenth Brumaire of Louis Bonaparte,” in The Marx-Engels Reader, ed. Robert C. Tucker (New York: W. W. Norton and Company, Inc., 1978), 595.

[3] Brown, “A Place in Biography for Oneself,” 600-3.

[4] Marx, “The Eighteenth Brumaire of Louis Bonaparte,” 595.

[5] Brown, “A Place in Biography for Oneself,” 604.

[6] Benedict Anderson, Imagined Communities, 2 ed. (New York: Verso, 2006), 33.

[7] Brown, “A Place in Biography for Oneself,” 603.

[8] Ibid., 605.

[9] Edward Said, Orientalism (New York: Vintage Books, 1994; reprint, 2003). Dipesh Chakrabarty, “Postcoloniality and the Artifice of History: Who Speaks For “Indian” Pasts?,” Representations, no. 37 (1992).

[10] Elizabeth Blackmar, “Contemplating the Force of Nature,” Radical Historians Newsletter, no. 70 (1994)., 4. Quoted in Ted Steinberg, “Down to Earth: Nature, Agency, and Power in History,” American Historical Review 107, no. 3 (2002), 804.

[11] Steinberg, “Down to Earth: Nature, Agency, and Power in History,” 802.

[12] Brown, “A Place in Biography for Oneself,” 603.

[13] Chakrabarty, “Postcoloniality and the Artifice of History: Who Speaks For “Indian” Pasts?”, 8.

[14] Bruno Latour, “Give Me a Laboratory and I Will Raise the World,” in Science Observed: Perpsectives on the Social Study of Science, ed. Karin Knorr-Cetina and Michael Mulkay (London: Sage, 1983). Accessed online at http://www.stanford.edu/dept/HPS/Latour_GiveMeALab.html on 9 November 2009.

[15] Bonnie Smith, “Gender and the Practices of Scientific History: The Seminar and Archival Research,” American Historical Review 100, no. 4 (1998).

Conceptualizing the “liberal bias” of academia

Have I mentioned I’m glad I’m not a modern historian? Seriously. So much of the scholarship of modern history I’m reading in my “Introduction to the Professional Study of History” course is angry, ultra-liberal work that arrogates to itself a point of view of objective correctness, using theory as a blunt instrument against people, institutions, and events with which/whom they might disagree politically. Anything that might discuss an event or institution without criticism is nationalistic, conservative, anti-intellectual nonsense. There’s a strain I perceive among some of my cohort of choosing to be a historian because of a particular anger about a particular issue — colonialism, nationalism, treatment of one group or another, and so on.

But hold on. Is that really what’s happening? What is the “liberal bias” of academia, really? Does it actually exist? Would those whom conservatives accuse of having a liberal bias actually see it that way themselves (and, alternately, would those conservatives recognize a corresponding conservative bias)? What’s really going on?

What I’m starting to wonder is this — is what some perceive as a “liberal bias” not much more than the very human reaction to the horrible things of the 20th century, but that reaction occurring in a post-Reformation, post-Enlightenment, postmodern world? Is it as simple as a group of well-meaning, intelligent people saying, very understandably, “These are awful, evil things! How do we explain them, understand them, and prevent them?” Except with the caveat that the structures that might exist to help explain them, understand them, and prevent them, are no longer seen as reliable?

Perhaps we’re in an age where what we’ve got is “choose your own adventure humanity” — there’s no reason for society to assent to a particular religion, but you go ahead if you want. There’s no reason for society to recognize as legitimate any particular power of the state, but you go ahead if you want. There’s no reason for society to acknowledge and privilege any of the constructs society used to acknowledge and privilege, but go ahead if you want. Don’t agree with X? Great, don’t do it, but don’t tell somebody else they can’t, because there’s no legitimate framework to do so. As a consequence of these points, there’s no reason for any particular group of people to have any particular advantage or privilege, perceived or real, over anybody else; not only that, but there is no legitimate definition of a difference of function that asserts a lack of difference of privilege, because there is no institution privileged to make that distinction, and any institution that would assert the privilege to make that distinction must automatically be seen through the lens of power relationships.

The end result is very well-meaning, very humane people trying to solve humanitarian problems out of context, which winds up being perceived as “liberal bias”, but it isn’t, really. It’s just that they’ve backed themselves into a theoretical corner. From a Christian standpoint, what we might say is that these people can perceive — and quite unmistakably so — the effect of the Fall, but they don’t have any means of actually discussing it meaningfully. The anger I sense in the scholarship I’m reading and in some of my colleagues is maybe not poorly motivated, but the only way they have to talk about it is to say is in terms of historical constructs like colonialism, nationalism, racism, gender inequality, and so on — Foucauldian language regarding power and domination emerges as a seemingly sensible way to discuss historical problems.

Christians also wind up being backed into the same corner, and have to at least discuss problems that are a result of the Fall as though the Fall never happened. Even Christianity has to function according to the rules of a postmodern, post-Christian world, in other words.

Is this an impasse? Perhaps to some extent. Bad things continue to happen; people continue to have a very human response to said bad things. It’s not a liberal vs. conservative problem; the problem with conservatives is that the potential is there to go to the opposite extreme — “Oh well, it’s a fallen world, you can’t make an omelette without breaking eggs, nothing you can do about it except be thankful if you’re on the winning side and hope the Second Coming happens before things get much worse,” being how I might broadly sketch out such an extreme.

Understanding the problem as a liberal bias is not ultimately going to be helpful, I don’t think. I think we can assume more often than not that people take certain positions in good faith and with good intentions (although by their fruits shall ye know them, of course). Kicking against the goads of a perceived liberal bias isn’t going to change anything; what might change some things — and more importantly, what might change some minds and hearts — is providing well-reasoned persuasive arguments for alternative theoretical understandings, and doing so within the context of a genuine Christian witness. At times that may very well mean having to be a witness in the sense of “martyrdom,” but it’s hard to deny that that can be necessarily part of the deal.

Which reminds me — I still have more to say about Foucault. I haven’t forgotten about that, and I’ll talk about that reasonably soon.

Reflecting on shame and identity

Rod Dreher has what is, at least for me, a very thought-provoking essay over at Crunchy Con. Actually, “thought-provoking” is a euphemism. I’ll be blunt — it hits what is, for me, a permanently raw and open nerve. The next 3200+ words will reflect this. Turn back now if you don’t want me to take you there.

Mr. Dreher begins with a discussion about shame, obesity, and race, and how he has personally experienced them function and interrelate as someone who considers the American South home. He takes it someplace else, however, and the key part for here at the conclusion:

A fellow Southern exile once said to me that it’s so easy to love where we’re from when we don’t live there, because we can edit out the stuff that’s hard to live with. That’s very true. And yet, I confess it’s hard for me to feel quite at home anywhere else. When I go back to visit, there’s something about the place and its people I dearly love, and treasure as part of myself. […] [However,] I chose to separate myself from it (and anybody who thinks Dallas is the South is sadly mistaken; it’s the southernmost Western city)… [F]or me, [what motivates my writing is] a sense of cultural rootlessness, and a craving for a sense of belonging to a place. Too much has happened to me over the years to form the kind of man that I am to make me feel at home in my actual homeland. And yet, when I’m away from Louisiana, I think about it a lot, and long for it. True story: I used to walk around Brooklyn romanticizing Louisiana, then go back to Louisiana and after a few days, start pining for my old borough home in Yankee Babylon.

[…] For me, displacement and a resulting craving for authenticity. But the fact that I chose displacement and exile adds a shake of shame about disloyalty into the cocktail too. […] Me, I don’t have anybody to show anything to (this was the greatest gift living and working in New York City gave me). When I sit down to write, almost always I think not about showing myself, but about finding myself.

I’m somewhat circumspect about elements of my personal life in this forum. This is not out of any sense of needing to protect anything, exactly; leastways, it’s not about protecting myself. Anybody who happens upon this blog knows my real name, my wife’s name, and more or less where to find me; this may or may not be wise, but there we are. There are certain things I have not discussed here, like politics, because I don’t want to detract from my main objective — namely, some record of my path as an Orthodox Christian on the way to something vaguely resembling an academic career. It’s also, by far, primarily for my own use, rather than being intended as any kind of a public news service. So, since I find myself heavily burdened talking about politics — feeling in the main that I ultimately can’t pick a side because I’m not at all sure anybody is on my side — I just don’t go there for my own sanity.

Other things I haven’t discussed simply out of respect for the fact that the blog is public, and I have to be mindful of what that can mean. I was stuck in a horrible, horrible, horrible employment situation until April 2008 that I could not (and can’t) discuss here, because if certain parties were to run across my blog, it would only make things quite a bit worse. Even once I was out of that situation, I had to be careful about how I discussed the unexpected ways my grad school opportunities were developing, because out of respect for my new employers, about whom I cared very much, I needed to time how I told them what was happening in a particular fashion.

All that out of the way, Dreher’s essay hits home for me in a number of ways, not the least being shame over the struggle with weight I’ve had as long as I can remember, but even more in how he discusses his sense of displacement. Unlike him, I have no particular pride in any particular place as home — but I’ll talk about that in a bit.

There’s not much to say about my weight that’s, um, earth-shaking — as I’ve said before, my ancestors were swinging battleaxes in northern Europe; I grew up swinging a backpack full of books, there was never anything about sports that was terribly attractive to me growing up, and I have spent much of my adult life behind a desk of some sort. My parents both had weight struggles they didn’t want me to have, which unfortunately meant that my weight as a child was monitored with the unapologetic and militantly nasty use of shame as a motivating tactic. (This is still hashed over yet today from time to time, and the parent who primarily engaged in this practice continues to defend their techniques, saying that they did these things because they wanted me to be healthy, and the only alternative they saw was to simply not care. That what they did didn’t actually work is only evidence to them that they didn’t do it enough for it to truly be the behavioral deterrent it was intended to be.) A growth spurt in junior high made me tall and reasonably thin (not skinny, I guarantee you — my frame does lend itself to skinniness to begin with) for the first time in my life, and I mostly stayed that way strictly by virtue of having a teenager’s metabolism. I put on a lot of weight my sophomore year of college as a result of various stresses (which I will discuss), lost it the next summer from even more stress, gained it all back (and how) once the school year started up again, and then got back down to my freshman year weight (more or less) about ten years ago. It stayed off roughly until my wedding, at which point it crept back on. When I moved to Indiana, a fencing class and a soccer class my first semester here took care of a good chunk of it, but then required courses edged further such intentions off of my schedule, and it came back on. For the last fourteen months, I have diligently made use of a treadmill, which between August and June took about ten pounds off very slowly; walking around Athens for two months got rid of another fifteen, and while much of it came back once I returned to the States, the addition of hand weights and other exercises to my routine has gotten me down to within five pounds of where I bottomed out in Athens. I am down two belt holes from where I was in August of 2008 one way or the other, and while the weight loss is slow, there is some very clear weight redistribution happening, as well as a development of muscle tone that didn’t used to be there. It’s a problem that anybody who has known me for any length of time knows I know about; the irony is that I am not sedentary by any means — I walk everywhere I can, in addition to the intentional exercise I get — but I also still eat the teenager’s portions of an adult diet, so I have to be very intentional about being active. This is more difficult when I’m not happy about large chunks of my life, and that’s been the case for most of the last six years. In the last year that has changed in some big ways, and my hope is that the physical aspect will also change concurrently. So that’s that.

The displacement issue is more complex. I was born in Anchorage, Alaska, which is where both of my parents had been born and raised and where the vast majority of their respective families were; when I was four, due to some disagreements over business matters within the family, we moved to Wenatchee, Washington where my dad tried to reinvent himself as a small businessman. Wenatchee wasn’t an active enough town for him, however, so we moved to the Seattle area when I was seven, he bought another small business that he was going to try to grow, and we built a big house with the intent of it being the family homestead. Four years later, a combination of factors, including economic collapse in Alaska and further business disagreements within the family, led to us basically losing everything. Over the next five years, we bounced from rental house to rental house, my mom went back to work, and my dad poured more and more of his soul that he wasn’t going to get back into a business that really couldn’t exist anymore (namely, office supplies) given the initial appearances of big box competitors in the late ’80s/early ’90s.

In 1993, Dad moved back to Alaska, having been offered a job by an old friend. As he likes to say, he was lured back to Anchorage with one word: “Saturday.” It was my senior year of high school, so the plan was for Mom and I to stay in Washington until I graduated, after which she’d move up there with him and I would start my freshman year of college at Western Washington University in Bellingham, Washington.

My senior year was a real struggle; not having a father at home, that year of all years, was a nightmare, and it wasn’t easy for any one of the three of us. It wasn’t easy for the two of them getting along with each other, and it wasn’t easy for me getting along with either of them. It was made worse by the fact that I started dating for the first time that year, and I also developed a close relationship with a couple of male teachers as sort of surrogate father figures, all developments my parents had trouble regarding with anything but suspicion and resentment.

The day before I graduated high school (Inglemoor High School, class of ’94), my dad flew into town. The day after I graduated high school, he and my mom flew back to Alaska, leaving me behind to supervise the load-in of their moving truck. I spent the summer going back and forth between Anchorage and Seattle, decidedly not feeling at home in a place of which I had no particular memory, and not being allowed to feel at home in the place that had been home for the previous ten years.

Freshman year at Western was a disaster. I had been to the campus all of once before; we lacked the resources, in terms of time or money, to really launch any kind of a school visitation effort, and the main reasons we picked it were because it wasn’t University of Washington, but it was in-state, close enough to home, and yet far enough away. With my parents’ move, however, none of these really meant anything anymore. I was at a school with no good reason to be there. I had no family left in the place that I had considered home for two-thirds of my life, and had no place to go back to that I could really call home. The place where my family now was, despite being my birthplace, was unfamiliar, and being now at the beginning of adulthood I had no compelling reason otherwise to be where they were. In short, I felt like they had left me. Unfortunately, when it became clear this arrangement wasn’t making any of us happy, the rhetoric that I heard more often than not was that of me having left them.

I no longer belonged where I had grown up, so I was being told, and where I was being told I belonged by virtue of my parents having moved there just as I was starting college was not anyplace that felt like home, and since the end result was that I felt like I had no home, I never really felt comfortable at Western. My two and a half years there were a miserable attempt at trying to eke out an undergraduate existence with no familial or financial support; lacking any particular guidance, I made very poor decisions during that time regarding money, my heart, and school (among other things). I spent the summer after my sophomore year in Anchorage trying to figure out how to put my relationships with my family back together, and found that at that particular point, there simply wasn’t anything to reassemble. My parents couldn’t deal with each other at that stage of the game, let alone me, and that summer was the lowest I had ever been up to that point. I lost weight simply by virtue of the fact that I wasn’t eating or sleeping for about a month; eventually a psychiatrist put me on both Soma and Serzone, and things evened out enough to be able to survive the rest of the summer. (I took myself off of both as soon as I returned to Bellingham.) After one quarter back, however, it was clear that college was something I just wasn’t in any position to be able to pursue properly, and I dropped out after having thoroughly ticked off virtually everybody in the Department of Music with my inability to cope and what had become a tendency to lash out.

I now had no particular reason to stay in Bellingham, I wasn’t going to move to Alaska, and that left me with no real place to go except back to Seattle. I started job hunting, selling classified ads in Bellingham for the time being to at least have some way to live, and after a year finally had the opportunity to take a contract position (which was fulltime within a year) with a Major Software Company.

Life settled into enough of an equilibrium over the next year, and my parents appeared to have put enough of their own lives back together, that there seemed to be some kind of peaceful relationship that could exist with me in Seattle and them in Anchorage. They still nudged and wheedled me to consider moving to Alaska, but the fact remained that beyond the two of them, there just wasn’t any reason for me to be there. I sometimes thought that maybe once they’d retired, they would move back to Seattle; I dreamed that, having made my initial couple million working in the software industry (with subsequent millions to be made as the Great American Hope of lyric tenors, of course), I could buy back the house they built that was supposed to be the Barrett family home and give it to them.

Except that, about the same time that Megan and I started dating, in the early spring of 1999 (she and I having met freshman year at Western, so I guess it wasn’t a total disaster), my parents announced that they were getting a divorce. This was not the first time they had made this announcement, but this time it was final. The one time my wife ever saw us all together, apart from our wedding day (when they studiously avoided each other as much as possible — the family photo has them at opposite ends of the line), was a few days before the divorce was finalized, and they were yelling at each other over a snowblower.

By 2001, we were married, and shortly thereafter my dad left Alaska to spend a couple of years in West Virginia. By 2003, we moved to Indiana so that I could finish my undergraduate degree; part of the idea was that we’d be five hours away from Dad, which would have been the first time in a decade that I had lived within driving distance of a parent. Shortly before I left for Bloomington, however, he headed off to Phoenix, Arizona to be near the older of his two daughters (from his first marriage) and her children. Living near family was simply not to be.

Six years later, we’re still here. We’ve lived in our little rental house for a tick over four years; at 32 (less than a month away from 33), it is the longest I have ever had the same address. Ever.

My dad is still in Phoenix; my mom is in Wasilla, Alaska. Megan’s family is in the Seattle area. There is no one place we can ever live and be reasonably close to everybody.

Jaroslav Pelikan once quoted Robert Frost in saying that “home” is where, when you have to go there, they have to take you, but “home” has come to mean, at least for me, where my wife and I are able to share a life. It has no meaning relative to roots or family, at least not for me; for it to take on that meaning would mean choosing between my parents individually, or choosing between her family and mine. Everybody has an argument for why we should do one thing or the other, but there’s nothing we can do without having to make a painful choice. In some ways, it seems best to live near nobody, thus treating everybody equally.

Like Mr. Dreher, I crave roots and something authentic, but unlike him, I feel at home precisely nowhere. I have never walked around Bloomington pining for Anchorage (or Seattle, for that matter), nor vice versa. My parents both live in houses in which I never lived, in zip codes I never visited until after I was an adult. The place where I grew up has exactly nothing to offer me now. I have lived for years with a lot of shame and pain because I belong nowhere, but not because of a sense of disloyalty — unlike Mr. Dreher, I didn’t choose this displacement. Ironically, my displacement is precisely because I didn’t choose it. Rather, my parents moved away just at the very moment I needed them to stay put. (I will emphasize that I say this descriptively, not to assign blame; they did what they had to do. This has not made it any easier over the years from an experiential standpoint, however.) No, my shame is that I have no roots, no sense of home, to pass on to my own children once I have them. I have nothing to give them but the culture of a stray, a transplant. A stray who married up and who gets to eat pretty well for a stray, but a stray nonetheless.

Like Mr. Dreher, much of the work I have chosen is implicitly a means of trying to “find myself”; unlike Mr. Dreher, I’ve been trying to “show them myself” for years — to show “them” that I’ve risen above the rootlessness, the struggles with finding a path, the forced independence, the displacement, the lack of any visible connection to anything except a woman who loves me with all her heart, the confusion about how to simply be that burdens me from lack of guidance. I’m still trying to “show them,” which I guess means I haven’t actually accomplished rising above any of those things yet.

I don’t know if this pain ever goes away. I managed to get saddled with it at 17, and I keep waiting to feel normal again, keep thinking that understanding of the last several years is right around the corner. I at least feel less stuck than I have in years, like I’m working towards something now, something productive, so maybe it really is right around the corner. I don’t know.

Week 5 of grad school and all is well

The last couple of times I had a hiatus in blogging, it was because things weren’t altogether well for me.

This time, to be honest, I’ve got nothing to complain about. Things are going really well.

I’m going to repeat that, just for emphasis and the sheer joy of being able to say that truthfully and unreservedly, perhaps for the first time since moving out here over six years ago:

Things are going really well.

The last several weeks have been something of a whirlwind; after getting back from Greece I had two papers to finish, a godson’s wedding to hold crowns for, my wife to murder, and Guilder to frame for it — er, wait. That is to say, two days after the wedding, Orientation Week started, during which I had to take a Latin and a Greek diagnostic exam; then the semester started for real, and it was off to the races.

Im photographing them being photographed. Theres something kind of uncomfortable meta about this, dont you think?

I'm photographing Matthew and Erin being photographed. There's something kind of uncomfortably "meta" about this, don't you think?

Matthew and Erin’s wedding was wonderful; we were in South Bend for the three days leading up to it to help out with various things, and it was a joy to be part of it at every step. Fr. George Konstantopoulos at St. Andrew’s Greek Orthodox Church in South Bend served with Fr. Peter, and this was a lucky match for everybody — Fr. George has decades of experience and knows all of the little things that often get left out in the simplified versions of services that are often done these days. For example, I was a lot busier as the koumvaros at this wedding than I was for another one at All Saints last year — at that wedding, I just stood there. Here, I did the crown exchange and the ring exchange — and let me tell you, I was sweating it during the ring exchange. Oh, I thought. These rings are very small, and my fingers are very big. And all three sets of hands are shaking. If I drop them it will be very bad. Now I remember why I don’t do brain surgery. Fr. George also had the gravity and authority (to say nothing of the beard) that comes from many years of doing this, and it complemented well Fr. Peter’s still-youthful energy (he’s 35, I guess it’s not inappropriate to say that, right?).

The next morning, the newly-crowned Mr. and Mrs. Wells met us at St. Andrew’s for Divine Liturgy, and Fr. George gave them a big ol’ head pat during the announcements — “Matthew and Erin from Bloomington were married here yesterday,” he said, “and this morning they were here for Divine Liturgy. To me, that is an example of what living life as an Orthodox Christian is all about.” His meaning could hardly be plainer had he hoisted a neon sign saying, Please take being here as seriously as they do.

I need a calculator to adequately express in mathematical terms how much shorter than me you are, Megan...

"I need a calculator to adequately express in mathematical terms how much shorter than me you are, Megan..."

Before driving home, we headed to Chicago to see our friend Tessa Studebaker, an old singing colleague of mine from Seattle whom we hadn’t seen since before we moved to Indiana. When I met her ten and a half years ago, she worked at Barnes and Noble for the discount and was still in high school; now she’s in her upper twenties, is a college graduate, took a job in France for a while, moved back, and is possibly getting serious about somebody. It’s incredible to think that the last ten years have gone by so quickly that all of that could have happened, but there we are. It’s even more incredible that the majority of that ten years has been spent here in Bloomington — it means I’ve spent more time here than I spent in Seattle after dropping out of college the first time. It means that the address I’ve had the longest in my entire life (four years) has been here. It means that by the time I’m done with my PhD, I’ll have spent probably over ten years at a place I thought maybe I’d spend three years at the very most.

But enough with the existential pondering for the moment. I guess seeing old friends has a way of bringing that out of me.

Orientation was more or less a non-event; I’ve been here for six years, I know where the library is, my e-mail account hasn’t changed in all of that time, so there wasn’t really any particular novelty for which I required context. That said, a couple of things stick out for me — one, Ed Watts, the Director of Graduate Studies for the History department here (who also happens to be my PhD advisor), strongly impressed on everybody to find a schedule for working, a rhythm of grad school life, that gets the job done and can be adhered to, and then to stick to it. Coming from a situation where I was trying to fit being a half-time (or more like three-quarter time) student in around having a fulltime 8-5 job, that advice really resonated with me; I’ve done my best to take that to heart, and I think it’s served me well thus far.

Secondly, I observed this kind of thing while students were introducing themselves:

“Hi, I’m Jacob Goldstein, and I’m doing Jewish history with an emphasis on Holocaust education.”

“My name is Sankar Ramasubramanian, and I’m interested in modern Indian history.”

“I’m Ramon Santiago, and I do early modern Latin American history.”

Do you see where I’m going with this? It seems that who one is can’t help but inform their research interests, and the correlation there appears to be entirely natural and predictable. That said, the same correlation appears to be viewed with some amount of suspicion when it comes to Christians doing Christian history. I haven’t directly experienced that among my cohort yet, but I’ve seen it in other contexts, and something I’ve picked up on a bit is a certain point of view, perhaps almost subconsciously held, that can be expressed as, I’m interested in history because I want to prove that everybody has always been as petty, nasty, and not to be trusted as they are now. It’s a fundamental skepticism of humanity bordering on loathing (but ironically, I think its proponents would probably self-identify as humanists), and it seems to cross disciplinary and ideological lines. I’m not exactly sure what to make of it.

My Greek and Latin exams evidently went well enough; for each language, I had three passages, a dictionary, and an hour. In each case I got through more or less the first passage and the first third to the first half of the second. I don’t remember what the passages were, but they didn’t generate any particular concern. I was worried, when I next saw Watts, that he’d get a concerned look on his face and say, “We need to talk,” but that didn’t happen. He just said I did very well with the Greek, and while the Latin wasn’t as good, it was still pretty good. I figured the Latin would be the weaker of the two anyway.

Then it was time to actually start classes.

So, I’m taking three classes for real, sitting in on two, and then doing some individual reading with Watts for one credit. I’m taking third year Modern Greek, a mandatory “Welcome to the History Department” course called “Introduction to the Professional Study of History,” and then a course in Classical Studies where we’re reading Ancient Greek judicial oratory — Antiphon, Lysias, and Demosthenes, namely. Modern Greek I have to take for my funding (and I should be doing as much with it as I can, anyway), and then Watts wanted me to take some upper-level Classical Studies courses so I could have a chance to sharpen my Greek a bit. The one credit of individual reading we’re doing finds us reading St. Jerome’s Life of St. Hilarion, so I’m also getting some Latin in this semester. Since I’m ahead of the game a bit in terms of my coursework, Watts thought it was important to give my languages some extra time, and he’s right — it’s been a good thing.

(Watts and I have had a couple of simpatico moments with our iPhones — today, for example, we were reading Jerome and needed to look up a word. I pulled out my sketchy little pocket dictionary, and he said, “I’ll one-up you there.” With a gleam in the eye only recognizable by the fellow geek, he pulled out his iPhone and asked, “Do you know about the Latin Dictionary app?” I didn’t, but within two minutes I had it along with its companion Greek Lexicon by the same developer.)

I’m also sitting in on an undergraduate survey Watts is teaching on the Late Antique Roman Empire, as well as a seminar in Art History called Problems in Early Christian Art. The former is really useful background, and I’m doing it instead of taking Watts’ actual graduate seminar on the same material (since I’m actually at a point where it’s vital I take seminars from people other than him). The latter is a result of recognizing a) that my interests, the way I want to talk about them, are interdisciplinary, and b) given certain realities, I will be best served doing some of the interdisciplinary work on my own time. The course is basically dealing with Christian art up to Iconoclasm; the reading is actually highly useful stuff for me, and I’m learning a lot, with certain things I can already talk about being discussed in a very different context than that to which I’m accustomed.

Anyway, it’s a lot, but it’s not a back-breaker of a schedule by any means. Yes, it’s a good amount of work, but I’m finding it easier to manage now than I found it to manage less work while having to juggle a fulltime job. It means I’ve had less time for blogging, yes, but it’s been for a good reason. I think I’m at a point where I understand the rhythm well enough that I can post a bit again.

So, in brief, that’s where I’ve been and what I’ve been doing. Coming up, there’s another wedding this weekend, that of a certain Daniel Maximus Greeson and Chelsea Coil, plus I’m also supposed to run a book review for these folks by 10 November. Plus there are any number of other things for me to talk about regarding what I’ve been reading and what I’m thinking about — it’s more “Where the heck do I begin?” than “What do I have to say?” Let me tell you, these are all problems I am thrilled to have.

I will close this post in the manner which I think I may start closing for the time being — that is, with a rundown of what I’ve recently finished reading and what I’m currently reading.

Recently finished:

Currently reading:

Gifted education

Rod Dreher has a post about plans in Louisiana to cut the budget of their residential public high school for gifted and talented kids. Read the whole thing, but these bits stick out to me:

It’s distressing to me how gifted education is typically seen in this country. We tend to spare no expense to provide for the needs of students who are handicapped or challenged in particular ways by the normal classroom experience. But we don’t spend nearly the energy or the money on gifted education — this, even though many gifted kids face their own set of challenges that cannot be easily overcome in a standard classroom. When I was in college at LSU, I remember getting into an argument with a friend over this; he believed that gifted kids had natural advantages by virtue of their cognitive skills, and didn’t need or deserve any special consideration.

I don’t believe that’s true at all. Of course nobody feels sorry for gifted kids, and nobody’s asking them to. The point is that to the extent that it’s feasible, all kids should be in an educational environment in which they can flourish to the extent of their own talents. If a kid cannot do as well as he otherwise could because of a particular learning disability, then insofar as it is possible to accomodate that child’s needs, we should seek to do so. Similarly, though, there are reasons why many gifted kids struggle in standard classrooms, and their needs should not be dismissed simply because of their intelligence. In my case, my grades were good in my old public school, but I struggled with depression because I was such an outsider, and was constantly picked on by the in crowd. The great thing about LSMSA — and I think lots of kids from small-town schools like mine felt this way — was not so much the superlative academics as the great blessing of not having to bear the emotional burden of being bullied and socially marginalized because you got good grades and liked books. […]

[The Louisiana School] was a place where, for the first time, we could feel accepted and affirmed, not marginalized and bullied as nerds and outcasts because we liked books and ideas. We could hardly believe our luck to be living and studying in a place where we didn’t have to keep our heads down and our mouths shut to avoid crossing the dominant peer culture in our hometown schools. When I graduated, I took with me a powerful sense of confidence, of being at home in the world, one that I had not known before. That gift was, literally, priceless.

I’ve talked extensively about my (mis)adventures in higher education (starting here); I’ve not really talked about more, shall we say, elementary matters. I’ll start out by saying that while it’s great that Dreher and people like him have had this experience with gifted education, it is worlds away from what mine was like. Perhaps, like so many things, the best thing to say is that gifted education is one of those things to which you either have to commit fully and do it right, or don’t do it at all, because to to do it in a, uh, half-fast manner will be worse than doing nothing.

I started to learn to read when I was probably 3. My parents claim that nobody taught me how to read; they would read to me, I would memorize the books they were reading to me, and (so they say) I started instinctively linking sounds to text. I don’t know; I don’t remember. I do know that when I was four or five, I was reading, and memorizing passages from, Carl Sagan’s Cosmos.

When I started kindergarten, within two weeks somebody realized that things weren’t quite right. I was given a diagnostic reading test after school one day; a couple of days later, I was told I was being moved up to first grade. What I found out later was that I had scored at the twelfth grade level, and that they had wanted to move me up to fourth or fifth grade. My parents decided that was probably going too far, and agreed to the one-year bump.

First and second grade, to say the least, were a struggle. I had, really, two friends, and they were both two grades ahead of me. (Aaron Spencer and Jamie Metrokas, where are you guys, anyway?) I tended to get along with adults better than other kids. It was really tough for me to stay engaged in class, because I would just read and work ahead very quickly, which of course isn’t what my teachers or fellow students wanted me to do. I would bring other books to occupy myself when I was done with what was assigned in class, which also isn’t what anybody wanted me to do. I soaked up whatever anybody put in front of me, and I had a big imagination that would start transforming the information into other things, too. Numbers I wasn’t (and am not) so hot with, but what that meant (at least up until high school) is that I was done with assignments five minutes before everybody else rather than a half hour.

Also, being less than athletic, I was at once the Smart Kid and the Fat Kid.

Like I said, it was a struggle. I just wanted to read my books, write my stories, and get along with people, and I didn’t understand why it seemed so hard. My first grade teacher told my parents that, realistically, it wouldn’t be until college before I’d really “come into my own,” whatever that really meant.

Just before I started third grade, we moved from Wenatchee to Woodinville, which back in 1984 was a reasonably-affluent almost-rural suburb of Seattle (as opposed to the nouveau riche extension of the Microsoft campus that it is now). The Northshore School District, as it worked out, had a (now much re-worked and re-titled) program for third through sixth grade called, prosaically enough, Talented and Gifted (TAG). My parents enrolled me, and hoped that it would mean better things for me.

Eh, not so much. Not really.

The trouble was multi-part. First of all, the program was a “magnet” (in other words, it was based at a particular school and you went there, rather than it being at your home school), and at least when I got started, it was floating magnet, having been at two or three different schools in the four or five years it had been in existence. So, we were among the “normal” kids, but we were sequestered from them somewhat because we were told we were “different”. That made for a weird, weird, weird dynamic, let me tell you.

For years four through six, we were at our own school, but it was the oldest and most rundown building in the district (built in the 1920s, had asbestos, fun stuff like that), and we were put there with the special education kids. This posed its own problems — we felt like freaks and afterthoughts, to a certain extent, and there was a certain amount of normal kid stuff which was expressly forbidden specifically because administrators were worried that the special education children might try to imitate us. We were “different”, we were “special”, but the way were treated, these terms did not appear to mean anything positive. It seemed to mean we were a problem best shoved aside and kept out of the view of everybody else.

As well, we were still kids, and kids will stratify themselves. It’s what they do. We were all theoretically “the smart kids,” so the smart kids separated themselves into “the popular smart kids,” “the not-popular smart kids,” “the smart smart kids,” the dumb smart kids,” and so on. Because we were smarter, part of what that meant is that we knew how to hurt each other more efficiently. Ever read Ender’s Game? It was sort of like that. Two of my fellow students absolutely brutalized me emotionally on a daily basis from third grade through fifth grade — and I mean they sought me out every free moment they had, and they were as intentionally merciless as they could manage. Their hobby was making me miserable, they were really good at it. My teachers told my parents on a regular basis that there was nothing they could do about it until it became physical.

Which, at some point, it did, when one morning I got spray cleaner blasted right in my eyes. Then somebody did something about it.

Another practical issue was transportation. We were bused to and from the school; that meant taking the bus on our normal route to what would have been our normal school, and then another bus picked us up there to take us to the host school. Getting home meant a special set of buses. I lived roughly five minutes from the school, but since they were trying to get everybody home from the host school on two or three buses, it took an hour to get home.

Since this program ceased after sixth grade, that meant that we were dumped back out among the “normal” kids in junior high. We had an Honors English and History program at that point, but that was it. In other words, we’d been kept separate from everybody for the last four years, and now were expected to “mainstream” ourselves. Since large groups of kids knew each other from the mainstream elementary schools, knew they didn’t know us, and knew why they didn’t know us, the 5-10 of us former TAG kids were instantly easy targets. Junior high was a long, agonizing three years — it was going to be anyway, of course, but this way it felt even longer.

A lot of us TAG kids wound up doing theatre in high school and thriving in that setting, interestingly enough. Make of that what you will.

All of this is to say, I’ve got really mixed feelings about so-called “gifted education”. I can’t lay all of this at the feet of TAG, necessarily; another issue was home life, which is its own long story. The relevant point I’ll share for the moment is that my parents, while not being stupid people by any means, are very practical people, and it was hard for them to relate to where I was. There were many times where I would tell them something I was excited about, and they would look at each other and say, “Are you sure this is our kid?” As a result, it was difficult for them to know what to cultivate in me or how to cultivate it, or to tell me how to deal with what I was going through.

Maybe the Louisiana School is a place that is able to do it right; maybe a residential high school, rather than a magnet elementary school, is a better way to go. If so, more power to it and to its students — but I’m not by any means going to cheerlead “gifted education” as an absolute.

Various points of interest (or not)

George List was an emeritus faculty member for the Department of Folklore and Ethnomusicology here at IU. He retired in 1976, the year I was born, but still had an office and an assistant here. I first spoke to him over the phone in April, when I started working here at the Archives of Traditional Music; I met him for the first of three times in August. He was ninety-seven years old, he was frail, he was blind, he had been a widower for “seven long years” as he put it, and he had even outlived his son. Despite all of that, he was sharp as a tack, very active mentally, perfectly articulate. He was also clearly very lonely. He made a big impression on me the three times I got to meet him, perhaps more than I realize even now.

He passed away on Sunday, 28 September. It’s affected me this week a lot more than I thought it would — I even had a dream about him a couple of nights ago, although I don’t really remember much about it, beyond wanting to say goodbye in person and trying go back in time a couple of days by crossing the international date line. Aionia mneme.

Sometimes there are these fortuitous moments of synchronicity that tell you you were supposed to do something. Megan has been toying with the idea of getting an iPod for a little while, and I finally decided it was time to take the plunge and call it a late birthday present (my regular readers, both of them, may recall that I was in New York and she had just arrived in Germany on her birthday; I was so discombobulated that I didn’t remember to say “Happy birthday”, and she was so discombobulated that she didn’t notice). So we bought one of the new 120gb iPod Classics yesterday (it was $250; for reference, my 80gb Classic was $350 a year and a half ago), and after paying for it, the guy behind us in line, noticing we had bought a different pair of earbuds, asked us, “Hey, do you want the earbuds that come with that?”

We shrugged and said, “No, not really.”

“Can I buy them from you for $10?” He was about to spend $30 or some such on a replacement pair — seemed fair to us, and appeared to be one of those moments where the timing just worked out the way it was supposed to.

Of course, once we got home, we realized that we had also given him the USB cable. Oh well — keeps us humble.

Me, Bryn, and some guy at Jeff Fletcher's wedding, 12 September 2005A year ago today my friend Bryn Patrick Martin passed away from a bad reaction to painkillers. It was a stupid way to go and entirely preventable; I’m still mad at him. He was a constant, and I mean a constant, in my life between 1993 and 2003; this picture of him and me smoking cigars (can’t remember who the guy on the far left was) was taken the last time I saw him, 12 September 2003. His devilish smile (and overall countenance), obscene sense of humor, and most of all his fierce loyalty to the people about whom he cared remain very much missed. Aionia mneme.

I note among the stats for the last few days that for the first time ever I’ve received a hit off of a Google search for “Richard Barrett.” Exactly how this works I’m not sure, since last time I checked, I think it was the 15th page of results before this blog popped up, so the searcher in question must have been very persistent. Not that I care about my Google rank, exactly, but I do hope that people don’t see all of the matches on the first several pages for the white supremacist Richard Barrett and think that’s me. (If he had any idea who I was, he’d probably hope people didn’t get us confused, too.) If people want to think I’m the English composer or the motivational speaker, that’s probably okay. (Maybe not the R&B pioneer or the minor-league Depression-era Seattle baseball player, since they’re both dead.)

YouTube is my Pensieve

It’s amazing the pieces of one’s childhood one can reconstruct using YouTube. Surely, I’m the 9,081,726,354th (and if somebody wants to tell me what’s interesting about that number, you’ll get a classic Marvel No-Prize from me) person to post this out of nostalgia:

But then, surely, there are the things of which you only have vague memories, haven’t seen it in years, have never heard anybody else ever mention it, it’s never come out on DVD, you question whether or not it even existed or if you just imagined it, etc.

We got cable when I was probably about five (c. 1981-1982), and I remember this short film that I saw on HBO numerous times. The main image that I remember is this mass of magnetic tape chasing somebody around an office building, and eventually consuming him. Over the years I’ve inquired here and there on the internet to see if anybody knew anything about it; nothing. One or two people thought they remembered seeing part of the film, but had no further information. Well, I finally found it last night. I give you 1975’s “Recorded Live”:

Oh, the mustache. Man.

Two interesting names in the end credits: George Winston and Ben Burtt, Jr. The former is the pianist; the latter would go on to become George Lucas’ main sound effects guy (and creator of the lightsaber noise).

Something else I saw on cable a lot as a little kid was an animated movie called “The Mouse and His Child.” I remember parts of it really messing with my head, particularly a sequence where there’s a can of dogfood with a highly recursive label, and the titular characters are trying to figure out where it ends. It has evidently never been released on DVD, I’ve never met anybody else who remembers it, but here’s the whole thing (and it’s well-worth the watch if you’ve got 77 minutes to spare):

“Will that be cash, or –”

“TREACLE BRITTLE!” (smash to head)

Finally (for now), in the late 1980s, the church my mother and I attended for awhile used to show this movie every so often to the junior high kids. The protagonist is a schlub of a guy who leads a very humdrum life (no doubt today he would be played by either Philip Seymour Hoffman or Paul Giamatti) in a very ugly, grey and industrial city. One day a gospel group simply appears in front of him, singing music that makes him happy in a way he’s clearly never experienced before. They disappear, leaving a box behind. He lifts the lid of the box, and their music comes out. He then walks around with the box up to his ear, listening to it wherever he goes, but he doesn’t want anybody else to hear it. One night, the band appears to him again when he’s in bed, and they tell him, “You’re supposed to share it!” To his terror, they start a song, waking up his family. They rush in, wondering what’s going on, and then find themselves infected by the music as well. When he realizes that it’s a good thing for him to let others hear the music of the box, he starts going everywhere with the lid open, and there’s a closing montage of him doing so. The last shot is of him standing on the roof of a tall building, holding the box open for the whole city to hear.

Same deal — I never knew what it was called, never heard of it again after we stopped going to that church, never met anybody who knew anything about it. The images from it have nonetheless stuck with me over the years. I finally found it, and turns out it is called, prosaically enough, “Music Box.” Here is the beginning:

There’s that ‘stache again. Were there special steroids one could feed facial hair follicles in those days?

The whole thing may be found here. It’s just a tick under half an hour.

I will say that whenever the topic of evangelism comes up, the images that come to mind are from this movie. Not the, um, special white tuxedos with fluttering wings, but rather that keeping what we have to ourselves out of fear is missing the point, and that we can shout it from the rooftops in such a way that will be more than just us making pests of ourselves. Let’s not keep our lights under bushels, in other words.

Anyway — watch and enjoy.

On Forgiveness Sunday, the alleged plurality of methods by which one may relieve a feline of its flesh, and other musings

First off, this very important announcement:

My Peculiar Aristocratic Title is:
Imperial Majesty Richard the Mad of Nether Wombleshire
Get your Peculiar Aristocratic Title

Right.

So, Cheesefare Week, as noted earlier, started off with some bad news. I had been obliquely informed about a month ago that good news would come via e-mail, and bad news would come via postal mail; therefore, when I saw the envelope in my mailbox on Monday, I knew exactly what it contained before I even opened it. Bottom line: I will not be a matriculated graduate student this fall. Ricardus est insufficiens petitor.

Exactly what is next for me is unclear. I was instructed to thank God for keeping me from going down this path since He obviously has something better in mind for me, so I’ll start there. There are some well-placed people who have told me they absolutely believe I can do this and want to talk about what happened and what they think I can do from here; I’m more than happy to listen, but in the meantime, I am beginning to consider what my other options are, up to and including the possibility that, being 31, perhaps my window of opportunity just isn’t open anymore.

However, without conceding that point just yet, I can say that I will finish this school year with two years of Latin, three semesters of Greek, and a year of Syriac. By the end of next year, I will have the fourth semester of Greek, a second year of Syriac, and a year of Coptic. I will also begin the St. Stephen’s course through the Antiochian House of Studies in the fall, with a concentration in Byzantine Musicology. This will, eventually, lead to a Masters in Applied Theology accredited by the Balamand Seminary (although I am under no illusions that anybody will take that seriously if I try to pass that off as my only Masters degree). I plan to forge ahead with a project I proposed in my personal statement, since I still think it very much worth doing, and I believe I’ve got the toolbox at this point to at least give it a shot and see if something productive (and perhaps publishable) comes of it. I will perhaps discuss it here from time to time as I make progress on it.

After that… I guess we’ll see. Perhaps there is something to be said for trying to do a Masters at a seminary such as St. Vlad’s; it wouldn’t exactly be ideal financially (depending on what our other circumstances are at that point), but there might be far worse things than working with Fr. John Behr and Dr. Paul Meyendorff for a couple of years. At least given what the current data are, Bloomington does not exactly seem like the place I am meant to thrive, so perhaps once some other things are clear, it will simply be time to move on.

My path has been less than linear up to this point; why should it be any different now?

Meanwhile…  Forgiveness Vespers has been served, and Great Lent is upon us. Forgiveness Vespers is a fascinating service; I’ve never seen the Mutual Forgiveness portion have the same tone two years in a row. Some years it seems quite somber; some years it can seem very chipper and cheery. I’m sure somehow that sets the mood for a given year’s Great Lent one way or the other, but I haven’t yet figured out how. Regardless, it’s a beautiful and moving service — I just wish more people came. It seems to me that if everybody isn’t there, part of the point has been missed.

On a different matter altogether — things like this just make me sad:

The Kingsway Cathedral in the Sherman Hill Neighborhood may be demolished to make room for a gas station and convenience store.

Right, because that’s what the world needs.

And then things like this just make me shake my head: “Satanism in Orthodox Catholicism!” It being Forgiveness Sunday, I suppose my most appropriate reaction would simply be to say, “Forgive them, Lord, for they know not what they do.”

Forgive me, a sinner.


adventures in writing alexander lingas all saints bloomington all saints orthodox church american orthodox architecture american orthodox music american orthodoxy Antiochian Archdiocese Antiochian Orthodox Christian Archdiocese of North America Antiochians books byzantine chant cappella romana chant church architecture ecclesiastical chant ethnomusicologists ethnomusicology fellowship of ss. alban and sergius Greece Greek greek food greekness hazards of church music international travel tips ioannis arvanitis joe mckamey john michael boyer kurt sander Latin liturgical adventures liturgical architecture liturgical music liturgical texts and translation liturgy liturgy and life lycourgos angelopoulos medieval byzantine chant Metropolitan PHILIP militant americanist orthodoxy modern byzantine architecture modern greek music music as iconography my kids will latin and greek when they're newborns my kids will learn latin and greek when they're newborns orthodox architecture orthodox architecture is bloody expensive Orthodox choir schools Orthodox Ecclesiology orthodox outreach orthodox travel pascha at the singing school Patriarchate of Antioch Patriarch IGNATIUS IV Patriarch of Antioch publishing random acts of chant richard barrett in greece richard toensing rod dreher sacred music st. vlads st john of damascus society Syriac the Bishop MARK fan club the convert dilemma the dark knight The Episcopacy The Episcopate the only good language is a dead language this american church life travel we need more american saints why do we need beautiful music in churches?

Blog Stats

  • 262,833 hits

Flickr Photos