Lifelong Learning (Blog Style)

February 7th, 2008

I’ve been thinking a bit about the use of blogs in courses. With a few exceptions, I don’t know that I see a lot of mileage in compelling students to keep an individual blog themselves. I really enjoyed some of what my students in my History of Reading class did along those lines, but it was especially suited to that course.

An information-and-news aggregating blog, on the other hand, has some obvious usefulness as long as the students look at it and the professor is nimble enough to make use of what actually appears on it. It doesn’t do any good if you don’t bring whatever material comes through that medium into the classroom, if it’s just something you ask the students to read but never do anything with what they read.

What inspired me to think about this idea in a new way this morning was reading about the announcement of some new research findings that the free distribution of bed nets in parts of Africa seems to be having an impact on malaria. In my courses on the history of development in Africa and on the environmental history of Africa, this was a topic we discussed quite a few times, in terms of the debate over whether free distribution leads to people reselling or devaluing the nets. I was thinking that I ought to go and dig out the student email addresses for both of the courses and email a link to the research.

It hit me suddenly that it could be an amazing “value-added” part of a course if you created an aggregator blog for some of your courses and then committed to maintaining it indefinitely, giving every student who has taken that course authoring privileges. Not a wiki, not a permanent collection of knowledge, but a running update of news, research, and information on the key topics and discussions of the course. I can think of at least six or seven classes that I have taught more than once that would really benefit from this kind of service.

This would be a way to connect alumni and current students, a different approach to “lifelong learning”. Taking a course would bring you into a small but continuously growing virtual community of people who had also taken that course with the same professor.

I grant you that it would be a lot of work for me to maintain these course-related aggregation blogs. I’d need support from the alumni and IT staff to carry it off, at a minimum. The idea really seems attractive to me, though.

One-A-Day: Oona Strathern, A Brief History of the Future

February 7th, 2008

Historians divide themselves by areas and by periods of specialization, but also by the methodological focus of their scholarly work: social history, political history, economic history and so on. This isn’t just an abstract division: it defines the real-world allocation of positions within departments. In many departments today, social historians of some kind or another are the largest plurality, often with cultural history of some kind a close second. Thirty years ago other specializations were more dominant. I tend to think that twenty years ahead, the balance will have shifted again. Partly because I think that knowledge, even historical scholarship, is progressive. Our methodologies do improve over time. Social historians ran into some intractable problems which in turn the “cultural turn” responded to. In African history, social historians renewed an interest in the colonial state and the concept of “indirect rule”, which created an opening for a new kind of political history. I also think the balance will shift because we relentlessly demand originality from junior scholars, and one way to be original is dusting off an old paradigm that was pushed aside largely for reasons of fashion.

Part of believing that knowledge is progressive, however, is also a belief that some older practices of historical writing fell by the wayside because they intrinsically weak in some respect, because they couldn’t hold up to sustained challenges from newer methodologies, couldn’t defend against critical examination. There’s a kind of 19th Century narrative history, for example, that is a lot of fun to read today for its literary qualities and lack of inhibitions about things like evidence and truth, but I don’t think scholarly historians are likely to return to writing fabulistic biographies and stirring if largely invented tales of derring-do.

There is a style of intellectual history that has fallen out of fashion, and I’m hoping it largely stays that way. Intellectual history and cultural history are often very closely related styles of scholarly writing, and in many ways, I think what is emerging out of their intertwining is a new hybrid form of historical study that has the strengths of both and the weaknesses of neither, that can study how a particular idea or concept moved in and out of formal thought and expression into wider popular consciousness or practice. Sometimes, though, there’s real value in a narrower focus, in tracing the successive development of a highly particular idea in formal published writing or texts. Say, in an intellectual history of the concept of sovereignty within British political philosophy in the 19th Century.

The variation on that style that I dislike, however, is when a contemporary devotee of some idea or institution writes a triumphalist intellectual history about how the march of time has beat a path to the writer’s very own doorstep. Partly this kind of intellectual history is a scavenger hunt through the past, an attempt to annex notable and famous figures as the glorious forebearers of the contemporary practicioner. Partly it smooths out any bumps or disagreements about the practice itself so that the story is itself entirely about ascendance: anything unseemly in the past history of the idea is something which was reformed or overcome. This kind of history only pretends to be about tracing how a concept or idea changed over time. There is a kind of bad or potted history of science that some contemporary scientists will recite that very much follows this outline, in which today’s practices are the most perfectly perfect of all (until tomorrow), and the past only a record of errors overcome, a history which the present has corrected and absorbed into its own ascendant body.

Which brings me to Oona Strathern’s book A Brief History of the Future. Since I teach a course on the history of the future, I had high hopes for this book. I’m always looking for something that can help the students grasp the overall picture. Web sites like Paleofuture and Retrofuture are pretty much on the right wavelength, but they don’t have the narrative to knit together Enlightenment conceptions of progress, Christian millennialism, high modernist futurism, postwar technosocial optimism, policy-driving futurism, postmodern skepticism, and contemporary talk about the Singularity.

Strathern’s book is definitely not the droid I’m looking for. For one, it’s mistitled. It is not a history of the future as an idea. It is a history of futurism, the intellectual practice of forecasting or predicting the future. That would still be useful to me, as well as interesting, if it had any critical distance at all from futurism. But it doesn’t: the book largely is an attempt to validate Strathern’s own practice as a futurist, both by burnishing her own credentials and by describing futurism in relentlessly whiggish terms, as a practice which has grown more and more professional, credible, useful, precise and focused over time.

So yes, there’s the usual annexation going on here, in which notable individuals and authors in the past are understood not just to be concerned with the future, but to be nascent or founding futurists, the fathers and mothers of a contemporary profession. There is the requisite disavowal of bad futurism, whose errors were not a complex product of how conceptions of “the future” as a concept interacted with a particular moment in time, but were the consequence of an imperfect or unprofessionalized practice of futurism.

When I look at the history of the future as an idea, even just sticking with formal texts written by past intellectuals, policy-makers and so on, I see something vastly more discontinuous and multivalent than Strathern. Apocalyptic and utopian claims colliding and intermingling, both coming out of deep reservoirs of modern Western experience and thought. Ideas about the future becoming the animating spirit of governmental or institutional action, and then falling out of favor. The future as a powerful belief system at one moment and as the target of scorn and cynicism in another.

There’s a history that links Disney with Corbusier very directly, for example, but you’ve got to pay attention to the entirely different registers and institutional worlds that their visions operated within, as well as the alchemical difference between Brasilia and Epcot. You can bring Cordorcet into a history that culminates with John Naisbett, but not if you make the former the noble ancestor of the professional latter: the relationship is far more diffuse and complicated than that.

Where I found myself especially irritated is as Strathern approaches the present, and contemporaries whom she wishes to compliment and associate herself with. The book seems largely unaware of something that is palpably obvious to me, which is that the expert-driven, policy-oriented futurism of the 1960s, closely tied to the sort of technological optimism that saw us all as driving flying cars, using jet-packs and living on the Moon by 2001, was pretty much thrown by the wayside in the 1980s and 1990s. Both because its projections and predictions were wildly, amusingly wrong, but also because of some basic shifts in the popular zeitgeist, in the entirety of how progress fit into the self-conception of Western and global society. In the kind of history that Strathern is writing, that wider context doesn’t exist. She knows that past futurism was wrong, but her understanding of that shift is simply that futurists got better at what they do, constrained their predictions more, and can now be trusted not to promise jet packs and a world free of hunger when you hire them to do projections.

The changes in way the concept of future was represented, imagined, used and described in the U.S. and Western Europe during 1980s weren’t just about a reaction, a realization that older projections were factually wrong. The same goes for other moments in this history. There isn’t an unbroken line between Enlightenment conceptions of progress and high modernist futurism: the players were different, the contexts were different, the applications of the concepts were different. The historical relationship is there, but it is complicated, diffuse, a matter of influence and subtle inheritance rather than familial descent.

I think that’s the kind of intellectual history I accept from someone who wants to explore the roots of their own practices in a self-complimentary way. It’s fine to talk about influences, to look for the ways that the past has shaped your own professional and personal worlds. Influence is not ascension, however. This kind of intellectual history recognizes that any contemporary idea has junk DNA in its genes, has unacknowledged ancestral branches full of bastards and incest, that its evolutionary line is a bush and not a spine, and that what that idea is doing right here and now is as much a matter of its nurture in the bosom of the present as its inheritance of the past.

Rules of the Game

February 6th, 2008

Quick: why did you object to the way the Florida recount was handled in 2000, if you did in fact object? Was it some of the Supreme Court justices seeming to contradict their own long-standing principles? Was it the chaos of the recount itself, the lack of preparation for it on the part of Florida? The machinations of one or both parties at various moments? The sense that someone, somehow, had cheated?

I don’t think anyone viewed the way that the election was handled as ideal, but we don’t seem to have learned many general lessons from it. At least, not the lesson that even the perception that someone cheated or broke the rules can have a corrosive effect on the national legitimacy of a political leader.

I hate to keep singing the same tune, but it is incredibly important to me that we have an Administration in 2009 that will bring back a sense of playing by the rules, respecting procedures, and caring about process as much as results. I’m already skeptical about Hilary Clinton in that respect given the kind of campaign she’s run, but if the Clinton campaign continues to maneuver to claim delegates from Michigan and Florida in her column, that would be a final deal-breaker for me, in the sense of my being unable to tolerate her as the eventual nominee at all. I understand that little back-room deals are already being made for superdelegates, as well as various other shenangians. That’s one thing, it happens, that’s politics. This is something else: going back on a very clear agreement about rules in unscrupulous pursuit of personal political advantage, very much to the detriment of the system as a whole. We’ve had enough of that for the last eight years, I think.

One-A-Day: Norman Rush, Mating

February 4th, 2008

This is an essay on Norman Rush’s Mating that I wrote up for the National Book Critics Circle Board of Directors blog, Critical Mass.

Barbarians at the Gate

February 4th, 2008

I’m not the only one to take note of the New York Times‘ baffling decision to review Lee Siegel’s new anti-Internet broadside not once, but twice. Both times, moreover, the assignment was given to reviewers who were clearly predisposed to sharing Siegel’s hostility to all things online and favorable in their outlook towards the author himself. You’d think, if you’re going to review a book twice, that you’d seek a more sharply critical perspective for the second one, just to create something of a debate.

This points to two issues that the Siegel reviews raise, actually. The first is largely specific to the Times itself, and its long-standing attempts to choreograph the conversations of the highbrow American (or at least New York) intelligentsia, to treat itself as the “paper of record” in such matters. It has long-standing print rivals to this role on both sides of the Atlantic: the New York Review of Books, the Times Literary Supplement, the New Yorker, in the 1980s and 1990s, the Village Voice (not so much for the last decade), a few other publications and periodicals here and there. There are long-standing ethical questions about highbrow cultural criticism that aren’t limited to the New York Times. Who should a review editor assign to do reviews, anyway? To someone you know has a favorable take, who will protect the reputation of a favored author or performer? To someone who will do a hatchet job? I don’t think it’s stretching things to say that highbrow editorial staff and their critics have indulged in fairly corrupt answers to these questions from time to time precisely in order to extend subtle lines of authority over the entire enterprise of literary or high-art production. That’s Anton Ego’s world, only more conspiratorial and incestuous: small networks of well-connected intellectuals, artists, publishers and socialites in New York sniping and biting at one another, with the Grey Lady dispensing and withholding its favors in whatever way deemed necessary in order to uphold its cultural capital.

What disappears as a result is much sense of which reviews might delight, amuse, instruct or provoke a wider educated American readership in far-flung communities who care little for the narrow social universe of the literati. I don’t pick up the Times looking for a hostile review of Siegel so as to comfort me, nor react against it simply because the reviews were positive. What I care about first is simply whether they’re interesting to read, whether the reviewer writes compellingly, whether there’s an original take or appreciation of the work reviewed. I care whether what is reviewed covers a broader or more interesting range of work than the conventional wisdom of a small inbred New York elite might deem worthy of attention. In that context, wasting two reviews on anything short of The Great American Novel is lamentable, even if the editorial context is one in the Sunday section, one in the regular paper. In that context, assigning a review to someone who is as uncurious and temporizing as John Lanchester in the Sunday section was seems a waste of space. Lanchester at least observes that it’s possible that the book isn’t particularly true, though he does so in the most mealy-mouthed way. The fact that he picks up on Siegel’s anger and then never really seems to think about the possibility that virtually every charge Siegel levels against online discourse could just as easily argued to be self-portraiture struck me even more. “Why is it”, asks Lanchester, that the Internet seems to make so many people so angry? I don’t know. Maybe, just possibly, it doesn’t. Maybe it’s mostly Lee Siegel that it makes angry? Lee Siegel who performed as a “isolated, elevated, asocial individual” drawn to online discourse (and anonymity)? I don’t think it’s much to ask that at least one reviewer consider more fully the lack of introspection and discovery in Siegel’s book. Maybe to do that, you’d have to be someone who knows the online world better than either of the Times reviewers do.

This goes to the second problem with the Times that’s particular to these two reviews rather than generic to the ethics and aesthetics of how it assigns reviews in any case. I think it’s fairly clear by now that the New York Times sees itself as one of the leaders of the charge against new media, as the fortress of mainstream journalism. Yes, the paper has finally gotten its head on straight about its online availability, but it’s also been aggressive about handing the ink microphone over to literary lions and fellow journalists so that they can moan and complain about this brave new world of blogs and Web 2.0. I met a Times reporter last year whose work I really respect, and with whom I had a fascinating, interesting conversation. I was fairly startled when the conversation briefly turned to the revenue situation of the major daily newspapers at the reporter’s bristling and unreserved hostility towards digital media, just because he seemed so much less reflective at that moment. I get that, people’s jobs and livelihoods are on the line, and an old industry is dying, at least in the form that most of its workers have known it. That doesn’t often allow for much perspective.

The New York Times and every other major daily is going to have to think about what its core business really is, about what it needs to be doing that no one else can do. I think that comes down to reportage. The online world can’t produce original, eye-witness accounts or in-depth research about the major and minor events of the day, for the most part. That takes money, it takes an organization, it takes experience. It takes reportage being your job, not just something you do on the side. The new online media lack all of those attributes and will continue to lack them. Readers will continue to pay for reportage. Maybe the revenue model will be different in thirty years, but there will be a market for original information.

What we won’t be paying for (at least not much) in thirty years is literary and cultural reviews and op-ed pieces. Not just because better can be had already online, in many cases, but because the old media ill-serves educated readers in those areas and has always ill-served them. This brings us back to the ethics and aesthetics of the closed world of editorial elite and the literati that used to exist unchallenged. Now we have choices, and our choices will proliferate still further as time goes on. We don’t have to settle for the choices that come out of small incestuous circle-jerk of New York editors, from their dispensing of favors through their immediate social networks.

That in the end is what made Lee Siegel so furious, as Ezra Klein noted at the time of the original “Sprezzatura” affair. He’d been handed a microphone, because he was an already-anointed cultural critic of note within those small social worlds. But a wider world of readers thought some of his cultural criticism to be at best silly, peripheral, oddly eccentric and strongly self-indulgent. (Siegel himself, judging from his Sprezzatura comments, imagined himself to be a strongly original, gutsy, and imaginative essayist.) He was given the stage and a big introduction, only to find that most of the audience had left the building, and those few that stayed threw rotten tomatoes. That’s a long way from getting a seat at the Algonquin Round Table.

So no wonder there are others in that small world who feel sympathy for Siegel and praise his rage against the Internet. They’ve got a union card for a closed shop that once had a monopoly, but suddenly the world is full of little entrepreneurial factories churning out commentary and reviews that are more readable, interesting and diverse than most of what the big outlets publish for commentary and cultural reviews. They have to earn their audiences for the first time in their lives, rather than just suck up to some latter-day William Shawn. So they’re not about to consider that the angriest, most isolated, most asocial person on the Internet in his day might have been Lee Siegel himself, that the skunky odor around his TNR column wasn’t generated by his detractors but wafted from the main entries, and that the main thing being destroyed, at least as far as cultural criticism goes, is a tottering, threadbare cocktail-party monopoly built on self-congratulation.

Now That I Like

January 31st, 2008

Eddie Izzard, apparently, will be voicing Reepicheep in the upcoming film version of Prince Caspian.

Process, Evidence, Closure

January 31st, 2008

There’s an interesting, complicated discussion of sexual assaults at Swarthmore in this week’s student paper, stemming partly from the case of a student last semester who was accused of having committed assaults and has not returned to campus this semester.

One of the consistent things that many students concerned with the issue say they want is a more open campus dialogue about the general issue of sexual assault but also some kind of resolution or closure on this specific case. As a contribution to dialogue, I want to point to one specific issue at the heart of much of the expressed dissatisfaction.

The students interviewed in the article seem generally satisfied with the counseling and support services provided by the college to victims of assault, and with the degree to which these services lay out some of the options and choices that victims have about how to deal with what’s been done to them. The issue for some is with what the college does in judicial or punitive terms.

In this case, it seems that the victims did not want to bring the accused into the judicial system outside of the college. As the article points out, there are a lot of reasons why they might not want to. One would be the difficulty of having to narrate and thus relive the incident itself in an institutional environment that is anything but consoling or supportive, where there is an expressly adversarial dimension to the way that their story gets authenticated as legal truth. The lawyer for the accused tries to find a way to discredit the accuser’s narrative, and the system itself by its nature is supposed to be scrupulously neutral about the truth of the narration until a jury has found it to be true beyond a reasonable doubt.

If we can step outside the usual terms of the acrimony about “political correctness” and so on, into the deeper waters of how we think about narration, memory and experience, I think just about anybody, of any political leaning, could acknowledge that stories and memories have immense human power, and that the circumstances under which we tell our stories transform us as individuals, and make our lives either deeply fulfilled or despoiled. A story I tell in an intimate moment that connects (and exposes) me to another person is very different from a story I am required by law to tell in front of an indifferent audience, even if both stories are exactly the same.

So I understand the reluctance to enter into the judicial system, or even a quasi-judicial campus system for this reason and others, including potentially a sympathy for the life and future of the accused. A victim who knew her attacker could well wish that her attacker be compelled to some kind of responsibility or understanding without wanting him to spend years of his life in jail.

But that’s the problem: we don’t have a middle ground in which we can compel an individual to do some things without having to go through a judicial process, nor should we. If we understand why it is emotionally traumatic to have to testify in front of an indifferent public institution, then we should also understand why compelling people to enter into therapy, confess their crimes, sit in workshops, or be marked forever in civic records as a transgressor against the community is serious business. You can’t say “I understand why you don’t want to testify in front of a court” and then say, “It’s only a workshop, it’s only therapy, it’s only a notation on transcript: it’s not a big deal. It’s not jail“. It cuts both ways: it is a big deal. For the same reason that the trauma of having to narrate private experience in public is a big deal: because all of those things forge a compulsory relation between private selfhood and a public transcript of experience. It is something to which someone is subjected.

If those consequences are a big deal, then there’s got to be a due process for arriving at truth that allows for public scrutiny, that protects the rights of the accused as well as the accuser, that starts as neutral towards the question of what happened, if not at all neutral in its views of the meaning of the crime itself. E.g., an institution can have a prior belief that sexual assault is an extraordinarily serious violation which it will pursue vigorously, but when there is a question of consequences for an accused person, it has to start as neutral towards whether that accusation in that instance is true and it has to ask of the accused that they be willing to step outside of the private support system into a public judicial system charged with making a finding of truth.

Maybe the formal judicial system in the United States isn’t the best model for determining that kind of truth. There are a lot of reasons to question it. But some of its basic requirements strike me as indispensible: that its workings be public, that it be neutrally disposed at the outset towards the truth of any given accusation, that it have persistent rather than ad hoc procedures. I don’t have any problem with a private college privately deciding that it no longer wants to enroll a given student for almost any reason. That comes along with deciding freely whom you admit and don’t admit. But if what people want instead is a permanent note on a transcript, a compulsory requirement to attend counseling, a judicial-type consequence, then there’s got to be an established judicial procedure which by its character is necessarily completely different than a support system. On one level, I can’t help but feel that some of the concerned students want the college to impose judicial-type consequences for assault without the accusers having to undergo judicial-type scrutiny, to derive punishment and consequences out of the necessarily private, non-judgemental, emotionally healing logics of a counseling and support system.

There are really good reasons not to get those two things intertwined, even when the only judicial consequences being sought involve counseling, therapy or even mandatory dialogue between the accused and accuser. I think that one of the victims is right when she says of the accused, “this kid needs some serious counseling”. But I wish she could see why administrators said that there wasn’t any way to make that come about. That can’t happen unless a serious judicial procedure happens, and if the accused isn’t here, there isn’t any way to compel him to come here and be a part of that procedure without turning to the court system outside the college. Because even in a very small way, that deprives that person of his liberty. There isn’t any way to investigate that doesn’t require the public collection of testimony, and public scrutiny of the same. The victim observes that the college told her that either she and the other accusers “do something or nothing happens”. Yes, that’s right. That’s how it needs to be.

One-A-Day: David Weinberger, Everything is Miscellaneous

January 31st, 2008

Cory Doctorow makes a lot of sales to me through his recommendations on Boing Boing. He tends to have an eye for things that I at least think I’m interested in. Sometimes, though, I feel a bit let down, feeling more like “I gave a little bit of money to one of Cory’s friends (which seems an ok thing to do)” rather than “He’s right, this book or graphic novel is really compelling”.

Weinberger’s Everything is Miscellaneous feels to me more like one of the former than the latter. The book strains to say something new about digital search and digital knowledge. It also has the obsession that some of the digerati have with proclaiming the digital as a utopian revolution against an old order. It’s not really thinking in original ways about the history of categories, typologies, information hierarchies, catalogs and so on: we get the obligatory fly-by of Plato and Aristotle, sure, but not much in-between. It’s only at the end that Weinberger even asks the question: if knowledge is intrinsically miscellaneous, why have we had such a long interregnum of typology, taxonomy, and classification? What he offers is a kind of three-page potted sort of Enlightenment-style fable about how we fell from miscellaneous Eden through the original sin of some old thinkers and now can glimpse utopia once again.

It feels as if he’s selling something. This is one thing that really wearies me about a certain kind of writing by the digerati. It’s often reads as if the child-catcher from Chitty Chitty Bang Bang has just pulled up and invited me to hop on his caged bandwagon. On one level, Weinberger is just preaching the gospel of Web 2.0, and I’m pretty much inside that revival tent myself. When he describes four key strategies (filter on the way out not the way in, associate a piece of information with as many classification systems as possible, everything is metadata and everything can be a label, and give up control), I pretty much agree with them all as approaches. I just don’t accept them as inevitable, universal and ubiquitious.

What irritates me is the either/or character of his presentation, which is one of the basic attributes of digerati manifestos. You’re in or you’re out. This is the way to do it, all other ways are bad. Thank god technology is at last liberating information and knowledge. There are no choices to be made, only discoveries of the one true way. People who feel confused or alienated by a Web 2.0 environment are just fossils. Information is miscellaneous, in Weinberger’s description. Reality is being unveiled at last.

It reminds me a bit of naive holism, of flip dismissals of “reductionism” in knowledge systems. Saying that you’re against reductionism or for it is like saying that you’re against breathing out carbon dioxide but very much in favor of breathing in oxygen. We confine or expand the questions we’re asking of the world situationally, in dynamic relationship to other people’s questions and our own purposes of the moment. Today I may compress some heuristic boundary I’m using, tomorrow I may discard it, and I’ll be perfectly right to do so both times. The same goes for information and knowledge as “miscellaneous”. Today I may classify from the top, tomorrow I may tag from below. Today I may want to be in a narrow, tightly-bounded conversation with a limited number of specialists who are following disciplinary constraints; tomorrow I may want to drift on the ocean of humanity’s digital sea, seeing what I pull up in my net.

The problem with old expert-driven bibliographic control or academic disciplinarity is that its strong correspondence with institutional organization made it seem both natural and essential to its practicioners, rather than a strategic tool adopted at certain moments to heighten the generativity or focus of knowledge-production–and it encouraged highly specialized knowledge practices to claim the right to dominate public decision-making and everyday forms of knowing as their birthright. Old practices of cataloging and disciplinarity kept scholars and experts from remembering that those practices were provisional, tactical responses to knowledge production.

Weinberger is right that everything can be and often should be miscellaneous, as he describes it. The problem is that he goes well beyond that to proclaim this as manifest destiny: “traditional trees”, as he puts it, have been “useful”, but it’s rather the same way that we might say that horses were useful for getting around before the internal combustion engine. Knowledge, in his view, is not organized. This is a Platonic claim about the essential character of all knowledge, at all times. When it’s organized prior to use, that’s a false, misshapen imposition. No capacities, abilities, or possibilities are lost in the recognition of the truth of knowledge’s miscellaneous character.

“In the miscellanized world, every idea is discussed, so no idea remains simple for long.” (p. 213) Doesn’t that just warm your heart? I feel as if Tiny Tim is about to yell out, “God bless us, everyone”. But when there isn’t any discussion anywhere of the disadvantages, the problems, the practical challenges, the downsides of what Weinberger calls “the third order”, when there isn’t any kind of sophisticated investigation of why past systems for organizing knowledge came to exist, then I’m uneasy even if I’m interested in and open to what he’s peddling.

The Loneliness of the Long-Distance Independent

January 29th, 2008

In American politics, independent voters are magical creatures. They are invoked, summoned, conjured with. Ritual sacrifices are made in their name. Candidates rub the fictive head of the Independent Voter for good luck. Never so much during primary season that the candidate loses the fidelity of his party constituency, of course. Unless it’s an open primary.

In reality, of course, the Independent Voter isn’t any single thing. There are voters who are unaffiliated out of casual disengagement, and those who adamantly refuse party affiliation as if it were poison. There are conservative independents, liberal independents, radical independents, libertarian independents. There are independents who barely bother to vote and have no real interest in the political system and independents who are as passionate about American politics as the most dedicated ward captain.

The independents whom some of the candidates are courting, though, do strike me as having some roughly similar views about politics, if not specific policy or ideological positions. As an independent myself, I’m conscious that I have some of these root-level, basically emotional, orientations. At least since the mid-1970s, these independents have been more attracted to the personal character, leadership style, and rhetorical mode of a candidate than to a match between the independent’s own specific convictions and the candidate’s declared positions. If a candidate seems fearless, bluntly honest, open-minded, willing to buck the conventional wisdom (particularly party-line ideology), they’re attractive to this kind of independent sensibility.

This explains why independents are easily seduced and abandoned, and stumble from one jilted political relationship to the next. (I very much include myself in this indictment.) Pundits have been using the “wouldya like to have a beer with the candidate?” test to evaluate the general likeability of political figures. For some independents, that’s not the test. Instead, it’s about about looking in the mirror and asking our reflections, “Who is the fairest of them all?”. We flatter ourselves and imagine that we are idiosyncratic in our loyalties, persuadable by rational argument, willing to side with truth wherever we find it, uncorrupted in a world full of corruption. And so we ask of a candidate: are you like that, too? A lot of independents are looking for the latter-day fantasy version of Harry Truman, so different from his historical reality.

Garry Wills in Nixon Agonistes scored a direct hit on the “objective” liberal intellectuals of the 1960s and 1970s, and the manner in which they delivered obsequious compliments to their own judiciousness and lack of ideology. We all have ideology, even the independent, and ideology is not some shameful fetish that one keeps in a locked box in the closet. If you don’t like that word, try philosophy or theory. We all have first principles from which we reason or feel or find our way to specific convictions and outcomes. The person of pure reason who comes to every issue as innocent as a newborn is not just a fiction, it’s a stupid and unattractive fiction.

That said, I do like the idea of being able to change one’s mind, of being open to unexpected evidence, of seeing things from multiple perspectives. I tend to look at politics the same way Jane Jacobs looked at cities, as something that grows organically out of experience and usage. The strong party or movement loyalist looks at politics the way that Le Corbusier looked at cities: as a thing to be built by rigid principles, and damn people if they’re too stupid or recalcitrant to live in the city of tomorrow the way that they’re supposed to.

So I’m an independent, and I want my preferred candidates to exhibit some of the outlook of the independent. If I turn on a candidate I’ve liked, it’ll often first be because they violated some crucial part of my expectation for independent behaviors. Take John McCain. I actually did like him somewhat in 2000, so I didn’t care so much about the fact that his open, declared political convictions were a million miles away from mine, and likely to violate many things that I believe in. But he first lost any hope that I might ever vote for him not because I woke up and paid attention to what he actually might do as President but because he fawned and cringed like a whipped dog at the feet of his master, because he went crawling to people and interests who had treated him so badly. The independent fantasizes that his ideal candidate will stand proud even if that means not winning, at least if the stakes are high enough and the principle important enough. When you get on a stage and give an enthusiastic hug to a man who slandered your family and yourself, not to mention a man who is dragging your country’s reputation down into the mud, you lose the independent voter.

This of course is all stupidly macho as a way to think about politics: it’s some kind of ur-brain thing about honor and loyalty and courage and so on working its way up into precincts that should be making other kinds of judgments. If McCain’s 2000 candidacy had ever gotten past the point of the protective cloud of media hype that burnished these images to a sheen, I think I would have rejected him for other reasons, namely, the things he’d actually do as President. That’s something else that happens to independents: they fall for image, and then when they get a look at reality, they realize that they’re better off with some dull old political hack. Because the person they thought they liked is either an extremist or a screaming lunatic or just a mildly charismatic hack. Or is just a propped-up media darling whose missteps and dirty laundry are being obligingly hidden by reporters who also have the same fetishes as independents.

A strong party-line voter seems to have an easier job of it, at least on the surface. You get out your checklist and you run down in and when you have the best match, you’ve got your candidate. In reality, it doesn’t work out that way. The independent has to work backward from what they perceive to be the personal and ethical qualifications of a candidate to a match on political positions. The party-line voter has to work from a match on positions to whether or not the candidate has the skills, charisma and ethical consistency to carry out the political program. If you’re a progressive at the left end of the spectrum, you might find Kucinich matching you closely on paper, but judge that in the end, he wouldn’t be able to institute any of the positions you value. Maybe for the same reason, you end up backing Clinton even though on paper, she’s not really that progressive on many issues.

I don’t think that my type of independent is wrong to want what we want (in either ourselves or our candidates) even if we overestimate its value and our own personal ability to live up to that expectation. It is important to think past the image. It is important not to fall too deeply into man-on-a-white-horse messianism, to forget that a candidate is going to have to govern within a two-party system, to play small ball as well as throw a few Hail Mary passes. It is important not to forget about the value of commitments to core political values.

But I think it’s worth remaining an independent in sensibility and outlook, too. Not just because the party-line candidate is business as usual in a time where we need something else (don’t we always need something else)? But also because the party-line candidate can’t even be counted on to deliver business as usual as a function of their superior ideological discipline or specificity. How much has a Democratic Congress delivered in the last two years in terms of Democratic positions? (Assuming there is such a thing amid the contradictions of the contemporary Democratic Party.) I keep hearing that building a big tent is a fool’s errand, that persuasion is for chumps, and that all we need is a leader with the correct checklist and the will to fight without compromise or hesitation for it. If that were sufficient, things would already be far different than they are. Independents have one set of illusions, party loyalists and activists another.

Grubeus Shagrid, At Your Service

January 28th, 2008

Low-energy day today: I spent a good part of yesterday playing the part of Shagrid, distant cousin to Hagrid of Harry Potter fame, convening an American expansion of the famous Hogwarts School. This was the consequence of my daughter’s request for a themed birthday party. One thing I discovered: it’s hard to find a fake beard in the middle of January. Another thing I discovered: if you spray-paint a grey wig brown, it will smell so toxic even after drying for three days that you won’t be able to wear it. A third thing: magic potions made from vinegar and baking soda are surprisingly sticky when they overflow and then dry on the dining room floor. But it was good fun.

My mom happened to bring along some of my old schoolwork from first through third grades. Reading through a stapled-together volume of “Monster Stories” I wrote from when I was nine, I came across the following, in between various stories about monsters robbing banks, pushing other monsters off cliffs, and getting into arguments with witches. See if you can guess what year it was.

——————–

Monster News

MONSTERGATE

President Monster has tapes!
They could be the answer.