God of the Machine – Page 17 – Culling my readers to a manageable elite since 2002.
Jul 082003
 

Two of my favorite libertarian bloggers have squared off over the meaning of the Commerce Clause. In this corner, from the Cato Institute, Radley Balko:

Nearly every libertarian interpretation of the Constitution I’ve read says that the intent of the Commerce Clause was to facilitate commerce between the states, not to inhibit it. It was meant to set up a kind of “free trade zone” between the states. So if Mississippi, for example, wanted to tax every boat carrying cotton not grown in Mississippi traveling down the Mississippi River, Congress would have the authority to intervene. I’ve never read a libertarian interpretation of the Commerce Clause that says it should be interpreted to mean that Congress can tell businesses how they can or can’t solicit customers.

In fact, most libertarians agree that the only Supreme Court case to correctly interpret the Commerce Clause was the very first to come across its desk — Gibbons v. Ogden in 1824. There the Court struck down a New York law attempting to establish a monopoly on steamships traveling between New York and New Jersey. Chief Justice Marshall recognized that the Commerce Clause applies only to the trafficking of goods between two or more states, and also that Congresss had no power to regulate commerce within a state (he refused to allow Congress to enforce quarantine laws before or after a steamship docked within a particular state, for example).

And in this corner, late of Cato, now with Reason magazine, Julian “the Apostate” Sanchez:

Well, you know, I don’t much care about the “libertarian” or “non-libertarian” interpretation; I’m more curious about the correct interpretation. And if the “libertarian” interpretation insists that, contrary to appearances, the Commerce Clause does not empower the federal government to “regulate Commerce… among the several States,” then the “libertarian interpretation” is wrong.

Oh, I fully agree that the abuses Radley goes on to cite, wherein “commerce” is read to mean “manufacturing” or “anything that might affect commerce” or “anything Congress feels like passing a law about” are ultra vires. But that’s not what we’re talking about here, is it? This isn’t someone growing wheat on his own farm, or insisting on a 50 hour work week in a local factory. We’re talking about folks in one state calling up folks in another state to carry out a business transaction. If that’s not “commerce… among the several States” I don’t know what is.

The ostensible bone of contention here is “don’t-call list” anti-telemarketing laws, which Radley opposes and Julian doesn’t object to very vociferously, but the more important issue, as they both recognize, is the scope of the Commerce Clause. What we really have on display, however, is two conflicting theories of Constitutional interpretation.

Radley, as he acknowledges offhandedly, subscribes to original intent, appealing to “the intent” of the Commerce Clause to faciliate rather than inhibit commerce between the states. Julian is a textualist: he argues that the power “to regulate commerce among the several States” is, well, exactly that.

Now much as I would like to read the Constitution as debarring all Federal economic regulation whatsoever, I’m afraid Radley’s dog won’t hunt. Laws and constitutions are written by committees, and ascribing intent to groups is a dubious process indeed. If Jefferson, Madison and Hamilton quarreled over the meaning of a particular clause, as they often did, whose “intent” carries the day?

Original intent theory, if we assume that “intent” can be established at all, perversely privileges thought over deed. We are supposed to concern ourselves not with what the Constitution actually says, but with what the Founders thought it said. Even as a principle of literary interpretation, where we usually have only one author to concern ourselves with, this is unsustainable.

I of course agree with Radley and Julian that the Commerce Clause is a painfully slender reed to suspend a full-featured welfare state from, or, as Julian puts it, “that commerce means, as it’s been read to mean, ‘anything that seems like it might possibly marginally affect commerce if you squint real hard.'” Radley complains that “How can you say, then, that Congress doesn’t have the Constitutional authority to regulate the airlines, broadcast media, or any business, really, that has franchises in more than one state, or that does business with other businesses in more than one state, or that does anything at all that even remotely affects commerce in more than one state?” The answer is two-fold. First, there is a principled difference between regulating interstate commerce and regulating anything that might conceivably affect interstate commerce. The Supreme Court eventually agreed in Lopez that there are some interventions that even the Commerce Clause cannot justify, like federal laws against guns near school grounds. Second, some laws have to be fought on substantive rather than Constitutional grounds. As Eve Tushnet likes to say, Repeat after me: Not all bad laws are unconstitutional… not all bad laws are unconstitutional…

(Update: Jonathan Wilde comments.)

Jul 072003
 

(Warning: Meta-content ahead.)

I feel guilty when I go a day without posting, and I’m not the only one. My friend Mark Riebling says that he considers skipping a day on his blog a moral failure, which gives him a lot to answer for, although in Mark’s defense he has book contracts to fulfill and a full-time job to hold down. Say what you like about Bill Buckley’s writing — the man, well into his 70s, still hies himself to the keyboard seven days a week to knock out the daily theme. We can all admire him for that, if nothing else.

Now why should I feel guilty when I don’t blog? Why, when my girlfriend comes home from work and asks, “Did you blog today?” do I feel compelled to mumble something about “working on a couple big posts,” all the while feeling cut to the quick?

It’s not as if I owe my readers anything. Don’t get me wrong: I love, I positively adore, every last one of you, but you get what you pay for, after all. Character is habit, as Aristotle says, and the task is the thing. You set yourself to write a blog. Mirabile dictu, a few people come to read it, but that’s beside the point. Maybe other bloggers are different, but I would feel the same way if I resolved to a keep a journal and then didn’t write in it every day (and I have) or if I tried to put up a shelf or assemble a piece of furniture and wound up punching some holes where none should have been (and I have). It doesn’t matter if nobody reads the journal or sees the holes. You know the holes are there, and it bothers you, or it should.

So will I be posting every day? Not necessarily. Poor Brian Micklethwait must rue the day he made that deal with the devil, although he has kept his end up impressively, if imperfectly, considering he runs another blog and contributes to a few more into the bargain. But I do promise you this: I will be wracked with guilt on the days I don’t.

Jul 042003
 

Of the numerous cyber-eulogies one of the best is Colby Cosh’s, describing her beauty as “harsh,” which is exactly right, and the sense she gave of being “bound by no known rules, certainly not those of fashion or politesse,” which is true but incomplete.

Watching Hepburn in comedy is like watching a great athlete. Kobe, Jordan, Gretzky, seem to occupy some interstice of time, inaccesible to the rest of us, that gives them an extra half-second to decide what to do. Hepburn, in the same way, always seems to buzz in some strange interstice of social relations, as if she already knows what someone is going to say, pronounces herself bored with it, and goes off on a tangent before he even opens his mouth, leaving him gasping for air. (I’m thinking of Bringing Up Baby and, especially, Holiday.) The conventional characters call her dizzy, when in fact she is dizzying.

Colby also amusingly cites a report from a local AM station that Audrey Hepburn had died, for the second time. Well, I used to know someone who thought there were three Hepburn sisters — Audrey, Katharine, and Tracy.

Speaking of Tracy, I have to take issue with Colby’s parenthetical remark that he could be “trusted to be big-hearted enough to slump back in his chair and enjoy the show. He seemed perfectly comfortable in the presence of a female superior.” This is exactly backwards. No male lead could be perfectly comfortable in Hepburn’s presence, and Tracy least of all. The comedy, on the contrary, derives from his acute discomfort, from Hepburn’s awareness of it, and from her futile attempts to mollify him, like her disastrous essay in making breakfast in Woman of the Year. Tracy and Hepburn always make up in the end, of course, but it is an uneasy alliance, and Guess Who’s Coming to Dinner? gives a nice picture of what their marriage might be like, twenty years on.

Of her male leads it is Cary Grant, not Tracy, who is, if not exactly comfortable with Hepburn, at least insouciant and urbane. He makes a few half-hearted attempts to restore sanity to Bringing Up Baby, but about halfway through decides to throw up his hands and just watch the show, and by The Philadelphia Story he has stopped trying altogether. Grant is a peculiarly affectless male lead, always giving the impression that sex would be fine, only it’s so much bother and he might muss his hair. This lends a certain chilliness to his collaborations with Hepburn, brilliant as they are, and is why they will never be beloved, as Tracy’s are. He is only outrun, while Tracy, the endearing palooka, is outclassed, but neither one could keep up with her. No one ever could.

Jul 022003
 

It has come to my attention that in certain unswept corners of the Internet I have acquired a reputation for knowing everything. This is untrue. I have forgotten the specific gravity of feldspar, and I never learned how to program COBOL. My Welsh is also terribly rusty. Everything else, I know.

Jul 012003
 

When a true cult appears in the world, you may know it by this infallible sign; that it sells taped lectures to the faithful at exorbitant prices. Literary critics, who usually lecture for a living, are the curious exception, lacking the shrewd understanding of price elasticity that the religious cults, the philosophical cults, and the buy-real-estate-with-no-money-down cults all seem to share. Maybe C.P. Snow had a point about the rift between the Two Cultures, at least between literature and economics. Maybe the cult critics simply didn’t care for money. But they missed out on a serious marketing opportunity. Who among the acolytes of F.R. Leavis, John Crowe Ransom, T.S. Eliot or Yvor Winters wouldn’t shell out the big bucks for the lectures of their favorite on cassette?

What becomes a cult critic? Evaluation, above all. For most of the last century instruction in literature aimed at producing someone like the befuddled art critic in the old New Yorker cartoon who says, “I know all about art, but I don’t know what I like.” It was possible, in my student days twenty years ago, to major in English without once being told why we were reading the writers we were, instead of some others. One of the epigraphs to Leavis’s The Common Pursuit is from Robert Graves:

At the end of my first term’s work I attended the usual college board to give an account of myself. The spokesman coughed and said a little stiffly, “I understand, Mr. Graves, that the essays that you write for your English tutor are, shall we say, a trifle temperamental. It appears, indeed, that you prefer some authors to others.”

Cult critics distinctly prefer some authors to others. They usually arrive on the scene by dynamiting an established reputation. Ransom lays waste to Shakespeare’s sonnets (the whole essay isn’t online, but an excerpt, on Sonnet 73, is here). Leavis writes that Milton “has forgotten how to use the English language.” Winters reads nearly the entire 18th and most of the 19th century out of the poetic canon. English students are starved for this sort of thing, and they flock.

Some of the best passages in the cult critics are the demolition jobs. Winters on Yeats, for instance:

Yeats’s concept of what would be the ideal society is also important. Such a society would be essentially agrarian, with as few politicans and tradesmen as possible. The dominant class would be the landed gentry; the peasants would also be important, but would stay in their place; a fair sprinkling of beggars (some of them mad), of drunkards, and of priests would make the countryside more picturesque. The gentlemen should be violent and bitter, patrons of the arts, and the maintainers of order; they should be good horsemen, preferably reckless horsemen (if the two kinds may exist in one); and they should be fond of fishing. The ladies should be beautiful and charming, should be gracious hostesses…, should if possible be musicians, should drive men mad, love, marry, and produce children, should not be interested in ideas, and should ride horseback, preferably to hounds. So far as I can recollect, the ladies are not required to go fishing.

Eliot, who is temperamentally incapable of such viciousness, must be read out of the ranks of the true cult critics on that account. He sets himself up as a defender of “tradition” and can scarcely bring himself to pronounce that certain works that have been read for a long time are just plain bad. Calling Milton “magniloquent” is as much vitriol as he can muster. Too much hedging will never gather you a proper cult, and when it comes to hedging Eliot had no peer.

Cult critics are all hedgehogs, not foxes; they have one big idea and they beat it senseless. Leavis takes dibs on “life,” Winters “moral judgment,” and poor Ransom is left with “structure [the argument] and texture [the images],” which is dualistic, to begin with, and dualism is no way to run a cult. In any case it bears too much resemblance to the ancient Horatian formula that a poem must “teach and delight” to excite the unquestioning allegiance that the true cult critic demands. Ransom was also an extremely polite Southerner, and politeness, in this league, will never do.

This leaves only Leavis and Winters standing as the preeminent cult critics of the 20th century. They have in common a finely-honed sense of persecution at the hands of academia. Although Leavis spent most of his career at Cambridge and Winters at Stanford, each considered himself disastrously underappreciated, and with reason. Leavis was well past 40 before he secured a permanent position, despite an impressive list of publications. “They say I have persecution mania,” he remarked. “Comes of being persecuted, you know.” Winters’ plaint at the end of his last book, Forms of Discovery, could serve almost as the cult critic’s motto:

It has been a common practice for years for casual critics to ridicule my students in a parenthesis; this has been an easy way to ridicule me. And the sneer is the easiest of all weapons to employ; it costs the user no labor, no understanding, and I should judge that it raises him in his own estimation. But I think the time has come when my faithful reader may as well face certain facts, no matter how painful the experience: namely, that I know a great deal about the art of poetry, theoretically, historically, and practically; that a great many talented people have come to Stanford to work with me; that I have been an excellent teacher; that six or seven of my former students are among the best poets of this century; that some of these and a few others are distinguished scholars.

Loyalty, clearly, flows top-down as well as bottom-up. Winters was very near death when he wrote this, and it’s true, actually. It’s true! His students included J.V. Cunningham, Edgar Bowers, Thom Gunn, Scott Momaday, and a host of minor figures. Still, your impulse is to close the book out of embarrassment.

Cult leadership is lonely work, and Leavis and Winters were both blessed with helpfully literary wives. Mrs. Winters was Janet Lewis, a distinguished poet and novelist (The Return of Martin Guerre) who didn’t care much for disputation but reliably backed her husband in public. The famously truculent Mrs. Leavis, known to her husband as Trixie, the Leavisites as Queenie, and the reading public as Q.D., was another matter. Her Ph.D. thesis, Fiction and the Reading Public, is still cited today. With her husband, she co-edited Scrutiny, the house organ of the Leavisites, for its entire 20-year run, and she was widely considered the more terrifying of the couple. Truly a match made in — truly a match.

Now, a confession: I am a Winters cultist myself, as my regular readers will have gathered by now. Winters, too, had his own, more modest version of Scrutiny, a little magazine called The Gyroscope. Four issues, with the approximate production values of a high-school literary magazine of the pre-PC era, were published in 1929 and 1930, and I own, at vast expense, the complete run (cf. cassette tapes).

There is an old Matt Groening cartoon that lists the Six Types of College Professors. One of them is “The One-Idea-To-Explain-Everything Maniac,” and there is a footnote: “Warning: Idea might be true.” So it is with Winters. Poems really are, largely considered, moral judgments about a human experience. Ben Jonson and Greville really are superior to Spenser and Sidney, Wordsworth and Shelley really are bad jokes, and 1700-1850 really is a trough in the history of English and American poetry. I urge any of my readers who have made it this far to go look up his books, especially the omnibus In Defense of Reason and Forms of Discovery; you will learn more about poetry than you ever thought possible.

Leavis, on the other hand, was spotty. He is a sensitive reader, especially of Shakespeare, but a lousy theoretician — “life” can take you only so far — and his considered judgments are unlikely to stand the test of time. (D.H. Lawrence, for the record, was not the greatest novelist of the 20th century. If Lawrence survives for anything, it will be, ironically, a work of criticism, the splenetic curiosity Studies in Classic American Literature.) None of Leavis’s epigones will be remembered. And Leavis, unlike Winters, was no poet himself, and incapable of the close metrical analysis that is one of the distinctive features of Winters’ criticism.

This, for the budding cult critic, is the most inspiring lesson of all. You will need feral energy, a boundless capacity for holding grudges, and barking monomania. What you won’t need, necessarily, is to be a good critic.

(Update: Michael Blowhard comments. And Jim Henley has some especially interesting remarks.)

Jun 272003
 

Richard Dawkins will stop at nothing. Not content with foisting on the Internet the SARS-like “meme” — which doesn’t mean what you think, look it up sometime — he plumps for “Bright” to describe “a naturalistic worldview…absent any presumption of forces or entities beyond what can be observed/measured.” Few things inspire in me a sympathy for the religious; here is one.

To begin with, there are obviously forces that are far from mystical that cannot be measured, human ends for instance, which are notoriously ordinal, not cardinal. Ends can be observed, but not directly, only in their manifestations. Unfortunately our Bright employs a slash so we cannot be sure if he meant “and” or “or,” which demonstrates the same feeling for language that “Bright” itself does.

As Andrea Harris points out, “bright” is, in ordinary usage, the antonym of “clever.” It describes children who get A’s in Deportment (do they still give grades for Deportment?) and Play Well With Others. It is a word from which any genuinely intelligent child instinctively recoils. This was as true in Dawkins’ time as in my own; he must have forgotten that “bright boy” is a term of abuse, and not the way “geek” and “grind” are either.

He may intend to hijack the word, the way statists hijacked “liberal” and radical homosexuals hijacked “queer.” If he succeeds, it will merely impoverish the language. There are perfectly good English words available to describe a naturalistic worldview. Rationalist, scientific, and non-religious have all performed this homely service adequately for quite some time.

Most offensively, it is a transparent attempt to win an argument by changing the terminology, which is as unscientific a procedure as can be imagined. You may as well adopt the word “right” to describe your worldview. What does that make your opponents? Wrong, of course! Dawkins is quite frank about this, imagining the following bright snatch of dialogue. There is really no other word for it, and if the Brights have their way, there will be no word for it at all.

“Well, some brights are happy to call themselves atheists. Some brights call themselves agnostics. Some call themselves humanists, some free thinkers. But all brights have a world view that is free of supernaturalism and mysticism.”

“Oh, I get it. It’s a bit like ‘gay’. So, what’s the opposite of a bright? What would you call a religious person?”

“What would you suggest?”

Count me dim.

(Update: Andrea Harris comments. Jonathan Wilde comments. Mark Wickens defends Dawkins doggedly but not altogether convincingly.)

Jun 262003
 

Hollywood has much to teach us.

Windtalkers — In this World War II John Woo gorefest Nicolas Cage, more cross-eyed and sullen than usual, plays a lieutenant assigned to a Navajo “codetalker.” His mission is to “protect the code,” that is, shoot the Navajo if he is in danger of falling into enemy hands. Now there actually was a field code, based on Navajo, the most obscure of Indian languages. It was never broken, which was more a testament to the steadfast loyalty of the codetalkers, and sheer dumb luck, than to sound cryptographic principles. If a single Navajo is captured, no more code. Even worse, if one sells out, the code has been broken and you don’t know it. Let’s face it, if you have to keep your radiomen under 24-hour armed guard, maybe you got a little cryptography problem.

The Matrices — Ah, grasshopper. Whereof one cannot speak, thereof one must be silent.

Pretty Woman — Conglomerateurs often look like Richard Gere. Street whores often look like Julia Roberts. This is why when conglomerateurs need an escort for the week they cruise the streets to find one.

Risky Business — Hey kids! Despite mediocre grades, by donning a pair of Wayfarers and running a cathouse for a weekend, you too can be admitted to Princeton and get your ticket punched for a rewarding career in investment banking!

So I can’t speak for the rest of you, but I sure haven’t been wasting my time, oh no.

Jun 242003
 

I can’t hope to match the peerless coverage of the Bollinger cases by Team Volokh, but a few thoughts:

O’Connor’s majority opinion in Grutter admits that the Court is obligated to find, under Adarand Constructors, which subjects racial categories to strict scrutiny, a “compelling state interest” in affirmative action. It finds this interest in diversity. It nowhere scruples to tell us what diversity actually is. Thomas, less shy, defines “diversity,” tersely and accurately, as “classroom aesthetics.”

The majority opinion also states, “the Law School frequently accepts nonminority applicants with grades and test scores lower than underrepresented minority applicants (and other nonminority applicants) who are rejected.” How frequently, do you suppose? The Court, alas, declines to provide statistics, but I’d wager the rent money that “frequently” is in the single digits for, say, the Law School in any given calendar year.

Sandra Day O’Connor may be the Lewis Powell of her generation, but Thomas, as Volokh points out, is living in a dream world when he declares in his dissent that “I agree with the Court’s holding that racial discrimination in higher education admissions will be illegal in 25 years.” The majority opinion reads, “The Court expects that, 25 years from now, the use of racial preferences will no longer be necessary.” Sure; just as rent-control in New York City was expected to be a temporary wartime measure. That war was World War II.

In a perfect world the University of Michigan would be able to admit anyone they pleased, corporations would be able to hire anyone they pleased, and there would be no “prohibited categories” of discrimination, or indeed, any anti-discrimination laws at all. So I can’t get too upset about affirmative action. The people who get really exercised about it tend to be old ACLU civil-rights lefties, like David Horowitz, or John Rosenberg, who runs the excellent anti-affirmative-action blog Discriminations, or my father. Old lefties retain a touching faith in the government’s ability to make a better society. They believe in integration and public education. They marched for civil rights in the South and sought to eliminate discrimination against blacks not just in law, by taking Jim Crow off the books, but in fact. They really believed in the government’s willingness and ability to wipe out racism, and they feel, with some justice, that they were sold a bill of goods. Communism made the first generation of neo-conservatives: affirmative action made the second generation.

Jun 232003
 

This is a test of the emergency broadcast system. Colby Cosh is now officially unemployed and needs your money, for which he is too proud to beg more than twice. Yes, we want Colby to have plenty of leisure for blogging, but we also want to keep him in food, cigs, and an Internet connection, and I’ve heard tell that those Alberta summers can be pretty harsh. If you haven’t been reading Colby, you’d best start, and if you have, then you know how consistently good he is, so cough up. This has been a test of the emergency broadcast system. If this were a real emergency, you would be instructed to give me money instead.

Jun 222003
 

Eddie Thomas has a longish and interesting post up about “Whiteness Studies,” in which he is characteristically more generous and fair-minded than I’m about to be. Eddie is firmly anti, but one of his commentators, Ted Hinchman, makes the best case in their defense:

What exactly is supposed to be wrong with inquiry into the formation and career of the concept of racial whiteness?

It seems obvious that the normative concept of whiteness had and still has as its core function the justification of a species of social prejudice — once the concept is in hand, you can call this ‘racial’ prejudice. Of course, it doesn’t follow that you can’t use the concept in other ways. When you say ‘White folks are sometimes plagued by racial guilt,’ you obviously aren’t justifying racial prejudice. But you’re using a concept that wouldn’t exist were it not for others’ use of it to justify racial prejudice. And it seems obvious that the justificatory use must be what gave the concept currency…

One might teach a course on the history of the concept of gravitational collapse without provoking hue and cry in the blogosphere. Or of the concept of evolution. Or of the concept of time (I don’t mean the rough draft of Sein und Zeit). So why not a course on the history of the concept of whiteness?

Now this is an argument, for good or ill, but try as you might, you just cannot expand an argument into a curriculum. In my day you had to take eight courses in your major; what might those be, for the aspiring Whiteness Studies major? (Rest assured that today’s course will be tomorrow’s department.) You have the two-semester intro on the social construction of everything, the sophomore-year history of slavery course, with special emphasis, naturally, on the United States, a couple of senior seminars on mortgage discrimination and the difficulty of getting a cab — somebody help me out here. There remains post-graduate work, which I can’t even fathom; doubtless this indicates my own insensitivity.

It isn’t much of an argument either. The concept of “whiteness” may have originated in the well-grounded observation that some people have fairer skin than others. It is obvious to Ted Hinchman that “justificatory use must be what gave the concept currency”; it’s far from obvious to me. “Normative,” then, colossally begs the question. If you wish to demonstrate the social construction of race, then you must demonstrate it, not assume it.

Whiteness Studies advocates insist on the one hand that race “is based on a fantasy” and on the other that everything be viewed through the lens of this fantasy. This WaPo story notes that “most [advocates of whiteness studies] are white liberals who hope to dismantle notions of race.” Of course people who really want to “dismantle notions of race” do not invent an academic discipline entirely devoted to such notions. Professor Gregory Jay of the University of Wisconsin encapsulates this cognitive dissonance in a single, convenient web page. He surrounds race with quotation marks, then asks his students, “How long can one watch television or read a newspaper or magazine without encountering anything but white people, or mostly white people?” I’m not sure: how do you tell?

The central premise of Whiteness Studies, and all social construction arguments, is that one’s thought is somehow externally constrained. This is our old friend, the prisoner of consciousness, which has a long and disreputable history. Plato shackles us in the cave, which keeps the Forms forever inaccessible; Kant in our faculties, which distort the true, “noumenal” world; Marx, himself bourgeois to the core, in our “class,” which renders us incapable of seeing that our arguments are mere bourgeois apologetics. Plato, Kant, and Marx granted themselves special get-out-of-jail-free cards, necessarily, to permit them to make such arguments. Such cards, however, are now for sale, like indulgences: whites will be permitted, with the aid of other, more tutored whites, to transcend their white consciousness for a modest tuition fee. Not so modest at Princeton, one of 30 universities that currently offer instruction in Whiteness Studies; but hey, who said enlightenment comes cheap?