Jan 162004

Michael Blowhard hypothesizes two concert attendees, one of a Black Sabbath show (Oz-era, one hopes), the other of Pollini playing Chopin. They both report that the show was “great,” and Michael has a few questions, which I will number for convenience:

1. Knowing nothing else about these two people, would you feel capable of saying that one of them had a “greater” experience than the other?
2. What is the relationship between the greatness of a given work and the greatness of the experience a spectator/consumer/user has? Does any such relationship exist, necessarily, at all?
3. Is it possible, or even semi-possible, to assert that greater works deliver greater experiences?
4. By what measure and on whose authority?
5. And to what extent does the answer depend on who’s doing the experiencing?

Let’s get one thing straight first: the aesthetic experience, like all experiences, exists entirely in the mind of the audience. Of course the stimulus, the work, is real, but no aesthetic experience is possible without an audience. Art is art by virtue of communicating something to someone else. The viewer recreates the work for himself, which can be done well or badly, and it is only the recreation that counts. The viewer is his own artist, not in the deconstructionist sense of all interpretations being equally valid, but in the sense that only he is responsible for bringing the art back to life. As J.V. Cunningham puts it:

Poets survive in fame.
But how can substance trade
The body for a name
Wherewith no soul’s arrayed?

No form inspires the clay
Now breathless of what was,
Save the imputed sway
Of some Pythagoras,

Some man so deftly mad
His metamorphosed shade,
Leaving the flesh it had,
Breathes on the words they made.

To experience a work of art well is to see what is there and nothing that is not. Let’s simplify Michael’s hypo a bit, controlling for the variables. Imagine the same person reading the same book twice, say five years apart. Everyone has reread a book and said to himself, “My God, how did I miss that the first time?” This happened to me with Portrait of a Lady the third time I read it, a couple years ago. Finally I’d had enough adult experience, and paid close enough attention, to understand, among other things, the solipsism that makes Gilbert Osmond such a monster, his insistence that everything in his universe reflect his pinched self. I can judge the aesthetic value of my three readings because I have direct introspective access to all of them. The third reading was deeper, broader, more complete — greater, in a word.

The answer to Question 1, however, is no. In the trivial case the Chopin fan may be a chronic liar who slept through the concert and praises it to impress his friends. More seriously, there is something to be got out of Black Sabbath, though less, perhaps, than what can be got out of Chopin. Quite possibly the Sabbath fan has concentrated so much better than the Chopin fan that he has had a superior aesthetic experience, though from an inferior work of art.

Which brings us to Question 2. Greater art offers the potential, but only the potential, for a greater experience. The viewer must realize that potential, and the work lies fallow unless he’s up to the task. Intensity, I should emphasize, is not the standard, but propriety. People are often deeply moved by bad art because it happens to accord with their prejudices. This is like being deeply moved by the sound of your own voice. We have all reread books that mattered to us as adolescents, only to be horrified at how bad they really are. The first reading sours in retrospect, and it should. The experience was meretricious: you were taken in.

To Question 3, which is the guts of the matter, we’d better be able to answer yes, otherwise critics, English teachers, and culturebloggers may as well hang up their collective spurs right now. Fortunately we can. There are things to notice in works of art. It is better to notice them than not to. Great works of art simply have more to notice. Nobody’s in charge, of course, and there is no quantitative standard (Question 4). We argue about what is there and what is not, and the best argument, if we’re lucky, carries the day.

If the answer to Question 5 isn’t obvious by now, imagine that the Sabbath fan, who hates classical music, was forced to attend the Chopin recital and the Chopin fan, who hates metal, was forced to go to the Sabbath show. Wouldn’t both of their experiences suffer in consequence? Sure they would.

Aren’t you sorry you asked?

(Update: James Joyner comments. Will Duquette comments. John Venlet comments. George Hunka comments at length. Lynn Sislo comments, and meta-comments. )

Jan 142004

Several bloggers are reading my homeboy, the late American poet and critic Yvor Winters, which pleases me greatly, and misreading him, which comes with the territory.

I first encountered Winters’ name in the back pages of The New Republic, where the reviewer, discussing someone else, referred to him slightingly as “opposed to everything the 20th century stood for.” Ah ha, thought I, there’s the critic for me. I dug up a copy of Forms of Discovery — which is not the best place to start, the novice should try In Defense of Reason instead — and soon hoovered up everything he wrote.

Eliot is still generally regarded as the most influential critic of the century; history will judge that it was Winters. He taught poetry and American literature at Stanford for forty years; among his students were dozens of distinguished poets and scholars, including J.V. Cunningham, Edgar Bowers, Thom Gunn, and Scott Momaday, who will eventually be numbered, along with Winters himself, among the finest poets of the 20th century. (That’s two appeals to posterity in two sentences if you’re scoring at home.) Winters personally introduced several poems to the canon, including George Herbert’s Church Monuments, Robert Bridges’ The Affliction of Richard, and F.G. Tuckerman’s The Cricket (with help from Witter Bynner and Edmund Wilson). His reevaluation of Elizabethan poetry, upgrading Wyatt, Jonson, Greville and Gascoigne, and downgrading Spenser and Sidney, is now a well-regarded if not yet the standard view; in the early 1960s an English department hack published an anthology of Elizabethan poetry that plagiarized Winters’ choices, extremely eccentric at the time, almost exactly, without so much as mentioning his name. Other causes of his, like Jones Very, Charles Churchill, and Sturge Moore, have met with less success: then again nobody reads Lancelot Andrewes on Eliot’s account either.

Winters campaigns, in a phrase, against emotion for its own sake. He insists that indulgence in emotion without adequate motive leads to sloppy writing, sloppy thinking, and sloppy living. This leaves him hostile in philosophy to the Transcendentalists and to their 18th century continental predecessors like Shaftesbury, who find wisdom in impulse. And it leaves him hostile in poetry to the British romantics especially, who constantly fall upon the thorns of life and bleed, without troubling to tell the reader anything about the thorns or even why they are thorny at all.

One suspects most anti-romantic critics, like Irving Babbitt or Paul Elmer More, of being insensible to the considerable lure of romanticism, of priggishly denouncing vices by which they were never tempted. Winters, on the other hand, was very nearly seduced. His early poems, like those of William Carlos Williams (think of the fire engine and the red wheelbarrow) and like American Indian verse, which influenced him greatly, derive their power from an intense focus on tiny particulars that borders on the maniacal. His first book of poems was called “Diadems and Faggots,” after a line from an Emerson poem that David Fiore, ironically, quotes against him. He finally concluded that he had to think better, and use better methods, to write better poetry, and retain his sanity. He deliberately sacrificed intensity for balance, lest he end up, as he put it, “a minor disciple of W.C. Williams.” This felt experience gives his criticism a uniquely charged earnestness. Winters takes Emerson far more seriously than Emerson ever took himself.

Winters is most notorious for his oft-repeated pronouncement that a poem is “a moral judgment of a human experience.” His contemporaries, Cleanth Brooks and John Crowe Ransom for instance, commonly translated this as a demand for a sort of propositional poetry, and the misapprehension persists in Lawrence White:

I read Winters as an undergraduate. He was my teacher’s teacher, & I thought it’d help me figure out what was going on. I learned a lot, but I always stumbled over the “poetry is the highest thought” thing. Man, like, I was reading Kant at the time! I think Fulke Greville is an awesome poet, but a thinker? … Kant is 1,000 times more exacting, more exquisite, more voluptuous a thinker than any poet. For proof, compare his reasoning ability to the reasoning of Winters (the latter being the rational synopsis of the poetry). Not that Winters is by any means a fool, but he’d have a hard time getting a PhD in philosophy from the work he’s submitted so far.

This reminds me of my father’s remark, when I showed him J.V. Cunningham’s poem on the Central Limit Theorem, that he preferred the Central Limit Theorem. Winters does not ask for the Critique of Pure Reason in verse, and the phrase “poetry is the highest thought” appears nowhere in his work to my knowledge. He interests himself in the relationship between the paraphrasable content and the emotion the poem provokes. In the precise adjustment of this relationship, through various technical means of which rhyme and meter are only the crudest, lies the judgment. He praises such poems as Rimbaud’s Larme, which has no paraphrase to speak of, Allen Tate’s The Subway, in which the paraphrase is mad (much like Winters’ own Danse Macabre), and Elizabeth Daryush’s Still-Life, in which the moral judgment differs entirely from the paraphrase.

Here, in a nutshell, is the problem with Yeats, with whom this tempest began. Yeats is not just foolish, he is resoundingly foolish. He affects a vatic tone, insisting that the reader treat his risible ideas seriously. The resounding is far more irksome than the foolishness. A poet’s got to know his limitations, and Yeats was never too clear on his. Winters understood his very well:

What was all the talk about?
This was something to decide.
It was not that I had died.
Though my plans were new, no doubt,
There was nothing to deride.

I had grown away from youth,
Shedding error where I could;
I was now essential wood,
Concentrating into truth;
What I did was small but good.

Orchard tree beside the road,
Bare to core, but living still!
Moving little was my skill.
I could hear the farting toad
Shifting to observe the kill,

Spotted sparrow, spawn of dung,
Mumbling on a horse’s turd,
Bullfinch, wren, or mockingbird
Screaming with a pointed tongue
Objurgation without word.

Jan 102004

The students

Angelina Staccato: Michele Catalano
Forrest Swisher: Agenda Bender
Wing-Ding Weisenheimer: Ken Goldstein
Gilbert Scrabbler: Steven Den Beste
Woolworth Van Husen III: Felix Salmon
Purdy Spackle: Kim du Toit
Chuck Farley: Stephen Green
French Lambretta: Pejman Yousefzadeh (fantasy)
Franklin Furter: Pejman Yousefzadeh (reality)
Madison Jones: Eddie Thomas
Faun Rosenberg: Eve Tushnet
Amana Peppridge: Glenn Reynolds
“Alphabits”: Salam Pax
Howard Havermeyer: Cinderella Bloggerfeller
Lawrence Kroger: James Lileks

The teachers

Mr. Vernon Wormer: Emperor Misha
Mr. Dwight Mannsburden: A.C. Douglas
Miss Dolores Panatella: Susannah Breslin
Miss Violet Coolidge: Mika Cooper
Miss Marilyn Armbruster: Dr. Weevil

(Update: Wing-Ding comments.)

Jan 092004

David Fiore is back for a third helping (or fourth or fifth, I’ve lost count by now). His erudite reply to my Professor X piece investigates various ancillary points of Emerson scholarship, like his relationship to Coleridge and the important question of whether the notorious transparent eyeball can see itself. David is terrifyingly well-informed on these matters, which fortunately need not concern us here. The question was whether Emerson advocates surrender to emotion. David, to his credit, does not attempt to deny this, and really it would be impossible to deny; every second page of Emerson contains passages to this effect. He takes a different approach:

Having read some of Winters, I see now, Aaron, why you place so much emphasis upon the logical consequences of philosophical positions. But you cannot deal with Emerson (or me!) this way. For Winters, Crane is a superior Emersonian, because he is “not content to write in a muddling manner about the Way; he is concerned primarily with the End.” But this is precisely what makes him such a failure as an Emersonian–and a sane human being. Life is a problem. People, like works of art, are alive so long as they maintain their ideas in tension. To long for the resolution of these tensions, as you do Aaron, is to long for catastrophe. [Italics his.]

Since David has many distinguished predecessors in this view, like “Negative Capability” Keats, who can be excused on grounds of extreme youth, and F. Scott “Opposed Ideas in the Mind at the Same Time” Fitzgerald, I may be forgiven for insisting on some obvious points. Life is indeed a problem, many problems, which one does one’s best to solve, through exercise of the rational faculty. Man acts and chooses: each choice excludes many others. Some choices are wise, others foolish; some conduce to his well-being, others to his destruction. One can no more hold an idea and its opposite at the same time — what, in this case, could “hold” possibly mean? — than one can act on an idea and its opposite at the same time. In the face of these difficulties, Emerson recommends abdication.

Emerson sprang from the dominant 19th-century intellectual tradition in America, New England Nonconformist. It is best represented by the Holmes family (Oliver Wendell Sr. and Jr.) and the James family (Henry Sr., William, Henry, and Alice). Its products include Emily Dickinson, Nathaniel Hawthorne, and Herman Melville. Today New England Nonconformism is extinct; Katharine Hepburn (b. 1907) was perhaps its last degenerate scion.

New England Nonconformists, with very few exceptions, were hobbyists. They liked to toy with ideas, often radical ideas and often very brilliantly. They filled the ranks of the Abolitionists and suffragettes; but they tended not to reason to these positions but intuit them. Their motto could have been Holmes Jr.’s frequent remark that he hated facts, that the chief end of man was to form general propositions, and that no general proposition was worth a damn. Holmes père et fils, Emerson, and William James were all radical skeptics philosophically who conducted themselves personally with exemplary rectitude. What constrained them was a deep prudence and moral sense, informed by the Calvinism of Jonathan Edwards, the doctrine that although good works and success on earth technically avail one nought, as all seats in the Kingdom of Heaven are reserved, they yet demonstrate one’s fitness for Election. Yvor Winters calls this a “New England emotional coloration,” accurately. To put it flippantly, the vote for women was all very well, but “never dip into capital” was a real rule to live by. (On the other hand, in the dominant 20th-century American intellectual tradition, the New York Jewish, ideas became the ticket to success.) Henry James’ American characters act not on ideas but on an inarticulable “moral sense.” This moral sense attenuated as its doctrinal background exerted less and less direct influence, until it finally vanished altogether.

This is why Emerson died rich, old, and in bed, and Hart Crane jumped off an ocean liner.

Jan 082004

It turns out that my laws of blog comments were incomplete. I forgot this:

7. Anyone who posts to a dead thread is insane.

To take a few recent instances:

Here, on a casual aside about Bill Buckley and joint sizes:

you are all stupied you have no clue about the weed world its alot more complaicated so shut the hell up [From “reefer king”.]

Here, on an article on the history of cryptography:

I have noticed any strengths and weakness.please commnt on this. [From “yathish”.]

Here, on a philosophical discussion of slippery slopes:

It is written in the big book, that,God said” The world is going to be rich and there will be plenty of everything. But there are people out there who are simply fell bad just becouse poor’s are not next to them.” In other word ‘s they don’t won’t to illiminate suffering.Let God bless wellestone for his “we can do it” kind of attitude.And his positivity for every race,including the poor immigerants.If you think all this terrorist and iraq game is about anything else ,you must be in slippery slope.I say it is about there is enough money in America today which can make every body rich .but if every good looking immegrants get rich out there,the ugly’s going to be exposed and they can’t compitate any more.”IT is ugly’s world baby” [From “Ambaye T kassay”.]

Sic, sic, sic.

Jan 032004

After sixteen months, and a few late fits and starts, Cinderella Bloggerfeller has finally decided to hang up his glass slipper. Don’t go over there and encourage him to come back. I used to do that when my favorite bloggers retired until I realized how tiresome it is. There are other joys in life, and if you’re sick of blogging, quit. Then, if you were any good, you get to read your own obituaries.

The worst name in blogging history was the least of Cinderella’s distinctions. He was among the wittiest of bloggers; sample his account of his birth, his New Apocalypse Review, his one-entry series of Inspired Misspellings of the Blogosphere, his brief and expanded eulogies for Edward Said, and his nifty lift from Swift.

I can count on the fingers of one hand the blogs that are irreplaceable: Cinderella’s was one. He specialized in translations from Polish, French, Spanish, and Italian. Some were literary, some philological, some political. Cinderella followed assiduously the nasty goings-on in Transcaucasia, of which most Westerners are at best dimly aware. (This entertaining Insider’s Guide to the Stans should get you started.) He consistently dug up useful articles and interviews that we monoglot Americans wouldn’t have found or been able to read if we had.

Glenn Reynolds does what he does supremely well, but if he were to disappear tomorrow, some pocket-Reynolds, like James Joyner, would spring up to take his place. There are no pocket-Cinderellas, and in his absence we shall simply have to do without.

Jan 012004

Sixty years ago Yvor Winters wrote a moving essay on Hart Crane called “What Are We To Do With Professor X?” Crane and Winters were correspondents and friends for several years; they broke over Winters’ largely hostile review of “The Bridge” in 1930; Crane jumped off an ocean liner two years later. Winters charges Crane’s suicide to his belief in Emersonian advocacy of instinct over intellect and change for its own sake. (To anyone who doubts that this is in fact Emerson’s philosophy I suggest reading “The Oversoul,” “Self-Reliance,” “Art,” or “Spiritual Laws” straight through, instead of the little snippets from them that are so frequently quoted.) He contrasts Crane, “a saint of the wrong religion,” who took those ideas with literally deadly seriousness, with genteel Professor X, who holds the same ideas but would not dream of actually practicing them:

Professor X can be met four or five times on the faculty of nearly every university in the country: I have lost count of the avatars in which I have met him. He usually teaches American literature or American history, but he may teach something else. And he admires Emerson and Whitman.

He says that Emerson in any event did not go mad and kill himself; the implication is that Emerson’s doctrines do not lead to madness and suicide. But in making this objection, he neglects to restate and defend Emerson’s doctrines as such, and he neglects to consider the historical forces which restrained Emerson and which had lost most of their power of restraint in Crane’s time and part of the country. [Crane was born in Cleveland in 1899.] … The Emersonian doctrine, which is merely the romantic doctrine with a New England emotional coloration, should naturally result in madness if one really lived it; it should result in literary confusion if one really wrote it. Crane accepted it; he lived it; he wrote it; and we have seen what he was and what he wrote.

Professor X says, or since he is a gentleman and a scholar, he implies, that Crane was merely a fool, that he ought to have known better. But the fact of the matter is, that Crane was not a fool. I knew Crane, as I know Professor X, and I am reasonably certain that Crane was incomparably the more intelligent man. As to Crane’s ideas, they were merely those of Professor X, neither better nor worse; and for the rest, he was able to write great poetry. In spite of popular or even academic prejudices to the contrary, it takes a very highly developed intelligence to write great poetry, even a little of it. So far as I am concerned, I would gladly emulate Odysseus, if I could, and go down to the shadows for another hour’s conversation with Crane on the subject of poetry; whereas, politeness permitting, I seldom go out of my way to discuss poetry with Professor X.

In the role of Professor X today is David Fiore, who is pleased that PETA exists. I have made my objections to the concept of animal rights elsewhere and will not rehearse them here; they are beside my point. Now PETA has been excoriated, properly and often, for its advocacy and funding of violence and terrorism. It is less often noted that these follow necessarily from its position. If you believe, like Ingrid Newkirk, that a rat is a pig is a dog is a boy, then fire-bombing a laboratory is a small price to pay to stop what, by your lights, is mass murder. I can respect this view even as I wish to jail anyone who tries to put it into practice.

David begins courageously enough: “I’ve made a radical choice. So have you.” But he fails to comprehend just how radical the choice is: “And certainly, I don’t condone any acts of violence Animal Rights people might commit. That’s just insanity, you don’t make change by terrorizing the majority. Change happens when the majority assents to it… Moreover, I don’t have the slightest desire to “convert” anyone, I like just about everybody, and I’m not suited to delivering harangues…” David has, and can have, no moral objection to violence on behalf of the bunny rabbits; it is a mere question of tactics: “you don’t make change by terrorizing the majority.” Winters writes that Professor X “once reproved me for what he considered my contentiousness by telling me that he himself had yet to see the book that he would be willing to quarrel over.” And so David, who likes just about everybody, prefers that PETA deliver the harangues on his behalf.

Sometimes hypocrisy is, as La Rochefoucauld says, the tribute vice pays to virtue; sometimes, as in this case, the tribute fanaticism pays to sanity. A significant minority of Americans believes that abortion is murder. Yet in their next breath they will condemn clinic bombers — because they are hypocrites, fortunately. In a society of mass murderers, armed resistance becomes a perfectly logical, even admirable, response.

The most shocking thing about 9/11 wasn’t the deaths, or the image of the World Trade Towers collapsing. It was the realization that some people are willing to die for their ideas, foolish as they are, while most of us treat ideas like shiny playthings that you can put back in the toy chest when you’re finished with them. I have friends who say the trouble nowadays is that no one takes ideas seriously. They should thank their lucky stars. When nearly everyone thinks as badly as possible, Professor X may be the best we can hope for.

(Update: David Fiore replies on his blog, and in the comments.

Dec 292003

So much to skip, so little time.

I begin with myself, having sucked a bit of late. This and this were too twee and precious for words. This wasn’t really very funny. This was half-right but embarrassingly wrong in several details, and God of the Machine is supposed to be in the details. This and this provoked squalls of irrelevant commentary. This was weird but informative. This was just weird.

The laziest organizing principle in prose is the list. (This post, for instance.) The cowardly lister postpones his imposition of a Few of My Favorite Things on the world until the end of the year, when everyone else is doing it and he has cover. The busy reader will naturally avoid such things. This goes double for that most elaborate of self-congratulation rituals, year-end awards. “Prizes,” said Ezra Pound, “are always a snare.” Besides, nobody ever nominates me for anything.

Suppose you edited a web magazine with open submissions, and you were obliged to publish whatever you received. You’d have The Carnival of the Vanities, now in its 66th tiresome edition, to which Instapundit still links dutifully every week (Glenn has a keyboard macro for “rich, bloggy goodness”). Which beats reading it, I can assure you. Good writers are often bad: bad writers are never, ever good. I confess that I often enjoy the summaries, in which the host of the week endeavors to say something kind about every submission. This testifies to my somewhat sadistic taste in humor.

The Type 1 political blog post cites an anecdotal news item that confirms his biases, whereupon the blogger crows that he was right and this proves it. Degree-of-difficulty: 0.0, since most of us obtain our news from like-minded sources. (Explaining away an item that conflicts with one’s biases, which would be far more interesting, is naturally far less common.) The Type 2 political blog post scours the Internet for the weakest possible opponent of his views and demolishes him line by line. No poliblog is complete without a healthy dose of Type 1 and Type 2, and many poliblogs consist of nothing but. If you devoted the time you’ve spent reading Type 1 and Type 2 to a more constructive activity, like exercising your abs, you might have that eight-pack you’ve always wanted by now.

Tolkien loses me about when Betamillion is making his way through Gallimaufria to secure the Ring of Fire and win the hand of fair Neuralgiel, or something. The pros established an early lead for dullest Lord of the Rings commentary, with the antis now closing fast. I’ll give you the gist here, with a spoiler-laden review of the trilogy:

Good triumphs over evil.

Not having finished any of the books or seen any of the movies, I admit that’s a wild guess.

Dec 262003

Another bulletin from the Dept. of Almost Right: Larry Ribstein, blogging on the Forbes list of top ten business movies:

Forbes story on the Ten Greatest Business Movies and related stories on Forbes.com, says a lot about films attitude toward business. The top ten were: Citizen Kane, The Godfather: Part II, It’s a Wonderful Life, The Godfather, Network, The Insider, Glengarry Glen Ross, Wall Street, Tin Men, Modern Times… This film list provides new fodder for my theory. My thesis, again, is that, while films usually portray business in a bad light, they do not really say that business is bad. After all, the films most of us see are produced by big businesses. More precisely, films are made by people working in these businesses. Filmmakers see themselves as artists, the latest in a long line from cave painters through Michelangelo. Yet, unlike many artists, filmmakers art is so costly that films cannot get made without lots of money. Filmmakers must get this money from capitalists, who, in turn, must sell tickets. Because film artists resent their shackles, they often show struggling workers, greedy capitalists, and heroic artists. “Good” businesses are those where the artistic types have the upper hand, and bad businesses are those where the artists have lost. In other words, films see firms from the cramped perspective of the assembly line or the cubicle. From way out in Hollywood, firms often seem like beehives or rabbit warrens, unfit for human habitation.

Larry’s point needs to be sharpened up a bit. All things being equal, people prefer good merchandise to bad, and they make exceptionally fine discriminations. Gillette mightily outsells Schick because its razor blades are better, not a lot, just enough. There are a few exceptions to this rule, mostly in aesthetic products, notably Hollywood itself. Bad art makes more money than good art, in general because bad taste is more prevalent than good taste, and in the specific case of movies because the audience for them is overwhelmingly young, and the taste of the average adolescent is even worse than that of the average adult. These are depressing facts if you work in the taste business. “From way out in Hollywood” it is Hollywood that looks “unfit for human habitation.” A screenwriter might rashly conclude that schlock always trumps quality; and in fact, as a survey of Hollywood movies about business shows, he usually does.

The anti-business movies deal overwhelmingly with schlock purveyors: yellow journalists (Citizen Kane), swampland peddlers (Glengarry Glen Ross), penny stock hustlers (Boiler Room), shady aluminum siding salesmen (Tin Men), and out-and-out gangsters (The Godfather). It’s a Wonderful Life gestures half-heartedly toward the notion of quality as good business, as in the scene where Mr. Potter’s rental agent lectures him on how all the nice houses in Bailey Park are killing his real estate business. But mostly it’s more people vs. profits hoo-rah.

In a “pro-business” movie like Executive Suite, our hero, William Holden, is the research chief for the furniture company, and in his big speech, as he ascends to the chairmanship, he tells the board that the company will never sacrifice quality, profits be damned. That it might actually be more profitable to manufacture good furniture does not cross the screenwriter’s mind. (Holden figures prominently in several famous business movies, Network of course and also the most authentically pro-business movie out of Hollywood that I know, Sabrina, which is disguised as a love story. He was, perhaps coincidentally, Ronald Reagan’s best man.)

Or consider Tucker, a garish and tasteless but ostensibly pro-business movie. Jeff Bridges plays the real-life car designer Preston Tucker, who sets out to build a revolutionary automobile, and succeeds, only to be squelched by a conspiracy of the government with the Big Three. This happens to be pretty much true; but out of this pregnant material the director, Francis Ford Coppola, fashions only another morality tale of how, as Larry would say, the good company, in which the artist, Tucker, is in charge, goes down to defeat, or, as I would say, the evil capitalists foist off shoddy merchandise on an unsuspecting public. Hollywood doesn’t hate business. It just hates businesses that act all businesslike.

(Update: Larry Ribstein replies. Michael Williams comments.)

Dec 262003

A while ago Brian Micklethwait had a bit about discomfiture in art to which many bloggers linked approvingly:

As for the endlessly repeated claim that art is supposed to make you feel uncomfortable, I don’t buy that. And I don’t believe the people who say that they do buy it are being honest. I think that a picture which they have no problem with, but which they believe makes other people whom they disapprove of uncomfortable, makes them very comfortable indeed, and that that is the kind of discomfort (i.e. not discomfort at all, for them) which they like, and are referring to with all this discomfort propaganda. They no more like being genuinely discomforted by art than I do.

As a psychological observation this is acute. No one ever talks this way about art that discomfits him, personally. And good art is never described in these terms — only epater le bourgeois stuff, which of course discomfits no one, certainly not the people who describe it as discomfiting, and not the people it’s supposed to discomfit either. “Discomfort” is the last-ditch argument of bad artists or their flaks, like museum directors.

So I almost agree with Brian, except I’d lop off the first sentence. Have you ever talked at length with someone who was far more intelligent than you? Such a person seems armed with all of your thoughts and experiences and much more besides; he answers objections that you have formed fuzzily or not at all. You get the most out of it by shunting aside your own prejudices, as best you can, and following him as he elaborates on his, which are more interesting. Later on you go back and reintroduce yourself, as it were, to your original prejudices, and compare and contrast. The experience is, in a word, discomfiting, not because your interlocutor tries to shock you like a cheap artist, but because he says things that have not occurred to you, and novelty is always unsettling.

Great art is like that, except that its commerce is with a mind greater than any you know personally and on a subject on which it has meditated deeply and you may not have thought at all. Henry James goes so far as to say “it is a very obvious truth that the deepest quality of a work of art will always be the mind of the producer… No good novel ever proceeded from a superficial mind,” and he’s talking to you, Charles Dickens.

Brian discusses visual art and music mostly, and I’m talking about literature, being wary of generalizations about all arts, although I seem to make them often enough. So maybe we are talking at cross purposes. But for all arts (oops, I did it again) the ideal aesthetic attitude is receptiveness — a provisional acceptance of the author’s cultural situation, the benefit of the doubt. You have to be willing to check your damaged self at the door. Many respectable aesthetic theories, like Coleridge’s “suspension of disbelief” and the “pseudo-belief” of T.S. Eliot and I.A. Richards, reasonably begin with the attempt to inculcate this attitude: we need to read Christians and pagans without being either. So yes, good art makes you uncomfortable, but only incidentally, and anyone who makes a big point of the fact is a bad artist.

(Update: David Fiore comments. Great artists aren’t just different, David, they’re better. Get over it.)