READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

I am Jack’s Raging Insomnia: The Tragically Overlooked Moral Dilemma at the Heart of Fight Club

There’s a lot of weird theorizing about what the movie Fight Club is really about and why so many men find it appealing. The answer is actually pretty simple: the narrator can’t sleep because his job has him doing something he knows is wrong, but he’s so emasculated by his consumerist obsessions that he won’t risk confronting his boss and losing his job. He needs someone to teach him to man up, so he creates Tyler Durden. Then Tyler gets out of control.

Image by Canva’s Magic Media

[This essay is a brief distillation of ideas explored in much greater depth in Hierarchies in Hell and Leaderless Fight Clubs: Altruism, Narrative Interest, and the Adaptive Appeal of Bad Boys, my master’s thesis]

If you were to ask one of the millions of guys who love the movie Fight Club what the story is about, his answer would most likely emphasize the violence. He might say something like, “It’s about men returning to their primal nature and getting carried away when they find out how good it feels.” Actually, this is an answer I would expect from a guy with exceptional insight. A majority would probably just say it’s about a bunch of guys who get together to beat the crap out of each other and pull a bunch pranks. Some might remember all the talk about IKEA and other consumerist products. Our insightful guy may even connect the dots and explain that consumerism somehow made the characters in the movie feel emasculated, and so they had to resort to fighting and vandalism to reassert their manhood. But, aside from ensuring they would know what a duvet is—“It’s a fucking blanket”—what is it exactly about shopping for household décor and modern conveniences that makes men less manly?

Maybe Fight Club is just supposed to be fun, with all the violence, and the weird sex scene with Marla, and all the crazy mischief the guys get into, but also with a few interesting monologues and voiceovers to hint at deeper meanings. And of course there’s Tyler Durden—fearless, clever, charismatic, and did you see those shredded abs? Not only does he not take shit from anyone, he gets a whole army to follow his lead, loyal to the death. On the other hand, there’s no shortage of characters like this in movies, and if that’s all men liked about Fight Club they wouldn’t sit through all the plane flights, support groups, and soap-making. It just may be that, despite the rarity of fans who can articulate what they are, the movie actually does have profound and important resonances.

If you recall, the Edward Norton character, whom I’ll call Jack (following the convention of the script), decides that his story should begin with the advent of his insomnia. He goes to the doctor but is told nothing is wrong with him. His first night’s sleep comes only after he goes to a support group and meets Bob, he of the “bitch tits,” and cries a smiley face onto his t-shirt. But along comes Marla who like Jack is visiting support groups but is not in fact recovering, sick, or dying. She is another tourist. As long as she's around, he can’t cry, and so he can’t sleep. Soon after Jack and Marla make a deal to divide the group meetings and avoid each other, Tyler Durden shows up and we’re on our way to Fight Clubs and Project Mayhem. Now, why the hell would we accept these bizarre premises and continue watching the movie unless at some level Jack’s difficulties, as well as their solutions, make sense to us?

So why exactly was it that Jack couldn’t sleep at night? The simple answer, the one that Tyler gives later in the movie, is that he’s unhappy with his life. He hates his job. Something about his “filing cabinet” apartment rankles him. And he’s alone. Jack’s job is to fly all over the country to investigate accidents involving his company’s vehicles and to apply “the formula.” I’m going to quote from Chuck Palahniuk’s book:

You take the population of vehicles in the field (A) and multiply it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C).

A times B times C equals X. This is what it will cost if we don’t initiate a recall.

If X is greater than the cost of a recall, we recall the cars and no one gets hurt.

If X is less than the cost of a recall, then we don’t recall (30).

Palahniuk's inspiration for Jack's job was an actual case involving the Ford Pinto. What this means is that Jack goes around trying to protect his company's bottom line to the detriment of people who drive his company's cars. You can imagine the husband or wife or child or parent of one of these accident victims hearing about this job and asking Jack, "How do you sleep at night?"

Going to support groups makes life seem pointless, short, and horrible. Ultimately, we all have little control over our fates, so there's no good reason to take responsibility for anything. When Jack burst into tears as Bob pulls his face into his enlarged breasts, he's relinquishing all accountability; he's, in a sense, becoming a child again. Accordingly, he's able to sleep like a baby. When Marla shows up, not only is he forced to confront the fact that he's healthy and perfectly able to behave responsibly, but he is also provided with an incentive to grow up because, as his fatuous grin informs us, he likes her. And, even though the support groups eventually fail to assuage his guilt, they do inspire him with the idea of hitting bottom, losing all control, losing all hope.

Here’s the crucial point: If Jack didn't have to worry about losing his apartment, or losing all his IKEA products, or losing his job, or falling out of favor with his boss, well, then he would be free to confront that same boss and tell him what he really thinks of the operation that has supported and enriched them both. Enter Tyler Durden, who systematically turns all these conditionals into realities. In game theory terms, Jack is both a 1st order and a 2nd order free rider because he both gains at the expense of others and knowingly allows others to gain in the same way. He carries on like this because he's more motivated by comfort and safety than he is by any assurance that he's doing right by other people.

This is where Jack being of "a generation of men raised by women" becomes important (50). Fathers and mothers tend to treat children differently. A study that functions well symbolically in this context examined the ways moms and dads tend to hold their babies in pools. Moms hold them facing themselves. Dads hold them facing away. Think of the way Bob's embrace of Jack changes between the support group and the fight club. When picked up by moms, babies breathing and heart-rates slow. Just the opposite happens when dads pick them up--they get excited. And if you inventory the types of interactions that go on between the two parents it's easy to see why.

Not only do dads engage children in more rough-and-tumble play; they are also far more likely to encourage children to take risks. In one study, fathers told they'd have to observe their child climbing a slope from a distance making any kind of rescue impossible in the event of a fall set the slopes at a much steeper angle than mothers in the same setup.

Fight Club isn't about dominance or triumphalism or white males' reaction to losing control; it's about men learning that they can't really live if they're always playing it safe. Jack actually says at one point that winning or losing doesn't much matter. Indeed, one of the homework assignments Tyler gives everyone is to start a fight and lose. The point is to be willing to risk a fight when it's necessary--i.e. when someone attempts to exploit or seduce you based on the assumption that you'll always act according to your rational self-interest.

And the disturbing truth is that we are all lulled into hypocrisy and moral complacency by the allures of consumerism. We may not be "recall campaign coordinators" like Jack. But do we know or care where our food comes from? Do we know or care how our soap is made? Do we bother to ask why Disney movies are so devoid of the gross mechanics of life? We would do just about anything for comfort and safety. And that is precisely how material goods and material security have emasculated us. It's easy to imagine Jack's mother soothing him to sleep some night, saying, "Now, the best thing to do, dear, is to sit down and talk this out with your boss."

There are two scenes in Fight Club that I can't think of any other word to describe but sublime. The first is when Jack finally confronts his boss, threatening to expose the company's practices if he is not allowed to leave with full salary. At first, his boss reasons that Jack's threat is not credible, because bringing his crimes to light would hurt Jack just as much. But the key element to what game theorists call altruistic punishment is that the punisher is willing to incur risks or costs to mete out justice. Jack, having been well-fathered, as it were, by Tyler, proceeds to engage in costly signaling of his willingness to harm himself by beating himself up, literally. In game theory terms, he's being rationally irrational, making his threat credible by demonstrating he can't be counted on to pursue his own rational self-interest. The money he gets through this maneuver goes, of course, not into anything for Jack, but into Fight Club and Project Mayhem.

The second sublime scene, and for me the best in the movie, is the one in which Jack is himself punished for his complicity in the crimes of his company. How can a guy with stitches in his face and broken teeth, a guy with a chemical burn on his hand, be punished? Fittingly, he lets Tyler get them both in a car accident. At this point, Jack is in control of his life, he's no longer emasculated. And Tyler flees.

One of the confusing things about the movie is that it has two overlapping plots. The first, which I've been exploring up to this point, centers on Jack's struggle to man up and become an altruistic punisher. The second is about the danger of violent reactions to the murder machine of consumerism. The male ethic of justice through violence can all too easily morph into fascism. And so, once Jack has created this father figure and been initiated into manhood by him, he then has to reign him in--specifically, he has to keep him from killing Marla. This second plot entails what anthropologist Christopher Boehm calls a "domination episode," in which an otherwise egalitarian group gets taken over by a despot who must then be defeated. Interestingly, only Jack knows for sure how much authority Tyler has, because Tyler seemingly undermines that authority by giving contradictory orders. But by now Jack is well schooled on how to beat Tyler--pretty much the same way he beat his boss.

It's interesting to think about possible parallels between the way Fight Club ends and what happened a couple years later on 9/11. The violent reaction to the criminal excesses of consumerism and capitalism wasn't, as it actually occurred, homegrown. And it wasn't inspired by any primal notion of manhood but by religious fanaticism. Still, in the minds of the terrorists, the attacks were certainly a punishment, and there's no denying the cost to the punishers.

Also read:
WHAT MAKES "WOLF HALL" SO GREAT?

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

THE ADAPTIVE APPEAL OF BAD BOYS

Read More
Dennis Junk Dennis Junk

Bad Men

There’s a lot not to like about the AMC series Mad Men, but somehow I found the show riveting despite all of its myriad shortcomings. Critic Daniel Mendelsohn offered up a theory to explain the show’s appeal, that it simply inspires nostalgia for all our lost childhoods. As intriguing as I always find Mendelsohn’s writing, though, I don’t think his theory holds up.

Though I had mixed feelings about the first season of Mad Men, which I picked up at Half Price Books for a steal, I still found enormous appeal in the more drawn out experience of the series unfolding. Movies lately have been leaving me tragically unmoved, with those in the action category being far too noisy and preposterous and those in the drama one too brief to establish any significant emotional investment in the characters. In a series, though, especially those in the new style pioneered by The Sopranos which eschew efforts to wrap up their plots by the end of each episode, viewers get a chance to follow characters as they develop, and the resultant investment in them makes even the most underplayed and realistic violence among them excruciatingly riveting. So, even though I found Pete Campbell, an account executive at the ad agency Sterling Cooper, the main setting for Mad Men, annoying instead of despicable, and the treatment of what we would today call sexual harassment in the office crude, self-congratulatory, and overdone, by the time I had finished watching the first season I was eager to get my hands on the second. I’ve now seen the first four seasons.

Reading up on the show on Wikipedia, I came across a few quotes from Daniel Mendelsohn’s screed against the series, “The Mad Men Account” in the New York Review of Books, and since Mendelsohn is always fascinating even when you disagree with him I made a point of reading his review after I’d finished the fourth season. His response was similar to mine in that he found himself engrossed in the show despite himself. There’s so much hoopla. But there’s so much wrong with the show. Allow me a longish quote:

The writing is extremely weak, the plotting haphazard and often preposterous, the characterizations shallow and sometimes incoherent; its attitude toward the past is glib and its self-positioning in the present is unattractively smug; the acting is, almost without exception, bland and sometimes amateurish.

Worst of all—in a drama with aspirations to treating social and historical ‘issues’—the show is melodramatic rather than dramatic. By this I mean that it proceeds, for the most part, like a soap opera, serially (and often unbelievably) generating, and then resolving, successive personal crises (adulteries, abortions, premarital pregnancies, interracial affairs, alcoholism and drug addiction, etc.), rather than exploring, by means of believable conflicts between personality and situation, the contemporary social and cultural phenomena it regards with such fascination: sexism, misogyny, social hypocrisy, racism, the counterculture, and so forth.

I have to say Mendelsohn is right on the mark here—though I will take issue with his categorical claims about the acting—leaving us with the question of why so many of us, me and Mendelsohn included, find the show so fascinating. Reading the review I found myself wanting to applaud at several points as it captures so precisely, and even artistically, the show’s failings. And yet these failings seem to me mild annoyances marring the otherwise profound gratification I get from watching. Mendelsohn lights on an answer for how it can be good while being so bad, one that squares the circle by turning the shortcomings into strengths.

If the characters are bland, stereotypical sixties people instead of individuals, if the issues are advertised rather than dramatized, if everyone depicted is hopelessly venal while evincing a smug, smiling commitment to decorum, well it’s because the show’s creator, Matthew Weiner, was born in 1965, and he’s trying to recreate the world of his parents. Mendelsohn quotes Weiner:

part of the show is trying to figure out—this sounds really ineloquent—trying to figure out what is the deal with my parents. Am I them? Because you know you are…. The truth is it’s such a trope to sit around and bash your parents. I don’t want it to be like that. They are my inspiration, let’s not pretend.

Mendelsohn’s clever solution to the Mad Men puzzle is that its appeal derives from its child’s-eye view of the period during which its enthusiasts’ parents were in their ascendancy. The characters aren’t deep because children wouldn’t have the wherewithal to appreciate their depth. The issues aren’t explored in all their complexity because children are only ever vaguely aware of them. For Mendelsohn, the most important characters are the Drapers’ daughter, Sally, and the neighbor kid, Glen, who first has a crush on Don’s wife, Betty, and then falls for Sally herself. And it turns out Glen is played by Weiner’s own son.

I admit the episodes that portrayed the Draper’s divorce struck me as poignant to the point of being slightly painful, resonating as they did with my memories of my own parents’ divorce. But that was in the ‘80’s not the 60’s. And Glen is, at least for me, one of the show’s annoyances, not by any means its main appeal. His long, unblinking stares at Betty, which Mendelsohn sees as so fraught with meaning, I can’t help finding creepy. The kid makes my skin crawl, much the way Pete Campbell does. I’m forced to consider that Mendelsohn, as astute as he is about a lot of the scenes and characters, is missing something, or getting something really wrong.

In trying to account for the show’s overwhelming appeal, I think Mendelsohn is a bit too clever. I haven’t done a survey but I’d wager the results would be pretty simple: it’s Don Draper stupid. While I agree that much of the characterization and background of the central character is overwrought and unsubtle (“meretricious,” “literally,” the reviewer jokes, assuming we all know the etymology of the word), I would suggest this only makes the question of his overwhelming attractiveness all the more fascinating. Mendelsohn finds him flat. But, at least in his review, he overlooks all the crucial scenes and instead, understandably, focuses on the lame flashbacks that supposedly explain his bad behavior.

All the characters are racist, Mendelsohn charges. But in the first scene of the first episode Don notices that the black busser clearing his table is smoking a rival brand of cigarettes—that he’s a potential new customer for his clients—and casually asks him what it would take for him to switch brands. When the manager arrives at the table to chide the busser for being so talkative, Don is as shocked as we are. I can’t recall a single scene in which Don is overtly racist.

Then there’s the relationship between Don and Peggy, which, as difficult as it is to believe for all the other characters, is never sexual. Everyone is sexist, yet in the first scene bringing together Don, Peggy, and Pete, our protagonist ends up chiding the younger man, who has been giving Peggy a fashion lesson, for being disrespectful. In season two, we see Don in an elevator with two men, one of whom is giving the raunchy details of his previous night’s conquest and doesn’t bother to pause the recounting when a woman enters. Her face registers something like terror, Don’s unmistakable disgust. “Take your hat off,” he says to the offender, and for a brief moment you wonder if the two men are going to tear into him. Then Don reaches over, unchecked, removes the man’s hat, and shoves it into his chest, rendering both men silent for the duration of the elevator ride. I hate to be one of those critics who reflexively resort to their pet theory, but my enjoyment of the scene long preceded my realization that it entailed an act of altruistic punishment.

The opening credits say it all, as we see a silhouetted man, obviously Don, walking into an office which begins to collapse, and cuts to him falling through the sky against the backdrop of skyscrapers with billboards and snappy slogans. How far will Don fall? For that matter, how far will Peggy? Their experiences oddly mirror each other, and it becomes clear that while Don barks denunciations at the other members of his creative team, he often goes out of his way to mentor Peggy. He’s the one, in fact, who recognizes her potential and promotes her from a secretary to a copywriter, a move which so confounds all the other men that they conclude he must have knocked her up.

Mendelsohn is especially disappointed in Mad Men’s portrayal, or rather its failure to portray, the plight of closeted gays. He complains that when Don witnesses Sal Romano kissing a male bellhop in a hotel on a business trip, the revelation “weirdly” “has no repercussions.” But it’s not weird at all because we experience some of Sal’s anxiety about how Don will react. On the plain home, Sal is terrified, but Don rather subtly lets him know he has nothing to worry about. Don can sympathize about having secrets. We can just imagine if one of the characters other than Don had been the one to discover Sal’s homosexuality—actually we don’t have to imagine it because it happens later.

Unlike the other characters, Don’s vices, chief among them his philandering, are timeless (except his chain-smoking) and universal. And though we can’t forgive him for what he does to Betty (another annoying character, who, like some women I’ve dated, uses the strategy of being constantly aggrieved to trick you into being nice to her, which backfires because the suggestion that your proclivities aren’t nice actually provokes you), we can’t help hoping that he’ll find a way to redeem himself. As cheesy as they are, the scenes that have Don furrowing his brow and extemporizing on what people want and how he can turn it into a marketing strategy, along with the similar ones in which he feels the weight of his crimes against others, are my favorites. His voice has the amazing quality of being authoritative and yet at the same time signaling vulnerability. This guy should be able to get it. But he’s surrounded by vipers. His job is to lie. His identity is a lie he can’t escape. How will he preserve his humanity, his soul? Or will he? These questions, and similar ones about Peggy, are what keep me watching.

Don Draper, then, is a character from a long tradition of bad boys who give contradictory signals of their moral worth. Milton inadvertently discovered how powerful these characters are when Satan turned out to be by far the most compelling character in Paradise Lost. (Byron understood why immediately.) George Lucas made a similar discovery when Han Solo stole the show from Luke Skywalker. From Tom Sawyer to Jack Sparrow and Tony Soprano (Weiner was also a writer on that show), the fascination with these guys savvy enough to get away with being bad but sensitive and compassionate enough to feel bad about it has been taking a firm grip on audiences sympathies since long before Don Draper put on his hat.

A couple final notes on the show's personal appeal for me: given my interests and education, marketing and advertising would be a natural fit for me, absent my moral compunctions about deceiving people to their detriment to enrich myself. Still, it's nice to see a show focusing on the processes behind creativity. Then there's the scene in season four in which Don realizes he's in love with his secretary because she doesn't freak out when his daughter spills her milkshake. Having spent too much of my adult life around women with short fuses, and so much of my time watching Mad Men being annoyed with Betty, I laughed until I teared up.

Also read:
SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

THE ADAPTIVE APPEAL OF BAD BOYS

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

Review of "Building Great Sentences," a "Great Courses" Lecture Series by Brooks Landon

Brooks Landon’s course is well worth the effort and cost, but he makes some interesting suggestions about what constitutes a great sentence. To him, greatness has to do with the structure of the language, but truly great sentences—truly great writing—gets its power from its role as a conveyance of meaning, i.e. the words’ connection to the real world.

You’ve probably received catalogues in the mail advertising “Great Courses.” I’ve been flipping through them for years thinking I should try a couple but have always been turned off by the price. Recently, I saw that they were on sale, and one in particular struck me as potentially worthwhile. “Building Great Sentences: Exploring the Writer’s Craft” is taught by Brooks Landon, who is listed as part of the faculty at the University of Iowa. It turns out, however, he’s not in any way affiliated with the august Creative Writing Workshop, and though he uses several example sentences from literature I’d say his primary audience is people interested in Rhetoric and Composition—and that makes the following criticisms a bit unfair. So let me first say that I enjoyed the lectures and think it well worth the money (about thirty bucks) and time (twenty-four half-hour-long lectures).

            Landon is obviously reading from a teleprompter, and he’s standing behind a lectern in what looks like Mr. Roger’s living room decked out to look scholarly. But he manages nonetheless to be animated, enthusiastic, and engaging. He gives plenty of examples of the principles he discusses, all of which appear in text form and are easy to follow—though they do at times veer toward the eye-glazingly excessive.

            The star of the show is what Landon calls “cumulative sentences,” those long developments from initial capitalized word through a series of phrases serving as free modifiers, each building on its predecessor, focusing in, panning out, or taking it as a point of departure as the writer moves forward into unexplored territory. After watching several lectures, I went to the novel I’m working on and indeed discovered more than a few instances where I’d seen fit to let my phrases accumulate into a stylistic flourish. The catch is that these instances were distantly placed from one another. Moving from my own work to some stories in the Summer Fiction Issue of The New Yorker, I found the same trend. The vast majority of sentences follow Strunk and White’s dictum to be simple and direct, a point Landon acknowledges. Still, for style and rhetorical impact, the long sentences Landon describes are certainly effective.

            Landon and I part ways, though, when it comes to “acrobatic” sentences which “draw attention to themselves.” Giving William Gass a high seat in his pantheon of literary luminaries, Landon explains that “Gass always sees language as a subject every bit as interesting and important as is the referential world his language points to, invokes, or stands for.” While this poststructuralist sentiment seems hard to object to, it misses the point of what language does and how it works. Sentences can call attention to themselves for performing their functions well, but calling attention to themselves should never be one of their functions.

            Writers like Gass and Pynchon and Wallace fail in their quixotic undertakings precisely because they perform too many acrobatics. While it is true that many readers, particularly those who appreciate literary as opposed to popular fiction—yes, there is a difference—are attuned to the pleasures of language, luxuriating in precise and lyrical writing, there’s something perverse about fixating on sentences to the exclusion of things like character. Great words in great sentences incorporating great images and suggestive comparisons can make the world in which a story takes place come alive—so much so that the life of the story escapes the page and transforms the way readers see the world beyond it. But the prompt for us to keep reading is not the promise of more transformative language; it’s the anticipation of transforming characters. Great sentences in literature owe their greatness to the moments of inspiration, from tiny observation to earth-shattering epiphany, experienced by the people at the heart of the story. Their transformations become our transformations. And literary language may seem to derive whatever greatness it achieves from precision and lyricism, but at a more fundamental level of analysis it must be recognized that writing must be precise and lyrical in its detailing of the thoughts and observations of the characters readers seek to connect with. This takes us to a set of considerations that transcend the workings of any given sentence.

            Landon devotes an entire lecture to the rhythm of prose, acknowledging it must be thought of differently from meter in poetry, but failing to arrive at an adequate, objective definition. I wondered all the while why we speak about rhythm at all when we’re discussing passages that don’t follow one. Maybe the rhythm is variable. Maybe it’s somehow progressive and evolving. Or maybe we should simply find a better word to describe this inscrutable quality of impactful and engaging sentences. I propose grace. Indeed, a singer demonstrates grace by adhering to a precisely measured series of vocal steps. Noting a similar type of grace in writing, we’re tempted to hear it as rhythmical, even though its steps are in no way measured. Grace is that quality of action that leaves audiences with an overwhelming sense of its having been well-planned and deftly executed, well-planned because its deft execution appeared so effortless—but with an element of surprise just salient enough to suggest spontaneity. Grace is a delicate balance between the choreographed and the extemporized.

            Grace in writing is achieved insofar as the sequential parts—words, phrases, clauses, sentences, paragraphs, sections, chapters—meet the demands of their surroundings, following one another seamlessly and coherently, performing the function of conveying meaning, in this case of connecting the narrator’s thoughts and experiences to the reader. A passage will strike us as particularly graceful when it conveys a great deal of meaning in a seemingly short chain of words, a feat frequently accomplished with analogies (a point on which Landon is eloquent), or when it conveys a complex idea or set of impressions in a way that’s easily comprehended. I suspect Landon would agree with my definition of grace. But his focus on lyrical or graceful sentences, as opposed to sympathetic or engaging characters—or any of the other aspects of literary writing—precludes him from lighting on the idea that grace can be strategically lain aside for the sake of more immediate connections with the people and events of the story, connections functioning in real-time as the reader’s eyes take in the page.

            Sentences in literature like to function mimetically, though this observation goes unmentioned in the lectures. Landon cites the beautifully graceful line from Gatsby,

Slenderly, languidly, their hands set lightly on their hips the two young women preceded us out onto a rosy-colored porch open toward the sunset where four candles flickered on the table in the diminished wind (16).

The multiple L’s roll out at a slow pace, mimicking the women and the scene being described. This is indeed a great sentence. But so too is the later sentence in which Nick Carraway recalls being chagrined upon discovering the man he’s been talking to about Gatsby is in fact Gatsby himself.

Nick describes how Gatsby tried to reassure him: “He smiled understandingly—much more than understandingly.” The first notable thing about this sentence is that it stutters. Even though Nick is remembering the scene at a more comfortable future time, he re-experiences his embarrassment, and readers can’t help but sympathize. The second thing to note is that this one sentence, despite serving as a crucial step in the development of Nick’s response to meeting Gatsby and forming an impression of him, is just that, a step. The rest of the remarkable passage comes in the following sentences:

It was one of those rare smiles with a quality of eternal reassurance in it, that you may come across four or five times in life. It faced—or seemed to face—the whole external world for an instant, and then concentrated on you with an irresistible prejudice in your favor. It understood you just so far as you wanted to be understood, believed in you as you would like to believe in yourself and assured you that it had precisely the impression of you that, at your best, you hoped to convey. Precisely at that point it vanished—and I was looking at an elegant young rough-neck, a year or two over thirty, whose elaborate formality of speech just missed being absurd. Some time before he introduced himself I’d got a strong impression that he was picking his words with care (52-3).

            Beginning with a solecism (“reassurance in it, that…”) that suggests Nick’s struggle to settle on the right description, moving onto another stutter (or seemed to face) which indicates his skepticism creeping in beside his appreciation of the regard, the passage then moves into one of those cumulative passages Landon so appreciates. But then there’s the jarring incongruity of the smile’s vanishing. This is, as far as I can remember, the line that sold me on the book when I first read it. You can really feel Nick’s confusion and astonishment. And the effect is brought about by sentences, an irreducible sequence of them, that are markedly ungraceful. (Dashes are wonderful for those break-ins so suggestive of spontaneity and advance in real-time.)

Also read:

POSTSTRUCTURALISM: BANAL WHEN IT'S NOT BUSY BEING ABSURD

WHAT TO LEAVE OUT: MINIMALISM AND THE HEMINGWAY MYSTIQUE

Read More
Dennis Junk Dennis Junk

Kayaking on a Wormhole

Being in a kayak on the creek, away from civilization even while you’re smack in the middle of it, works some tricky magic on your sense of time. This is my remembrance and reflection on one particular trip with a friend.

We’d been on the water for quite a while, neither of us at all sure just how much longer we’d be on it before reaching the Hursh Road Bridge, cattycorner to which, in a poorly tended gravel parking lot marking the head of a trail through a nature preserve, I’d parked Kevin's work truck before transferring vehicles to ride with him in his wife’s truck, kayaks strapped to the roof, to Cook’s Landing, another park situated in the shadow of a bridge, this one for Coldwater Road just north of Shoaf on the way to Garret. After maybe an hour of paddling and floating, it occurred to me to start attending to our banter and assessing how faithfully some of the dialogue between friends in my stories mimicked it.

            “What the hell kind of bird is that?”

            “Probably an early bird.”

            “Probably a dirty bird.”

            “Oh yeah, it’s filthy.”

            At one point, after posing a series of questions about what I’d rather have fall on me from the trees—a cricket or a spider?; a spider or a centipede?; a spider or a snake?—followed by the question of what I’d do if I saw a giant snake someone had let loose slither into the water after me, he began a story about a herpetologist in Brazil: “Did you hear what happened?” Of course, I hadn’t heard; the story was from a show on cable about anacondas. This hundred and fifty pound woman was walking through the marshes, tracking a snake, which turned out to be about twenty-eight feet long and five hundred pounds, to study it.

            “She’s following the track it left in the tall grass, and then she senses that there’s something watching her. When she turns around, she sees that it’s reared up”—he held up his arm with his fist bent forward—“so it’s just looking at her at eye level.”

            “Did it say, ‘Who da fuck is you?’”

            “Snakes don’t generally creep me out, but I don’t like the idea of it, like, following her and rearing up like that.”

            “Yeah, I’ve never heard of an anaconda doing that. You hear of cobras doing it. Did she say, ‘Dere’s snakes out here dis big!?’”—my impression of Ice Cube in the movie Anaconda.

            When I asked what she did, he said he didn’t remember. The snake had attacked her, lunging at her face, but she must’ve escaped somehow because she was being interviewed for the show. At a couple points in his recounting of the story, I thought how silly it was. For one thing, it’s impossible to sense something watching you. For another, her being a herpetologist doesn’t rule out the possibility that she was embellishing. And yet I couldn’t help picturing the encounter, vividly, as I paddled my kayak.

            The story was oddly appropriate. Every time we put in on Cedar Creek and make it some distance from the roads, we get the sense that we’re closer to the jungle than we are to civilization. Kevin knows snakes are my fear totem. At times, scenes from the Paul Bowles collection A Delicate Prey, or from Heart of Darkness, or even from Huckleberry Finn would drift into my mind. We did briefly discuss some paleoanthropology—recent discoveries in Dmanisi, Georgia suggesting a possible origin of modern humans in Western Asia rather than Africa—but, for the most part, for the duration of our sojourn on the river, we may as well have been two prepubescent boys. Compared to the way we were talking, the dialogue between friends in my stories is far too sophisticated.

            But that’s really not how we normally talk. As our time on the water accrued long past our upper estimates, and as the fallen-tree-strewn stretches got more and more tricky to traverse, that sense of being far from civilization, far from our lives, our adult lives, became ever more profound. The gnats and mosquitoes and splendidly black dragonflies, their wings tipped with blue, swarmed us whenever we lolled in the shade, getting more bold as more of our bug spray got washed away. We talked about all the ways we’d heard of that Native Americans and other indigenous peoples avoided bug bites and poison plants. The shores were lousy with poison ivy. It was easy to lose your identity. We could’ve been any two guys in the world, at any time in history. The phones were locked away in a waterproof box. The kayaks could’ve been made of anything; the plastic was adventitious. Out here, with the old-growth trees and the ghostly shadows of quickly glimpsed fish, it was the big concrete bridges and the exiguous houses and yards backing up to the creek that seemed impermanent, unreal. Even the human trash washing into the leafy and wooden detritus gathering against the smoothed-over bark of collapsed trees was being dulled and stripped of all signs of cleverness.

            “Do you ever have déjà vu?” Kevin asked after we’d passed all the expected landmarks and gotten over our astonishment at how drastically we’d underestimated the length of the journey down the creek. “Because the first time we kayaked here, I was completely sure I’d had a dream about it—but I had the dream before I’d ever been here.”

            “I think it’s something about the river,” I said, recalling several instances that day when I had experienced an emptying of mind, something I’ve often strived for while meditating but seldom even come to close to achieving. You find yourself being carried downstream, lulled, quieted, your gaze focused on the busy motion of countless tiny bugs on a swatch of surface gilt with sunlight. Their coordinated pattern dazzles you. It’s the only thing remotely resembling a thought. “There’s something about the motion of the water and the way it has you slowly moving along. I keep laying back and watching the undersides of the leaves move over me. It puts you in a trance. It’s hypnotic.”

            “You’re right. It is, like, mesmerizing.” He knew what I was talking about, but we kept shuffling through our stock of words because none of them seemed to get it quite right.

            So there we were, a couple of nameless, ageless guys floating down the river, leaning back to watch the trees slide away upstream, soft white clouds in a soft blue sky, riotous distant stars shattering the immense dark of some timeless night.

            “I forgive you river for making me drag my boat through all those nettles.”

            “I’ll reserve my forgiveness until I find out if I have poison ivy.”

Also read:

THE GHOST HAUNTING 710 CROWDER COURT

GRACIE - INVISIBLE FENCES

PERCY FAWCETT’S 2 LOST CITIES

Read More
Dennis Junk Dennis Junk

Taking the GRE again after 10 Years

I aced the verbal reasoning section of the GRE the first time I took it. It ended up, not being the worst thing that ever happened to me, but… distracting. Eleven years later, I had to take the test again to start trying to make my way back into school. How could I compete with my earlier perfection?

            I had it timed: if I went to the bathroom at 8:25, I’d be finishing up the essay portion of the test about ten minutes after my bladder was full again. Caffeine being essential for me to get into the proper state of mind for writing, I’d woken up to three cans of Diet Mountain Dew and two and half rather large cups of coffee. I knew I might not get called in to take the test precisely at 8:30, but I figured I could handle the pressure, as it were. The clock in the office of the test center read 8:45 when I walked in. Paperwork, signatures, getting a picture taken, turning out all my pockets (where I managed to keep my three talismans concealed)—by the time I was sitting down in the carrel—in a room that might serve as a meeting place for prisoners and their lawyers—it was after 9:00. And there were still more preliminaries to go through.

            Test takers are allotted 45 minutes for an essay on the “Issue Topic” prompted by a short quote. The “Analysis of an Argument” essay takes a half hour. The need to piss got urgent with about ten minutes left on the clock for the issue essay. By the end of the second essay, I was squirming and dancing and pretty desperate. Of course, I had to wait for our warden to let me out of the testing room. And then I had to halt midway through the office to come back and sign myself out. Standing at the urinal—and standing and standing—I had plenty of time to consider how poorly designed my strategy had been. I won’t find out my scores for the essay portion for ten or so days.

**********************************

            I’ve been searching my apartment for the letter with my official scores from the first time I took the GRE about ten years ago. I’d taken it near the end of the summer, at one of those times in life of great intellectual awakening. With bachelor’s degrees in both anthropology and psychology, and with only the most inchoate glimmerings of a few possible plans for the future, I lived in my dad’s enormous house with my oldest brother, who had returned after graduating from Notre Dame and was now taking graduate courses at IPFW, my alma mater, and some roommates. I delivered pizzas in the convertible Mustang I bought as a sort of hand-me-down from that same brother. And I spent hours every day reading.

            I’m curious about the specific date of the test because it would allow me place it in the context of what I was reading. It would also help me ascertain the amount of time I spent preparing. If memory serves, I was doing things like pouring over various books by Stephen Jay Gould and Richard Dawkins, trying to decide which one of them knew the real skinny on how evolution works. I think by then I’d read Frank Sulloway’s Born to Rebel, in which he applied complex statistics to data culled from historical samples and concluded that later-born siblings tend to be less conscientious but more open to new ideas and experiences. I was delighted to hear that the former president had read Jared Diamond’s Guns, Germs, and Steel, and thought it tragically unimaginable that the current president would ever read anything like that. At some point, I began circling words I didn’t recognize or couldn’t define so when I was finished with the chapter I could look them up and make a few flashcards.

            I’m not even sure the flashcards were in anticipation of the GRE. Several of my classmates in both the anthropology and psychology departments had spoken to me by then of their dejection upon receiving their scores. I was scared to take it. The trend seemed to be that everyone was getting about a hundred points less on this test than they did on the SAT. I decided I only really cared about the verbal reasoning section, and a 620 on that really wasn’t acceptable. Beyond the flashcards, I got my hands on a Kaplan CD-ROM from a guy at school and started doing all the practice tests on it. The scores it gave me hovered in the mid-600s. It also gave me scads of unfamiliar words (like scad) to put in my stack of flashcards, which grew, ridiculously, to the height of about a foot.

            I don’t remember much about the test itself. It was at a Sylvan Learning Center that closed a while back. One of the reading comprehension excerpts was on chimpanzees, which I saw as a good sign. When I was done, there was a screen giving me a chance to admit I cheated. It struck me as odd. Then came the screen with my scores—800 verbal reasoning. I looked around the room and saw nothing but the backs of silent test-takers. Could this be right? I never ace anything. It sank in when I was sitting down in the Mustang. Driving home on I-69, I sang along to “The Crush” by Dave Matthews, elated.

            I got accepted into MIT’s program in science writing based on that score and a writing sample in which I defended Frank Sulloway’s birth order theory against Judith Rich Harris, the author of The Nurture Assumption, another great book. But Harris’s arguments struck me as petty and somewhat disgraceful. She was engaging in something akin to a political campaign against a competing theory, rather than making a good faith effort to discover the truth. Anyway, the article I wrote got long and unwieldy. Michael Shermer considered it for publication in Skeptic but ultimately declined because I just didn’t have my chops up when it came to writing about science. By then, I was a writer of fiction.

            That’s why upon discovering how expensive a year in Cambridge would be and how little financial aid I’d be getting I declined MIT's invitation to attend their program. If being a science writer was my dream, I’d have gone. But I decided to hold out for an acceptance to an MFA program in creative writing. I’d already applied two years in row before stretching my net to include science writing. But the year I got accepted at MIT ended up being the third year of summary rejection on the fiction front. I had one more year before that perfect GRE score expired.

**********

            Year four went the same way all the other years had gone. I was in my late twenties now and had the feeling whatever opportunities that were once open to me had slipped away. Next came a crazy job at a restaurant—Lucky’s—and a tumultuous relationship with the kitchen manager. After I had to move out of the apartment I shared with her in the wake of our second breakup (there would be a third), I was in a pretty bad place. But I made the smartest decision I’d made in a while and went back to school to get my master’s in English at IPFW.

            The plan was to improve my qualifications for creative writing programs. And now that I’m nearly finished with the program I put re-taking the GRE at the top of my list for things to do this summer. In the middle of May, I registered to take it on June 22nd. I’d been dreading it ever since my original score expired, but now I was really worried. What would it mean if I didn’t get an 800 again? What if I got significantly lower than that? The MFA programs I’ll be applying to are insanely competitive: between five hundred and a thousand applicants for less than a dozen spaces. At the same time, though, there was a sense that a lower score would serve as this perfect symbol for just how far I’d let my life go off-track.

            Without much conscious awareness of what I was doing, I started playing out a Rocky narrative, or some story like Mohammed Ali making his comeback after losing his boxing license for refusing to serve in Vietnam. I would prove I wasn’t a has-been, that whatever meager accomplishments I had under my belt weren’t flukes. Last semester I wrote a paper on how to practice to be creative, and one of the books I read for it was K. Anders Ericsson’s The Road to Excellence. So, after signing up for the test I created a regimen of what Ericsson calls “deliberate practice,” based on anticipation and immediate feedback. I got my hands on as many sample items and sample tests I could find. I made little flashcards with the correct answers on them to make the feedback as close as possible to the hazarded answer. I put hours and hours into it. And I came up with a strategy for each section, and for every possible contingency I could think of. I was going to beat the GRE, again, through sheer force of will.

***********

            The order of the sections is variable. Ideally, the verbal section would have come first after the essay section so I wouldn’t have to budget my stores of concentration. But sitting down again after relieving my bladder I saw the quantitative section appear before me on the screen. Oh well, I planned for this too, I thought. I adhered pretty well to my strategy of working for a certain length of time to see if I could get the answer and then guessing if it didn’t look promising. And I achieved my goal for this section by not embarrassing myself. I got a 650.

            The trouble began almost immediately when the verbal questions starting coming. The strategy for doing analogies, the questions I most often missed in practice, was to work out the connection between the top words, “the bridge,” before considering the five word couples below to see which one has the same bridge. But because the screen was so large, and because I was still jittery from the caffeine, I couldn’t read the first word pair without seeing all the others. I abandoned the strategy with the first question.

            Then disaster struck. I’d anticipated only two sets of reading comprehension questions, but then, with the five minute warning already having passed, another impossibly long blurb appeared. I resign myself at that point to having to give up my perfect score. I said to myself, “Just read it quick and give the best answers you can.” I finished the section with about twenty seconds left. At least all the antonyms had been easy. Next came an experimental section I agreed to take since I didn’t need to worry about flagging concentration anymore. For the entire eighteen minutes it took, I sat there feeling completely defeated. I doubt my answers for that section will be of much use.

            Finally, I was asked if I wanted to abandon my scores—a ploy, I’m sure to get skittish people to pay to take the test twice. I said no, and clicked to see and record my scores. There it was at the top of the screen, my 800. I’d visualized the moment several times. I was to raise one arm in victory—but I couldn’t because the warden would just think I was raising my hand to signal I needed something. I also couldn’t because I didn’t feel victorious. I still felt defeated. I was sure all the preparation I’d done had been completely pointless. I hadn’t boxed. I’d clenched my jaw, bunched up my fist, and brawled.

            I listened to “The Crush” on the way home again, but as I detoured around all the construction downtown I wasn’t in a celebratory mood. I wasn’t elated. I was disturbed. The experience hadn’t been at all like a Rocky movie. It was a lot more like Gattaca. I’d come in, had my finger pricked so they could read my DNA, and had the verdict delivered to me. Any score could have come up on the screen. I had no control over it. That it turned out to be the one I was after was just an accident. A fluke.

**************

            The week before I took the test, I’d met a woman at Columbia Street who used to teach seventh graders. After telling her I taught Intro Comp at IPFW, we discussed how teaching is a process of translation from how you understand something into a language that will allow others who lack your experience and knowledge to understand it. Then you have to add some element of entertainment so you don’t lose their attention. The younger the students, the more patience it takes to teach them. Beginning when I was an undergrad working in the Writing Center, but really picking up pace as I got more and more experience as a TA, the delight I used to feel in regard to my own cleverness was being superseded by the nagging doubt that I could ever pass along the method behind it to anyone.

            When you’re young (or conservative), it’s easy to look at people who don’t do as well as you with disdain, as if it’s a moral failing on their part. You hold the conviction deep in your gut that if they merely did what you’ve done they’d have what you have or know what you know. Teaching disabuses you of this conviction (which might be why so many teachers are liberal). How many times did I sit with a sharp kid in the writing center trying to explain some element of college writing to him or her, trying to think back to how I had figured it out, and realizing either that I’d simply understood it without much effort or arrived at an understanding through a process that had already failed this kid? You might expect such a realization would make someone feel really brilliant. But in fact it’s humbling. You wonder how many things there are, fascinating things, important things, that despite your own best effort you’ll never really get. Someone, for instance, probably “just gets” how to relay complex information to freshman writers—just gets teaching.

            And if, despite your efforts, you’re simply accorded a faculty for perceiving this or understanding that, if you ever lose it your prospects for recreating the same magic are dismal. What can be given can be taken away. Finally, there’s the question of desert. That I can score an 800 on the verbal reasoning section of the GRE is not tied to my effort or to my will. I like to read, always have. It’s not work to me. My proficiency is morally arbitrary. And yet everyone will say about my accomplishments and accolades, “You deserve it.”

            Really, though, this unsettled feeling notwithstanding, this is some stupid shit to complain about. I aced the GRE—again. It’s time to celebrate.

Also read:

GRACIE - INVISIBLE FENCES

SECRET DANCERS

THE GHOST HAUNTING 710 CROWDER COURT

KAYAKING ON A WORMHOLE

Read More
Dennis Junk Dennis Junk

Art as Altruism: Lily Briscoe and the Ghost of Mrs. Ramsay in To the Lighthouse Part 1 of 2

Woolf’s struggle with her mother, and its manifestation as Lily’s struggle with Mrs. Ramsay, represents a sort of trial in which the younger living woman defends herself against a charge of selfishness leveled by her deceased elder. And since Woolf’s obsession with her mother ceased upon completion of the novel, she must have been satisfied that she had successfully exonerated herself.

Virginia Woolf underwent a transformation in the process of writing To the Lighthouse the nature of which has been the subject of much scholarly inquiry. At the center of the novel is the relationship between the beautiful, self-sacrificing, and yet officious Mrs. Ramsay, and the retiring, introverted artist Lily Briscoe. “I wrote the book very quickly,” Woolf recalls in “Sketch of the Past,” “and when it was written, I ceased to be obsessed by my mother. I no longer hear her voice; I do not see her.” Quoting these lines, biographer Hermione Lee suggests the novel is all about Woolf’s parents, “a way of pacifying their ghosts” (476). But how exactly did writing the novel function to end Woolf’s obsession with her mother? And, for that matter, why would she, at forty-four, still be obsessed with a woman who had died when she was only thirteen? Evolutionary psychologist Jesse Bering suggests that while humans are uniquely capable of imagining the inner workings of each other’s minds, the cognitive mechanisms underlying this capacity, which psychologists call “theory of mind,” simply fail to comprehend the utter extinction of those other minds. However, the lingering presence of the dead is not merely a byproduct of humans’ need to understand and communicate with other living humans. Bering argues that the watchful gaze of disembodied minds—real or imagined—serves a type of police function, ensuring that otherwise selfish and sneaky individuals cooperate and play by the rules of society. From this perspective, Woolf’s struggle with her mother, and its manifestation as Lily’s struggle with Mrs. Ramsay, represents a sort of trial in which the younger living woman defends herself against a charge of selfishness leveled by her deceased elder. And since Woolf’s obsession with her mother ceased upon completion of the novel, she must have been satisfied that she had successfully exonerated herself.

Woolf made no secret of the fact that Mr. and Mrs. Ramsay were fictionalized versions of her own parents, and most critics see Lily as a stand-in for the author—even though she is merely a friend of the Ramsay family. These complex relationships between author and character, and between daughter and parents, lie at the heart of a dynamic which readily lends itself to psychoanalytic explorations. Jane Lilienfeld, for instance, suggests Woolf created Lily as a proxy to help her accept her parents, both long dead by the time she began writing, “as monumental but flawed human beings,” whom she both adored and detested. Having reduced the grand, archetypal Mrs. Ramsay to her proper human dimensions, Lily is free to acknowledge her own “validity as a single woman, as an artist whose power comes not from manipulating others’ lives in order to fulfill herself, but one whose mature vision encapsulates and transcends reality” (372). But for all the elaborate dealings with mythical and mysterious psychic forces, the theories of Freud and Jung explain very little about why writers write and why readers read. And they explain very little about how people relate to the dead, or about what role the dead play in narrative. Freud may have been right about humans’ intense ambivalence toward their parents, but why should this tension persist long after those parents have ceased to exist? And Jung may have been correct in his detection of mythic resonances in his patients’ dreams, but what accounts for such universal narrative patterns? What do they explain?

Looking at narrative from the perspective of modern evolutionary biology offers several important insights into why people devote so much time and energy to, and get so much gratification from immersing themselves in the plights and dealings of fictional characters. Anthropologists believe the primary concern for our species at the time of its origin was the threat of rival tribes vying for control of limited resources. The legacy of this threat is the persistent proclivity for tribal—us versus them—thinking among modern humans. But alongside our penchant for dehumanizing members of out-groups arose a set of mechanisms designed to encourage—and when necessary to enforce—in-group cooperation for the sake of out-competing less cohesive tribes. Evolutionary literary theorist William Flesch sees in narrative a play of these cooperation-enhancing mechanisms. He writes, “our capacity for narrative developed as a way for us to keep track of cooperators” (67), and he goes on to suggest we tend to align ourselves with those we perceive as especially cooperative or altruistic while feeling an intense desire to see those who demonstrate selfishness get their comeuppance. This is because “altruism could not sustain an evolutionarily stable system without the contribution of altruistic punishers to punish the free-riders who would flourish in a population of purely benevolent altruists” (66). Flesch cites the findings of numerous experiments which demonstrate people’s willingness to punish those they see as exploiting unspoken social compacts and implicit rules of fair dealing, even when meting out that punishment involves costs or risks to the punisher (31-34). Child psychologist Karen Wynn has found that even infants too young to speak prefer to play with puppets or blocks with crude plastic eyes that have in some way demonstrated their altruism over the ones they have seen behaving selfishly or aggressively (557-560). Such experiments lead Flesch to posit a social monitoring and volunteered affect theory of narrative interest, whereby humans track the behavior of others, even fictional others, in order to assess their propensity for altruism or selfishness and are anxious to see that the altruistic are vindicated while the selfish are punished. In responding thus to other people’s behavior, whether they are fictional or real, the individual signals his or her own propensity for second- or third-order altruism.

The plot of To the Lighthouse is unlike anything else in literature, and yet a great deal of information is provided regarding the relative cooperativeness of each of the characters. Foremost among them in her compassion for others is Mrs. Ramsay. While it is true from the perspective of her own genetic interests that her heroic devotion to her husband and their eight children can be considered selfish, she nonetheless extends her care beyond the sphere of her family. She even concerns herself with the tribulations of complete strangers, something readers discover early in the novel, as

she ruminated the other problem, of rich and poor, and the things she saw with her own eyes… when she visited this widow, or that struggling wife in person with a bag on her arm, and a note- book and pencil with which she wrote down in columns carefully ruled for the purpose wages and spendings, employment and unemployment, in the hope that thus she would cease to be a private woman whose charity was half a sop to her own indignation, half relief to her own curiosity, and become what with her untrained mind she greatly admired, an investigator, elucidating the social problem. (9)

No sooner does she finish reflecting on this social problem than she catches sight of her husband’s friend Charles Tansley, who is feeling bored and “out of things,” because no one staying at the Ramsays’ summer house likes him. Regardless of the topic Tansley discusses with them, “until he had turned the whole thing around and made it somehow reflect himself and disparage them—he was not satisfied” (8). And yet Mrs. Ramsay feels compelled to invite him along on an errand so that he does not have to be alone. Before leaving the premises, though, she has to ask yet another houseguest, Augustus Carmichael, “if he wanted anything” (10). She shows this type of exquisite sensitivity to others’ feelings and states of mind throughout the first section of the novel.

Mrs. Ramsay’s feelings about Lily, another houseguest, are at once dismissive and solicitous. Readers are introduced to Lily only through Mrs. Ramsay’s sudden realization, after prolonged absentmindedness, that she is supposed to be holding still so Lily can paint her. Mrs. Ramsay’s son James, who is sitting with her as he cuts pictures out of a catalogue, makes a strange noise she worries might embarrass him. She turns to see if anyone has heard: “Only Lily Briscoe, she was glad to find; and that did not matter.” Mrs. Ramsay is doing Lily the favor of posing, but the gesture goes no further than mere politeness. Still, there is a quality the younger woman possesses that she admires. “With her little Chinese eyes,” Mrs. Ramsay thinks, “and her puckered-up face, she would never marry; one could not take her painting very seriously; she was an independent little creature, and Mrs. Ramsay liked her for it” (17). Lily’s feelings toward her hostess, on the other hand, though based on a similar recognition that the other enjoys aspects of life utterly foreign to her, are much more intense. At one point early in the novel, Lily wonders, “what could one say to her?” The answer she hazards is “I’m in love with you?” But she decides that is not true and settles on, “‘I’m in love with this all,’ waving her hand at the hedge, at the house, at the children” (19). What Lily loves, and what she tries to capture in her painting, is the essence of the family life Mrs. Ramsay represents, the life Lily herself has rejected in pursuit of her art. It must be noted too that, though Mrs. Ramsay is not related to Lily, Lily has only an elderly father, and so some of the appeal of the large, intact Ramsay family to Lily is the fact that she has been sometime without a mother.

Apart from admiring in the other what each lacks herself, the two women share little in common. The tension between them derives from Lily’s having resigned herself to life without a husband, life in the service of her art and caring for her father, while Mrs. Ramsay simply cannot imagine how any woman could be content without a family. Underlying this conviction is Mrs. Ramsay’s unique view of men and her relationship to them:

Indeed, she had the whole of the other sex under her protection; for reasons she could not explain, for their chivalry and valour, for the fact that they negotiated treaties, ruled India, controlled finance; finally for an attitude towards herself which no woman could fail to feel or to find agreeable, something trustful, childlike, reverential; which an old woman could take from a young man without loss of dignity, and woe betide the girl—pray Heaven it was none of her daughters!—who did not feel the worth of it, and all that it implied, to the marrow of her bones! (6)

In other words, woe betide Lily Briscoe. Anthropologists Peter Richerson and Robert Boyd, whose work on the evolution of cooperation in humans provides the foundation for Flesch’s theory of narrative, put forth the idea that culture functions to simultaneously maintain group cohesion and to help the group adapt to whatever environment it inhabits. “Human cultures,” they point out, “can change even more quickly than the most rapid examples of genetic evolution by natural selection” (43). What underlies the divergence of views about women’s roles between the two women in Woolf’s novel is that their culture is undergoing major transformations owing to political and economic upheaval in the lead-up to The First World War.

Lily has no long-established tradition of women artists in which to find solace and guidance; rather, the most salient model of womanhood is the family-minded, self-sacrificing Mrs. Ramsay. It is therefore to Mrs. Ramsay that Lily must justify her attempt at establishing a new tradition. She reads the older woman as making the implicit claim that “an unmarried woman has missed the best of life.” In response, Lily imagines how

gathering a desperate courage she would urge her own exemption from the universal law; plead for it; she liked to be alone; she liked to be herself; she was not made for that; and so have to meet a serious stare from eyes of unparalleled depth, and confront Mrs. Ramsay’s simple certainty… that her dear Lily, her little Brisk, was a fool. (50)

Living alone, being herself, and refusing to give up her time or her being to any husband or children strikes even Lily herself as both selfish and illegitimate, lacking cultural sanction and therefore doubly selfish. Trying to figure out the basis of her attraction to Mrs. Ramsay, beyond her obvious beauty, Lily asks herself, “did she lock up within her some secret which certainly Lily Briscoe believed people must have for the world to go on at all? Every one could not be as helter skelter, hand to mouth as she was” (50). Lily’s dilemma is that she can either be herself, or she can be a member of a family, because being a member of a family means she cannot be wholly herself; like Mrs. Ramsay, she would have to make compromises, and her art would cease to have any more significance than the older woman’s note-book with all its writing devoted to social problems. But she must justify devoting her life only to herself. Meanwhile, she’s desperate for some form of human connection beyond the casual greetings and formal exchanges that take place under the Ramsays’ roof.

Lily expresses a desire not just for knowledge from Mrs. Ramsay but for actual unity with her because what she needs is “nothing that could be written in any language known to men.” She wants to be intimate with the “knowledge and wisdom… stored up in Mrs. Ramsay’s heart,” not any factual information that could be channeled through print. The metaphor Lily uses for her struggle is particularly striking for anyone who studies human evolution.

How then, she had asked herself, did one know one thing or another thing about people, sealed as they were? Only like a bee, drawn by some sweetness or sharpness in the air intangible to touch or taste, one haunted the dome-shaped hive, ranged the wastes of the air over the countries of the world alone, and then haunted the hives with their murmurs and their stirrings; the hives, which were people. (51)

According to evolutionary biologist David Sloan Wilson, bees are one of only about fifteen species of social insect that have crossed the “Cooperation Divide,” beyond which natural selection at the level of the group supercedes selection at the level of the individual. “Social insect colonies qualify as organisms,” Wilson writes, “not because they are physically bounded but because their members coordinate their activities in organ-like fashion to perpetuate the whole” (144). The main element that separates humans from their ancestors and other primates, he argues, “is that we are evolution’s newest transition from groups of organisms to groups as organisms. Our social groups are the primate equivalent of bodies and beehives” (154). The secret locked away from Lily in Mrs. Ramsay’s heart, the essence of the Ramsay family that she loves so intensely and feels compelled to capture in her painting, is that human individuals are adapted to life in groups of other humans who together represent a type of unitary body. In trying to live by herself and for herself, Lily is going not only against the cultural traditions of the previous generation but even against her own nature.

Part 2.

Read More
Dennis Junk Dennis Junk

How to Read Stories--You're probably doing it wrong

Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

There are whole books out there about how to read like a professor or a writer, or how to speed-read and still remember every word. For the most part, you can discard all of them. Studies have shown speed readers are frauds—the faster they read the less they comprehend and remember. The professors suggest applying the wacky theories they use to write their scholarly articles, theories which serve to cast readers out of the story into some abstract realm of symbols, psychological forces, or politics. I find the endeavor offensive.

Writers writing about how to read like a writer are operating on good faith. They just tend to be a bit deluded. Literature is very much like a magic trick, but of course it’s not real magic. They like to encourage people to stand in awe of great works and great passages—something I frankly don’t need any encouragement to do (what is it about the end of “Mr. Sammler’s Planet”?) But to get to those mystical passages you have to read a lot of workaday prose, even in the work of the most lyrical and crafty writers. Awe simply can’t be used as a reading strategy.

Good fiction is like a magic trick because it’s constructed of small parts that our minds can’t help responding to holistically. We read a few lines and all the sudden we have a person in mind; after a few pages we find ourselves caring about what happens to this person. Writers often avoid talking about the trick and the methods and strategies that go into it because they’re afraid once the mystery is gone the trick will cease to convince. But even good magicians will tell you well performed routines frequently astonish even the one performing them. Focusing on the parts does not diminish appreciation for the whole.

The way to read a piece of fiction is to use the information you've already read in order to anticipate what will happen next. Most contemporary stories are divided into several sections, which offer readers the opportunity to pause after each, reflecting how it may fit into the whole of the work. The author had a purpose in including each section: furthering the plot, revealing the character’s personality, developing a theme, or playing with perspective. Practice posing the questions to yourself at the end of each section, what has the author just done, and what does it suggests she’ll likely do in sections to come.

In the early sections, questions will probably be general: What type of story is this? What type of characters are these? But by the time you reach about the two/thirds point they will be much more specific: What’s the author going to do with this character? How is this tension going to be resolved? Efforts to classify and anticipate the elements of the story will, if nothing else, lead to greater engagement with it. Every new character should be memorized—even if doing so requires a mnemonic (practice coming up with one on the fly).

The larger goal, though, is a better understanding of how the type of fiction you read works. Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

By trying to get ahead of the author, as it were, you won’t be learning to simply reproduce the same effects. By internalizing the strategies, making them automatic, you’ll be freeing up your conscious mind for new flights of creative re-working. You’ll be using the more skilled author’s work to bootstrap your own skill level. But once you’ve accomplished this there’ll be nothing stopping you from taking your own writing to the next level. Anticipation makes reading a challenge in real time—like a video game. And games can be conquered.

Finally, if a story moves you strongly, re-read it immediately. And then put it in a stack for future re-reading.

Also read:

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

Read More
Dennis Junk Dennis Junk

Productivity as Practice: An Expert Performance Approach to Creative Writing Pedagogy Part 1

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

            Much of the pedagogy in creative writing workshops derives solely from tradition and rests on the assumption that the mind of the talented writer will adopt its own learned practices in the process of writing. The difficult question of whether mastery, or even expertise, can be inculcated through any process of instruction, and the long-standing tradition of assuming the answer is an only somewhat qualified “no”, comprise just one of several impediments to developing an empirically supported set of teaching methods for aspiring writers. Even the phrase, “empirically supported,” conjures for many the specter of formula, which they fear students will be encouraged to apply to their writing, robbing the products of some mysterious and ineffable quality of freshness and spontaneity. Since the criterion of originality is only one of several that are much easier to recognize than they are to define, the biggest hindrance to moving traditional workshop pedagogy onto firmer empirical ground may be the intractability of the question of what evaluative standards should be applied to student writing. Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is as difficult to describe in any detail as the standards by which work in that domain are evaluated.

            Paul Kezle, in a review article whose title, “What Creative Writing Pedagogy Might Be,” promises more than the conclusions deliver, writes, “The Iowa Workshop model originally laid out by Paul Engle stands as the pillar of origination for all debate about creative writing pedagogy” (127). This model, which Kezle describes as one of “top-down apprenticeship,” involves a published author who’s achieved some level of acclaim—usually commensurate to the prestige of the school housing the program—whose teaching method consists of little more than moderating evaluative class discussions on each student’s work in turn. The appeal of this method is two-fold. As Shirley Geok-lin Lim explains, it “reliev[es] the teacher of the necessity to offer teacher feedback to students’ writing, through editing, commentary, and other one-to-one, labor intensive, authority-based evaluation” (81), leaving the teacher more time to write his or her own work as the students essentially teach each other and, hopefully, themselves. This aspect of self-teaching is the second main appeal of the workshop method—it bypasses the pesky issue of whether creative writing can be taught, letting the gates of the sacred citadel of creative talent remain closed. Furthermore, as is made inescapably clear in Mark McGurl’s book The Program Era, which tracks the burgeoning of creative writing programs as their numbers go from less than eighty in 1975 to nearly nine hundred today, the method works, at least in terms of its own proliferation.

            But what, beyond enrolling in a workshop, can a writer do to get better at writing? The answer to this question, assuming it can be reliably applied to other writers, holds the key to answering the question of what creative writing teachers can do to help their students improve. Lim, along with many other scholars and teachers with backgrounds in composition, suggests that pedagogy needs to get beyond “lore,” by which she means “the ad hoc strategies composing what today is widely accepted as standard workshop technique” (79). Unfortunately, the direction these theorists take is forbiddingly abstruse, focusing on issues of gender and ethnic identity in the classroom, or the negotiation of power roles (see Russel 109 for a review.) Their prescription for creative writing pedagogy boils down to an injunction to introduce students to poststructuralist ways of thinking and writing. An example sentence from Lim will suffice to show why implementing this approach would be impractical:

As Kalamaras has argued, however, collective identities, socially constructed, historically circumscribed, uniquely experienced, call for a “socially responsible” engagement, not only on the level of theme and content but particularly on that of language awareness, whether of oral or dialectic-orthographic “voice,” lexical choice, particular idiolect features, linguistic registers, and what Mikhail Bakhtin called heteroglossic characteristics. (86)

Assuming the goal is not to help marginalized individuals find a voice and communicate effectively and expressively in society but rather to help a group of students demonstrating some degree of both talent and passion in the realm of creative writing to reach the highest levels of success possible—or even simply to succeed in finding a way to get paid for doing what they love—arcane linguistic theories are unlikely to be of much use. (Whether they’re of any real use even for the prior goal is debatable.)

            Conceiving of creative writing as the product of a type of performance demanding several discrete skills, at least some of which are improvable through training, brings it into a realm that has been explored with increasing comprehensiveness and with ever more refined methods by psychologists. While University of Chicago professor Mihaly Csikszentmihalyi writes about the large group of highly successful people in creative fields interviewed for his book Creativity: Flow and the Psychology of Discovery and Invention as if they were a breed apart, even going so far as to devote an entire chapter to “The Creative Personality,” and in so doing reinforcing the idea that creative talent is something one is simply born with, he does manage to provide several potentially useful strategies for “Enhancing Personal Creativity” in a chapter by that name. “Just as a physician may look at the physical habits of the most healthy individuals” Csikszentmihalyi writes, “to find in them a prescription that will help everyone else to be more healthy, so we may extract some useful ideas from the lives of a few creative persons about how to enrich the lives of everyone else” (343). The aspirant creative writer must understand, though, that “to move from personal to cultural creativity one needs talent, training, and an enormous dose of good luck” (344). This equation, as it suggests only one variable amenable to deliberate effort, offers a refinement to the question of what an effective creative writing pedagogy might entail. How does one train to be a better a writer? Training as a determining factor underlying exceptional accomplishments is underscored by Ericsson’s finding that “amount of experience in a domain is often a weak predictor of performance” (20). Simply writing poems and stories may not be enough to ensure success in the realm of creative writing, especially considering the intense competition evidenced by those nearly nine hundred MFA programs.

            Because writing stories and poems seldom entails a performance in real time, but instead involves multiple opportunities for inspiration and revision, the distinction Ericsson found between simply engaging in an activity and training for it may not be as stark for creative writing. Writing and training may overlap if the tasks involved in writing meet the requirements for effective training. Having identified deliberate practice as the most important predictor of expert performance, Ericsson breaks the concept down into three elements: “a well-defined task with an appropriate level of difficulty for the particular individual, informative feedback, and opportunities for repetition and corrections of errors” (21). Deliberate practice requires immediate feedback on performance. In a sense, success can be said to multiply in direct proportion to the accumulation of past failures. But how is a poet to know if the line she’s just written constitutes a success or failure? How does a novelist know if a scene or a chapter bears comparison to the greats of literature?

            One possible way to get around the problem of indefinable evaluative standards is to focus on quantity instead of quality. Ericsson’s colleague, Dean Simonton, studies people in various fields in which innovation is highly valued in an attempt to discover what separates those who exhibit “received expertise,” mastering and carrying on dominant traditions in arts or sciences, from those who show “creative expertise” (228) by transforming or advancing those traditions. Contrary to the conventional view that some individuals possess a finely attuned sense of how to go about producing a successful creative work, Simonton finds that what he calls “the equal odds rule” holds in every creative field he’s studied. What the rule suggests is “that quality correlates positively with quantity, so that creativity becomes a linear statistical function of productivity” (235). Individuals working in creative fields can never be sure which of their works will have an impact, so the creators who have the greatest impact tend to be those who produce the greatest number of works. Simonton has discovered that this rule holds at every stage in the individual’s lifespan, leading him to conclude that success derives more from productivity and playing the odds than from sure-footed and far-seeing genius. “The odds of hitting a bull’s eye,” he writes, “is a probabilistic function of the number of shots” (234). Csikszentmihalyi discovered a similar quantitative principle among the creative people he surveyed; part of creativity, he suggests, is having multiple ideas where only one seems necessary, leading him to the prescription for enhancing personal creativity, “Produce as many ideas as possible” (368).

Part 2 of this essay.

Read More
Dennis Junk Dennis Junk

Magic, Fiction, and the Illusion of Free Will part 1 of 2

It’s nothing new for people with a modicum of familiarity with psychology that there’s an illusory aspect to all our perceptions, but in reality it would be more accurate to say there’s a slight perceptual aspect to all our illusions. And one of those illusions is our sense of ourselves.

E.M. Forster famously wrote in his book Aspects of the Novel that what marks a plot resolution as gratifying is that it is both surprising and seemingly inevitable. Many have noted the similarity of this element of storytelling to riddles and magic tricks. “It’s no accident,” William Flesch writes in Comeuppance, “that so many stories revolve around riddles and their solutions” (133). Alfred Hitchcock put it this way: “Tell the audience what you’re going to do and make them wonder how.” In an ever-more competitive fiction market, all the lyrical prose and sympathy-inspiring characterization a brilliant mind can muster will be for naught if the author can’t pose a good riddle or perform some eye-popping magic.

Neuroscientist Stephen L. Macnick and his wife Susana Martinez-Conde turned to magicians as an experiment in thinking outside the box, hoping to glean insights into how the mind works from those following a tradition which takes advantage of its shortcuts and blind spots. The book that came of this collaboration, Sleights of Mind: What the Neuroscience of Magic Reveals about our Everyday Deceptions, is itself both surprising and seemingly inevitable (the website for the book). What a perfect blend of methods and traditions in the service of illuminating the mysteries of human perception and cognition. The book begins somewhat mundanely, with descriptions of magic tricks and how they’re done interspersed with sections on basic neuroscience. Readers of Skeptic Magazine or any of the works in the skeptical tradition will likely find the opening chapters hum-drum. But the sections have a cumulative effect.

The hook point for me was the fifth chapter, “The Gorilla in Your Midst,” which takes its title from the famous experiment conducted by Daniel Simons and Christopher Chabris in which participants are asked to watch a video of a group of people passing a basketball around and count the number of passes. A large portion of the participants are so engrossed in the task of counting that they miss a person walking onto the scene in a gorilla costume, who moves to the center of the screen, pounds on his chest, and then walks off camera. A subsequent study by Daniel Memmert tracked people’s eyes while they were watching the video and found that their failure to notice the gorilla wasn’t attributable to the focus of their gaze. Their eyes were directly on it. The failure to notice was a matter of higher-order brain processes: they weren’t looking for a gorilla, so they didn’t see it, even though their eyes were on it. Macnick and Martinez-Conde like to show the video to their students and ask the ones who do manage to notice the gorilla how many times the ball was passed. They never get the right answer. Of course, magicians exploit this limitation in our attention all the time. But we don’t have to go to a magic show to be exploited—we have marketers, PR specialists, the entertainment industry.

At the very least, I hoped Sleights of Mind would be a useful compendium of neuroscience concepts—a refresher course—along with some basic magic tricks that might help make the abstract theories more intuitive. At best, I hoped to glean some insight into how to arrange a sequence of events to achieve that surprising and inevitable effect in the plots of my stories. Some of the tricks might even inspire a plot twist or two. The lesser hope has been gratified spectacularly. It’s too soon to assess whether the greater one will be satisfied. But the book has impressed me on another front I hadn’t anticipated. Having just finished the ninth of twelve chapters, I’m left both disturbed and exhilarated in a way similar to how you feel reading the best of Oliver Sacks or Steven Pinker. There’s some weird shit going on in your brain behind the scenes of the normal stuff you experience in your mind. It’s nothing new for people with a modicum of familiarity with psychology that there’s an illusory aspect to all our perceptions, but in reality it would be more accurate to say there’s a slight perceptual aspect to all our illusions. And one of those illusions is our sense of ourselves.

I found myself wanting to scan the entire seventh chapter, “The Indian Rope Trick,” so I could send a pdf file to everyone I know. It might be the best summation I’ve read of all the ways we overestimate the power of our memories. So many people you talk to express an unwillingness to accept well established findings in psychology and other fields of science because the data don’t mesh with their experiences. Of course, we only have access to our experiences through memory. What those who put experience before science don’t realize is that memories aren’t anything like direct recordings of events; they’re bricolages of impressions laid down prior to the experience, a scant few actual details, and several impressions received well afterward. Your knowledge doesn’t arise from your experiences; your experiences arise from your knowledge. The authors write:

As the memory plays out in your mind, you may have the strong impression that it’s a high-fidelity record, but only a few of its contents are truly accurate. The rest of it is a bunch of props, backdrops, casting extras, and stock footage your mind furnishes on the fly in an unconscious process known as confabulation (119).

The authors go on to explore how confabulation creates the illusion of free will in the ninth chapter, “May the Force be with You.” Petter Johansson and Lars Hall discovered a phenomenon they call “choice blindness” by presenting participants in an experiment with photographs of two women of about equal attractiveness and asking them to choose which one they preferred. In a brilliant mesh of magic with science, the researchers then passed the picture over to the participant and asked him or her to explain their choice—only they used sleight of hand to switch the pictures. Most of them didn’t notice, and they went on to explain why they chose the woman in the picture that they had in fact rejected. The explanations got pretty elaborate too.

Part 2 of this essay.

Read More
Dennis Junk Dennis Junk

They Comes a Day: Celebrating Cooperation in A Gathering of Old Men and Horton Hears a Who! Part 1

The appeal of stories like Horton Hears a Who! and A Gathering of Old Men lies in our strong human desire to see people who are willing to cooperate, even at great cost to themselves, prevail over those who behave only on their own behalves.

            Ernest Gaines opens his novel A Gathering of Old Men with a young boy named Snookum being sent on an errand to tell a group of men to come together in defense of an individual named Mathu, a black man who readers are led to believe has shot and killed a white man on a post-civil rights era Louisiana plantation still carrying on the legacy of Jim Crow. But the goal of protecting Mathu from revenge at the hands of the white man’s family gets subsumed by a greater cause, that of ensuring all the gathered men be treated as men and not like slaves. Though it may seem a flippant comparison, there are many parallels between Gaines’s novel and the children’s classic Horton Hears a Who! by Dr. Seuss, which likewise features a gathering of threatened people who can only save themselves by collectively calling for recognition of their personhood. Evolutionary critics, who see in narratives a play of evolved psychological mechanisms, would view this resemblance as more than coincidence.

            Brian Boyd examines Horton in his book On the Origin of Stories, juxtaposing it with Homer’s epic The Odyssey to demonstrate that both the children’s story and the ageless classic for adults engage emotional adaptations shared by all humans. Boyd’s theoretical framework incorporates a wide array of findings from both evolutionary and cognitive science. Though much of his thinking overlaps with the ideas William Flesch puts forth in Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction, Flesch’s theory of narrative is at once more focused and multidimensional. Flesch theorizes that our thoughts and feelings are engaged while reading a story because we’ve evolved to monitor others—even fictional others—for signals of altruism and to emotionally favor those who emit them, while at the same time wanting to see those who behave selfishly get punished. He arrives at this social monitoring and volunteered affect model using research into the evolution of cooperation in humans, research which Boyd likewise refers to in explaining universal narrative themes. Though Flesch’s ideas are more compelling because he focuses more on the experience of reading stories than on their thematic content, both authors would agree that the appeal of stories like Horton and Gathering lies in our strong human desire to see people who are willing to cooperate, even at great cost to themselves, prevail over those who behave only on their own behalves.

            Though her research was published too late to be included in either Flesch’s or Boyd’s book, Karen Wynn, a Yale psychologist who studies the development of social behavior in children, has conducted experiments that highlight how integral the task of separating selfish actors from cooperators is even for children too young to speak. In one setup, infants watch a puppet show that features a small white tiger who wants to play ball and two rabbits, each of whom respond quite differently to the tiger’s overtures. One rabbit, distinguished by a green jacket, rudely steals off with the ball after the tiger has rolled it over. But when the other rabbit, this one in an orange jacket, receives the ball from the tiger, the two end up playfully rolling it back and forth to each other. The young children attend to these exchanges with rapt interest, and when presented with a choice afterward of which rabbit to play with they almost invariably choose the one with the orange jacket, the cooperative one. This preference extends even to wooden blocks with nothing but crude eyes to suggest they’re living beings. When Wynn’s colleagues stage a demonstration in which one block hinders another’s attempt to climb a hill, and then subsequently a third block helps the climber, children afterward overwhelmingly choose the helper to play with. Wynn concludes that “preverbal infants assess individuals on the basis of their behavior toward others” (557). Evolutionary game theorists, who use mathematical models to simulate encounters between individuals relying on varying strategies for dealing with others in an attempt to determine how likely each strategy is to evolve, call the behavior Wynn and her colleagues observed strong reciprocity, which Flesch explains occurs when “the strong reciprocator punishes or rewards others for their behavior toward any member of the social group, and not just or primarily for their individual interactions with the reciprocator” (22).

            Children reading Horton—or having it read to them—probably become engaged initially because they appreciate Horton’s efforts to protect the speck of dust on which he hears a voice calling for help. But that’s only the beginning of the elephant’s struggle to keep the microscopic creatures called the Whos safe. At one point, after chasing the eagle Vlad Vlad-i-koff, who has stolen a clover Horton has placed the Who’s speck of dust on, all through the night over absurdly rugged terrain, the elephant has to pick through a field with millions of nearly identical clovers before recovering the one with the Whos on it. The accompanying illustration of the slumped and bedraggled elephant shows beyond doubt the lengths to which Horton is willing to go on behalf of his friends. And, as Boyd points out, “we all love an altruist. As game theory simulations of cooperation show, any participant in a social exchange benefits when the other partner is an altruist. And Horton’s altruism is as colossal as his physique” (375). But Flesch would emphasize that we don’t favor Horton merely because he would be a good exchange partner for each of us to deal with directly; rather, we can signal our own altruism by volunteering affect on behalf of someone who has clearly demonstrated his own. He writes that

Among the kinds of behavior that we monitor through tracking or through report, and that we have a tendency to punish or reward, is the way others monitor behavior through tracking or through report, and the way they manifest a tendency to punish and reward (50).

So, even as we’re assessing someone to determine how selfish or altruistic he or she is, others are assessing us to see how we respond to what we discover. Favoring an altruist (or showing disfavor for a selfish actor) is itself a signal of altruism. In game theory terms, witnesses can become second-order altruists, or third-order, or however many order. But how could this propensity toward monitoring and cooperation have evolved in a Darwinian world of intense competition for survival and reproduction?

            The main conceptual tool used by game theorists to see how various strategies for dealing with others fare when pitted against each other is a scenario called The Prisoner’s Dilemma. Imagine two criminals are arrested and taken to separate rooms to be interrogated without being given a chance to consult with one another. If both criminals keep their mouths shut and confess to nothing, then they will both serve a prison sentence of one year. So their cooperation results in a negative outcome. However, if both criminals confess, the outcome is a longer, five-year sentence. What makes the scenario useful in understanding how cooperation could have evolved is the condition that if just one criminal confesses—if he or she takes advantage of the fellow prisoner’s cooperation—the confessor goes free without spending any more time in custody. Meanwhile, the criminal who doesn’t confess, but whose partner does, gets a sentence of twenty years. The idea is that small benefits accrue over time to cooperators, but there’s always temptation for individuals to act for their own short-term benefit to their partners’ detriment (Flesch 23; Boyd 56 uses slightly different numbers but to the same effect).

            The Prisoner’s Dilemma has several variations, and it can be scaled up to conceptualize cooperation among groups with more than two members. The single Who not shouting in Horton is an example of how even a lone free-rider, a “shirker,” can undermine group cohesion. And, mild as it is, this character gets some comeuppance when Seuss refers to him as a “twerp.” More severe punishment turns out to be unnecessary because the mayor of Who-ville prevails upon him how important his cooperation is. In Gathering, the men likewise face a prisoner’s dilemma when, having all brought their own shotguns and shown their own willingness to confess to the killing of the white man named Beau, Sheriff Mapes begins separating each of them in turn from the group gathered around Mathu’s porch and beating them when they refuse to name Mathu as the true culprit. Speaking to Mathu, Mapes says, “I know you did it… You’re the only one here man enough. But I have to hear it from one of them. One of them must say he was called here after it happened” (85). If just one man buckles under the sheriff’s abuse, analogous to the one year sentence for cooperators in The Prisoner’s Dilemma, then all their efforts will be for naught and the men will miss out on their opportunity to stand up to their white oppressors. The gathered men face another similar dilemma when the racist Luke Will shows up with his own group to lynch Mathu; as long as the older men cooperate, they maintain an advantage over the whites who can’t imagine them standing up at all, much less standing up together.

Part 2

Read More
Dennis Junk Dennis Junk

Defiance and Duplicity: Decoding Poe’s Attacks on Readers Part 1 of 4

To say Edgar Allan Poe had a defiant streak is an enormity of understatement. He had nothing but contempt for many of his fellow writers, and he seemed at times almost embarrassed to be writing for the types of audiences who loved his most over-the-top stories. He may have pushed some of his stories to the extreme and beyond in the spirit of satire, but that’s not how readers took them. It just may be that at some point he started sending coded messages to his readers, confident most of them would be too stupid to pick up on them.

            Modern writers and critics are never quite sure what to make of Edgar Allan Poe. Aldous Huxley famously described Poe’s verse lines as the equivalent of “the wearing of a diamond ring on every finger,” and the popular critic Harold Bloom extends the indictment to Poe’s prose, writing “Poe's awful diction… seems to demand the decent masking of a competent French translation” (2). What accounts for the stories continuing impact, Bloom suggests, are “the psychological dynamics and mythic reverberations of his stories” (3). Joyce Carol Oates, who of all contemporary writers might be expected to show Poe some sympathy, seems to agree with Bloom, charging that Poe’s stories are “hampered by… writerly turgidity” but nonetheless work on readers’ minds through his expert use of “surreal dream-images” (91). What these critics fail to realize is that when Poe went to excess in his prose he was doing it quite deliberately. He liked to use his stories to play games with his readers, simultaneously courting them, so that he could make a living, and signaling to them—at least the brightest and most attentive of them—that his mind was too good for the genre he was writing in, his tastes too sophisticated.

            The obliviousness of critics like Bloom notwithstanding, most Poe scholars are well aware that his writing was often intended as a satire on popular works of his day that showed the same poor tastes and the same tendency toward excess which readers today mistake as his own failures of eloquence. Indeed, some scholars have detected satiric elements to what are usually taken as Poe’s most serious works in the Gothic Horror genre. Clark Griffith, for instance, sees in the story “Ligeia” evidence of a “satiric underside” to the “Gothic overplot” (17) which arrests the attention of most readers. And, as critic Robert Regan explains, “Poe…was capable of synchronizing a multi-faceted tale of terror with a literary satire,” and he was furthermore “surely… capable of making his satiric point apparent if he had chosen to” (294). To understand what it is Poe is satirizing—and to discover why his satire is often concealed—it is important to read each of his stories not just in the context of his other stories but, because he had a proclivity toward what Regan refers to as duplicity, of his life beyond his work as well. Even a cursory study of his biography reveals a pattern of self-destructive defiance reflective of an incapacity to tolerate being at the slightest disadvantage. Poe, poverty-stricken for most of his adult life, even went so far as to defy the very readers he depended on for his paltry livelihood, but he did so in a manner so clever most of them caught no hint of the sneer. In point of fact, he more than once managed to escape detection for expressions of outright contempt for his readers by encoding them within some of the very tales—“The Tell-Tale Heart,” “The Black Cat,” “The Cask of Amontillado”—they found, and continue to find, most pleasing—and most horrifying.

            That Poe often wrote what were intended to be straightforward satires or comedies is clear in any comprehensive edition of his work. The story “How to Write a Blackwood Article,” for instance, not only quotes Cervantes, but probably owes its central conceit, a dunderheaded and ambitious writer seeking out the ridiculous, but lucrative, advice of a magazine editor, to the prologue of Part I of Don Quixote, in which the author struggles with the opening of his story until a friend comes along and advises him on how to embellish it (Levine 15). And Thomas Mabbott, a major Poe critic, has found letters in which the author describes a framing scheme for his early stories similar to Chaucer’s Canterbury Tales, in which a group, The Folio Club—“a mere Junto of Dunderheadism,” he calls them in an introduction (595)—takes turns telling stories and then votes to award a prize to the best. Poe writes: “As soon as each tale is read—the other 16 members criticise it in turn—and their criticisms are intended as a burlesque upon criticism generally” (173). The tales themselves, Poe writes, “are of a bizarre and generally whimsical character” (173). Robert Regan takes this evidence of Poe’s comedic intentions a step further, pointing out that when readers responded to “parodies or imitations of the mannered styles of fiction of his day” by giving them “praise for virtues [they] never pretended to,” the author “seems to have decided to make the best of being misunderstood: if his audience would not laugh at his clownishness, he would laugh at theirs” (281).

            Poe was not opposed to writing stories and poems he knew would be appealing to a popular audience—he couldn’t afford to be. But, as in every other sphere of his life, Poe chafed under this dependence, and was eager to signal his contempt whenever he could. This is most explicitly demonstrated in his treatment of stories published in—or associated with—Blackwood’s Edinburgh Magazine in “How to Write a Blackwood Article.” The story features the fictional Signora Psyche Zenobia—known to her enemies as Suky Snobbs—seeking the advice of Mr. Blackwood, who at one point gives her some examples of real works in the style he’s prescribing. He first praises “The Dead Alive,” saying “You would have sworn that the writer had been born and brought up in a coffin.” Then, of “Confessions of an Opium-eater,” he says

fine, very fine!—glorious imagination—deep philosophy—acute speculation—plenty of fire and fury, and a good spicing of the decidedly unintelligible. That was a nice bit of flummery, and went down the throats of the people delightfully. They would have it that Coleridge wrote that paper—but not so. It was composed by my pet baboon, Juniper. (176)

            Of course, Poe himself would go on to write a story titled “The Premature Burial,” so his attitude toward this type of writing was more complicated than Juniper’s authorship would suggest. Indeed, the story Signora Psyche Zenobia writes in the manner advocated by Mr. Blackwood, “A Predicament,” or “The Scythe of Time” as it was originally titled, is not fundamentally different from the style in which Poe wrote his more serious tales. It seems, rather, to have resulted from a cranking up of the exaggeration dial until the burlesque that is elsewhere imperceptible to popular readers comes across as patently ridiculous. And this wasn’t the first time Poe lampooned the overblown prose and farfetched plots of what were called “sensation tales,” like those published in Blackwood’s; three years earlier, with “Berenice,” another tale of premature burial, he tried to mimic these same excesses, but no one, including Thomas White, the editor who published it, caught the joke.

            In his correspondence with White, Poe gives a sense of what he believed he was working with—and against—in the publishing world of the 1830’s and 40’s. White agreed to publish “Berenice,” in which the protagonist, in the throes of his obsession with a woman’s teeth, removes them from her corpse, only to discover she wasn’t really dead, even though he suspected it was in bad taste. Poe apparently agreed. “The subject is by far too horrible,” he writes, “and I confess that I hesitated in sending it you especially as a specimen of my capabilities” (597). He goes on to claim that he was prompted to write the story by a wager against his ability to compose one on such a horrible subject. Napier Wilt, a critic who considers the question of Poe’s attitude toward his tales, finds this claim “somewhat dubious” (102). But it seems such a bet would have been the very type of challenge to Poe’s genius he could not back down from. And Poe goes on: “The history of all Magazines shows plainly that those which have attained celebrity were indebted for it to articles similar in nature—to Berenice—although, I grant you, far superior in style and execution” (597). That admission of the superiority of others’ work is more dubious by far than that the story was conceived from a bet.

            Poe then explicates to White precisely how the pieces he refers to, which are responsible for the success of the magazines that publish them, handle their topics. The way he describes them provides a lens through which to view works of his beyond just the one he is defending. The public, he writes, likes works consisting in “the ludicrous heightened into the grotesque: the fearful coloured into the horrible: the witty exaggerated into the burlesque: the singular wrought out into the strange and mystical” (597). He goes on to list several examples of stories that adhere to this formula, and Wilt attests that he could have made the list much longer: “Even a casual study of the early nineteenth-century English and American magazines yields hundreds of such tales” (103). So it seems that even as Poe was contemptuous of the writers of these articles—as well as the audiences who ensured their proliferation—he determined to write his own semi-serious stories in a similar vein, only to turn around some time later and poke fun at the style much less subtly in “How to Write a Blackwood Article.” He sums up his motivation later in the letter to White. “To be appreciated,” he writes, “one must be read” (597).

            It cannot be concluded, though, that Poe was a sellout who slavishly pandered to the blinkered sensibilities of the reading public—because he never acted slavish to anyone. He grew up the foster son of John Allan, a well-to-do Southern merchant. In the course of his upbringing he somehow, in G.R. Thompson’s words, came to “expect the life of the son of a Virginia gentleman—or, if not quite an aristocrat, the next best to that—the son of a prosperous merchant” (xxi). But by the time Poe was attending the University of Virginia, Allan was growing weary of underwriting the profligate Poe’s excesses, such as his habit of losing enormous sums at gambling tables, and cut him off. His foster mother tried to arrange a reconciliation but Poe was too proud to grovel. He did prevail upon Allan, though, years later to help finance his education at West Point. Unfortunately, he blew this too when he wrote to a creditor explaining that his foster father had not yet sent him the money he needed to pay off the debt, further suggesting that the reason for the delay was that “Mr. A. is not very often sober” (Thompson xxiii). The letter was then enclosed along with a demand for payment sent directly to said Mr. A. Rather than face the humiliation of being dismissed from West Point for being unable to pay his expenses, Poe decided to get himself kicked out by disobeying orders. There is even a story—possibly apocryphal—of the cadet showing up in formation naked (Thompson xxxiii).

Part 2

Read More
Dennis Junk Dennis Junk

What to Leave Out: Minimalism and the Hemingway Mystique

Much of the impact of Hemingway’s work comes from the mystique surrounding the man himself. Hemingway was a brand even before his influence had reached its zenith. So readers can’t really come to his work without letting their views about the author fill in the blanks he so expertly left empty. That’s probably why feelings about it tend to be so polarized.

“The Snows of Kilimanjaro” features a man who is lamenting his lost opportunities to write a bunch of stories he’s been saving up in his mind as he lays wounded and dying from an injury he suffered while on safari in Africa. It turns out Hemingway himself once suffered an injury while on safari in Africa; but of course he survived to write about the ordeal. Several of his other stories likewise feature fictionalized versions of himself. In fact, there are very few works in the Hemingway oeuvre that aren’t at least obliquely about Hemingway.

The success of the famous “iceberg theory” of writing, which has the author refrain from explicit statements about important elements of the characters’ minds, histories, and motivations, probably relied in large part on readers’ suspicion that the stories they were reading were true. In Death in the Afternoon, Hemingway explained,

If a writer of prose knows enough of what he is writing about he may omit things that he knows and the reader, if the writer is writing truly enough, will have a feeling of those things as strongly as though the writer had stated them. The dignity of movement of an ice-berg is due to only one-eighth of it being above water. A writer who omits things because he does not know them only makes hollow places in his writing.

The prose style of this theorizing on prose style is markedly unlike Hemingway’s usual “short, declarative statements.” And it is remarkably revealing. It almost seems as though Hemingway is boasting about being able to get away with leaving out as many of the details as he does in his stories because he’s so familiar with the subjects of which he writes. And what exciting and fascinating subjects they are—wars, romances, travels, brushes with death, encounters with man-eating beasts. Yet readers coming to the stories with romantic visions of Hemingway’s adventures are quickly disappointed by the angst, insecurity, and fear of the actual Hemingway experience.

Stories like “The Short Happy Life of Francis Macomber” unsettle us because there is a truth in them that not many people are given to exploring (even in America, land of “happily ever after,” marriages entail struggles for dominance). But much of the impact of Hemingway’s work comes from the mystique surrounding the man himself. Hemingway was a brand even before his influence had reached its zenith. So readers can’t really come to his work without letting their views about the author fill in the blanks he so expertly left empty. That’s probably why feelings about it tend to be so polarized.

If you take Hemingway’s celebrity out of the equation, though, you’re still left with a formidable proposition: fiction works not by detailing the protagonist’s innermost thoughts and finding clever metaphors for his or her feelings; rather the goal is to describe the scene in enough detail, to render the circumstances so thoroughly that the reader doesn’t need to be told how the character feels because the reader can imagine for him or herself what it would be like to inhabit a real-life version of the story. This proposition may have begun as far back as Proust, and having been taken in a completely new direction by Hemingway, reached something of an apotheosis in the minimalism of such authors as Raymond Carver. Of course, Carver’s more domestic dramas rely on a common stock of experience in place of the celebrity of the author, but the effect is of even greater revelation—or perhaps recognition is a better word.

Really, though, if you take this theory of storytelling to its logical endpoint you have films and movies, and you’ve lost the element that makes fiction writing unique—the space for interiority. It’s no coincidence that the best candidate for the Hemingway mantle today—at least in his most recent works—has had his two latest books adapted into films within a couple years of their publication. Cormac McCarthy hasn’t been able to rely on any public notion of his interchangeability with his characters; nor does he write about experiences he can count on his readers to recognize. Instead, he builds his stories up from the ground of popular genres we’re most familiar with from our lifetime love affair with cinema.

No Country for Old Men reads so much like Hemingway at points that you wonder if McCarthy took frequent breaks from the writing to dip into the icon’s complete short stories. At the same time, the novel reads so much like a script you wonder if the movie rights were sold before or after he began writing it.

If Hemingway could only write about things he’d actually experienced, and Carver can only write about experiences similar to those his readers have actually had, and McCarthy is dependent on our familiarity with popular genres, it seems the theory of omission or minimalism either runs up against a wall or gets stuck in an infinite regress. The possibility of discovery, the author going somewhere new and taking his readers along for the ride, recedes farther and farther into the distance. These limitations are real, of course, no matter what style you’re writing in; all writers must follow the injunction to write what they know—at least up to a point. I think the lesson to take from Hemingway and his followers is that the emotion inferred is often more poignant than the emotion described, but inference is only one of many tools in the writer’s toolbox.

Also read:

LIFE'S WHITE MACHINE: JAMES WOOD AND WHAT DOESN'T HAPPEN IN FICTION

Read More
Dennis Junk Dennis Junk

Eric Harris: Antisocial Aggressor or Narcissistic Avenger?

Conventional wisdom after the Columbine shooting was that the perpetrators had lashed out after being bullied. They were supposed to have low self-esteem and represented the dangers of letting kids go about feeling bad about themselves. But is it possible at least one of the two was in fact a narcissist? Eric Harris definitely had some fantasies of otherworldly grandeur.

Coincident with my writing a paper defending Gabriel Conroy in James Joyce’s story “The Dead” from charges of narcissism leveled by Lacanian critics, my then girlfriend was preparing a presentation on the Columbine shooter Eric Harris which had her trying to determine whether he would have better fit the DSM-IV diagnostic criteria for Narcissistic or for Antisocial Personality Disorder. Everything about Harris screamed narcissist, but there was a deal-breaker for the diagnosis: people who hold themselves in astronomical esteem seem unlikely candidates for suicide, and Harris turned his gun on himself in culmination of his murder spree.

Clinical diagnoses are mere descriptive categorizations which don’t in any way explain behavior; at best, they may pave the way for explanations by delineating the phenomenon to be explained. Yet the nature of Harris’s thinking about himself has important implications for our understanding of other types of violence. Was he incapable of empathizing with others, unable to see and unwilling to treat them as feeling, sovereign beings, in keeping with an antisocial diagnosis? Or did he instead believe himself to be so superior to his peers that they simply didn’t merit sympathy or recognition, suggesting narcissism? His infamous journals suggest pretty unequivocally that the latter was the case. But again we must ask if a real narcissist would kill himself?

This seeming paradox was brought to my attention again this week as I was reading 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior (about which I will very likely be writing more here). Myth #33 is that “Low Self-Esteem Is a Major Cause of Psychological Problems” (162). The authors make use of the common misconception that the two boys responsible for the shootings were meek and shy and got constantly picked on until their anger boiled over into violence. (It turns out the boiling-over metaphor is wrong too, as explained under Myth #30: “It’s Better to Express Anger to Others than to Hold It in.”) The boys were indeed teased and taunted, but the experience didn’t seem to lower their view of themselves. “Instead,” the authors write, “Harris and Klebold’s high self-esteem may have led them to perceive the taunts of their classmates as threats to their inflated sense of self-worth, motivating them to seek revenge” (165).

Narcissists, they explain, “believe themselves deserving of special privileges” or entitlements. “When confronted with a challenge to their perceived worth, or what clinical psychologists term a ‘narcissistic injury,’ they’re liable to lash out at others” (165). We usually think of school shootings as random acts of violence, but maybe the Columbine massacre wasn’t exactly random. It may rather have been a natural response to perceived offenses—just one that went atrociously beyond the realm of what anyone would consider fair. If what Harris did on that day in April of 1999 was not an act of aggression but one of revenge, it may be useful to consider it in terms of costly punishment, a special instance of costly signaling.

The strength of a costly signal is commensurate with that cost, so Harris’s willingness both to kill and to die might have been his way of insisting that the offense he was punishing was deathly serious. What the authors of 50 Great Myths argue is that the perceived crime consisted of his classmates not properly recognizing and deferring to his superiority. Instead of contradicting the idea that Harris held himself in great esteem then, his readiness to die for the sake of his message demonstrates just how superior he thought he was—in his mind the punishment was justified by the offense, and how seriously he took the slights of his classmates can be seen as an index of how superior to them he thought he was. The greater the difference in relative worth between Harris and his schoolmates, the greater the injustice.

Perceived relative status plays a role in all punishments. Among two people of equal status, such factors as any uncertainty regarding guilt, mitigating circumstances surrounding the offense, and concern for making the punishment equal to the crime will enter into any consideration of just deserts. But the degree to which these factors are ignored can be used as an index for the size of the power differential between the two individuals—or at least to the perceived power differential. Someone who feels infinitely superior will be willing to dish out infinite punishment. Absent a truly horrendous crime, revenge is a narcissistic undertaking.

Also read

SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

And:

THE MENTAL ILLNESS ZODIAC: WHY THE DSM 5 WON'T BE ANYTHING BUT MORE PSEUDOSCIENCE

Read More
Dennis Junk Dennis Junk

Getting Gabriel Wrong: Part 1 of 3

The theories put forth by evolutionary critics, particularly the Social Monitoring and Volunteered Affect Theory formulated by William Flesch, highlight textual evidence in Joyce’s story “The Dead” that undermines readings by prominent Lacanians—evidence which has likely led generations of readers to a much more favorable view of Joyce’s protagonist.

Soon after arriving with his wife at his aunts’ annual celebration of Christmas, Gabriel Conroy, the protagonist of James Joyce’s “The Dead,” has an awkward encounter. Lily, “the caretaker’s daughter” (175), has gone with Gabriel into a pantry near the entrance to help him off with his coat. They exchange a few polite words before Gabriel broaches the topic of whether Lily might be engaged, eliciting from her a bitter remark about the nature of men, which in turn causes him to blush. After nervously adjusting his attire, Gabriel gives her a coin and rushes away to join the party. How readers interpret this initial scene, how they assess Gabriel’s handling of it, has much bearing on how they will experience the entire story. Many psychoanalytic critics, particularly followers of Jacques Lacan, read the encounter as evidence of Gabriel’s need to control others, especially women. According to this approach, the rest of the story consists of Gabriel’s further frustrations at the hands of women until he ultimately succumbs and adopts a more realistic understanding of himself. But the Lacanian reading is cast into severe doubt by an emerging field of narrative studies based on a more scientific view of human psychology. The theories put forth by these evolutionary critics, particularly the Social Monitoring and Volunteered Affect Theory formulated by William Flesch, highlight textual evidence that undermines readings by prominent Lacanians—evidence which has likely led generations of readers to a much more favorable view of Joyce’s protagonist.

At a glance, two disparate assessments of Gabriel seem equally plausible. Is he solicitous or overbearing? Fastidious or overweening? Self-conscious or narcissistic? While chatting with Lily, he smiles “at the three syllables she had given his surname” (177). He subsequently realizes that he “had known her when she was a child and used to sit on the lowest step nursing a rag doll” (177). Lacanian critic James Kelley asserts that these lines describe Gabriel “reveling in his position of superiority” (202). When Gabriel goes on to inquire about Lily’s schooling and, upon learning that she is no longer a student, whether he can expect to be attending her wedding sometime soon, she “glanced back at him over her shoulder and said with great bitterness: / —The men that is now is only all palaver and what they can get out of you” (177). Joyce leaves unanswered two questions in this scene: why does Lily respond “with great bitterness”? And why does Gabriel lose his composure over it? Kelley suggests that Lily is responding to Gabriel’s “attitude of superiority” (202). Gary Leonard, another Lacanian, sees the encounter similarly, charging Gabriel with the offense of “asking a real woman a question better asked of a little girl in a fairy tale (of course she is that unreal to him)” (458). But Leonard is holding Gabriel to a feminist standard that had not come into existence yet. And Gabriel’s smiling and reminiscing are just as likely reflective of a fatherly as they are of a patriarchal attitude.

One of the appeals of Flesch’s Social Monitoring and Volunteered Affect (SMVA) theory, which views the experience of fiction as a process of tracking characters for signals of altruism and favoring those who emit them, is that it relies on no notional story between the lines of the actual story. When encountering the scene in which Lily responds bitterly to Gabriel’s small talk, readers need not, for instance, be looking at a symbol of class conflict (Marxism) or gender oppression (Feminism) or one consciousness trying to wrest psychic unity from another (Lacanianism). Evidence can be culled from the scene to support the importance of these symbols and dynamics, along with countless others. But their importance rests solely in the mind of the critic serving as advocate for this or that theory. As Flesch’s fellow evolutionary critic Brian Boyd explains in his book On the Origin of Stories: Evolution, Cognition, and Fiction, “Such critics assume that if they can ‘apply’ the theory, if they can read a work in its light, they thereby somehow ‘prove’ it, even if the criteria of application and evidence are loose” (387).Unfortunately, those trained in the application of one of these theories can become so preoccupied with the task of sifting between the lines that their experience—to say nothing of their enjoyment—of the lines themselves gets short shrift. And, as Boyd points out, “We learn more when evidence against a reading surfaces, since it forces us to account for a richer stock of information” (387, emphasis in original).

The distorting effect of theory can be seen in the Lacanian critics’ obliviousness toward several aspects of “The Dead” Joyce is trying to draw their attention to. Why, for instance, would Gabriel be embarrassed by Lily’s bitter response when there are no witnesses? She is just a lowly caretaker’s daughter; that he would be at all concerned with what she says or how she says it tells readers something about him. Joyce makes a point of suggesting that Gabriel’s blush is not borne of embarrassment, but rather of his shame at offending Lily, accidental though the offence may have been. He “coloured as if he felt he had made a mistake” (178), and that mistake was supposing Lily would be getting married soon. That Lily’s bitterness at this supposition has anything to do with Gabriel as opposed to the one or more men who have frustrated her in love is unlikely (unless she has a crush on him). Concerning her “three mistresses,” she knows, for instance, that “the only thing they would not stand was back answers” (176), implying that she has ventured some in the past and been chastised for them. Aunt Kate even complains later about Lily’s recent behavior: “I’m sure I don’t know what has come over her lately. She’s not the girl she was at all” (181). This may or may not be enough evidence to support the idea that Lily’s recent transformation resulted from a courtship with one of those men who is all palaver, but it certainly goes a long way toward undermining readings like Kelley’s and Leonard’s.

When Gabriel thrusts a coin into Lily’s hand, his purpose may be “to reestablish his superiority” (202), as Kelley argues, but that assumes both that he has a sense of his own superiority and that he is eager to maintain it. If Gabriel feels so superior, though, why would he respond charitably rather than getting angry? Why, when Aunt Kate references Lily’s recent change, does he make ready “to ask his aunt some questions on this point” (181) instead of making a complaint or insisting on a punishment? A simpler and much more obvious motivation for the thrusting of the coin is to make amends for the accidental offence. But, astonishingly, Leonard concludes solely from Joyce’s use of the word “thrusting” that Lily and Gabriel’s “social intercourse is terminated in a manner that mimics sexual intercourse” (458). Yet another Lacanian, Ivan Trujillo, takes this idea a step further: having equated Gabriel’s blush with an orgasm, he sees the coin as the culmination of an act of prostitution, as paying her back “for his orgasmic feeling of shame” (3).

From the perspective of SMVA theory, laid out in Flesch’s Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction, Gabriel’s blushing, which recurs later in the story in his encounter with Molly Ivors, suggests something quite distantly removed from sexual arousal. “Blushing is an honest signal of how one feels,” Flesch writes. “It is honest because we would suppress it if we could” (103). But what feeling might Gabriel be signaling? Flesch offers a clue when he explains,

Being known through hard-to-fake or costly or honest signaling to have the emotional propensity to act against our own rational interests helps those who receive our signals to solve the problem of whether they can trust us. Blushing, weeping, flushing with rage, going livid with shock: all these are reliable signals, not only of how we feel in a certain situation but of the fact that we generally emit reliable signals. It pays to be fathomable. People tend to trust those who blush (106, emphasis in original).

The most obvious information readers of “The Dead” can glean from Gabriel’s blushing at Lily’s response to his questions is that he is genuinely concerned that he may have offended her. Lacanians might counter that his real concern is with the authentication of his own sense of superiority, but again if he really felt so superior why would he care about offending the lowly caretaker’s daughter in an exchange with no witnesses? In fact, Lily’s back is turned, so even she misses the blush. It can be read as a signal from Joyce to readers to let them know a little about what kind of character Gabriel is.

Part 2.

Read More
Dennis Junk Dennis Junk

Postmodernism and the Commodification of Authenticity

Our rebel cry against consumerism has long since been packaged and sold back to us. So while reading we’re not learning about others, certainly not learning about ourselves, and at the same time we’re fueling the machine we set out to sabotage.

Inspired by poststructuralism’s dropping of the referent—the conviction that language allows for no access to reality—authors, along with editors and publishers, have instigated a trend away from fiction with authoritative third-person narrators. Authority in general is now something of a bugaboo, a tired and transparent disguise for advocates of exclusionary and oppressive ideologies or discourses. The good intention behind this movement is to give voice to hitherto powerless minorities. But what it means to be powerless and what it means to be a minority are complicated matters. A different breed of truth has taken over. A capturing of, or a representing of, the true experience of this or that minority has become the ascendant pursuit. Authenticity has supplanted authority as the guiding principle of fiction.

One way to achieve authenticity is to be a minority and to tell stories, preferably through first-person narration, that are culled from your actual experiences—or at least figurative representations of those experiences. White guys have a harder time for obvious reasons. To represent their experiences with authenticity usually means portraying their characters as demented. But the biggest problem for attempts at authenticity through first-person narratives is that no human consciously attends to the quantities of detail that make up the most effective, the most vivid scenes in literature. Instead of the voices of real people in real, underrepresented circumstances, readers encounter bizarre, hyper-articulate dialect savants, hybrids of the socially impoverished and the mentally rich, as if liberal documentary filmmakers had been shrunk and surgically implanted into the brains of poor immigrants.

To say that such narrators are like no one you’ll ever meet only addresses the most superficial of the dilemmas inherent in the quest for authenticity. For narrators to tell their stories in obsessive detail they must have some reason to do so; they must be anticipating some effect on their readers. They must have an agenda, which seems to be nothing other than to advertise their own and indirectly the author’s authenticity. Take J. Lo singing “I’m real” as a refrain to one of her songs and a reprisal of the main theme of them all. It raises the question, if you’re so real, why must you so compulsively insist on it? The answer—and J. Lo knows it—is that being real has become a commodity.

Ironically, this emphasis on the experiences of individuals is well in keeping with the so-called western tradition. One of the appeals of postmodernism is that in privileging subjectivity and entirely ruling out objectivity it implies an unbridgeable chasm between one individual and another. In a world of nearly seven billion, this affirmation of uniqueness comes as a welcome assurance. No overlap between individuals means no redundancy, no superfluity. It may also suggest that we can never really know each other, but that’s okay as long as we accept each other as real. But real as opposed to what?

Being real means not being manufactured, artificial, mass produced. It means not being a poseur. Authenticity is our clarion call of resistance to industrialization, commercialism, globalization—even tourism. No wonder so many marketing firms have embraced it. This paradox has placed both writers and readers in an awkward position vis á vis the purpose fiction has traditionally served. We can’t truly understand other people. And our rebel cry against consumerism has long since been packaged and sold back to us. So while reading we’re not learning about others, certainly not learning about ourselves, and at the same time we’re fueling the machine we set out to sabotage.

As poststructuralism’s original case for dropping the referent was pathetic, bringing reality, even the reality of a common humanity, back into the purview of literature suggests itself as a possible escape from this self-defeating solipsism.

Also read:
FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

HOW TO BE INTERESTING: DEAD POETS AND CAUSAL INFERENCES

Read More
Dennis Junk Dennis Junk

Absurdities and Atrocities in Literary Criticism

Poststructuralists believe that everything we see is determined by language, which encapsulates all of culture, so our perceptions are hopelessly distorted. What can be done then to arrive at the truth? Well, nothing—all truth is constructed. All that effort scientists put into actually testing their ideas is a waste of time. They’re only going to “discover” what they already know.

All literary theories (except formalism) share one common attraction—they speak to the universal fantasy of being able to know more about someone than that person knows about him- or herself. If you happen to be a feminist critic for instance, then you will examine some author’s work and divine his or her attitude toward women. Because feminist theory insists that all or nearly all texts exemplify patriarchy if they’re not enacting some sort of resistance to it, the author in question will invariably be exposed as either a sexist or a feminist, regardless of whether or not that author intended to make any comment about gender. The author may complain of unfair treatment; indeed, there really is no clearer instance of unchecked confirmation bias. The important point, though, is that the writer of the text supposedly knows little or nothing about how the work functions in the wider culture, what really inspired it at an unconscious level, and what readers will do with it. Substitute bourgeois hegemony for patriarchy in the above formula and you have Marxist criticism. Deconstruction exposes hidden hierarchies. New Historicism teases out dominant and subversive discourses. And none of them flinches at objections from authors that their work has been completely misunderstood.

This has led to a sad, self-righteous state of affairs in English departments. The first wrong turn was taken by Freud when he introduced the world to the unconscious and subsequently failed to come up with a method that could bring its contents to light with any reliability whatsoever. It’s hard to imagine how he could’ve been more wrong about the contents of the human mind. As Voltaire said, “He who can make you believe absurdities can make you commit atrocities.” No sooner did Freud start writing about the unconscious than he began arguing that men want to kill their fathers and have sex with their mothers. Freud and his followers were fabulists who paid lip service to the principles of scientific epistemology even as they flouted them. But then came the poststructuralists to muddy the waters even more. When Derrida assured everyone that meaning derived from the play of signifiers, which actually meant meaning is impossible, and that referents—to the uninitiated, referents mean the real world—must be dismissed as having any part to play, he was sounding the death knell for any possibility of a viable epistemology. And if truth is completely inaccessible, what’s the point of even trying to use sound methods? Anything goes.

Since critics like to credit themselves with having good political intentions like advocating for women and minorities, they are quite adept at justifying their relaxing of the standards of truth. But just as Voltaire warned, once those standards are relaxed, critics promptly turn around and begin making accusations of sexism and classism and racism. And, since the accusations aren’t based on any reasonable standard of evidence, the accused have no recourse to counterevidence. They have no way of defending themselves. Presumably, their defense would be just another text the critics could read still more evidence into of whatever crime they’re primed to find.

The irony here is that the scientific method was first proposed, at least in part, as a remedy for confirmation bias, as can be seen in this quote from Francis Bacon’s 1620 treatise Novum Organon:

The human understanding is no dry light, but receives infusion from the will and affections; whence proceed sciences which may be called “sciences as one would.” For what a man had rather were true he more readily believes. Therefore he rejects difficult things from impatience of research; sober things, because they narrow hope; the deeper things of nature, from superstition; the light of experience, from arrogance and pride; things commonly believed, out of deference to the opinion of the vulgar. Numberless in short are the ways, and sometimes imperceptible, in which the affections color and infect the understanding.

Poststructuralists believe that everything we see is determined by language, which encapsulates all of culture, so our perceptions are hopelessly distorted. What can be done then to arrive at the truth? Well, nothing—all truth is constructed. All that effort scientists put into actually testing their ideas is a waste of time. They’re only going to “discover” what they already know.

But wait: if poststructuralism posits that discovery is impossible, how do its adherents account for airplanes and nuclear power? Just random historical fluctuations, I suppose.

The upshot is that, having declared confirmation bias inescapable, critics embraced it as their chief method. You have to accept their relaxed standard of truth to accept their reasoning about why we should do away with all standards of truth. And you just have to hope like hell they never randomly decide to set their sights on you or your work. We’re lucky as hell the legal system doesn’t work like this. And we can thank those white boys of the enlightenment for that.

Also read:

CAN’T WIN FOR LOSING: WHY THERE ARE SO MANY LOSERS IN LITERATURE AND WHY IT HAS TO CHANGE

WHY SHAKESPEARE NAUSEATED DARWIN: A REVIEW OF KEITH OATLEY'S "SUCH STUFF AS DREAMS"

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

Poststructuralism: Banal When It's Not Busy Being Absurd

The dominant epistemology in the humanities, and increasingly the social sciences, is postmodernism, also known as poststructuralism—though pedants will object to the labels. It’s the idea that words have shaky connections to the realities they’re supposed to represent, and they’re shot through with ideological assumptions serving to perpetuate the current hegemonies in society. As a theory of language and cognition, it runs counter to nearly all the evidence that’s been gathered over the past century.

            Reading the chapter in one of my textbooks on Poststructualism, I keep wondering why this paradigm has taken such a strong hold of scholars' minds in the humanities. In a lot of ways, the theories that fall under its aegis are really simple--overly simple in fact. The structuralism that has since been posted was the linguistic theory of Ferdinand de Saussure, who held that words derive their meanings from their relations to other, similar words. Bat means bat because it doesn't mean cat. Simple enough, but Saussure had to gussy up his theory by creating a more general category than "words," which he called signs. And, instead of talking about words and their meanings, he asserts that every sign is made up of a signifier (word) and a signified (concept or meaning).

            What we don't see much of in Saussure's formulation of language is its relation to objects, actions, and experiences. These he labeled referents, and he doesn't think they play much of a role. And this is why structuralism is radical. The common-sense theory of language is that a word's meaning derives from its correspondence to the object it labels. Saussure flipped this understanding on its head, positing a top-down view of language. What neither Saussure nor any of his acolytes seemed to notice is that structuralism can only be an incomplete description of where meaning comes from because, well, it doesn't explain where meaning comes from--unless all the concepts, the signifieds are built into our brains. (Innate!)

            Saussure's top-down theory of language has been, unbeknownst to scholars in the humanities, thoroughly discredited by research in developmental psychology going back to Jean Piaget that shows children's language acquisition begins very concretely and only later in life enables them to deal in abstractions. According to our best evidence, the common-sense, bottom-up theory of language is correct. But along came Jacques Derrida to put the post to structuralism--and make it even more absurd. Derrida realized that if words' meanings come from their relation to similar words then discerning any meaning at all from any given word is an endlessly complicated endeavor. Bat calls to mind not just cat, but also mat, and cad, and cot, ad infinitum. Now, it seems to me that this is a pretty effective refutation of Saussure's theory. But Derrida didn't scrap the faulty premise, but instead drew an amazing conclusion from it: that meaning is impossible.

            Now, to round out the paradigm, you have to import some Marxism. Logically speaking, such an importation is completely unjustified; in fact, it contradicts the indeterminacy of meaning, making poststructuralism fundamentally unsound. But poststructuralists believe all ideas are incoherent, so this doesn't bother them. The Marxist element is the idea that there is always a more powerful group who's foisting their ideology on the less powerful. Derrida spoke of binaries like man and woman--a man is a man because he's not a woman--and black and white--blacks are black because they're not white. We have to ignore the obvious objection that some things can be defined according their own qualities without reference to something else. Derrida's argument is that in creating these binaries to structure our lives we always privilege one side over the other (men and whites of course--even though both Saussure and Derrida were both). So literary critics inspired by Derrida "deconstruct" texts to expose the privileging they take for granted and perpetuate. This gives wonks the gratifying sense of being engaged in social activism.

            Is the fact that these ideas are esoteric what makes them so appealing to humanities scholars, the conviction that they have this understanding that supposedly discredits what the hoi polloi, or even what scientists and historians and writers of actual literature know? Really poststructuralism is nonsense on stilts riding a unicycle. It's banal in that it takes confirmation bias as a starting point, but it's absurd in that it insists this makes knowledge impossible. The linguist founders were armchair obscurantists whose theories have been disproved. But because of all the obscurantism learning the banalities and catching out the absurdities takes a lot of patient reading. So is the effort invested in learning the ideas a factor in making them hard to discount outright? After all, that would mean a lot of wasted effort.

Also read:

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

THE SELF-RIGHTEOUSNESS INSTINCT: STEVEN PINKER ON THE BETTER ANGELS OF MODERNITY AND THE EVILS OF MORALITY

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

Read More
Dennis Junk Dennis Junk

From Rags to Republican

Written at a time in my life when I wanted to argue politics and religion with anyone who was willing—and some who weren’t—this essay starts with the observation that when you press a conservative for evidence or logic to support their economic theories, they’ll instead tell you a story about how their own personal stories somehow prove it’s possible to start with little and beat the odds.

One of the dishwashers at the restaurant where I work likes to light-heartedly discuss politics with me. “How are things this week on the left?” he might ask. Not even in his twenties yet, he can impressively explain why it’s wrong to conflate communism with Stalinism. He believes the best government would be a communist one, but until we figure out how to establish it, our best option is to go republican. He loves Rush Limbaugh. One day I was talking about disparities in school funding when he began telling about why he doesn’t think that sort of thing is important. “I did horribly in school, but I decided I wanted to learn on my own.”

He went on to tell me about a terrible period he went through growing up, after his parents got divorced and his mother was left nearly destitute. The young dishwater had pulled himself up by his own bootstraps. The story struck me because about two weeks earlier I’d been discussing politics with a customer in the dinning room who told a remarkably similar one. He was eating with his wife and their new baby. When I disagreed with him that Obama’s election was a national catastrophe he began an impromptu lecture on conservative ideology. I interrupted him, saying, “I understand top-down economics; I just don’t agree with it.” But when I started to explain the bottom-up theory, he interrupted me with a story about how his mom was on food stamps and they had nothing when he was a kid, and yet here he is, a well-to-do father (he even put a number on his prosperity). “I’m walking proof that it is possible.”

I can go on and on with more examples. It seems like the moment anyone takes up the mantle of economic conservatism for the first time he (usually males) has to put together one of these rags-to-riches stories. I guess I could do it too, with just a little exaggeration. “My first memories are of living in government subsidized apartments, and my parents argued about money almost every day of my life when I was a kid, and then they got divorced and I was devastated—I put on weight until I was morbidly obese and I went to a psychologist for depression because I missed a month of school in fourth grade.” (Actually, that’s not exaggerated at all.)

The point we’re supposed to take away is that hardship is good and that no matter how bad being poor may appear it’s nothing a good work ethic can’t fix. Invariably, the Horatio Alger proceeds to the non sequitur that his making it out of poverty means it’s a bad idea for us as a society to invest in programs to help the poor. Push him by asking what if the poverty he experienced wasn’t as bad as the worst poverty in the country, or where that work ethic that saved him came from, and he’ll most likely shift gears and start explaining that becoming a productive citizen is a matter of incentives.

The logic runs: if you give money to people who aren’t working, you’re taking away the main incentive they had to get off their asses and go to work. Likewise, if you take money away from the people who have earned it by taxing them, you’re giving them a disincentive to continue being productive. This a folksy version of a Skinner Box: you get the pigeons to do whatever tricks you want by rewarding them with food pellets when they get close to performing them correctly—“successive approximations” of the behavior—and punishing them by not giving them food pellets when they go astray. What’s shocking is that this is as sophisticated as the great Reagan Revolution ever got. It’s a psychological theory that was recognized as too simplistic in the 1950’s writ large to explain the economy. What if people can make money in ways other than going to work, say, by selling drugs? The conservatives’ answer—more police, harsher punishments. But what if money isn’t the only reward people respond to? And what if prison doesn’t work like it’s supposed to?

The main appeal, I think, to Skinner Box Economics is that it says, in effect, don’t worry about having more than other people because you’ve earned what you have. You deserve it. What a relief to hear that we have more because we’re just better people. We needn’t work ourselves up over the wretched plight of the have-nots; if they really wanted to, they could have everything we have. To keep this line of reasoning afloat you need to buoy it up with a bit of elitism: so maybe offering everyone the same incentives won’t make everyone rich, but the smartest and most industrious people will be alright. If you’re doing alright, then you must be smart and industrious. And if you’re filthy rich, say, Wall Street banker rich, then, well, you must be one amazing S.O.B. How much money you have becomes an index of how virtuous you are as a person. And some people are so amazing in fact that the worst thing society can do is hold them back in any way, because their prosperity is so awesome it benefits everyone—it trickles down. There you have it, a rationale for letting rich people do whatever they want, and leaving poor people to their own devices to pull up their own damn bootstraps. This is the thinking that has led to even our democratic president believing that he needs to pander to Wall Street to save the economy. This is conservatism. And it’s so silly no adult should entertain it for more than a moment.

A philosophy that further empowers the powerful, that justifies the holding of power over the masses of the less powerful, ought to be appealing to anyone who actually has power. But it’s remarkable how well these ideas trickle down to the rest of us. One way to account for the assimilation of Skinner Box Economics among the middle class is that it is the middle class; people in it still have to justify being more privileged than those in the lower classes. But the real draw probably has little to do with any recognition of one’s actual circumstances; it relies rather on a large-scale obliviousness of them. Psychologists have been documenting for years the power of two biases we all fall prey to that have bearing on our economic thinking: the first is the self-serving bias, according to which we take credit any time we succeed at something but point to forces beyond our control whenever we fail. One of the best examples of the self-serving bias is research showing that the percentage of people who believe themselves to be better-than-average drivers is in the nineties—even among those who’ve recently been at fault in a traffic accident. (Sounds like Wall Street.) The second bias, which is the flipside of the first, is the fundamental attribution error, according to which we privilege attributions of persistent character traits to other people in explaining their behavior at the expense of external, situational factors—when someone cuts us off while we’re driving we immediately conclude that person is a jerk, even though we attribute the same type of behavior in ourselves to our being late for a meeting.

Any line of thinking that leads one away from the comforting belief in his or her own infinite capacity for self-determination will inevitably fail to take hold in the minds of those who rely on intuition as a standard of truth. That’s why the conservative ideology is such an incoherent mess: on the one hand, you’re trying to create a scientific model for how the economy works (or doesn’t), but on the other you’re trying not only to leave intact people’s faith in free will but also to bolster it, to elevate it to the status of linchpin to the entire worldview. But free will and determinism don’t mix, and unless you resort to religious concepts of non-material souls there’s no place to locate free will in the natural world. The very notion of free will is self-serving to anyone at all successful in his or her life—and that’s why self-determination, in the face of extreme adversity, is fetishized by the right. That’s why every conservative has a rags-to-riches story to offer as proof of the true nature of economic forces.

The real wonder of the widespread appeal of conservatism is the enormous capacity it suggests we all have for taking our advantages for granted. Most people bristle when you even use the words advantage or privilege—as if you’re undermining their worth or authenticity as a person. But the advantages middle class people enjoy are glaring and undeniable. Sure, many of us were raised by single mothers who went through periods of hardship. I’d say most of us, though, had grandparents around who were willing to lend a helping hand here and there. And even if these grandparents didn’t provide loans or handouts they did provide the cultural capital that makes us recognizable to other middle class people as part of the tribe. What makes conservative rags-to-riches stories impossible prima facie is that the people telling them know the plot elements so well, meaning someone taught them the virtue of self-reliance, and they tell them in standard American English, with mouths full of shiny, straight teeth, in accents that belie the story’s gist. It may not seem, in hindsight, that they were comfortably ensconced in the middle class, but at the very least they were surrounded by middle class people, and benefiting from their attention.

You might be tempted to conclude that the role of contingency is left out of conservative ideology, but that’s not really the case. Contingency in the form of bad luck is incorporated into conservative thinking in the form of the very narratives of triumph over adversity that are offered as proof of the fatherly wisdom of the free market. In this way, the ideology is inextricably bound to the storyteller’s authenticity as a person. I suffered and toiled, the storyteller reasons, and therefore my accomplishments are genuine, my character strong. The corollary to this personal investment in what is no longer merely an economic theory is that any dawning awareness of people in worse circumstances than those endured and overcome by the authentic man or woman will be resisted as a threat to that authenticity. If they were to accept that they had it better or easier than some, then their victories would be invalidated. They are thus highly motivated to discount, or simply not to notice contingencies like generational or cultural advantages.

I’ve yet to hear a rags-to-riches story that begins with a malnourished and overstressed mother giving birth prematurely to a cognitively impaired and immuno-compromised baby, and continues with a malnourished and neglected childhood in underperforming schools where not a teacher nor a classmate can be found who places any real value on education, and ends with the hard-working, intelligent person you see in front of you, who makes a pretty decent income and is raising a proud, healthy family. Severely impoverished people live a different world, and however bad we middle-class toilers think we’ve had it we should never be so callous and oblivious to claim we’ve seen and mastered that world. But Skinner Box Economics doesn’t just fail because some of us are born less able to perform successive approximations of the various tricks of productivity; it fails because it’s based on an inadequate theory of human motivation. Rewards and punishments work to determine our behavior to be sure, but the only people who sit around calculating outcomes and navigating incentives and disincentives with a constant eye toward the bottom line are the rich executives who benefit most from a general acceptance of supply-side economics.

The main cultural disadvantage for people growing up in poor families in poor neighborhoods is that the individuals who are likely to serve as role models there will seldom be the beacons of middle-class virtue we stupidly expect our incentive structure to produce. When I was growing up, I looked up to my older brothers, and wanted to do whatever they were doing. And I looked up to an older neighbor kid, whose influence led me to race bikes at local parks. Later my role models were Jean Claude Van Damme and Arnold Schwarzenegger, so I got into martial arts and physical fitness. Soon thereafter, I began to idolize novelists and scientists. Skinnerian behaviorism has been supplanted in the social sciences by theories emphasizing the importance of observational learning, as well as the undeniable role of basic drives like the one for status-seeking. Primatologist Frans de Waal, for instance, has proposed a theory for cultural transmission—in both apes and humans—called BIOL, for bonding and identification based observational learning. What this theory suggests is that our personalities are largely determined by a proclivity for seeking out high-status individuals whom we admire, assimilating their values and beliefs, and emulating their behavior. Absent a paragon of the Calvinist work ethic, no amount of incentives is going to turn a child into the type of person who tells conservative rags-to-riches stories.

The thing to take away from these stories is usually that there is a figure or two who perform admirably in them—the single mom, the determined dad, the charismatic teacher. And the message isn’t about economics at all but about culture and family. Conservatives tout the sanctity of family and the importance of good parenting but when they come face-to-face with the products of poor parenting they see only the products of bad decisions. Middle class parents go to agonizing lengths to ensure their children grow up in good neighborhoods and attend good schools but suggest to them that how well someone behaves is a function of how much they have—how much love and attention, how much healthy food and access to doctors, how much they can count on their struggles being worthwhile—and those same middle class parents will warn you about the dangers of making excuses.

The real proof of how well conservative policies work is not to be found in anecdotes, no matter how numerous; it’s in measures of social mobility. The story these measures tell about the effects of moving farther to the right as a country contrast rather starkly with all the rags-to-Republican tales of personal heroism. But then numbers aren’t really stories; there’s no authenticity and self-congratulation to be gleaned from statistics; and if it’s really true that we owe our prosperity to chance, well, that’s just depressing—and discouraging. We can take some encouragement for our stories of hardship though. We just have to take note of how often the evidence they provide for poverty—food stamps, rent-controlled housing—are in fact government programs to aid the impoverished. They must be working.

Also read:

WHAT'S WRONG WITH THE DARWIN ECONOMY?

Read More
Dennis Junk Dennis Junk

A Literary Darwinist Take on Death in Venice Part 1

Aschenbach’s fate in the course of the novel can be viewed as hinging on whether he will be able, once he’s stepped away from the lonely duty of his writing, to establish intimate relationships with real humans, as he betrays a desperate longing to do. When he ultimately fails in this endeavor, largely because he fails to commit himself to it fully, Mann has the opportunity to signal to his readers how grave the danger is that every artist, including Thomas Mann, must face as he stands at the edge of the abyss.

There is comfort to be had in the orderliness of solitude, but that orderliness will be the first casualty in any encounter with other people. Such is the experience of Gustav von Aschenbach in Thomas Mann’s 1911 novel Death in Venice. Aschenbach has not, however, strived for solitude and order for the sake of comfort—at least not by his own account—but rather for the sake of his art, to which he has devoted himself single-mindedly, even monomaniacally, his whole life. Now, at age fifty, newly elevated to a titled status, Aschenbach has become acutely aware of all he has sacrificed on the altar of his accomplishment. The desire for fame, as philosopher David Hume explained, is paradoxically an altruistic one. At least in the short-term, no one has anything to gain from the dedication and toil that are the hallmark of ambition. And status will tend to be awarded to those whose services or accomplishments benefit society at large and not any select part of it the ambitious has special designs for or interest in. As selfish as we may seem at first glance, we humans tend to be drawn to the ambitious for the other-directedness their ambition signals.

Evolutionary Literary Critic William Flesch incorporates Hume’s argument into the theoretical framework he lays out in Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction, in which he posits that one of our biggest joys in reading fictional narratives derives from our capacity to track characters while anticipating rewards for the altruistic and comeuppance for the selfish. With this biological perspective in mind, Aschenbach’s fate in the course of the novel can be viewed as hinging on whether he will be able, once he’s stepped away from the lonely duty of his writing, to establish intimate relationships with real humans, as he betrays a desperate longing to do. When he ultimately fails in this endeavor, largely because he fails to commit himself to it fully, Mann has the opportunity to signal to his readers how grave the danger is that every artist, including Thomas Mann, must face as he stands at the edge of the abyss.

Mann’s novel was published at an interesting time, not just geopolitically, but in the realm of literary theory as well. Most notably, the years leading up to 1911 saw the ascendancy of Freudian psychoanalysis. Mann has even suggested that Death in Venice was at least partly inspired by Freud’s ideas (Symington, 128). And it has gone on to be re-evaluated countless times in light of not only psychoanalytic developments but of those of several other newly christened and burgeoning literary theories. Readers of this nearly hundred-year-old story may rightly ask whether it has any meaning to anyone not steeped in such paradigms, especially since the value—and validity—of literary theory in general, and psychoanalysis in particular are being questioned in many arenas. Terry Eagleton notes in the preface to the 25th Anniversary Edition of his popular Literary Theory: An Introductionthat there has been “in recent times the growth of a kind of anti-theory” (vii). In the original preface to the same work, he writes:

Some students and critics…protest that literary theory “gets in between the reader and the work.” The simple response to this is that without some kind of theory, however unreflective and implicit, we would not know what a “literary work” was in the first place, or how we were to read it (xii).

Authors like Flesch, however, along with others who subscribe to the recently developed set of theories collectively labeled Literary Darwinism, would probably insist that Eagleton vastly underestimates just how unreflective and implicit our appreciation of narrative really is.

If there are cases, though, in which Eagleton’s argument holds up, they would probably be those works which are heavily influenced by the theories that would be referenced to interpret them, and Death in Venice certainly falls into that category. But these special cases shouldn’t overshadow the fact that when Eagleton makes the seemingly obvious point that we must have some theory of literature if we’re to make any sense of our reading, he is in fact making a rather grand assumption, one in keeping with a broader poststructuralist paradigm. According to this view, objectivity is impossible because our only real contact with the world and its inhabitants is through language. This observation, which in a banal way is indisputable—if it’s not rendered linguistically we can’t speak or write about it—takes the emphasis away from any firsthand experience with either the world or the text and affords to language the utmost power in determining our beliefs, and even our perceptions. The flipside of this linguistic or discursive determinism is that any social phenomenon we examine, from a work of fiction to the institutionalized marginalization of women and minorities, is somehow encapsulated in and promulgated through the medium of language. Poststructuralism has led many to the conclusion that the most effective remedy for such inequality and injustice consists of changing the way we talk and write about people and their relations. This political program, disparaged (accurately) by conservatives with the label “political correctness,” has been singularly ineffective.

One possible explanation for this failure is that the poststructuralists’ understanding of human nature and human knowledge is grossly off the mark. Indeed, to Eagleton’s claim that we need a theory of literature or of language to get meaning out of a novel, most linguists, cognitive neuroscientists, and any other scientist involved in the study of human behavior would simply respond nonsense. Almost all of the “structures” discursive determinists insist are encapsulated in and propagated through language are to be found elsewhere in human (and sometimes non-human) cognition and in wider cultural networks. It is perhaps a partial concession to the argument that discursive determinism can only lead to infinite regresses, and that any theory of literature must be grounded in a wider understanding of human nature, that the longest chapter of Eagleton’s book is devoted to psychoanalysis. And what could Death in Venice be if not a tale about a repressed homosexual who has achieved eminence through the disciplined sublimation of his desires into literature, but who eventually buckles under the strain and succumbs to perversion and sickness? More importantly, if Freud’s model of the unconscious has been shown to be inaccurate, and repression a mere chimera, must Mann’s novel be relegated to a category of works whose interest is solely historical? (One of the most damning refutations of Eagleton’s argument for the necessity of theory is that such a category is so difficult to fill.)

If Flesch is correct in arguing that our interest in fiction is inseparable from our propensity for tracking other people, assessing their proclivity toward altruism, and anticipating the apportionment of just deserts, Gustav von Aschenbach, who has devoted his life to solitary public service, but who through the course of the novel abandons this service and sets out on an adventure consisting of multiple potential encounters with flesh-and-blood humans, may still attract the attention of post-Freudian (or simply non-Freudian) readers. Another way to frame to the repression-sublimation-perversion dynamic central to Death in Venice is as an enactment of the benefits of an intense devotion to art being overwhelmed by its costs and risks. An excerpt that can serve as a key to unlocking the symbolism of the entire novel comes when Aschenbach is at last settled in his hotel in Venice:

The observations and encounters of a devotee of solitude and silence are at once less distinct and more penetrating than those of the sociable man; his thoughts are weightier, stranger, and never without a tinge of sadness. Images and perceptions which might otherwise be easily dispelled by a glance, a laugh, an exchange of comments, concern him unduly, they sink into mute depths, take on significance, become experiences, adventures, emotions. The fruit of solitude is originality, something daringly and disconcertingly beautiful, the poetic creation. But the fruit of solitude can also be the perverse, the disproportionate, the absurd and the forbidden (43).

Can Aschenbach, a devotee of solitude, be considered prosocial or altruistic? He can when the fruit of his solitude is the poetic creation prized by the society as a whole. However, the plot of the story focuses more on the perverse and the forbidden, on the great man’s fall from grace. Any yet these costs are suffered, not by society, but by the artist alone, so in the end he can be seen as even more of an altruist—he is in fact a martyr. (And many a poststructuralist critic would take this opportunity to highlight the word art in the middle of martyr.)

In the lead-up to this martyrdom, however, Aschenbach toes the very selfish waters of pedophilia. What little suspense the plot has to offer comes from uncertainty over how far the august author will allow his obsession with the young boy Tadzio to take him. Are we monitoring Aschenbach to see if he gives into temptation? Interestingly, his attraction for the young boy is never explicitly described as sexual. There are suggestive lines, to be sure, especially those coming in the wake of Aschenbach’s discovery of the epidemic being covered up by the Venetian authorities. His response is to become elated.

For to passion, as to crime, the assured everyday order and stability of things is not opportune, and any weakening of the civil structure, any chaos and disaster afflicting the world, must be welcome to it, as offering a vague hope of turning such circumstances to its advantage (68).

This line can not only be read as proof that Aschenbach indeed has a selfish desire to satisfy, a passion awaiting the opportunity to press—or take—its advantage; it can also be seen as a piece of puzzle that was his motivation for coming to Venice in the first place. Did he gravitate to this place because of the disruption of the daily order, the chaos, it promised? Soon after the narrator refers to this “vague hope” he reveals that Aschenbach has begun doing more than merely watching Tadzio—he’s been following him around. All the while, though, the unease about whether the devotee of solitude will ever get close enough to do any sort of harm to the object of his obsession is undercut by the great pains he goes to just to keep the boy in view juxtaposed with the fact that it never seems to occur to him to simply approach and begin a conversation.

Read part 2

Read More
Dennis Junk Dennis Junk

Edgar Allan Poe Doesn't Like You

What does the black cat in Edgar Allan Poe’s story “The Black Cat” symbolize? The old man who’s murdered in “The Tell-Tale Heart” had one eye that stared intensely at the narrator. The black cat, Pluto, in the other story likewise ends up with a single eye. This is Poe poking at fun at his half-blind readers. What does that tell us? Well, Pluto, it so happens, isn’t merely the god of the underworld. Pluto was also a god of wealth. Poe had to appeal to the half-blind to earn his wealth.

Well, it's October, and I'm reminded of a 20-page paper I wrote all the way back in the spring semester about Poe. (Seriously, that's a long time to be thinking about a paper.) It's not very often that you have the chance to make any actual discoveries when you're researching papers in English (or any other topics for that matter). But I came up with something new--as far as I can tell.

Before you read any further, you have to have read "The Tell-Tale Heart," "The Black Cat," "The Imp of the Perverse," and "The Cask of Amontillado" to really understand what I'm about to write. If you haven't read them get cracking here.

Now, a lot of the murders that take place in Poe stories seem to have little or no motive. The conventional way to account for this is to conclude the murderers are simply insane, or that their motives are unconscious. But, knowing as we do that psychoanalysis is worthless pseudoscience, and that Poe predated Freud, we might want better answers. The stories above are listed in the order in which they were written, and I suggest there's a progression. In "Tell-Tale," the murder victim is an old man who is killed because of his one staring eye. In "Black Cat," the victims are a one-eyed cat (whose loss of the other eye makes a dual victim) and the murderer's/narrator's wife. The only thing mentioned about the victim in "Imp" is that he was fond of reading by candlelight. And Fortunato in "Amontillado" is a rich drunk guy who gets killed because he treats Montressor contemptuously.

The key to unlocking the true victim of Poe's aggression comes from an examination of these victims' names--or rather, the two of them that Poe's deigns to give names. One of course is Fortunato, which is pretty suggestive in itself. The other belongs to the hapless cat, Pluto. Now, most readers with any knowledge of mythology at all see the name Pluto and think, ah, the god of the underworld. Poe's a creepy guy so in this horror story it only fits that a cat would have a name vaguely associated with Hell. But Pluto--or Dis--has been tied to wealth throughout history. In Dante's Inferno, for instance, Fortuna and Plutus (actually a separate deity but often conflated) are in the fourth circle guarding the avaricious and prodigal. Even the name Pluto translates to "wealth-giver."

The old man in "Tell-Tale" looks at the narrator with only one eye because he's half blind. Poe intended much of his work as satire, but his readers constantly mistook his work for what it was designed to parody. The old man only saw the horror story, and was blind to the joke. The cat, Pluto, is likewise a stand-in for Poe's half blind readership, on whom he depends for his wealth. That's why in trying to kill the cat all the narrator manages to do is kill his wife. (Poe's wife had TB, and he agonized over the shabby, unheated rooms his poverty relegated them to.) Notice in all these stories, the narrator can't help bragging about how brilliantly he planned and executed the murders--only to be found out later by some supernatural means. The dead reader in "Imp" is pretty self-explanatory, and is possibly a reference to the symbolic murder in "Black Cat." And then there's Montressor, who tells his story several years after committing the crime, because he's sure now he won't be caught--no one at that point in Poe's life had broken his coded message, and he had little confidence anyone would.

So when it seems like Poe is going over the top, as silly critics like Harold Bloom take him to task for doing, keep in mind he's doing it deliberately. He's expecting to be taken only half seriously. And if you never picked up on this he'd just as soon bury an ax in your stupid skull and leave you walled up in some cellar, never to be found.

You can read the longer essay: DEFIANCE AND DUPLICITY: DECODING POE’S ATTACKS ON READERS PART 1 OF 4

Read More