pseudoscience in psychotherapy

Nice Guys with Nothing to Say: Brett Martin’s Difficulty with “Difficult Men” and the Failure of Arts Scholarship

With his book Difficult Men: Behind the Scenes of a Creative Revolution: From “The Sopranos” and “The Wire” to “Mad Men” and “Breaking Bad”, Brett Martin shows that you can apply the whole repertoire of analytic tools furnished by contemporary scholarship in the arts to a cultural phenomenon without arriving at anything even remotely approaching an insight. Which isn’t to say the book isn’t worth reading: if you’re interested in the backstories of how cable TV series underwent their transformation to higher production quality, film-grade acting and directing, greater realism, and multiple, intricately interlocking plotlines, along with all the gossip surrounding the creators and stars, then you’ll be delighted to discover how good Martin is at delivering the dish. 

He had excellent access to some of the showrunners, seems to know everything about the ones he didn’t have access to anyway, and has a keen sense for the watershed moments in shows—as when Tony Soprano snuck away from scouting out a college with his daughter Meadow to murder a man, unceremoniously, with a smile on his face, despite the fears of HBO executives that audiences would turn against the lead character for doing so. And Difficult Men is in no way a difficult read. Martin’s prose is clever without calling too much attention to itself. His knowledge of history and pop culture rivals that of anyone in the current cohort of hipster sophisticates. And his enthusiasm for the topic radiates off the pages while not marring his objectivity with fanboyism. But if you’re more interested in the broader phenomenon of unforgivable male characters audiences can’t help loving you’ll have to look elsewhere for any substantive discussion of it.

Brett Martin
Difficult Men would have benefited from Martin being a more difficult man himself. Instead, he seems at several points to be apologizing on behalf of the show creators and their creations, simultaneously ecstatic at the unfettering of artistic freedom and skittish whenever bumping up against questions about what the resulting shows are reflecting about artists and audiences alike. He celebrates the shows’ shucking off of political correctness even as he goes out of his way to brandish his own PC bona fides. With regard to his book’s focus on men, for instance, he writes,

Though a handful of women play hugely influential roles in this narrative—as writers, actors, producers, and executives—there aren’t enough of them. Not only were the most important shows of the era run by men, they were also largely about manhood—in particular the contours of male power and the infinite varieties of male combat.
Why that was had something to do with a cultural landscape still awash in postfeminist dislocation and confusion about exactly what being a man meant. (13)

Martin throws multiple explanations at the centrality of “male combat” in high-end series, but the basic fact that he suggests accounts for the prevalence of this theme across so many shows in TV’s Third Golden Age is that most of the artists working on the shows are afflicted with the same preoccupations.

In other words, middle-aged men predominated because middle-aged men had the power to create them. And certainly the autocratic power of the showrunner-auteur scratches a peculiarly masculine itch. (13)

Never mind that women make up a substantial portion of the viewership. If it ever occurred to Martin that this alleged “masculine itch” may have something to do with why men outnumber women in high-stakes competitive fields like TV scriptwriting, he knew better than to put the suspicion in writing.

            The centrality of dominant and volatile male characters in America’s latest creative efflorescence is in many ways a repudiation of the premises underlying the scholarship of the decades leading up to it. With women moving into the workplace after the Second World War, and with the rise of feminism in the 1970s, the stage was set for an experiment in how malleable human culture really was with regard to gender roles. How much change did society’s tastes undergo in the latter half of the twentieth century? Despite his emphasis on “postfeminist dislocation” as a factor in the appeal of TV’s latest crop of bad boys, Martin is savvy enough to appreciate these characters’ long pedigree, up to a point. He writes of Tony Soprano, for instance,

In his self-absorption, his horniness, his alternating cruelty and regret, his gnawing unease, Tony was, give or take Prozac and one or two murders, a direct descendant of Updike’s Rabbit Angstrom. In other words, the American Everyman. (84)

According to the rules of modern criticism, it’s okay to trace creative influences along their historical lineages. And Martin is quite good at situating the Third Golden Age in its historical and technological context:

The ambition and achievement of these shows went beyond the simple notion of “television getting good.” The open-ended, twelve- or thirteen-episode serialized drama was maturing into its own, distinct art form. What’s more, it had become the signature American art form of the first decade of the twenty-first century, the equivalent of what the films of Scorsese, Altman, Coppola, and others had been to the 1970s or the novels of Updike, Roth, and Mailer had been to the 1960s. (11)

What you’re not allowed to do, however—and what Martin knows better than to try to get away with—is notice that all those male filmmakers and novelists of the 60s and 70s were dealing with the same themes as the male showrunners Martin is covering. Is this pre-feminist dislocation? Mad Men could’ve featured Don Draper reading Rabbit, Run right after it was published in 1960. In fact, Don bears nearly as much resemblance to the main character of what was arguably the first novel ever written, The Tale of Genji, by the eleventh-century Japanese noblewoman, Murasaki Shikibu, as Tony Soprano bears to Rabbit Angstrom.

            Missed connections, tautologies, and non sequiturs abound whenever Martin attempts to account for the resonance of a particular theme or show, and at points his groping after insight is downright embarrassing. Difficult Men, as good as it is on history and the politicking of TV executives, can serve as a case study in the utter banality and logical bankruptcy of scholarly approaches to discussing the arts. These politically and academically sanctioned approaches can be summed up concisely, without scanting any important nuances, in the space of paragraph. While any proposed theory about average gender differences with biological bases must be strenuously and vociferously criticized and dismissed (and its proponents demonized without concern for fairness), any posited connection between a popular theme and contemporary social or political issues is seen not just as acceptable but as automatically plausible, to the point where after drawing the connection the writer need provide no further evidence whatsoever.

One of several explanations Martin throws out for the appeal of characters like Tony Soprano and Don Draper, for instance, is that they helped liberal HBO and AMC subscribers cope with having a president like George W. Bush in office. “This was the ascendant Right being presented to the disempowered Left—as if to reassure it that those in charge were still recognizably human” (87). But most of Mad Men’s run, and Breaking Bad’s too, has been under a President Obama. This doesn’t present a problem for Martin’s analysis, though, because there’s always something going on in the world that can be said to resonate with a show’s central themes. Of Breaking Bad, he writes,

Like The Sopranos, too, it uncannily anticipated a national mood soon to be intensified by current events—in this case the great economic unsettlement of the late aughts, which would leave many previously secure middle-class Americans suddenly feeling like desperate outlaws in their own suburbs. (272)

If this strikes you as comically facile, I can assure you that were the discussion taking place in the context of an explanation proposed by a social scientist, writers like Martin would be falling all over themselves trying to be the first to explain the danger of conflating correlation with causation, whether the scientist actually made that mistake or not.

            But arts scholarship isn’t limited to this type of socio-historical loose association because at some point you simply can’t avoid bringing individual artists, characters, and behind-the-scenes players into the discussion. Even when it comes to a specific person or character’s motivation, though, it’s important to focus on upbringing in a given family and sociopolitical climate as opposed to any general trend in human psychology. This willful blindness becomes most problematic when Martin tries to identify commonalities shared by all the leading men in the shows he’s discussing. He writes, for example,

All of them strove, awkwardly at times, for connection, occasionally finding it in glimpses and fragments, but as often getting blocked by their own vanities, their fears, and their accumulated past crimes. (189-90)

This is the closest Martin comes to a valid insight into difficult men in the entire book. The problem is that the rule against recognizing trends in human nature has made him blind to the applicability of this observation to pretty much everyone in the world. You could use this passage as a cold read and convince people you’re a psychic.

            So far, our summation of contemporary arts scholarship includes a rule against referring to human nature and an injunction to focus instead on sociopolitical factors, no matter how implausible their putative influence. But the allowance for social forces playing a role in upbringing provides something of a backdoor for a certain understanding of human nature to enter the discussion. Although the academic versions of this minimalist psychology are byzantine to the point of incomprehensibility, most of the main precepts will be familiar to you from movie and book reviews and criticism: parents, whom we both love and hate, affect nearly every aspect of our adult personalities; every category of desire, interest, or relationship is a manifestation of the sex drive; and we all have subconscious desires—all sexual in one way or another—based largely on forgotten family dramas that we enjoy seeing played out and given expression in art. That’s it. 

            So, if we’re discussing Breaking Bad for instance, a critic might refer to Walt and Jesse’s relationship as either oedipal, meaning they’re playing the roles of father and son who love but want to kill each other, or homoerotic, meaning their partnership substitutes for the homosexual relationship they’d both really prefer. The special attention the show gives to the blue meth and all the machines and gadgets used to make it constitutes a fetish. And the appeal of the show is that all of us in the audience wish we could do everything Walt does. Since we must repress those desires, we come to the show because watching it effects a type of release.

            Not a single element of this theory has any scientific validity. If we were such horny devils, we could just as easily watch internet pornography as tune into Mad Men. Psychoanalysis is to modern scientific psychology what alchemy is to chemistry and what astrology is to astronomy. But the biggest weakness of Freud’s pseudo-theories from a scientific perspective is probably what has made them so attractive to scholars in the humanities over the past century: they don’t lend themselves to testable predictions, so they can easily be applied to a variety of outcomes. As explanations, they can never fail or be definitively refuted—but that’s because they don’t really explain anything. Quoting Craig Wright, a writer for Six Feet Under, Martin writes that

…the left always articulates a critique through the arts.  “But the funny part is that masked by, or nested within, that critique is a kind of helpless eroticization of the power of the Right. They’re still in love with Big Daddy, even though they hate him.”
That was certainly true for the women who made Tony Soprano an unlikely sex symbol—and for the men who found him no less seductive. Wish fulfillment has always been at the queasy heart of the mobster genre, the longing for a life outside the bounds of convention, mingled with the conflicted desire to see the perpetrator punished for the same transgression… Likewise for viewers, for whom a life of taking, killing, and sleeping with whomever and whatever one wants had an undeniable, if conflict-laden, appeal. (88)

So Tony reminds us of W. because they’re both powerful figures, and we’re interested in powerful figures because they remind us of our dads and because we eroticize power. Even if this were true, would it contribute anything to our understanding or enjoyment of the show? Are any of these characters really that much like your own dad? Tony smashes some poor guy’s head because he got in his way, and sometimes we wish we could do that. Don Draper sleeps with lots of attractive women, and all the men watching the show would like to do that too. Startling revelations, those.

What a scholar in search of substantive insights might focus on instead is the universality of the struggle to reconcile selfish desires—sex, status, money, comfort—with the needs and well-being of the groups to which we belong. Don Draper wants to sleep around, but he also genuinely wants Betty and their children to be happy. Tony Soprano wants to be feared and respected, but he doesn’t want his daughter to think he’s a murderous thug. Walter White wants to prove he can provide for his family, but he also wants Skyler and Walter Junior to be safe. These tradeoffs and dilemmas—not the difficult men themselves—are what most distinguish these shows from conventional TV dramas. In most movies and shows, the protagonist may have some selfish desires that compete with his or her more altruistic or communal instincts, but which side ultimately wins out is a foregone conclusion. “Heroes are much better suited for the movies,” Martin quotes Alan Ball saying. “I’m more interested in real people. And real people are fucked up” (106).

Ball is the showrunner behind the HBO series Six Feet Under and True Blood, and though Martin gives him quite a bit of space in Difficult Men he doesn’t seem to notice that Ball’s “feminine style” (102) of showrunning undermines his theory about domineering characters being direct reflections of their domineering creators. The handful of interesting observations about what makes for a good series in Martin’s book is pretty evenly divvied up between Ball and David Simon, the creator of The Wire. Recalling his response to the episode of The Sopranos in which Tony strangles a rat while visiting a college campus with Meadow, Ball says,

I felt like was watching a movie from the seventies. Where it was like, “You know those cartoon ideas of good and evil? Well, forget them. We’re going to address something that’s really real.” The performances were electric. The writing was spectacular. But it was the moral complexity, the complexity of the characters and their dilemmas, that made it incredibly exciting. (94-5)

Alan Ball with the actors playing his bad boy creations
The connection between us and the characters isn’t just that we have some of the same impulses and desires; it’s that we have to do similar balancing acts as we face similar dilemmas. No, we don’t have to figure out how to whack a guy without our daughters finding out, but a lot of us probably do want to shield our kids from some of the ugliness of our jobs. And most of us have to prioritize career advancement against family obligations in one way or another. What makes for compelling drama isn’t our rooting for a character who knows what’s right and does it—that’s not drama at all. What pulls us into these shows is the process the characters go through of deciding which of their competing desires or obligations they should act on. If we see them do the wrong thing once in a while, well, that just ups the ante for the scenes when doing the right thing really counts.

            On the one hand, parents and sponsors want a show that has a good message, a guy with the right ideas and virtuous motives confronted with people with bad ideas and villainous motives. The good guy wins and the lesson is conveyed to the comfortable audiences. On the other hand, writers, for the most part, want to dispense with this idea of lessons and focus on characters with murderous, adulterous, or self-aggrandizing impulses, allowing for the possibility that they’ll sometimes succumb to them. But sometimes writers face the dilemma of having something they really want to say with their stories. Martin describes David Simon’s struggle to square this circle.

 As late as 2012, he would complain in a New York Times interview that fans were still talking about their favorite characters rather than concentrating on the show’s political message… The real miracle of The Wire is that, with only a few late exceptions, it overcame the proud pedantry of its creators to become one of the greatest literary accomplishments of the early twenty-first century. (135)

But then it’s Simon himself who Martin quotes to explain how having a message to convey can get in the way of a good story.

Everybody, if they’re trying to say something, if they have a point to make, they can be a little dangerous if they’re left alone. Somebody has to be standing behind them saying, dramatically, “Can we do it this way?” When the guy is making the argument about what he’s trying to say, you need somebody else saying, “Yeah, but…” (207)

The exploration of this tension makes up the most substantive and compelling section of Difficult Men.

            Unfortunately, Martin fails to contribute anything to this discussion of drama and dilemmas beyond these short passages and quotes. And at several points he forgets his own observation about drama not being reducible to any underlying message. The most disappointing part of Difficult Men is the chapter devoted to Vince Gilligan and his show Breaking Bad. Gilligan is another counterexample to the theory that domineering and volatile men in the writer’s seat account for domineering and volatile characters in the shows; the writing room he runs gives the chapter its name, “The Happiest Room in Hollywood.” Martin writes that Breaking Bad is “arguably the best show on TV, in many ways the culmination of everything the Third Golden Age had made possible” (264). In trying to explain why the show is so good, he claims that
…whereas the antiheroes of those earlier series were at least arguably the victims of their circumstances—family, society, addiction, and so on—Walter White was insistently, unambiguously, an agent with free will. His journey became a grotesque magnification of the American ethos of self-actualization, Oprah Winfrey’s exhortation that all must find and “live your best life.” What if, Breaking Bad asked, one’s best life happened to be as a ruthless drug lord? (268)

This is Martin making the very mistake he warns against earlier in the book by finding some fundamental message at the core of the show. (Though he could simply believe that even though it’s a bad idea for writers to try to convey messages it’s okay for critics to read them into the shows.) But he’s doing the best he can with the tools of scholarship he’s allowed to marshal. This assessment is an extension of his point about post-feminist dislocation, turning the entire series into a slap in the face to Oprah, that great fount of male angst.

            To point out that Martin is perfectly wrong about Walter White isn’t merely to offer a rival interpretation. Until the end of season four, as any reasonable viewer who’s paid a modicum of attention to the development of his character will attest, Walter is far more at the mercy of circumstances than any of the other antiheroes in the Third Golden Age lineup. Here’s Walter explaining why he doesn’t want to undergo an expensive experimental cancer treatment in season one:

What I want—what I need—is a choice. Sometimes I feel like I never actually make any of my own. Choices, I mean. My entire life, it just seems I never, you know, had a real say about any of it. With this last one—cancer—all I have left is how I choose to approach this.

He’s secretly cooking meth to make money for his family already at this point, but that’s a lot more him making the most of a bad situation than being the captain of his own fate. Can you imagine Tony or Don saying anything like this? Even when Walt delivers his famous “I am the danger” speech in season four—which gets my vote for the best moment in TV history (or film history too for that matter)—the statement is purely aspirational; he’s still in all kinds of danger at that point. Did Martin neglect the first four seasons and pick up watching only after Walt finally killed Gus? Either way, it’s a big, embarrassing mistake.

           The dilemmas Walt faces are what make his story so compelling. He’s far more powerless than other bad boy characters at the start of the series, and he’s also far more altruistic in his motives. That’s precisely why it’s so disturbing—and riveting—to see those motives corrupted by his gradually accumulating power. It’s hard not to think of the cartel drug lords we always hear about in Mexico according to those “cartoon ideas of good and evil” Alan Ball was so delighted to see smashed by Tony Soprano. But Breaking Bad goes a long way toward bridging the divide between such villains and a type of life we have no trouble imagining. The show isn’t about free will or self-actualization at all; it’s about how even the nicest guy can be turned into one of the scariest villains by being placed in a not all that far-fetched set of circumstances. In much the same way, Martin, clearly a smart guy and a talented writer, can be made to look like a bit of an idiot by being forced to rely on a bunch of really bad ideas as he explores the inner workings some really great shows.

            If men’s selfish desires—sex, status, money, freedom—aren’t any more powerful than women’s, their approaches to satisfying them still tend to be more direct, less subtle. But what makes it harder for a woman’s struggles with her own desires to take on the same urgency as a man’s is probably not that far removed from the reasons women are seldom as physically imposing as men. Volatility in a large man can be really frightening. Men are more likely to have high-status careers like Don’s still today, but they’re also far more likely to end up in prison. These are pretty high stakes. And Don’s actions have ramifications for not just his own family’s well-being, but that of everyone at Sterling Cooper and their families, which is a consequence of that high-status. So status works as a proxy for size. Carmela Soprano’s volatility could be frightening too, but she isn’t the time-bomb Tony is. Speaking of bombs, Skyler White is an expert at bullying men, but going head-to-head with Walter she’s way overmatched. Men will always be scarier than women on average, so their struggles to rein in their scarier impulses will seem more urgent. Still, the Third Golden Age is a teenager now, and as anxious as I am to see what happens to Walter White and all his friends and family, I think the bad boy thing is getting a little stale. Anyone seen Damages




Oh yeah--can't forget: The Adaptive Appeal of Bad Boys

Sabbath Says: Philip Roth and the Dilemmas of Ideological Castration

            Sabbath’s Theater is the type of book you lose friends over. Mickey Sabbath, the adulterous title character who follows in the long literary line of defiantly self-destructive, excruciatingly vulnerable, and offputtingly but eloquently lustful leading males like Holden Caulfield and Humbert Humbert, strains the moral bounds of fiction and compels us to contemplate the nature of our own voyeuristic impulse to see him through to the end of the story—and not only contemplate it but defend it, as if in admitting we enjoy the book, find its irreverences amusing, and think that in spite of how repulsive he often is there still might be something to be said for poor old Sabbath we’re confessing to no minor offense of our own. Fans and admiring critics alike can’t resist rushing to qualify their acclaim by insisting they don’t condone his cheating on both of his wives, the seduction of a handful of his students, his habit of casually violating others’ privacy, his theft, his betrayal of his lone friend, his manipulations, his racism, his caustic, often cruelly precise provocations—but by the time they get to the end of Sabbath’s debt column it’s a near certainty any list of mitigating considerations will fall short of getting him out of the red. Sabbath, once a puppeteer who now suffers crippling arthritis, doesn’t seem like a very sympathetic character, and yet we sympathize with him nonetheless. In his wanton disregard for his own reputation and his embrace, principled in a way, of his own appetites, intuitions, and human nastiness, he inspires a fascination none of the literary nice guys can compete with. So much for the argument that the novel is a morally edifying art form.

            Thus, in Sabbath, Philip Roth has created a character both convincing and compelling who challenges a fundamental—we may even say natural—assumption about readers’ (or viewers’) role in relation to fictional protagonists, one made by everyone from the snarky authors of even the least sophisticated Amazon.com reviews to the theoreticians behind the most highfalutin academic criticism—the assumption that characters in fiction serve as vehicles for some message the author created them to convey, or which some chimerical mechanism within the “dominant culture” created to serve as agents of its own proliferation. The corollary is that the task of audience members is to try to decipher what the author is trying to say with the work, or what element of the culture is striving to perpetuate itself through it. If you happen to like the message the story conveys, or agree with it at some level, then you recommend the book and thus endorse the statement. Only rarely does a reviewer realize or acknowledge that the purpose of fiction is not simply to encourage readers to behave as the protagonists behave or, if the tale is a cautionary one, to expect the same undesirable consequences should they choose to behave similarly. Sabbath does in fact suffer quite a bit over the course of the novel, and much of that suffering comes as a result of his multifarious offenses, so a case can be made on behalf of Roth’s morality. Still, we must wonder if he really needed to write a story in which the cheating husband is abandoned by both of his wives to make the message sink in that adultery is wrong—especially since Sabbath doesn’t come anywhere near to learning that lesson himself. “All the great thoughts he had not reached,” Sabbath muses in the final pages, “were beyond enumeration; there was no bottom to what he did not have to say about the meaning of his life” (779).

           Part of the reason we can’t help falling back on the notions that fiction serves a straightforward didactic purpose and that characters should be taken as models, positive or negative, for moral behavior is that our moral emotions are invariably and automatically engaged by stories; indeed, what we usually mean when we say we got into a story is that we were in suspense as we anticipated whether the characters ultimately met with the fates we felt they deserved. We reflexively size up any character the author introduces the same way we assess the character of a person we’re meeting for the first time in real life. For many readers, the question of whether a novel is any good is interchangeable with the question of whether they liked the main characters, assuming they fare reasonably well in the culmination of the plot. If an author like Roth evinces an attitude drastically different from ours toward a character of his own creation like Sabbath, then we feel that in failing to condemn him, in holding him up as a model, the author is just as culpable as his character. In a recent edition of PBS’s American Masters devoted to Roth, for example, Jonathan Franzen, a novelist himself, describes how even he couldn’t resist responding to his great forebear’s work in just this way. “As a young writer,” Franzen recalls, “I had this kind of moralistic response of ‘Oh, you bad person, Philip Roth’” (54:56).

Jonathan Franzen
            That fiction’s charge is to strengthen our preset convictions through a process of narrative tempering, thus catering to our desire for an orderly calculus of just deserts, serves as the basis for a contract between storytellers and audiences, a kind of promise on which most commercial fiction delivers with a bang. And how many of us have wanted to throw a book out of the window when we felt that promise had been broken? The goal of professional and academic critics, we may imagine, might be to ease their charges into an appreciation of more complex narrative scenarios enacted by characters who escape easy categorization. But since scholarship in the humanities, and in literary criticism especially, has been in a century-long sulk over the greater success of science and the greater renown of scientists, professors of literature have scarcely even begun to ponder what anything resembling a valid answer to the questions of how fiction works and what the best strategies for experiencing it might look like. Those who aren’t pouting in a corner about the ascendancy of science—but the Holocaust!—are stuck in the muck of the century-old pseudoscience of psychoanalysis. But the real travesty is that the most popular, politically inspired schools of literary criticism—feminism, Marxism, postcolonialism—actively preach the need to ignore, neglect, and deny the very existence of moral complexity in literature, violently displacing any appreciation of difficult dilemmas with crudely tribal formulations of good and evil.

            For those inculcated with a need to take a political stance with regard to fiction, the only important dynamics in stories involve the interplay of society’s privileged oppressors and their marginalized victims. In 1976, nearly twenty years before the publication of Sabbath’s Theater, the feminist critic Vivian Gornick lumped Roth together with Saul Bellow and Norman Mailer in an essay asking “Why Do These Men Hate Women?” because she took issue with the way women are portrayed in their novels. Gornick, following the methods standard to academic criticism, doesn’t bother devoting any space in her essay to inconvenient questions about how much we can glean about these authors from their fictional works or what it means that the case for her prosecution rests by necessity on a highly selective approach to quoting from those works. And this slapdash approach to scholarship is supposedly justified because she and her fellow feminist critics believe women are in desperate need of protection from the incalculable harm they assume must follow from such allegedly negative portrayals. In this concern for how women, or minorities, or some other victims are portrayed and how they’re treated by their notional oppressors—rich white guys—Gornick and other critics who make of literature a battleground for their political activism are making the same assumption about fiction’s straightforward didacticism as the most unschooled consumers of commercial pulp. The only difference is that the academics believe the message received by audiences is all that’s important, not the message intended by the author. The basis of this belief probably boils down to its obvious convenience.

Gornick
            In Sabbath’s Theater, the idea that literature, or art of any kind, is reducible to so many simple messages, and that these messages must be measured against political agendas, is dashed in the most spectacularly gratifying fashion. Unfortunately, the idea is so seldom scrutinized, and the political agendas are insisted on so inclemently, clung to and broadcast with such indignant and prosecutorial zeal, that it seems not one of the critics, nor any of the authors, who were seduced by Sabbath were able to fully reckon with the implications of that seduction. Franzen, for instance, in a New Yorker article about fictional anti-heroes, dodges the issue as he puzzles over the phenomenon that “Mickey Sabbath may be a disgustingly self-involved old goat,” but he’s somehow still sympathetic. The explanation Franzen lights on is that

the alchemical agent by which fiction transmutes my secret envy or my ordinary dislike of “bad” people into sympathy is desire. Apparently, all a novelist has to do is give a character a powerful desire (to rise socially, to get away with murder) and I, as a reader, become helpless not to make that desire my own. (63)

If Franzen is right—and this chestnut is a staple of fiction workshops—then the political activists are justified in their urgency. For if we’re powerless to resist adopting the protagonist’s desires as our own, however fleetingly, then any impulse to victimize women or minorities must invade readers’ psyches at some level, conscious or otherwise. The simple fact, however, is that Sabbath has not one powerful desire but many competing desires, ones that shift as the novel progresses, and it’s seldom clear even to Sabbath himself what those desires are. (And is he really as self-involved as Franzen suggests? It seems to me rather that he compulsively tries to get into other people’s heads, reflexively imagining elaborate stories for them.)

            While we undeniably respond to virtuous characters in fiction by feeling anxiety on their behalf as we read about or watch them undergo the ordeals of the plot, and we just as undeniably enjoy seeing virtue rewarded alongside cruelty being punished—the goodies prevailing over the baddies—these natural responses do not necessarily imply that stories compel our interest and engage our emotions by providing us with models and messages of virtue. Stories aren’t sermons. In his interview for American Masters, Roth explained what a writer’s role is vis-à-vis social issues.

My job isn’t to be enraged. My job is what Chekhov said the job of an artist was, which is the proper presentation of the problem. The obligation of the writer is not to provide the solution to a problem. That’s the obligation of a legislator, a leader, a crusader, a revolutionary, a warrior, and so on. That’s not the goal or aim of a writer. You’re not selling it, and you’re not inviting condemnation. You’re inviting understanding. (59:41)

Chekhov
The crucial but overlooked distinction that characters like Sabbath—but none so well as Sabbath—bring into stark relief is the one between declarative knowledge on the one hand and moment-by-moment experience on the other. Consider for a moment how many books and movies we’ve all been thoroughly engrossed in for however long it took to read or watch them, only to discover a month or so later that we can’t remember even the broadest strokes of how their plots resolved themselves—much less what their morals might have been.

            The answer to the question of what the author is trying to say is that he or she is trying to give readers a sense of what it would be like to go through what the characters are going through—or what it would be like to go through it with them. In other words, authors are not trying to say anything; they’re offering us an experience, once-removed and simulated though it may be. This isn’t to say that these simulated experiences don’t engage our moral emotions; indeed, we’re usually only as engaged in a story as our moral emotions are engaged by it. The problem is that in real-time, in real life, political ideologies, psychoanalytic theories, and rigid ethical principles are too often the farthest thing from helpful. “Fuck the laudable ideologies,” Sabbath helpfully insists: “Shallow, shallow, shallow!” Living in a complicated society with other living, breathing, sick, cruel, saintly, conniving, venal, altruistic, deceitful, noble, horny humans demands not so much a knowledge of the rules as a finely honed body of skills—and our need to develop and hone these skills is precisely why we evolved to find the simulated experiences of fictional narratives both irresistibly fascinating and endlessly pleasurable. Franzen was right that desires are important, the desire to be a good person, the desire to do things others may condemn, the desire to get along with our families and friends and coworkers, the desire to tell them all to fuck off so we can be free, even if just for an hour, to breathe… or to fuck an intern, as the case may be. Grand principles offer little guidance when it comes to balancing these competing desires. This is because, as Sabbath explains, “The law of living: fluctuation. For every thought a counterthought, for every urge a counterurge” (518).

            Fiction then is not a conveyance for coded messages—how tedious that would be (how tedious it really is when writers make this mistake); it is rather a simulated experience of moral dilemmas arising from scenarios which pit desire against desire, conviction against reality, desire against conviction, reality against desire, in any and all permutations. Because these experiences are once-removed and, after all, merely fictional, and because they require our sustained attention, the dilemmas tend to play out in the vicinity of life’s extremes. Here’s how Sabbath’s Theater opens:

                        Either forswear fucking others or the affair is over.
            This was the ultimatum, the maddeningly improbable, wholly unforeseen ultimatum, that the mistress of fifty-two delivered in tears to her lover of sixty-four on the anniversary of an attachment that had persisted with an amazing licentiousness—and that, no less amazingly, had stayed their secret—for thirteen years. But now with hormonal infusions ebbing, with the prostate enlarging, with probably no more than another few years of semi-dependable potency still his—with perhaps not that much more life remaining—here at the approach of the end of everything, he was being charged, on pain of losing her, to turn himself inside out. (373)

The ethical proposition that normally applies in situations like this is that adultery is wrong, so don’t commit adultery. But these two have been committing adultery with each other for thirteen years already—do we just stop reading? And if we keep reading, maybe nodding once in a while as we proceed, cracking a few wicked grins along the way, does that mean we too must be guilty?
                               *****
Updike
            Much of the fiction written by male literary figures of the past generation, guys like Roth, Mailer, Bellow, and Updike, focuses on the morally charged dilemmas instanced by infidelity, while their gen-x and millennial successors, led by guys like Franzen and David Foster Wallace, have responded to shifting mores—and a greater exposure to academic literary theorizing—by completely overhauling how these dilemmas are framed. Whereas the older generation framed the question as how can we balance the intense physical and spiritual—even existential—gratification of sexual adventure on the one hand with our family obligations on the other, for their successors the question has become how can we males curb our disgusting, immoral, intrinsically oppressive lusting after young women inequitably blessed with time-stamped and overwhelmingly alluring physical attributes. “The younger writers are so self-conscious,” Katie Roiphe writes in a 2009 New York Times essay, “so steeped in a certain kind of liberal education, that their characters can’t condone even their own sexual impulses; they are, in short, too cool for sex.” Roiphe’s essay, “The Naked and the Confused,” stands alongside a 2012 essay in The New York Review of Books by Elaine Blair, “Great American Losers,” as the best descriptions of the new literary trend toward sexually repressed and pathetically timid male leads. The typical character in this vein, Blair writes, “is the opposite of entitled: he approaches women cringingly, bracing for a slap.”

Katie Roiphe
            The writers in the new hipster cohort create characters who bury their longings layers-deep in irony because they’ve been assured the failure on the part of men of previous generations to properly check these same impulses played some unspecified role in the abysmal standing of women in society. College students can’t make it past their first semester without hearing about the evils of so-called objectification, but it’s nearly impossible to get a straight answer from anyone, anywhere, to the question of how objectification can be distinguished from normal, non-oppressive male attraction and arousal. Even Roiphe, in her essay lamenting the demise of male sexual virility in literature, relies on a definition of male oppression so broad that it encompasses even the most innocuous space-filling lines in the books of even the most pathetically diffident authors, writing that “the sexism in the work of the heirs apparent” of writers like Roth and Updike,

is simply wilier and shrewder and harder to smoke out. What comes to mind is Franzen’s description of one of his female characters in “The Corrections”: “Denise at 32 was still beautiful.” To the esteemed ladies of the movement I would suggest this is not how our great male novelists would write in the feminist utopia.

How, we may ask, did it get to the point where acknowledging that age influences how attractive a woman is qualifies a man for designation as a sexist? Blair, in her otherwise remarkably trenchant essay, lays the blame for our oversensitivity—though paranoia is probably a better word—at the feet of none other than those great male novelists themselves, or, as David Foster Wallace calls them, the Great Male Narcissists. She writes,

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

That Roth et al were sexist, condescending, disgusting, narcissistic—these are articles of faith for feminist critics. Yet when we consider how expansive the definition of terms like sexism and misogyny have become—in practical terms, they both translate to: not as radically feminist as me—and the laughably low standard of evidence required to convince scholars of the accusations, female empowerment starts to look like little more than a reserved right to stand in self-righteous judgment of men for giving voice to and acting on desires anyone but the most hardened ideologue will agree are only natural.

             The effect on writers of this ever-looming threat of condemnation is that they either allow themselves to be silenced or they opt to participate in the most undignified of spectacles, peevishly sniping their colleagues, falling all over themselves to be granted recognition as champions for the cause. Franzen, at least early in his career, was more the silenced type. Discussing Roth, he wistfully endeavors to give the appearance of having moved beyond his initial moralistic responses. “Eventually,” he says, “I came to feel as if that was coming out of an envy: like, wow, I wish I could be as liberated of worry about other’s people’s opinion of me as Roth is” (55:18). We have to wonder if his espousal of the reductive theory that sympathy for fictional characters is based solely on the strength of their desires derives from this same longing for freedom to express his own. David Foster Wallace, on the other hand, wasn’t quite as enlightened or forgiving when it came to his predecessors. Here’s how he explains his distaste for a character in one of Updike’s novels, openly intimating the author’s complicity:

D.F. Wallace
It’s that he persists in the bizarre adolescent idea that getting to have sex with whomever one wants whenever one wants is a cure for ontological despair. And so, it appears, does Mr. Updike—he makes it plain that he views the narrator’s impotence as catastrophic, as the ultimate symbol of death itself, and he clearly wants us to mourn it as much as Turnbull does. I’m not especially offended by this attitude; I mostly just don’t get it. Erect or flaccid, Ben Turnbull’s unhappiness is obvious right from the book’s first page. But it never once occurs to him that the reason he’s so unhappy is that he’s an asshole.

So the character is an asshole because he wants to have sex outside of marriage, and he’s unhappy because he’s an asshole, and it all traces back to the idea that having sex with whomever one wants is a source of happiness? Sounds like quite the dilemma—and one that pronouncing the main player an asshole does nothing to solve. This passage is the conclusion to a review in which Wallace tries to square his admiration for Updike’s writing with his desire to please a cohort of women readers infuriated by the way Updike writes about—portrays—women (which begs the question of why they’d read so many of his books). The troubling implication of his compromise is that if Wallace were himself to freely express his sexual feelings, he’d be open to the charge of sexism too—he’d be an asshole. Better to insist he simply doesn’t “get” why indulging his sexual desires might alleviate his “ontological despair.” What would Mickey Sabbath make of the fact that Wallace hanged himself when he was only forty-six, eleven years after publishing that review? (This isn’t just a nasty rhetorical point; Sabbath has a fascination with artists who commit suicide.)

The inadequacy of moral codes and dehumanizing ideologies when it comes to guiding real humans through life’s dilemmas, along with their corrosive effects on art, is the abiding theme of Sabbath’s Theater. One of the pivotal moments in Sabbath’s life is when a twenty-year-old student he’s in the process of seducing leaves a tape recorder out to be discovered in a lady’s room at the university. The student, Kathy Goolsbee, has recorded a phone sex session between her and Sabbath, and when the tape finds its way into the hands of the dean, it becomes grounds for the formation of a committee of activists against the abuse of women. At first, Kathy doesn’t realize how bad things are about to get for Sabbath. She even offers to give him a blow job as he berates her for her carelessness. Trying to impress on her the situation’s seriousness, he says,

Your people have on tape my voice giving reality to all the worst things they want the world to know about men. They have a hundred times more proof of my criminality than could be required by even the most lenient of deans to drive me out of every decent antiphallic educational institution in America. (586)

The committee against Sabbath proceeds to make the full recorded conversation available through a call-in line (the nineties equivalent of posting the podcast online). But the conversation itself isn’t enough; one of the activists gives a long introduction, which concludes,

The listener will quickly recognize how by this point in his psychological assault on an inexperienced young woman, Professor Sabbath has been able to manipulate her into thinking that she is a willing participant. (567-8)

Sabbath knows full well that even consensual phone sex can be construed as a crime if doing so furthers the agenda of those “esteemed ladies of the movement” Roiphe addresses. 

Reading through the lens of a tribal ideology ineluctably leads to the refraction of reality beyond recognizability, and any aspiring male writer quickly learns in all his courses in literary theory that the criteria for designation as an enemy to the cause of women are pretty much whatever the feminist critics fucking say they are. Wallace wasn’t alone in acquiescing to feminist rage by denying his own boorish instincts. Roiphe describes the havoc this opportunistic antipathy toward male sexuality wreaks in the minds of male writers and their literary creations:

Rather than an interest in conquest or consummation, there is an obsessive fascination with trepidation, and with a convoluted, postfeminist second-guessing. Compare [Benjamin] Kunkel’s tentative and guilt-ridden masturbation scene in “Indecision” with Roth’s famous onanistic exuberance with apple cores, liver and candy wrappers in “Portnoy’s Complaint.” Kunkel: “Feeling extremely uncouth, I put my penis away. I might have thrown it away if I could.” Roth also writes about guilt, of course, but a guilt overridden and swept away, joyously subsumed in the sheer energy of taboo smashing: “How insane whipping out my joint like that! Imagine what would have been had I been caught red-handed! Imagine if I had gone ahead.” In other words, one rarely gets the sense in Roth that he would throw away his penis if he could.

And what good comes of an ideology that encourages the psychological torture of bookish young men? It’s hard to distinguish the effects of these so-called literary theories from the hellfire scoldings delivered from the pulpits of the most draconian and anti-humanist religious patriarchs. Do we really need to ideologically castrate all our male scholars to protect women from abuse and further the cause of equality?
*****
The experience of sexual relations between older teacher and younger student in Sabbath’s Theater is described much differently when the gender activists have yet to get involved—and not just by Sabbath but by Kathy as well. “I’m of age!” she protests as he chastises her for endangering his job and opening him up to public scorn; “I do what I want” (586). Absent the committee against him, Sabbath’s impression of how his affairs with his students impact them reflects the nuance of feeling inspired by these experimental entanglements, the kind of nuance that the “laudable ideologies” can’t even begin to capture.

There was a kind of art in his providing an illicit adventure not with a boy of their own age but with someone three times their age—the very repugnance that his aging body inspired in them had to make their adventure with him feel a little like a crime and thereby give free play to their budding perversity and to the confused exhilaration that comes of flirting with disgrace. Yes, despite everything, he had the artistry still to open up to them the lurid interstices of life, often for the first time since they’d given their debut “b.j.” in junior high. As Kathy told him in that language which they all used and which made him want to cut their heads off, through coming to know him she felt “empowered.” (566)

Opening up “the lurid interstices of life” is precisely what Roth and the other great male writers—all great writers—are about. If there are easy answers to the questions of what characters should do, or if the plot entails no more than a simple conflict between a blandly good character and a blandly bad one, then the story, however virtuous its message, will go unattended.

            But might there be too much at stake for us impressionable readers to be allowed free reign to play around in imaginary spheres peopled by morally dubious specters? After all, if denouncing the dreamworlds of privileged white men, however unfairly, redounds to the benefit of women and children and minorities, then perhaps it’s to the greater good. In fact, though, right alongside the trends of increasing availability for increasingly graphic media portrayals of sex and violence have occurred marked decreases in actual violence and the abuse of women. And does anyone really believe it’s the least literate, least media-saturated societies that are the kindest to women? The simple fact is that the theory of literature subtly encouraging oppression can’t be valid. But the problem is once ideologies are institutionalized, once a threshold number of people depend on their perpetuation for their livelihoods, people whose scholarly work and reputations are staked on them, then victims of oppression will be found, their existence insisted on, regardless of whether they truly exist or not.

In another scandal Sabbath was embroiled in long before his flirtation with Kathy Goolsbee, he was brought up on charges of indecency because in the course of a street performance he’d exposed a woman’s nipple. The woman herself, Helen Trumbull, maintains from the outset of the imbroglio that whatever Sabbath had done, he’d done it with her consent—just as will be the case with his “psychological assault” on Kathy. But even as Sabbath sits assured that the case against him will collapse once the jury hears the supposed victim testify on his behalf, the prosecution takes a bizarre twist:
  
In fact, the victim, if there even is one, is coming this way, but the prosecutor says no, the victim is the public. The poor public, getting the shaft from this fucking drifter, this artist. If this guy can walk along a street, he says, and do this, then little kids think it’s permissible to do this, and if little kids think it’s permissible to do this, then they think it’s permissible to blah blah banks, rape women, use knives. If seven-year-old kids—the seven nonexistent kids are now seven seven-year-old kids—are going to see that this is fun and permissible with strange women… (663-4)

Here we have Roth’s dramatization of the fundamental conflict between artists and moralists. Even if no one is directly hurt by playful scenarios, that they carry a message, one that threatens to corrupt susceptible minds, is so seemingly obvious it’s all but impossible to refute. Since the audience for art is “the public,” the acts of depravity and degradation it depicts are, if anything, even more fraught with moral and political peril than any offense against an individual victim, real or imagined.  

            This theme of the oppressive nature of ideologies devised to combat oppression, the victimizing proclivity of movements originally fomented to protect and empower victims, is most directly articulated by a young man named Donald, dressed in all black and sitting atop a file cabinet in a nurse’s station when Sabbath happens across him at a rehab clinic. Donald “vaguely resembled the Sabbath of some thirty years ago,” and Sabbath will go on to apologize for interrupting him, referring to him as “a man whose aversions I wholeheartedly endorse.” What he was saying before the interruption:

“Ideological idiots!” proclaimed the young man in black. “The third great ideological failure of the twentieth century. The same stuff. Fascism. Communism. Feminism. All designed to turn one group of people against another group of people. The good Aryans against the bad others who oppress them. The good poor against the bad rich who oppress them. The good women against the bad men who oppress them. The holder of ideology is pure and good and clean and the other wicked. But do you know who is wicked? Whoever imagines himself to be pure is wicked! I am pure, you are wicked… There is no human purity! It does not exist! It cannot exist!” he said, kicking the file cabinet for emphasis. “It must not and should not exist! Because it’s a lie. … Ideological tyranny. It’s the disease of the century. The ideology institutionalizes the pathology. In twenty years there will be a new ideology. People against dogs. The dogs are to blame for our lives as people. Then after dogs there will be what? Who will be to blame for corrupting our purity?” (620-1)

It’s noteworthy that this rant is made by a character other than Sabbath. By this point in the novel, we know Sabbath wouldn’t speak so artlessly—unless he was really frightened or angry. As effective and entertaining an indictment of “Ideological tyranny” as Sabbath’s Theater is, we shouldn’t expect to encounter anywhere in a novel by a storyteller as masterful as Roth a character operating as a mere mouthpiece for some argument. Even Donald himself, Sabbath quickly gleans, isn’t simply spouting off; he’s trying to impress one of the nurses.

            And it’s not just the political ideologies that conscript complicated human beings into simple roles as oppressors and victims. The pseudoscientific psychological theories that both inform literary scholarship and guide many non-scholars through life crises and relationship difficulties function according to the same fundamental dynamic of tribalism; they simply substitute abusive family members for more generalized societal oppression and distorted or fabricated crimes committed in the victim’s childhood for broader social injustices. Sabbath is forced to contend with this particular brand of depersonalizing ideology because his second wife, Roseanna, picks it up through her AA meetings, and then becomes further enmeshed in it through individual treatment with a therapist named Barbara. Sabbath, who considers himself a failure, and who is carrying on an affair with the woman we meet in the opening lines of the novel, is baffled as to why Roseanna would stay with him. Her therapist provides an answer of sorts.

But then her problem with Sabbath, the “enslavement,” stemmed, according to Barbara, from her disastrous history with an emotionally irresponsible mother and a violent alcoholic father for both of whom Sabbath was the sadistic doppelganger. (454)

Roseanna’s father was a geology professor who hanged himself when she was a young teenager. Sabbath is a former puppeteer with crippling arthritis. Naturally, he’s confused by the purported identity of roles.

These connections—between the mother, the father, and him—were far clearer to Barbara than they were to Sabbath; if there was, as she liked to put it, a “pattern” in it all, the pattern eluded him.

In the midst of a shouting match, Sabbath tells his wife, “As for the ‘pattern’ governing a life, tell Barbara it’s commonly called chaos” (455). When she protests, “You are shouting at me like my father,” Sabbath asserts his individuality: “The fuck that’s who I’m shouting at you like! I’m shouting at you like myself!” (459). Whether you see his resistance as heroic or not probably depends on how much credence you give to those psychological theories.

            From the opening lines of Sabbath’s Theater when we’re presented with the dilemma of the teary-eyed mistress demanding monogamy in their adulterous relationship, the simple response would be to stand in easy judgment of Sabbath, and like Wallace did to Updike’s character, declare him an asshole. It’s clear that he loves this woman, a Croatian immigrant named Drenka, a character who at points steals the show even from the larger-than-life protagonist. And it’s clear his fidelity would mean a lot to her. Is his freedom to fuck other women really so important? Isn’t he just being selfish? But only a few pages later our easy judgment suddenly gets more complicated:

As it happened, since picking up Christa several years back Sabbath had not really been the adventurous libertine Drenka claimed she could no longer endure, and consequently she already had the monogamous man she wanted, even if she didn’t know it. To women other than her, Sabbath was by now quite unalluring, not just because he was absurdly bearded and obstinately peculiar and overweight and aging in every obvious way but because, in the aftermath of the scandal four years earlier with Kathy Goolsbee, he’s become more dedicated than ever to marshaling the antipathy of just about everyone as though he were, in fact, battling for his rights. (394)

Christa was a young woman who participated in a threesome with Sabbath and Drenka, an encounter to which Sabbath’s only tangible contribution was to hand the younger woman a dildo.

            One of the central dilemmas for a character who loves the thrill of sex, who seeks in it a rekindling of youthful vigor—“the word’s rejuvenation,” Sabbath muses at one point (517)—the adrenaline boost borne of being in the wrong and the threat of getting caught, what Roiphe calls “the sheer energy of taboo smashing,” becomes ever more indispensable as libido wanes with age. Even before Sabbath ever had to contend with the ravages of aging, he reveled in this added exhilaration that attends any expedition into forbidden realms. What makes Drenka so perfect for him is that she has not just a similarly voracious appetite but a similar fondness for outrageous sex and the smashing of taboo. And it’s this mutual celebration of the verboten that Sabbath is so reluctant to relinquish. Of Drenka, he thinks,

The secret realm of thrills and concealment, this was the poetry of her existence. Her crudeness was the most distinguishing force in her life, lent her life its distinction. What was she otherwise? What was he otherwise? She was his last link with another world, she and her great taste for the impermissible. As a teacher of estrangement from the ordinary, he had never trained a more gifted pupil; instead of being joined by the contractual they were interconnected by the instinctual and together could eroticize anything (except their spouses). Each of their marriages cried out for a countermarriage in which the adulterers attack their feelings of captivity. (395)

Those feelings of captivity, the yearnings to experience the flow of the old juices, are anything but adolescent, as Wallace suggests of them; adolescents have a few decades before they have to worry about dwindling arousal. Most of them have the opposite problem.

            The question of how readers are supposed to feel about a character like Sabbath doesn’t have any simple answers. He’s an asshole at several points in the novel, but at several points he’s not. One of the reasons he’s so compelling is that working out what our response to him should be poses a moral dilemma of its own. Whether or not we ultimately decide that adultery is always and everywhere wrong, the experience of being privy to Sabbath’s perspective can help us prepare ourselves for our own feelings of captivity, lusting nostalgia, and sexual temptation. Most of us will never find ourselves in a dilemma like Sabbath gets himself tangled in with his friend Norman’s wife, for instance, but it would be to our detriment to automatically discount the old hornball’s insights.

He could discern in her, whenever her husband spoke, the desire to be just a little cruel to Norman, saw her sneering at the best of him, at the very best things in him. If you don’t go crazy because of your husband’s vices, you go crazy because of his virtues. He’s on Prozac because he can’t win. Everything is leaving her except for her behind, which her wardrobe informs her is broadening by the season—and except for this steadfast prince of a man marked by reasonableness and ethical obligation the way others are marked by insanity or illness. Sabbath understood her state of mind, her state of life, her state of suffering: dusk is descending, and sex, our greatest luxury, is racing away at a tremendous speed, everything is racing off at a tremendous speed and you wonder at your folly in having ever turned down a single squalid fuck. You’d give your right arm for one if you are a babe like this. It’s not unlike the Great Depression, not unlike going broke overnight after years of raking it in. “Nothing unforeseen that happens,” the hot flashes inform her, “is likely ever again going to be good.” Hot flashes mockingly mimicking the sexual ecstasies. Dipped, she is, in the very fire of fleeting time. (651)

Welcome to messy, chaotic, complicated life.

            Sabbath’s Theater is, in part, Philip Roth’s raised middle finger to the academic moralists whose idiotic and dehumanizing ideologies have spread like a cancer into all the venues where literature is discussed and all the avenues through which it’s produced. Unfortunately, the unrecognized need for culture-wide chemotherapy hasn’t gotten any less dire in the nearly two decades since the novel was published. With literature now drowning in the devouring tide of new media, the tragic course set by the academic custodians of art toward bloodless prudery and impotent sterility in the name of misguided political activism promises to do nothing but ensure the ever greater obsolescence of epistemologically doomed and resoundingly pointless theorizing, making of college courses the places where you go to become, at best, profoundly confused about where you should stand with relation to fiction and fictional characters, and, at worst, a self-righteous demagogue denouncing the chimerical evils allegedly encoded into every text or cultural artifact. All the conspiracy theorizing about the latent evil urgings of literature has amounted to little more than another reason not to read, another reason to tune in to Breaking Bad or Mad Men instead. But the only reason Roth’s novel makes such a successful case is that it at no point allows itself to be reducible to a mere case, just as Sabbath at no point allows himself to be conscripted as a mere argument. We don’t love or hate him; we love and hate him. But we sort of just love him because he leaves us free to do both as we experience his antics, once removed and simulated, but still just as complicatedly eloquent in their message of “Fuck the laudable ideologies”—or not, as the case may be. 


Also read Let's Play Kill Your Brother: Fiction as a Moral Dilemma Game.

And Stories, Social Proof, and Our Two Selves.

And Can't Win for Losing: Why There Are So Many Losers in Literature and Why It Has to Change.

Freud: The Falsified Cipher

Warhol's Freud
[As I'm hard at work on a story, I thought I'd post an essay from my first course as a graduate student on literary criticism. It was in the fall of 2009, and I was shocked and appalled that not only were Freud's ideas still being taught but there was no awareness whatsoever that psychology had moved beyond them. This is my attempt at righting the record while keeping my tone in check.]

The matter of epistemology in literary criticism is closely tied to the question of what end the discipline is supposed to serve. How critics decide what standard of truth to adhere to is determined by the role they see their work playing, both in academia and beyond. Freud stands apart as a literary theorist, professing in his works a commitment to scientific rigor in a field that generally holds belief in even the possibility of objectivity as at best naïve and at worst bourgeois or fascist. For the postmodernists, both science and literature are suspiciously shot through with the ideological underpinnings of capitalist European male hegemony, which they take as their duty to undermine. Their standard of truth, therefore, seems to be whether a theory or application effectively exposes one or another element of that ideology to “interrogation.” Admirable as the values underlying this patently political reading of texts are, the science-minded critic might worry lest such an approach merely lead straight back to the a priori assumptions from which it set forth. Now, a century after Freud revealed the theory and practice of psychoanalysis, his attempt to interpret literature scientifically seems like one possible route of escape from the circularity (and obscurantism) of postmodernism. Unfortunately, Freud’s theories have suffered multiple devastating empirical failures, and Freud himself has been shown to be less a committed scientist than an ingenious fabulist, but it may be possible to salvage from the failures of psychoanalysis some key to a viable epistemology of criticism.

            A text dating from early in the development of psychoanalysis shows both the nature of Freud’s methods and some of the most important substance of his supposed discoveries. Describing his theory of the Oedipus complex in The Interpretation of Dreams, Freud refers vaguely to “observations on normal children,” to which he compares his experiences with “psychoneurotics” to arrive at his idea that both display, to varying degrees, “feelings of love and hatred to their parents” (920). There is little to object to in this rather mundane observation, but Freud feels compelled to write that his

discovery is confirmed by a legend…a legend whose profound
and universal power to move can only be understood if the
hypothesis I have put forward in regard to the psychology of
children has an equally universal validity. (920)

He proceeds to relate the Sophocles drama from which his theory gets its name. In the story, Oedipus is tricked by fate into killing his father and marrying his mother. Freud takes this as evidence that the love and hatred he has observed in children are of a particular kind. According to his theory, any male child is fated to “direct his first sexual impulse towards his mother” and his “first murderous wish against his father” (921). But Freud originally poses this idea as purely hypothetical. What settles the issue is evidence he gleans from dream interpretations. “Our dreams,” he writes, “convince us that this is so” (921). Many men, it seems, confided to him that they dreamt of having sex with their mothers and killing their fathers.

            Freud’s method, then, was to seek a thematic confluence between men’s dreams, the stories they find moving, and the behaviors they display as children, which he knew mostly through self-reporting years after the fact. Indeed, the entire edifice of psychoanalysis is purported to have been erected on this epistemic foundation. In a later essay on “The Uncanny,” Freud makes the sources of his ideas even more explicit. “We know from psychoanalytic experience,” he writes, “that the fear of damaging or losing one’s eyes is a terrible one in children” (35). A few lines down, he claims that, “A study of dreams, phantasies and myths has taught us that anxiety about one’s eyes…is a substitute for the dread of being castrated” (36). Here he’s referring to another facet of the Oedipus complex which theorizes that the child keeps his sexual desire for his mother in check because of the threat of castration posed by his jealous father. It is through this fear of his father, which transforms into grudging respect, and then into emulation, that the boy learns his role as a male in society. And it is through the act of repressing his sexual desire for his mother that he first develops his unconscious, which will grow into a general repository of unwanted desires and memories (Eagleton 134).

            But what led Freud to this theory of repression, which suggests that we have the ability to willfully forget troubling incidents and drive urges to some portion of our minds to which we have no conscious access? He must have arrived at an understanding of this process in the same stroke that led to his conclusions about the Oedipus complex, because, in order to put forth the idea that as children we all hated one parent and wanted to have sex with the other, he had to contend with the fact that most people find the idea repulsive. What accounts for the dramatic shift between childhood desires and those of adults? What accounts for our failure to remember the earlier stage? The concept of repression had to be firmly established before Freud could make such claims. Of course, he could have simply imported the idea from another scientific field, but there is no evidence he did so. So it seems that he relied on the same methods—psychoanalysis, dream interpretation, and the study of myths and legends—to arrive at his theories as he did to test them. Inspiration and confirmation were one and the same.

            Notwithstanding Freud’s claim that the emotional power of the Oedipus legend “can only be understood” if his hypothesis about young boys wanting to have sex with their mothers and kill their fathers has “universal validity,” there is at least one alternative hypothesis which has the advantage of not being bizarre. It could be that the point of Sophocles’s drama was that fate is so powerful it can bring about exactly the eventualities we most desire to avoid. What moves audiences and readers is not any sense of recognition of repressed desires, but rather compassion for the man who despite, even because of, his heroic efforts fell into this most horrible of traps. (Should we assume that the enduring popularity of W.W. Jacobs’s story, “The Monkey’s Paw,” which tells a similar fated story about a couple who inadvertently wish their son dead, proves that all parents want to kill their children?) The story could be moving because it deals with events we would never want to happen. It is true however that this hypothesis fails to account for why people enjoy watching such a tragedy being enacted—but then so does Freud’s. If we have spent our conscious lives burying the memory of our childhood desires because they are so unpleasant to contemplate, it makes little sense that we should find pleasure in seeing those desires acted out on stage. And assuming this alternative hypothesis is at least as plausible as Freud’s, we are left with no evidence whatsoever to support his theory of repressed childhood desires.

            To be fair, Freud did look beyond the dreams and myths of men of European descent to test the applicability of his theories. In his book Totem and Taboo he inventories “savage” cultures and adduces the universality among them of a taboo against incest as further proof of the Oedipus complex. He even goes so far as to cite a rival theory put forth by a contemporary:

            Westermarck has explained the horror of incest on the
            ground that “there is an innate aversion to sexual
            intercourse between persons living very closely together
            from early youth, and that, as such persons are in most cases
            related by blood, this feeling would naturally display itself
            in custom and law as a horror of intercourse between near
            kin.” (152)

To dismiss Westermarck’s theory, Freud cites J. G. Frazer, who argues that laws exist only to prevent us from doing things we would otherwise do or prod us into doing what we otherwise would not. That there is a taboo against incest must therefore signal that there is no innate aversion, but rather a proclivity, for incest. Here it must be noted that the incest Freud had in mind includes not just lust for the mother but for sisters as well. “Psychoanalysis has taught us,” he writes, again vaguely referencing his clinical method, “that a boy’s earliest choice of objects for his love is incestuous and that those objects are forbidden ones—his mother and sister” (22). Frazer’s argument is compelling, but Freud’s test of the applicability of his theories is not the same as a test of their validity (though it seems customary in literary criticism to conflate the two).

Edvard Westermarck
            As linguist and cognitive neuroscientist Steven Pinker explains in How the Mind Works, in tests of validity Westermarck beats Freud hands down. Citing the research of Arthur Wolf, he explains that without setting out to do so, several cultures have conducted experiments on the nature of incest aversion. Israeli kibbutzim, in which children grew up in close proximity to several unrelated agemates, and the Chinese and Taiwanese practice of adopting future brides for sons and raising them together as siblings are just two that Wolf examined. When children from the kibbutzim reached sexual maturity, even though there was no discouragement from adults for them to date or marry, they showed a marked distaste for each other as romantic partners. And compared to more traditional marriages, those in which the bride and groom grew up in conditions mimicking siblinghood were overwhelmingly “unhappy, unfaithful, unfecund, and short” (459). The effect of proximity in early childhood seems to apply to parents as well, at least when it comes to fathers’ sexual feelings for their daughters. Pinker cites research that shows the fathers who sexually abuse their daughters tend to be the ones who have spent the least time with them as infants, while the stepdads who actually do spend a lot of time with their stepdaughters are no more likely to abuse them than their biological parents. These studies not only favor Westermarck’s theory; they also provide a counter to Frazer’s objection to it. Human societies are so complex that we often grow up in close proximity with people who are unrelated, or don’t grow up with people who are, and therefore it is necessary for there to be a cultural proscription—a taboo—against incest in addition to the natural mechanism of aversion.

Frederick Crews
            Among biologists and anthropologists, what is now called the Westermarck effect has displaced Freud’s Oedipus complex as the best explanation for incest avoidance. Since Freud’s theory of childhood sexual desires has been shown to be false, the question arises of where this leaves his concept of repression. According to literary critic—and critic of literary criticism—Frederick Crews, repression came to serve in the 1980’s and 90’s a role equivalent to the “spectral evidence” used in the Salem witch trials. Several psychotherapists latched on to the idea that children can store reliable information in their memories, especially when that information is too terrible for them to consciously handle. And the testimony of these therapists has led to many convictions and prison sentences. But the evidence for this notion of repression is solely clinical—modern therapists base their conclusions on interactions with patients, just as Freud did. Unfortunately, researchers outside the clinical setting are unable to find any phenomenon answering to the description of repressed but retrievable memories. Crews points out that there are plenty of people who are known to have survived traumatic experiences: “Holocaust survivors make up the most famous class of such subjects, but whatever group or trauma is chosen, the upshot of well-conducted research is always the same” (158). That upshot:

            Unless a victim received a physical shock to the brain or
            was so starved or sleep deprived as to be thoroughly
            disoriented at the time, those experiences are typically
            better remembered than ordinary ones. (159, emphasis in
            original)

It seems here, as with incest aversion, Freud got the matter exactly wrong—and with devastating fallout for countless families and communities. But Freud was sketchy when it came to whether or not it was memories of actual events that were repressed or just fantasies. The crux of his argument was that we repress unacceptable and inappropriate drives and desires.

            And the concept of repressed desires is integral to the use of psychoanalysis in literary criticism. In The Interpretation of Dreams, Freud distinguishes between the manifest content of dreams and their latent content. Having been exiled from consciousness, troublesome desires press against the bounds of the ego, Freud’s notional agent in charge of tamping down uncivilized urges. In sleep, the ego relaxes, allowing the desires of the id, from whence all animal drives emerge, an opportunity for free play. Even in dreams, though, full transparency of the id would be too disconcerting for the conscious mind to accept, so the ego disguises all the elements which surface with a kind of code. Breaking this code is the work of psychoanalytic dream interpretation. It is also the basis for Freud’s analysis of myths and the underlying principle of Freudian literary criticism. (In fact, the distinction between manifest and latent content is fundamental to many schools of literary criticism, though they each have their own version of the true nature of the latent content.) Science writer Steven Johnson compares Freud’s conception of repressed impulses to compressed gas seeping through the cracks of the ego’s defenses, emerging as slips of the tongue or baroque dream imagery. “Build up enough pressure in the chamber, though, and the whole thing explodes—into uncontrolled hysteria, anxiety, madness” (191). The release of pressure, as it were, through dreams and through various artistic media, is sanity-saving.
Steven Johnson
            Johnson’s book, Mind Wide Open: Your Brain and the Neuroscience of Everyday life, takes the popular currency of Freud’s ideas as a starting point for his exploration of modern science. The subtitle is a homage to Freud’s influential work The Psychopathology of Everyday Life. Perhaps because he is not a working scientist, Johnson is able to look past the shaky methodological foundations of psychoanalysis and examine how accurately its tenets map onto the modern findings of neuroscience. Though he sees areas of convergence, like the idea of psychic conflict and that of the unconscious in general, he has to admit in his conclusion that “the actual unconscious doesn’t quite look like the one Freud imagined” (194). Rather than a repository of repressed fantasies, the unconscious is more of a store of implicit, or procedural, knowledge. Johnson explains, “Another word for unconscious is ‘automated’—the things you do so well you don’t even notice doing them” (195). And what happens to all the pressurized psychic energy resulting from our repression of urges? “This is one of those places,” Johnson writes, “where Freud’s metaphoric scaffolding ended up misleading him” (198). Instead of a steam engine, neuroscientists view the brain as type of ecosystem, with each module competing for resources; if the module goes unused—the neurons failing to fire—then the strength of their connections diminishes.

            What are the implications of this new conception of how the mind works for the interpretation of dreams and works of art? Without the concept of repressed desires, is it still possible to maintain a distinction between the manifest and latent content of mental productions? Johnson suggests that there are indeed meaningful connections that can be discovered in dreams and slips of the tongue. To explain them, he points again to the neuronal ecosystem, and to the theory that “Neurons that fire together wire together.” He writes:

            These connections are not your unconscious speaking in
            code. They’re much closer to free-associating. These
            revelations aren’t the work of some brilliant cryptographer
            trying to get a message to the frontlines without enemy
            detection. They’re more like echoes, reverberations. One
            neuronal group fires, and a host of others join in the
            chorus. (200-201)

Mind Wide Open represents Johnson’s attempt to be charitable to the century-old, and now popularly recognized, ideas of psychoanalysis. But in this description of the shortcomings of Freud’s understanding of the unconscious and how it reveals itself, he effectively discredits the epistemological underpinnings of any application of psychoanalysis to art. It’s not only the content of the unconscious that Freud got outrageously wrong, but the very nature of its operations. And if Freud could so confidently look into dreams and myths and legends and find in them material that simply wasn’t there, it is cause for us to marvel at the power of his preconceptions to distort his perceptions.

            Ultimately, psychoanalysis failed to move from the realm of proto-science to that of methodologically well founded science, and got relegated rather to the back channel of pseudoscience by the hubris of its founder. And yet, if Freud had relied on good science, his program of interpreting literature in terms of the basic themes of human nature, and even his willingness to let literature inform his understanding of those themes, may have matured into a critical repertoire free of the obscurantist excesses and reality-denying absurdities of postmodernism. (Anthropologist Clifford Geertz once answered a postmodernist critic of his work by acknowledging that perfect objectivity is indeed impossible, but then so is a perfectly germ-free operating room; that shouldn’t stop us from trying to be as objective and as sanitary as our best methods allow.)
 
            Critics could feasibly study the production of novels by not just one or a few authors, but a large enough sample—possibly extending across cultural divides—to analyze statistically. They could pose questions systematically to even larger samples of readers. And they could identify the themes in any poem or novel which demonstrate the essential (in the statistical sense) concerns of humanity that have been studied by behavioral scientists, themes like status-seeking, pair-bonding, jealousy, and even the overwhelming strength of the mother-infant bond. “The human race has produced only one successfully validated epistemology,” writes Frederick Crews (362). That epistemology encompasses a great variety of specific research practices, but they all hold as inviolable the common injunction “to make a sharp separation between hypothesis and evidence” (363). Despite his claims to scientific legitimacy, Freud failed to distinguish himself from other critical theorists by relying too much on his own intuitive powers, a reliance that all but guarantees succumbing to the natural human tendency to discover in complex fields precisely what you’ve come to them seeking. 


Also read Absurdities and Atrocities in Literary Criticism

The Mental Illness Zodiac: Why the DSM 5 Won't Be Anything But More Pseudoscience


            Thinking you can diagnose psychiatric disorders using checklists of symptoms means taking for granted a naïve model of the human mind and human behavior. How discouraging to those in emotional distress, or to those doubting their own sanity, that the guides they turn to for help and put their faith in to know what’s best for them embrace this model. The DSM has taken it for granted since its inception, and the latest version, the DSM 5, due out next year, despite all the impediments to practical usage it does away with, despite all the streamlining, and despite all the efforts to adhere to common sense, only perpetuates the mistake. That the diagnostic categories are necessarily ambiguous and can’t be tied to any objective criteria like biological markers has been much discussed, as have the corruptions of the mental health industry, including pharmaceutical companies’ reluctance to publish failed trials for their blockbuster drugs, and clinical researchers who make their livings treating the same disorders they lobby to have included in the list of official diagnoses. Indeed, there’s good evidence that prognoses for mental disorders have actually gotten worse over the past century. What’s not being discussed, however, is the propensity in humans to take on roles, to play parts, even tragic ones, even horrific ones, without being able to recognize they’re doing so.

            In his lighthearted, mildly satirical but severely important book on self-improvement 59 Seconds: Change Your Life in Under a Minute, psychologist Richard Wiseman describes an experiment he conducted for the British TV show The People Watchers. A group of students spending an evening in a bar with their friends was given a series of tests, and then they were given access to an open bar. The tests included memorizing a list of numbers, walking along a line on the floor, and catching a ruler dropped by experimenters as quickly as possible. Memory, balance, and reaction time—all areas our performance diminishes in predictably as we drink. The outcomes of the tests were well in-keeping with expectation as they were repeated over the course of the evening. All the students did progressively worse the more they drank. And the effects of the alcohol were consistent throughout the entire group of students. It turns out, however, that only half of them were drinking alcohol.

At the start of the study, Wiseman had given half the participants a blue badge and the other half a red badge. The bartenders poured regular drinks for everyone with red badges, but for those with blue ones they made drinks which looked, smelled, and tasted like their alcoholic counterparts but were actually non-alcoholic. Now, were the students with the blue badges faking their drunkenness? They may have been hamming it for the cameras, but that would be true of the ones who were actually drinking too. What they were doing instead was taking on the role—you might even say taking on the symptoms—of being drunk. As Wiseman explains,

Our participants believed that they were drunk, and so they thought and acted in a way that was consistent with their beliefs. Exactly the same type of effect has emerged in medical experiments when people exposed to fake poison ivy developed genuine rashes, those given caffeine-free coffee became more alert, and patients who underwent a fake knee operation reported reduced pain from their “healed” tendons. (204)

After being told they hadn’t actually consumed any alcohol, the students in the blue group “laughed, instantly sobered up, and left the bar in an orderly and amused fashion.” But not all the natural role-playing humans engage in is this innocuous and short-lived.

            In placebo studies like the one Wiseman conducted, participants are deceived. You could argue that actually drinking a convincing replica of alcohol or taking a realistic-looking pill is the important factor behind the effects. People who seek treatment for psychiatric disorders aren’t tricked in this way; so what would cause them to take on the role associated with, say, depression, or bipolar? But plenty of research shows that pills or potions aren’t necessary. We take on different roles in different settings and circumstances all the time. We act much differently at football games and rock concerts than we do at work or school. These shifts are deliberate, though, and we’re aware of them, at least to some degree, when they occur. But many cues are more subtle. It turns out that just being made aware of the symptoms of a disease can make you suspect that you have it. What’s called Medical Student Syndrome afflicts those studying both medical and psychiatric diagnoses. For the most part, you either have a biological disease or you don’t, so the belief that you have one is contingent on the heightened awareness that comes from studying the symptoms. But is there a significant difference between believing you’re depressed and having depression? There answer, according to check-list diagnosis, is no. 

            In America, we all know the symptoms of depression because we’re bombarded with commercials, like the one that uses squiggly circle faces to explain that it’s caused by a deficit of the neurotransmitter serotonin—a theory that had already been ruled out by the time that commercial began to air. More insidious though are the portrayals of psychiatric disorders in movies, TV series, or talk shows—more insidious because they embed the role-playing instructions in compelling stories. These shows profess to be trying to raise awareness so more people will get help to end their suffering. They profess to be trying to remove the stigma so people can talk about their problems openly. They profess to be trying to help people cope. But, from a perspective of human behavior that acknowledges the centrality of role-playing to our nature, all these shows are actually doing is shilling for the mental health industry, and they are probably helping to cause much of the suffering they claim to be trying to assuage.

            Multiple Personality Disorder, or Dissociative Identity Disorder as it’s now called, was an exceedingly rare diagnosis until the late 1970s and early 1980s when its incidence spiked drastically. Before the spike, there were only ever around a hundred cases. Between 1985 and 1995, there were around 40,000 new cases. What happened? There was a book and a miniseries called Sybil starring Sally Field that aired in 1977. Much of the real-life story on which Sybil was based has been cast into doubt through further investigation (or has been shown to be completely fabricated). But if you’re one to give credence to the validity of the DID diagnosis (and you shouldn’t), then we can look at another strange behavioral phenomenon whose incidence spiked after a certain movie hit the box offices in the 1970’s. Prior to the release of The Exorcist, the Catholic church had pretty much consigned the eponymous ritual to the dustbins of history. Lately, though, they’ve had to dust it off. The Skeptic’s Dictionary says of a TV series devoted to the exorcism ritual, or the play rather, on the Sci-Fi channel,

The exorcists' only prop is a Bible, which is held in one hand while they talk down the devil in very dramatic episodes worthy of Jerry Springer or Jenny Jones. The “possessed” could have been mentally ill, actors, mentally ill actors, drug addicts, mentally ill drug addicts, or they may have been possessed, as the exorcists claimed. All the participants shown being exorcized seem to have seen the movie “The Exorcist” or one of the sequels. They all fell into the role of husky-voiced Satan speaking from the depths, who was featured in the film. The similarities in speech and behavior among the “possessed” has led some psychologists such as Nicholas Spanos to conclude that both “exorcist” and “possessed” are engaged in learned role-playing.

If people can somehow inadvertently fall into the role of having multiple personalities or being possessed by demons, it’s not hard to imagine them hearing about, say, bipolar, briefly worrying that they may have some of the symptoms, and then subsequently taking on the role, even the identity of someone battling bipolar disorder.

            Psychologist Dan McAdams theorizes that everyone creates his or her own “personal myth,” which serves to give life meaning and trajectory. The character we play in our own myth is what we recognize as our identity, what we think of when we try to answer the question “Who am I?” in all its profundity. But, as McAdams explains in The Stories We Live By: Personal Myths and the Making of the Self,

Stories are less about facts and more about meanings. In the subjective and embellished telling of the past, the past is constructed—history is made. History is judged to be true or false not solely with respect to its adherence to empirical fact. Rather, it is judged with respect to such narrative criteria as “believability” and “coherence.” There is a narrative truth in life that seems quite removed from logic, science, and empirical demonstration. It is the truth of a “good story.” (28-9)
Dan McAdams

The problem when it comes to diagnosing psychiatric disorders is that the checklist approach tries to use objective, scientific criteria, when the only answers they’ll ever get will be in terms of narrative criteria. But why, if people are prone to taking on roles, wouldn’t they take on something pleasant, like kings or princesses?

            Since our identities are made up of the stories we tell about ourselves—even to ourselves—it’s important that those stories be compelling. And if nothing ever goes wrong in the stories we tell, well, they’d be pretty boring. As Jonathan Gottschall writes in The Storytelling Animal: How Stories Make Us Human,

This need to see ourselves as the striving heroes of our own epics warps our sense of self. After all, it’s not easy to be a plausible protagonist. Fiction protagonists tend to be young, attractive, smart, and brave—all the things that most of us aren’t. Fiction protagonists usually live interesting lives that are marked by intense conflict and drama. We don’t. Average Americans work retail or cubicle jobs and spend their nights watching protagonists do interesting things on television. (171)

Listen to the ways talk show hosts like Oprah talk about mental disorders, and count how many times in an episode she congratulates the afflicted guests for their bravery in keeping up the struggle. Sometimes, the word hero is even bandied about. Troublingly, the people who cast themselves as heroes spreading awareness, countering stigmas, and helping people cope even like to do really counterproductive things like publishing lists of celebrities who supposedly suffer from the disorder in question. Think you might have bipolar? Kay Redfield Jameson thinks you’re in good company. In her book Touched By Fire, she suggests everyone from rocker Curt Cobain to fascist Mel Gibson is in that same boat-full of heroes.

            The reason medical researchers insist a drug must not only be shown to make people feel better but must also be shown to work better than a placebo is that even a sham treatment will make people report feeling better between 60 and 90% of the time, depending on several well-documented factors. What psychiatrists fail to acknowledge is that the placebo dynamic can be turned on its head—you can give people illnesses, especially mental illnesses, merely by suggesting they have the symptoms—or even by increasing their awareness of and attention to those symptoms past a certain threshold. If you tell someone a fact about themselves, they’ll usually believe it, especially if you claim a test, or an official diagnostic manual allowed you to determine the fact. This is how frauds convince people they’re psychics. An experiment you can do yourself involves giving horoscopes to a group of people and asking how true they ring. After most of them endorse their reading, reveal that you changed the labels and they all in fact read the wrong sign’s description.  

            Psychiatric diagnoses, to be considered at all valid, would need to be double-blind, just like drug trials: the patient shouldn’t know the diagnosis being considered; the rater shouldn’t know the diagnosis being considered; only a final scorer, who has no contact with the patient, should determine the diagnosis. The categories themselves are, however, equally problematic. In order to be properly established as valid, they need to have predictive power. Trials would have to be conducted in which subjects assigned to the prospective categories using double-blind protocols were monitored for long periods of time to see if their behavior adheres to what’s expected of the disorder. For instance, bipolar is supposedly marked by cyclical mood swings. Where are the mood diary studies? (The last time I looked for them was six months ago, so if you know of any, please send a link.) Smart phones offer all kinds of possibilities for monitoring and recording behaviors. Why aren’t they being used to do actual science on mental disorders?

            To research the role-playing dimension of mental illness, one (completely unethical) approach would be to design from scratch a really bizarre disorder, publicize its symptoms, maybe make a movie starring Mel Gibson, and monitor incidence rates. Let’s call it Puppy Pregnancy Disorder. We all know dog saliva is chock-full of gametes, right? So, let’s say the disorder is caused when a canine, in a state of sexual arousal of course, bites the victim, thus impregnating her—or even him. Let’s say it affects men too. Wouldn’t that be funny? The symptoms would be abdominal pain, and something just totally out there, like, say, small pieces of puppy feces showing up in your urine. Now, this might be too outlandish, don’t you think? There’s no way we could get anyone to believe this. Unfortunately, I didn’t really make this up. And there are real people in India who believe they have Puppy Pregnancy Disorder

What's the Point of Difficult Reading?

James Joyce

          You sit reading the first dozen or so pages of some celebrated classic and gradually realize that having to sort out how the ends of the long sentences fix to their beginnings is taking just enough effort to distract you entirely from the setting or character you’re supposed to be getting to know. After a handful of words you swear are made up and a few tangled metaphors you find yourself riddling over with nary a resolution, the dread sinks in. Is the whole book going to be like this? Is it going to be one of those deals where you get to what’s clearly meant to be a crucial turning point in the plot but for you is just another riddle without a solution, sending you paging back through the forest of verbiage in search of some key succession of paragraphs you spaced out while reading the first time through? Then you wonder if you’re missing some other kind of key, like maybe the story’s an allegory, a reference to some historical event like World War II or some Revolution you once had to learn about but have since lost all recollection of. Maybe the insoluble similes are allusions to some other work you haven’t read or can’t recall. In any case, you’re not getting anything out of this celebrated classic but frustration leading to the dual suspicion that you’re too ignorant or stupid to enjoy great literature and that the whole “great literature” thing is just a conspiracy to trick us into feeling dumb so we’ll defer to the pseudo-wisdom of Ivory Tower elites.

            If enough people of sufficient status get together and agree to extol a work of fiction, they can get almost everyone else to agree. The readers who get nothing out of it but frustration and boredom assume that since their professors or some critic in a fancy-pants magazine or the judges of some literary award committee think it’s great they must simply be missing something. They dutifully continue reading it, parrot a few points from a review that sound clever, and afterward toe the line by agreeing that it is indeed a great work of literature, clearly, even if it doesn’t speak to them personally. For instance, James Joyce’s Ulysses, utterly nonsensical to anyone without at least a master’s degree, tops the Modern Library’s list of 100 best novels in the English language. Responding to the urging of his friends to write out an explanation of the novel, Joyce scoffed, boasting, “I’ve put in so many enigmas and puzzles that it will keep the professors busy for centuries arguing over what I meant, and that’s the only way of ensuring one’s immortality.” He was right. To this day, professors continue to love him even as Ulysses and the even greater monstrosity Finnegan’s Wake do nothing but bore and befuddle everyone else—or else, more fittingly, sit inert or unchecked-out on the shelf, gathering well-deserved dust.

Jonathan Franzen-Courtesy of Frank Bauer
            Joyce’s later novels are not literature; they are lengthy collections of loosely connected literary puzzles. But at least his puzzles have actual solutions—or so I’m told. Ulysses represents the apotheosis of the tradition in literature called modernism. What came next, postmodernism, is even more disconnected from the universal human passion for narrative. Even professors aren’t sure what to do with it, so they simply throw their hands up, say it’s great, and explain that the source of its greatness is its very resistance to explanation. Jonathan Franzen, whose 2001 novel The Corrections represented a major departure from the postmodernism he began his career experimenting with, explained the following year in The New Yorker how he’d turned away from the tradition. He’d been reading the work of William Gaddis “as a kind of penance” (101) and not getting any meaning out of it. Of the final piece in the celebrated author’s oeuvre, Franzen writes,

The novel is an example of the particular corrosiveness of literary postmodernism. Gaddis began his career with a Modernist epic about the forgery of masterpieces. He ended it with a pomo romp that superficially resembles a masterpiece but punishes the reader who tries to stay with it and follow its logic. When the reader finally says, Hey, wait a minute, this is a mess, not a masterpiece, the book instantly morphs into a performance-art prop: its fraudulence is the whole point! And the reader is out twenty hours of good-faith effort. (111)

In other words, reading postmodern fiction means not only forgoing the rewards of narratives, having them replaced by the more taxing endeavor of solving multiple riddles in succession, but those riddles don’t even have answers. What’s the point of reading this crap? Exactly. Get it?

            You can dig deeper into the meaningless meanderings of pomos and discover there is in fact an ideology inspiring all the infuriating inanity. The super smart people who write and read this stuff point to the willing, eager complicity of the common reader in the propagation of all the lies that sustain our atrociously unjust society (but atrociously unjust compared to what?). Franzen refers to this as the Fallacy of the Stupid Reader,

wherein difficulty is a “strategy” to protect art from cooptation and the purpose of art is to “upset” or “compel” or “challenge” or “subvert” or “scar” the unsuspecting reader; as if the writer’s audience somehow consisted, again and again, of Charlie Browns running to kick Lucy’s football; as if it were a virtue in a novelist to be the kind of boor who propagandizes at friendly social gatherings. (109)

But if the author is worried about art becoming a commodity does making the art shitty really amount to a solution? And if the goal is to make readers rethink something they take for granted why not bring the matter up directly, or have a character wrestle with it, or have a character argue with another character about it? The sad fact is that these authors probably just suck, that, as Franzen suspects, “literary difficulty can operate as a smoke screen for an author who has nothing interesting, wise, or entertaining to say” (111).

            Not all difficulty in fiction is a smoke screen though. Not all the literary emperors are naked. Franzen writes that “there is no headache like the headache you get from working harder on deciphering a text than the author, by all appearances, has worked on assembling it.” But the essay, titled “Mr. Difficult,” begins with a reader complaint sent not to Gaddis but to Franzen himself. And the reader, a Mrs. M. from Maryland, really gives him the business:

Who is it that you are writing for? It surely could not be the average person who enjoys a good read… The elite of New York, the elite who are beautiful, thin, anorexic, neurotic, sophisticated, don’t smoke, have abortions tri-yearly, are antiseptic, live in penthouses, this superior species of humanity who read Harper’s and The New Yorker. (100)

In this first part of the essay, Franzen introduces a dilemma that sets up his explanation of why he turned away from postmodernism—he’s an adherent of the “Contract model” of literature, whereby the author agrees to share, on equal footing, an entertaining or in some other way gratifying experience, as opposed to the “Status model,” whereby the author demonstrates his or her genius and if you don’t get it, tough. But his coming to a supposed agreement with Mrs. M. about writers like Gaddis doesn’t really resolve Mrs. M.’s conflict with him. The Corrections, after all, the novel she was responding to, represents his turning away from the tradition Gaddis wrote in. (It must be said, though, that Freedom, Franzen’s next novel, is written in a still more accessible style.)

            The first thing we must do to respond properly to Mrs. M. is break down each of Franzen’s models into two categories. The status model includes writers like Gaddis whose difficulty serves no purpose but to frustrate and alienate readers. But Franzen’s own type specimen for this model is Flaubert, much of whose writing, though difficult at first, rewards any effort to re-read and further comprehend with a more profound connection. So it is for countless other writers, the one behind number two on the Modern Library’s ranking for instance—Fitzgerald and Gatsby. As for the contract model, Franzen admits,

Taken to its free-market extreme, Contract stipulates that if a product is disagreeable to you the fault must be the product’s. If you crack a tooth on a hard word in a novel, you sue the author. If your professor puts Dreiser on your reading list, you write a harsh student evaluation… You’re the consumer; you rule. (100)

Franzen, in declaring himself a “Contract kind of person,” assumes that the free-market extreme can be dismissed for its extremity. But Mrs. M. would probably challenge him on that. For many, particularly right-leaning readers, the market not only can but should be relied on to determine which books are good and which ones belong in some tiny niche. When the Modern Library conducted a readers' poll to create a popular ranking to balance the one made by experts, the ballot was stuffed by Ayn Rand acolytes and scientologists. Mrs. M. herself leaves little doubt as to her political sympathies. For her and her fellow travelers, things like literature departments, National Book Awards—like the one The Corrections won—Nobels and Pulitzers are all an evil form of intervention into the sacred workings of the divine free market, un-American, sacrilegious, communist. According to this line of thinking, authors aren’t much different from whores—except of course literal whoring is condemned in the bible (except when it isn’t).

            A contract with readers who score high on the personality dimension of openness to new ideas and experiences (who tend to be liberal), those who have spent a lot of time in the past reading books like The Great Gatsby or Heart of Darkness or Lolita (the horror!), those who read enough to have developed finely honed comprehension skills—that contract is going to look quite a bit different from one with readers who attend Beck University, those for whom Atlas Shrugged is the height of literary excellence. At the same time, though, the cult of self-esteem is poisoning schools and homes with the idea that suggesting that a student or son or daughter is anything other than a budding genius is a form of abuse. Heaven forbid a young person feel judged or criticized while speaking or writing. And if an author makes you feel the least bit dumb or ignorant, well, it’s an outrage—heroes like Mrs. M. to the rescue.

            One of the problems with the cult of self-esteem is that anticipating criticism tends to make people more, not less creative. And the link between low self-esteem and mental disorders is almost purely mythical. High self-esteem is correlated with school performance, but as far as researchers can tell it’s the performance causing the esteem, not the other way around. More invidious, though, is the tendency to view anything that takes a great deal of education or intelligence to accomplish as an affront to everyone less educated or intelligent. Conservatives complain endlessly about class warfare and envy of the rich—the financially elite—but they have no qualms about decrying intellectual elites and condemning them for flaunting their superior literary achievements. They see the elitist mote in the eye of Nobel laureates without noticing the beam in their own.

         What’s the point of difficult reading? Well, what’s the point of running five or ten miles? What’s the point of eating vegetables as opposed to ice cream or Doritos? Difficulty need not preclude enjoyment. And discipline in the present is often rewarded in the future. It very well may be that the complexity of the ideas you’re capable of understanding is influenced by how many complex ideas you attempt to understand. No matter how vehemently true believers in the magic of markets insist otherwise, markets don’t have minds. And though an individual’s intelligence need not be fixed a good way to ensure children never get any smarter than they already are is to make them feel fantastically wonderful about their mediocrity. We just have to hope that despite these ideological traps there are enough people out there determined to wrap their minds around complex situations depicted in complex narratives about complex people told in complex language, people who will in the process develop the types of minds and intelligence necessary to lead the rest of our lazy asses into a future that’s livable and enjoyable. For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them. How do we get the rewards without wasting our time?

Also read "Can't Win for Losing: Why There are so many Losers in Literature and Why It has to Change."

And: "Life's White Machine: James Wood and What doesn't Happen in Fiction."

And: Stories, Social Proof, & Our Two Selves

Madness and Bliss: Critical vs Primitive Readings in A.S. Byatt's Possesion: A Romance 2

Read part one.
            The critical responses to the challenges posed by Byatt in Possession fit neatly within the novel’s satire. Louise Yelin, for instance, unselfconsciously divides the audience for the novel into “middlebrow readers” and “the culturally literate” (38), placing herself in the latter category. She overlooks Byatt’s challenge to her methods of criticism and the ideologies underpinning them, for the most part, and suggests that several of the themes, like ventriloquism, actually support poststructuralist philosophy. Still, Yelin worries about the novel’s “homophobic implications” (39). (A lesbian, formerly straight character takes up with a man in the end, and Christabel LaMotte’s female lover commits suicide after the dissolution of their relationship, but no one actually expresses any fear or hatred of homosexuals.) Yelin then takes it upon herself to “suggest directions that our work might take” while avoiding the “critical wilderness” Byatt identifies. She proposes a critical approach to a novel that “exposes its dependencies on the bourgeois, patriarchal, and colonial economies that underwrite” it (40). And since all fiction fails to give voice to one or another oppressed minority, it is the critic’s responsibility to “expose the complicity of those effacements in the larger order that they simultaneously distort and reproduce” (41). This is not in fact a response to Byatt’s undermining of critical theories; it is instead an uncritical reassertion of their importance.

            Yelin and several other critics respond to Possession as if Byatt had suggested that “culturally literate” readers should momentarily push to the back of their minds what they know about how literature is complicit in various forms of political oppression so they can get more enjoyment from their reading. This response is symptomatic of an astonishing inability to even imagine what the novel is really inviting literary scholars to imagine—that the theories implicating literature are flat-out wrong. Monica Flegel for instance writes that “What must be privileged and what must be sacrificed in order for Byatt’s Edenic reading (and living) state to be achieved may give some indication of Byatt’s own conventionalizing agenda, and the negative enchantment that her particular fairy tale offers” (414). Flegel goes on to show that she does in fact appreciate the satire on academic critics; she even sympathizes with the nostalgia for simpler times, before political readings became mandatory. But she ends her critical essay with another reassertion of the political, accusing Byatt of “replicating” through her old-fashioned novel “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by Maud’s treatment in the final scene, since, she claims, “stereotypical gender roles are reaffirmed” (428). “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her’” (429). This interpretation places Flegel in the company of the feminists in the novel who hiss at Maud for trying to please men, forcing her to bind her head.

            Flegel believes that her analysis proves Byatt is guilty of misogyny and mistreatment of the poor. “Byatt urges us to leave behind critical readings and embrace reading for enjoyment,” she warns her fellow critics, “but the narrative she offers shows just how much is at stake when we leave criticism behind” (429). Flegel quotes Yelin to the effect that Possession is “seductive,” and goes on to declaim that

it is naïve, and unethical, to see the kind of reading that Byatt offers as happy. To return to an Edenic state of reading, we must first believe that such a state truly existed and that it was always open to all readers of every class, gender, and race. Obviously, such a belief cannot be held, not because we have lost the ability to believe, but because such a space never really did exist. (430)

In her preening self-righteous zealotry, Flegel represents a current in modern criticism that’s only slightly more extreme than that represented by Byatt’s misguided but generally harmless scholars. The step from using dubious theories to decode alleged justifications for political oppression in literature to Flegel’s frightening brand of absurd condemnatory moralizing leveled at authors and readers alike is a short one.

            Another way critics have attempted to respond to Byatt’s challenge is by denying that she is in fact making any such challenge. Christien Franken suggests that Byatt’s problems with theories like poststructuralism stem from her dual identity as a critic and an author. In a lecture Byatt once gave titled “Identity and the Writer” which was later published as an essay, Franken finds what she believes is evidence of poststructuralist thinking, even though Byatt denies taking the theory seriously. Franken believes that in the essay, “the affinity between post-structuralism and her own thought on authorship is affirmed and then again denied” (18). Her explanation is that

the critic in A.S. Byatt begins her lecture “Identity and the Writer” with a recognition of her own intellectual affinity with post-structuralist theories which criticize the paramount importance of “the author.” The writer in Byatt feels threatened by the same post-structuralist criticism. (17)

Franken claims that this ambivalence runs throughout all of Byatt’s fiction and criticism. But Ann Marie Adams disagrees, writing that “When Byatt does delve into poststructuralist theory in this essay, she does so only to articulate what ‘threatens’ and ‘beleaguers’ her as a writer, not to productively help her identify the true ‘identity’ of the writer” (349). In Adams view, Byatt belongs in the humanist tradition of criticism going back to Matthew Arnold and the romantics. In her own response to Byatt, Adams manages to come closer than any of her fellow critics to being able to imagine that the ascendant literary theories are simply wrong. But her obvious admiration for Byatt doesn’t prevent her from suggesting that “Yelin and Flegel are right to note that the conclusion of Possession, with its focus on closure and seeming transcendence of critical anxiety, affords a particularly ‘seductive’ and ideologically laden pleasure to academic readers” (120). And, while she seems to find some value in Arnoldian approaches, she fails to engage in any serious reassessment of the theories Byatt targets.

            Frederick Holmes, in his attempt to explain Byatt’s attitude toward history as evidenced by the novel, agrees with the critics who see in Possession clear signs of the author’s embrace of postmodernism in spite of the parody and explicit disavowals. “It is important to acknowledge,” he writes,

that the liberation provided by Roland’s imagination from the previously discussed sterility of his intellectual sophistication is never satisfactorily accounted for in rational terms. It is not clear how he overcomes the post-structuralist positions on language, authorship, and identity. His claim that some signifiers are concretely attached to signifieds is simply asserted, not argued for. (330)


While Holmes is probably mistaken in taking this absence of rational justification as a tacit endorsement of the abandoned theory, the observation is the nearest any of the critics comes to a rebuttal of Byatt’s challenge. What Holmes is forgetting, though, is that structuralist and poststructuralist theorists themselves, from Saussure through Derrida, have been insisting on the inadequacy of language to describe the real world, a radical idea that flies in the face of every human’s lived experience, without ever providing any rational, empirical, or even coherent support for the departure. The stark irony to which Holmes is completely oblivious is that he’s asking for a rational justification to abandon a theory that proclaims such justification impossible. The burden remains on poststructuralists to prove that people shouldn’t trust their own experiences with language. And it is precisely this disconnect with experience that Byatt shows to be problematic. Holmes, like the other critics, simply can’t imagine that critical theories have absolutely no validity, so he’s forced to read into the novel the same chimerical ambivalence Franken tries so desperately to prove.

Roland’s dramatic alteration is validated by the very sort of emotional or existential experience that critical theory has conditioned him to dismiss as insubstantial. We might account for Roland’s shift by positing, not a rejection of his earlier thinking, but a recognition that his psychological well-being depends on his living as if such powerful emotional experiences had an unquestioned reality. (330)

            Adams quotes from an interview in which Byatt discusses her inspiration for the characters in Possession, saying, “poor moderns are always asking themselves so many questions about whether their actions are real and whether what they say can be thought to be true […] that they become rather papery and are miserably aware of this” (111-2). Byatt believes this type of self-doubt is unnecessary. Indeed, Maud’s notion that “there isn’t a unitary ego” (290) and Roland’s thinking of the “idea of his ‘self’ as an illusion” (459)—not to mention Holmes’s conviction that emotional experiences are somehow unreal—are simple examples of the reductionist fallacy. While it is true that an individual’s consciousness and sense of self rest on a substrate of unconscious mental processes and mechanics that can be traced all the way down to the firing of neurons, to suggest this mechanical accounting somehow delegitimizes selfhood is akin to saying that water being made up of hydrogen and oxygen atoms means the feeling of wetness can only be an illusion.
Helen Fisher

       Just as silly are the ideas that romantic love is a “suspect ideological construct” (290), as Maud calls it, and that “the expectations of Romance control almost everyone in the Western world” (460), as Roland suggests. Anthropologist Helen Fisher writes in her book Anatomy of Love, “some Westerners have come to believe that romantic love is an invention of the troubadours… I find this preposterous. Romantic love is far more widespread” (49). After a long list of examples of love-strickenness from all over the world from west to east to everywhere in-between, Fisher concludes that it “must be a universal human trait” (50). Scientists have found empirical support as well for Roland’s discovery that words can in fact refer to real things. Psychologist Nicole Speer and her colleagues used fMRI to scan people’s brains as they read stories. The actions and descriptions on the page activated the same parts of the brain as witnessing or perceiving their counterparts in reality. The researchers report, “Different brain regions track different aspects of a story, such as a character’s physical location or current goals. Some of these regions mirror those involved when people perform, imagine, or observe similar real-world activities” (989).

       Critics like Flegel insist on joyless reading because happy endings necessarily overlook the injustices of the world. But this is like saying anyone who savors a meal is complicit in world hunger (or for that matter anyone who enjoys reading about a character savoring a meal). If feminist poststructuralists were right about how language functions as a vehicle for oppressive ideologies, then the most literate societies would be the most oppressive, instead of the other way around. Jacques Lacan is the theorist Byatt has the most fun with in Possession—and he is also the main target of the book Fashionable Nonsense:Postmodern Intellectuals’ Abuse of Science by the scientists Alan Sokal and Jean Bricmont. “According to his disciples,” they write, Lacan “revolutionized the theory and practice of psychoanalysis; according to his critics, he is a charlatan and his writings are pure verbiage” (18). After assessing Lacan’s use of concepts in topological mathematics, like the Mobius strip, which he sets up as analogies for various aspects of the human psyche, Sokal and Bricmont conclude that Lacan’s ideas are complete nonsense. They write,

Bricmont and Sokal
The most striking aspect of Lacan and his disciples is probably their attitude toward science, and the extreme privilege they accord to “theory”… at the expense of observations and experiments… Before launching into vast theoretical generalizations, it might be prudent to check the empirical adequacy of at least some of its propositions. But, in Lacan’s writings, one finds mainly quotations and analyses of texts and concepts. (37)

Sokal and Bricmont wonder if the abuses of theorists like Lacan “arise from conscious fraud, self-deception, or perhaps a combination of the two” (6). The question resonates with the poem Randolph Henry Ash wrote about his experience exposing a supposed spiritualist as a fraud, in which he has a mentor assure her protégée, a fledgling spiritualist with qualms about engaging in deception, “All Mages have been tricksters” (444).

There’s even some evidence that Byatt is right about postmodern thinking making academics into “papery” people. In a 2006 lecture titled “The Inhumanity of the Humanities,” William van Peer reports on research he conducted with a former student comparing the emotional intelligence of students in the humanities to students in the natural sciences. Although the psychologists Raymond Mar and Keith Oately (407) have demonstrated that reading fiction increases empathy and emotional intelligence, van Peer found that humanities students had no advantage in emotional intelligence over students of natural science. In fact, there was a weak trend in the opposite direction—this despite the fact that the humanities students were reading more fiction. Van Peer attributes the deficit to common academic approaches to literature:

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

Steven Pinker
Whether it derives from her early reading of Arnold and his successors, as Adams suggests, or simply from her own artistic and readerly sensibilities, Byatt has an intense desire to revive that very humanity so many academics sacrifice on the altar of postmodern theory. Critical theory urges students to assume that any discussion of humanity, or universal traits, or human nature can only be exclusionary, oppressive, and, in Flegel’s words, “naïve” and “unethical.” The cognitive linguist Steven Pinker devotes his book The Blank Slate: The Modern Denial of Human Nature to debunking the radical cultural and linguistic determinism that is the foundation of modern literary theory. In a section on the arts, Pinker credits Byatt for playing a leading role in what he characterizes as a “revolt”: 

Museum-goers have become bored with the umpteenth exhibit on the female body featuring dismembered torsos or hundreds of pounds of lard chewed up and spat out by the artist. Graduate students in the humanities are grumbling in emails and conference hallways about being locked out of the job market unless they write in gibberish while randomly dropping the names of authorities like Foucault and Butler. Maverick scholars are doffing the blinders that prevented them from looking at exciting developments in the sciences of human nature. And younger artists are wondering how the art world got itself into the bizarre place in which beauty is a dirty word. (Pinker 416)

There are resonances with Roland Mitchell’s complaints about how psychoanalysis transforms perceptions of landscapes in Pinker’s characterization of modern art. And the idea that beauty has become a dirty word is underscored by the critical condemnations of Byatt’s “fairy tale ending.” Pinker goes on to quote Byatt’s response in New York Times Magazine to the question of what the best story ever told was.  Her answer—The Arabian Nights:

The stories in “The Thousand and One Nights”… are about storytelling without ever ceasing to be stories about love and life and death and money and food and other human necessities. Narration is as much a part of human nature as breath and the circulation of the blood. Modernist literature tried to do away with storytelling, which it thought vulgar, replacing it with flashbacks, epiphanies, streams of consciousness. But storytelling is intrinsic to biological time, which we cannot escape. Life, Pascal said, is like living in a prison from which every day fellow prisoners are taken away to be executed. We are all, like Scheherazade, under sentences of death, and we all think of our lives as narratives, with beginnings, middles, and ends. (quoted in Pinker 419)

            Byatt’s satire of papery scholars and her portrayals of her characters’ transcendence of nonsensical theories are but the simplest and most direct ways she celebrates the power of language to transport readers—and the power of stories to possess them. Though she incorporates an array of diverse genres, from letters to poems to diaries, and though some of the excerpts’ meanings subtly change in light of discoveries about their authors’ histories, all these disparate parts nonetheless “hook together,” collaborating in the telling of this magnificent tale. This cooperation would be impossible if the postmodern truism about the medium being the message were actually true. Meanwhile, the novel’s intimate engagement with the mythologies of wide-ranging cultures thoroughly undermines the paradigm according to which myths are deterministic “repeating patterns” imposed on individuals, showing instead that these stories simultaneously emerge from and lend meaning to our common human experiences. As the critical responses to Possession make abundantly clear, current literary theories are completely inadequate in any attempt to arrive at an understanding of Byatt’s work. While new theories may be better suited to the task, it is incumbent on us to put forth a good faith effort to imagine the possibility that true appreciation of this and other works of literature will come only after we’ve done away with theory altogether. 

Intuition vs. Science: What's Wrong with Your Thinking, Fast and Slow

Amazon
From Completely Useless to Moderately Useful
            In 1955, a twenty-one-year-old Daniel Kahneman was assigned the formidable task of creating an interview procedure to assess the fitness of recruits for the Israeli army. Kahneman’s only qualification was his bachelor’s degree in psychology, but the state of Israel had only been around for seven years at the time so the Defense Forces were forced to satisfice. In the course of his undergraduate studies, Kahneman had discovered the writings of a psychoanalyst named Paul Meehl, whose essays he would go on to “almost memorize” as a graduate student. Meehl’s work gave Kahneman a clear sense of how he should go about developing his interview technique.


If you polled psychologists today to get their predictions for how successful a young lieutenant inspired by a book written by a psychoanalyst would be in designing a personality assessment protocol—assuming you left out the names—you would probably get some dire forecasts. But Paul Meehl wasn’t just any psychoanalyst, and Daniel Kahneman has gone on to become one of the most influential psychologists in the world. The book whose findings Kahneman applied to his interview procedure was Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence, which Meehl lovingly referred to as “my disturbing little book.” Kahneman explains,


Meehl reviewed the results of 20 studies that had analyzed whether clinical predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule. In a typical study, trained counselors predicted the grades of freshmen at the end of the school year. The counselors interviewed each student for forty-five minutes. They also had access to high school grades, several aptitude tests, and a four-page personal statement. The statistical algorithm used only a fraction of this information: high school grades and one aptitude test. (222)



Daniel Kahneman
The findings for this prototypical study are consistent with those arrived at by researchers over the decades since Meehl released his book:


The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algorithms. The other comparisons scored a draw in accuracy, but a tie is tantamount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convincingly documented. (223)       


            Kahneman designed the interview process by coming up with six traits he thought would have direct bearing on a soldier’s success or failure, and he instructed the interviewers to assess the recruits on each dimension in sequence. His goal was to make the process as systematic as possible, thus reducing the role of intuition. The response of the recruitment team will come as no surprise to anyone: “The interviewers came close to mutiny” (231). They complained that their knowledge and experience were being given short shrift, that they were being turned into robots. Eventually, Kahneman was forced to compromise, creating a final dimension that was holistic and subjective. The scores on this additional scale, however, seemed to be highly influenced by scores on the previous scales.


When commanding officers evaluated the new recruits a few months later, the team compared the evaluations with their predictions based on Kahneman’s six scales. “As Meehl’s book had suggested,” he writes, “the new interview procedure was a substantial improvement over the old one… We had progressed from ‘completely useless’ to ‘moderately useful’” (231).   



Amos Tversky
            Kahneman recalls this story at about the midpoint of his magnificent, encyclopedic book Thinking, Fast and Slow. This is just one in a long series of run-ins with people who don’t understand or can’t accept the research findings he presents to them, and it is neatly woven into his discussions of those findings. Each topic and each chapter feature a short test that allows you to see where you fall in relation to the experimental subjects. The remaining thread in the tapestry is the one most readers familiar with Kahneman’s work most anxiously anticipated—his friendship with AmosTversky, with whom he shared the Nobel prize in economics in 2002.


Most of the ideas that led to experiments that led to theories which made the two famous and contributed to the founding of an entire new field, behavioral economics, were borne of casual but thrilling conversations both found intrinsically rewarding in their own right. Reading this book, as intimidating as it appears at a glance, you get glimmers of Kahneman’s wonder at the bizarre intricacies of his own and others’ minds, flashes of frustration at how obstinately or casually people avoid the implications of psychology and statistics, and intimations of the deep fondness and admiration he felt toward Tversky, who died in 1996 at the age of 59.


Pointless Punishments and Invisible Statistics


            When Kahneman begins a chapter by saying, “I had one of the most satisfying eureka experiences of my career while teaching flight instructors in the Israeli Air Force about the psychology of effective training” (175), it’s hard to avoid imagining how he might have relayed the incident to Amos years later. It’s also hard to avoid speculating about what the book might’ve looked like, or if it ever would have been written, if he were still alive. The eureka experience Kahneman had in this chapter came about, as many of them apparently did, when one of the instructors objected to his assertion, in this case that “rewards for improved performance work better than punishment of mistakes.” The instructor insisted that over the long course of his career he’d routinely witnessed pilots perform worse after praise and better after being screamed at. “So please,” the instructor said with evident contempt, “don’t tell us that reward works and punishment does not, because the opposite is the case.” Kahneman, characteristically charming and disarming, calls this “a joyous moment of insight” (175).


            The epiphany came from connecting a familiar statistical observation with the perceptions of an observer, in this case the flight instructor. The problem is that we all have a tendency to discount the role of chance in success or failure. Kahneman explains that the instructor’s observations were correct, but his interpretation couldn’t have been more wrong.



Francis Galton, who first
described regression to the mean
What he observed is known as regression to the mean, which in that case was due to random fluctuations in the quality of performance. Naturally, he only praised a cadet whose performance was far better than average. But the cadet was probably just lucky on that particular attempt and therefore likely to deteriorate regardless of whether or not he was praised. Similarly, the instructor would shout into the cadet’s earphones only when the cadet’s performance was unusually bad and therefore likely to improve regardless of what the instructor did. The instructor had attached a causal interpretation to the inevitable fluctuations of a random process. (175-6)


The roster of domains in which we fail to account for regression to the mean is disturbingly deep. Even after you’ve learned about the phenomenon it’s still difficult to recognize the situations you should apply your understanding of it to. Kahneman quotes statistician David Freedman to the effect that whenever regression becomes pertinent in a civil or criminal trial the side that has to explain it will pretty much always lose the case. Not understanding regression, however, and not appreciating how it distorts our impressions has implications for even the minutest details of our daily experiences. “Because we tend to be nice to other people when they please us,” Kahneman writes, “and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty” (176). Probability is a bitch.


The Illusion of Skill in Stock-Picking


            Probability can be expensive too. Kahneman recalls being invited to give a lecture to advisers at an investment firm. To prepare for the lecture, he asked for some data on the advisers’ performances and was given a spreadsheet for investment outcomes over eight years. When he compared the numbers statistically, he found that none of the investors was consistently more successful than the others. The correlation between the outcomes from year to year was nil. When he attended a dinner the night before the lecture “with some of the top executives of the firm, the people who decide on the size of bonuses,” he knew from experience how tough a time he was going to have convincing them that “at least when it came to building portfolios, the firm was rewarding luck as if it were a skill.” Still, he was amazed by the execs’ lack of shock:

We all went on calmly with our dinner, and I have no doubt that both our findings and their implications were quickly swept under the rug and that life in the firm went on just as before. The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed. (216)


The scene that follows echoes the first chapter of Carl Sagan’s classic paean to skepticism Demon-Haunted World, where Sagan recounts being bombarded with questions about science by a driver who was taking him from the airport to an auditorium where he was giving a lecture. He found himself explaining to the driver again and again that what he thought was science—Atlantis, aliens, crystals—was, in fact, not. "As we drove through the rain," Sagan writes, "I could see him getting glummer and glummer. I was dismissing not just some errant doctrine, but a precious facet of his inner life" (4). In Kahneman’s recollection of his drive back to the airport after his lecture, he writes of a conversation he had with his own driver, one of the execs he’d dined with the night before. 


He told me, with a trace of defensiveness, “I have done very well for the firm and no one can take that away from me.” I smiled and said nothing. But I thought, “Well, I took it away from you this morning. If your success was due mostly to chance, how much credit are you entitled to take for it? (216)


Blinking at the Power of Intuitive Thinking



Malcolm Gladwell
            It wouldn’t surprise Kahneman at all to discover how much stories like these resonate. Indeed, he must’ve considered it a daunting challenge to conceive of a sensible, cognitively easy way to get all of his vast knowledge of biases and heuristics and unconscious, automatic thinking into a book worthy of the science—and worthy too of his own reputation—while at the same time tying it all together with some intuitive overarching theme, something that would make it read more like a novel than an encyclopedia. Malcolm Gladwell faced a similar challenge in writing Blink: the Power of Thinking without Thinking, but he had the advantages of a less scholarly readership, no obligation to be comprehensive, and the freedom afforded to someone writing about a field he isn’t one of the acknowledged leaders and creators of. Ultimately, Gladwell’s book painted a pleasing if somewhat incoherent picture of intuitive thinking. The power he refers to in the title is over the thoughts and actions of the thinker, not, as many must have presumed, to arrive at accurate conclusions.


It’s entirely possible that Gladwell’s misleading title came about deliberately, since there’s a considerable market for the message that intuition reigns supreme over science and critical thinking. But there are points in his book where it seems like Gladwell himself is confused. Robert Cialdini, Steve Marin, and Noah Goldstein cover some of the same research Kahneman and Gladwell do, but their book Yes!: 50 Scientifically Proven Ways to be Persuasive is arranged in a list format, with each chapter serving as its own independent mini-essay.



Robert Cialdini
Early in Thinking, Fast and Slow, Kahneman introduces us to two characters, System 1 and System 2, who pass the controls of our minds back and forth between themselves according the expertise and competency demanded by current exigency or enterprise. System 1 is the more intuitive, easygoing guy, the one who does what Gladwell refers to as “thin-slicing,” the fast thinking of the title. System 2 works deliberately and takes effort on the part of the thinker. Most people find having to engage their System 2—multiply 17 by 24—unpleasant to one degree or another.


The middle part of the book introduces readers to two other characters, ones whose very names serve as a challenge to the field of economics. Econs are the beings market models and forecasts are based on. They are rational, selfish, and difficult to trick. Humans, the other category, show inconsistent preferences, changing their minds depending on how choices are worded or presented, are much more sensitive to the threat of loss than the promise of gain, are sometimes selfless, and not only can be tricked with ease but routinely trick themselves. Finally, Kahneman introduces us to our “Two Selves,” the two ways we have of thinking about our lives, either moment-to-moment—experiences he, along with Mihaly Csikzentmihhalyi (author of Flow) pioneered the study of—or in abstract hindsight. It’s not surprising at this point that there are important ways in which the two selves tend to disagree.


Intuition and Cerebration


 The Econs versus Humans distinction, with its rhetorical purpose embedded in the terms, is plenty intuitive. The two selves idea, despite being a little too redolent of psychoanalysis, also works well. But the discussions about System 1 and System 2 are never anything but ethereal and abstruse. Kahneman’s stated goal was to discuss each of the systems as if they were characters in a plot, but he’s far too concerned with scientifically precise definitions to run with the metaphor. The term system is too bloodless and too suggestive of computer components; it’s too much of the realm of System 2 to be at all satisfying to System 1. The collection of characteristics Thinking links to the first system (see a list below) is lengthy and fascinating and not easily summed up or captured in any neat metaphor. But we all know what Kahneman is talking about. We could use mythological figures, perhaps Achilles or Orpheus for System 1 and Odysseus or Hephaestus for System 2, but each of those characters comes with his own narrative baggage. Not everyone’s System 1 is full of rage like Achilles, or musical like Orpheus. Maybe we could assign our System 1s idiosyncratic totem animals.
  

Mihaly Csikzentmihalyi
But I think the most familiar and the most versatile term we have for System 1 is intuition. It is a hairy and unpredictable beast, but we all recognize it. System 2 is actually the harder to name because people so often mistake their intuitions for logical thought. Kahneman explains why this is the case—because our cognitive resources are limited our intuition often offers up simple questions as substitutes from more complicated ones—but we must still have a term that doesn’t suggest complete independence from intuition and that doesn’t imply deliberate thinking operates flawlessly, like a calculator. I propose cerebration. The cerebral cortex rests on a substrate of other complex neurological structures. It’s more developed in humans than in any other animal. And the way it rolls trippingly off the tongue is as eminently appropriate as the swish of intuition. Both terms work well as verbs too. You can intuit, or you can cerebrate. And when your intuition is working in integrated harmony with your cerebration you are likely in the state of flow Csikzentmihalyi pioneered the study of.


While Kahneman’s division of thought into two systems never really resolves into an intuitively manageable dynamic, something he does throughout the book, which I initially thought was silly, seems now a quite clever stroke of brilliance. Kahneman has no faith in our ability to clean up our thinking. He’s an expert on all the ways thinking goes awry, and even he catches himself making all the common mistakes time and again. In the introduction, he proposes a way around the impenetrable wall of cognitive illusion and self-justification. If all the people gossiping around the water cooler are well-versed in the language describing biases and heuristics and errors of intuition, we may all benefit because anticipating gossip can have a profound effect on behavior. No one wants to be spoken of as the fool.


Kahneman writes, “it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own.” It’s not easy to tell from his straightforward prose, but I imagine him writing lines like that with a wry grin on his face. He goes on,


Questioning what we believe and want is difficult at the best of times, and especially difficult when we most need to do it, but we can benefit from the informed opinions of others. Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters. The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and at home. (3)


So we encourage the education of others to trick ourselves into trying to be smarter in their eyes. Toward that end, Kahneman ends each chapter with a list of sentences in quotation marks—lines you might overhear passing that water cooler if everyone where you work read his book.  I think he’s overly ambitious. At some point in the future, you may hear lines like “They’re counting on denominator neglect” (333) in a boardroom—where people are trying to impress colleagues and superiors—but I seriously doubt you’ll hear it in the break room. Really, what he’s hoping is that people will start talking more like behavioral economists. Though some undoubtedly will, Thinking, Fast and Slow probably won’t ever be as widely read as, say, Freud’s lurid pseudoscientific On the Interpretation of Dreams. That’s a tragedy.


Still, it’s pleasant to think about a group of friends and colleagues talking about something other than football and American Idol.

Characteristics of System 1 (105): Try to come up with a good metaphor.

·         generates impressions, feelings, and inclinations; when endorsed by System 2 these become beliefs, attitudes, and intentions
·         operates automatically and quickly, with little or no effort, and no sense of voluntary control
·         can be programmed by System 2 to mobilize attention when particular patterns are detected (search)
·         executes skilled responses and generates skilled intuitions, after adequate training
·         creates a coherent pattern of activated ideas in associative memory
·         links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance
·         distinguishes the surprising from the normal
·         infers and invents causes and intentions
·         neglects ambiguity and suppresses doubt
·         is biased to believe and confirm
·         exaggerates emotional consistency (halo effect)
·         focuses on existing evidence and ignores absent evidence (WYSIATI)
·         generates a limited set of basic assessments
·         represents sets by norms and prototypes, does not integrate
·         matches intensities across scales (e.g., size and loudness)
·         computes more than intended (mental shotgun)
·         sometimes substitutes an easier question for a difficult one (heuristics)
·         is more sensitive to changes than to states (prospect theory)
·         overweights low probabilities.
·         shows diminishing sensitivity to quantity (psychophysics)
·         responds more strongly to losses than to gains (loss aversion)
·         frames decision problems narrowly, in isolation from one another

The Upper Hand in Relationships

         People perform some astoundingly clever maneuvers in pursuit of the upper hand in their romantic relationships, and some really stupid ones too. They try to make their partners jealous. They feign lack of interest. They pretend to have enjoyed wild success in the realm of dating throughout their personal histories, right up until the point at which they met their current partners. The edge in cleverness, however, is usually enjoyed by women—though you may be inclined to call it subtlety, or even deviousness.

            Some of the most basic dominance strategies used in romantic relationships are based either on one partner wanting something more than the other, or on one partner being made to feel more insecure than the other. We all know couples whose routine revolves around the running joke that the man is constantly desperate for sex, which allows the woman to set the terms he must meet in order to get some. His greater desire for sex gives her the leverage to control him in other domains. I’ll never forget being nineteen and hearing a friend a few years older say of her husband, “Why would I want to have sex with him when he can’t even remember to take out the garbage?” Traditionally, men held the family purse strings, so they—assuming they or their families had money—could hold out the promise of things women wanted more. Of course, some men still do this, giving their wives little reminders of how hard they work to provide financial stability, or dropping hints of their extravagant lifestyles to attract prospective dates.

            You can also get the upper hand on someone by taking advantage of his or her insecurities. (If that fails, you can try producing some.) Women tend to be the most vulnerable to such tactics at the moment of choice, wanting their features and graces and wiles to make them more desirable than any other woman prospective partners are likely to see. The woman who gets passed up in favor of another goes home devastated, likely lamenting the crass superficiality of our culture.

            Most of us probably know a man or two who, deliberately or not, manages to keep his girlfriend or wife in constant doubt when it comes to her ability to keep his attention. These are the guys who can’t control their wandering eyes, or who let slip offhand innuendos about incremental weight gain. Perversely, many women respond by expending greater effort to win his attention and his approval.

           Men tend to be the most vulnerable just after sex, in the Was-it-good-for-you moments. If you found yourself seething at some remembrance of masculine insensitivity reading the last paragraph, I recommend a casual survey of your male friends in which you ask them how many of their past partners at some point compared them negatively to some other man, or men, they had been with prior to the relationship. The idea that the woman is settling for a man who fails to satisfy her as others have plays into the narrative that he wants sex more—and that he must strive to please her outside the bedroom.
  
          If you can put your finger on your partner’s insecurities, you can control him or her by tossing out reassurances like food pellets to a trained animal. The alternative would be for a man to be openly bowled over by a woman’s looks, or for a woman to express in earnest her enthusiasm for a man’s sexual performances. These options, since they disarm, can be even more seductive; they can be tactics in their own right—but we’re talking next-level expertise here so it’s not something you’ll see very often.

           I give the edge to women when it comes to subtly attaining the upper hand in relationships because I routinely see them using a third strategy they seem to have exclusive rights to. Being the less interested party, or the most secure and reassuring party, can work wonders, but for turning proud people into sycophants nothing seems to work quite as well as a good old-fashioned guilt-trip.

           To understand how guilt-trips work, just consider the biggest example in history: Jesus died on the cross for your sins, and therefore you owe your life to Jesus. The illogic of this idea is manifold, but I don’t need to stress how many people it has seduced into a lifetime of obedience to the church. The basic dynamic is one of reciprocation: because one partner in a relationship has harmed the other, the harmer owes the harmed some commensurate sacrifice.
  
          I’m probably not the only one who’s witnessed a woman catching on to her man’s infidelity and responding almost gleefully—now she has him. In the first instance of this I watched play out, the woman, in my opinion, bore some responsibility for her husband’s turning elsewhere for love. She was brutal to him. And she believed his guilt would only cement her ascendancy. Fortunately, they both realized about that time she must not really love him and they divorced.
  
          But the guilt need not be tied to anything as substantive as cheating. Our puritanical Christian tradition has joined forces in America with radical feminism to birth a bastard lovechild we encounter in the form of a groundless conviction that sex is somehow inherently harmful—especially to females. Women are encouraged to carry with them stories of the traumas they’ve suffered at the hands of monstrous men. And, since men are of a tribe, a pseudo-logic similar to the Christian idea of collective guilt comes into play. Whenever a man courts a woman steeped in this tradition, he is put on early notice—you’re suspect; I’m a trauma survivor; you need to be extra nice, i.e. submissive.

           It’s this idea of trauma, which can be attributed mostly to Freud, that can really make a relationship, and life, fraught and intolerably treacherous. Behaviors that would otherwise be thought inconsiderate or rude—a hurtful word, a wandering eye—are instead taken as malicious attempts to cause lasting harm. But the most troubling thing about psychological trauma is that belief in it is its own proof, even as it implicates a guilty party who therefore has no way to establish his innocence.
  
          Over the course of several paragraphs, we’ve gone from amusing but nonetheless real struggles many couples get caught up in to some that are just downright scary. The good news is that there is a subset of people who don’t see relationships as zero-sum games. (Zero-sum is a game theory term for interactions in which every gain for one party is a loss for the other. Non zero-sum games are those in which cooperation can lead to mutual benefits.) The bad news is that they can be hard to find.
      
            There are a couple of things you can do now though that will help you avoid chess match relationships—or minimize the machinations in your current romance. First, ask yourself what dominance tactics you tend to rely on. Be honest with yourself. Recognizing your bad habits is the first step toward breaking them. And remember, the question isn’t whether you use tactics to try to get the upper hand; it’s which ones you use how often?

           The second thing you can do is cultivate the habit and the mutual attitude of what’s good for one is good for the other. Relationship researcher Arthur Aron says that celebrating your partner’s successes is one of the most important things you can do in a relationship. “That’s even more important,” he says, “than supporting him or her when things go bad.” Watch out for zero-sum responses, in yourself and in your partner. And beware of zero-summers in the realm of dating. Ladies, you know the guys who seem vaguely resentful of the power you have over them by dint of your good looks and social graces. And, guys, you know the women who make you feel vaguely guilty and set-upon every time you talk to them. The best thing to do is stay away.
       
     But you may be tempted, once you realize a dominance tactic is being used on you, to perform some kind of countermove. It’s one of my personal failings to be too easily provoked into these types of exchanges. It is a dangerous indulgence.

Why I Am Not a Feminist—and You Shouldn’t Be Either part 3: Engendering Gender Madness


             "As a professional debunker I feel like I know bunk when I see it, and Wertheim has well captured the genre: 'In all likelihood there will be an abundant use of CAPITAL LETTERS and exclamation points!!! Important sections will be underlined or bolded, or circled, for emphasis.'"


This is from Skeptic editor Michael Shermer's review of a book on the demarcation problem, the thorny question of how to recognize whether ideas are revolutionary or just, well, bunk. Obviously, if someone's writing begs for attention in way that seems meretricious or unhinged, you're likely dealing with a bunk peddler. What to make, then, of these lines, to which I have not added any formatting?

"Honestly, I can’t think of a better way to make a girl in grade school question whether she’ll have any interest in or aptitude for science than to present her with a 'science for girls' kit."

"And, science kits that police these gender stereotypes run the risk of alienating boys from science, too."

"I really don’t think that science kits should be segregated by gender, but if you are going to segregate them at least make the experiments for girls NOT SO LAME."

"If girls are at all interested in science, then it must be in a pretty, feminine way that reinforces notions of beauty. It’s mystical. The chemistry of perfumery is hidden behind 'perfection.' But boys get actual physics and chemistry—just like that, with no fancy modifiers. This division is NOT okay..."

To the first, I’d say, really? You must have a very limited imagination. To the second, I’d say, really? Isn’t “police” a strong term for science kits sold at a toy store? I agree with the third, but I think the author needs to settle down. And to the fourth, I’d say, well, if the kids really want kits of this nature—and if they don’t want them the manufacturer won’t be offering them for long—you’d have to demonstrate that they actually cause some harm before you can say, in capitals or otherwise, they’re not okay.

Were these breathless fulminations posted on the pages of some poststructuralist site for feminist rants? The first and second are from philosopher Janet Stemwedel’s blog at Scientific American. The third is from a blog hosted by the American Geophysical Union and was written by geologist Evelyn Mervine. And the fourth is from anthropologist Krystal D’Costa’s blog, also at Scientific American.

           You’d hope these blog posts, as emphatic as they are, would provide links to some pretty compelling research on the dangers of pandering to kids’ and parents’ gender stereotypes. One of the posts has a link to a podcast about research on how vaginas are supposed to smell. Another of Stemwedel’s posts on the issue links to yet another post, by Christie Wilcox, in which she not-so-gently takes the journal Nature to task for publishing what was supposed to be a humorous piece on gender differences. It’s only through this indirect route that you can find any actual evidence—in any of these posts—that stereotyping is harmful. “Reinforcing negative gender stereotypes is anything but harmless,” Wilcox declares. But does humor based on stereotypes in fact reinforce them, or does it make them seem ridiculous? How far are we really willing to go to put a stop to this type of humor? It seems to me that gender and racial and religious stereotypes are the bread-and-butter of just about every comedian in the business. 

            The science Wilcox refers to has nothing to do with humor but instead demonstrates a phenomenon psychologists call stereotype threat. It’s a fascinating topic—really one of the most fascinating in psychology in my opinion. It may even be an important factor in the underrepresentation of women in STEM fields. Still, the connection between research on stereotypes and performance—stereotype boost has also been documented—and humor is tenuous. And the connection with pink and pretty microscopes is even more nebulous.

           Helping women in STEM fields feel more welcome is a worthy cause. Gender stereotypes probably play some role in their current underrepresentation. I take these authors at their word that they routinely experience the ill effects of common misconceptions about women’s cognitive abilities, so I sympathize with their frustration to a degree. I even have to admit that it’s a testament to the success of past feminists that the societal injustices their modern counterparts rail against are so much less overt—so subtle. But they may actually be getting too subtle; decrying them sort of resembles the righteous, evangelical declaiming of conspiracy theorists. If you can imagine a way that somebody may be guilty of reinforcing stereotypes, you no longer even have to shoulder the burden of proving they’re guilty.

          The takeaway from all this righteously indignant finger-pointing is that you should never touch anything with even a remote resemblance to a stereotype. Allow me some ironic capitals of my own: STEREOTYPES BAD!!! This message, not surprisingly, even reaches into realms where a casual dismissal of science is fashionable, and skepticism about the value of empirical research, expressed in tortured prose, is an ascendant virtue—or maybe I have the direction of the influence backward.

           On two separate occasions now, one of my colleagues in the English department has posted the story of a baby named Storm on Facebook. Storm’s parents opted against revealing the newborn’s sex to friends and any but immediate family to protect her or him from those nasty stereotypes. In the comments under these links were various commendations and expressions of solidarity. Storm’s parents, most agreed, are heroes. Parents bragged about all their own children’s androgynous behavior, expressing their desire to rub it in the faces of “gender nazis.” 

From the Toronto Star
             From what I can tell, Storm’s parents had no idea the story of their unorthodox parenting would go viral, so we probably shouldn’t condemn them for using their child to get media attention. And I don’t think the “experiment,” as some have called it, poses any direct threat to Storm’s psychological well-being. But Storm’s parents are jousting with windmills. They’re assuming that gender is something imposed on children by society—those chimerical gender nazis—through a process called socialization. The really disheartening thing is that even the bloggers at Scientific American make this mistake; they assume that sparkly pink science kits that help girls explore the chemistry of lipstick and perfume send direct messages about who and what girls should be, and that the girls will receive and embrace these messages without resistance, as if the little tykes were noble savages with pristine spirits forever vulnerable to the tragic overvaluing of outward beauty.

            When they’re thinking clearly, all parents know a simple truth that gets completely discounted in discussions of gender—it’s really hard to get through to your kids even with messages you’re sending deliberately and explicitly. The notion that you can accidentally send some subtle cue that’s going to profoundly shape a child’s identity deserves a lot more skepticism than it gets (ask my conservative parents, especially my Catholic mom). This is because identity is something children actively create for themselves, not the sum total of all the cultural assumptions foisted on them as they grow up. Children’s minds are not receptacles for all our ideological garbage. They rummage around for their own ideological garbage, and they don’t just pick up whatever they find lying around.

            Psychologist John Money was a prominent advocate of the theory that gender is determined completely through socialization. So he advised the parents of a six-month-old boy whose penis had been destroyed in a botched circumcision to have the testicles removed as well and to raise the boy as a girl. The boy, David Reimer, never thought of himself as a girl, despite his parents’ and Money’s efforts to socialize him as one. Money nevertheless kept declaring success, claiming Reimer (who was called Brenda at the time) proved his theory of gender development. By age 13, however, the poor kid was suicidal. At 14, he declared himself a boy, and later went on to get further surgeries to reconstruct his genitals. In his account, written with John Colapinto, As Nature Made Him: The Boy Who Was Raised as a Girl, Reimer says that Money’s ministrations were in no way therapeutic—they were traumatic. Having read about Reimer in Steven Pinker’s book The Blank Slate: The Modern Denial of Human Nature, I thought of John Money every time I came across the term gender nazi in the Facebook comments about Storm (though I haven’t read Colapinto’s book in its entirety and don’t claim to know the case in enough detail to support such a severe charge).

            Reimer’s case is by no means the only evidence that gender identity and gender-typical behavior are heavily influenced by hormones. Psychiatrist William Reiner and urologist John Gearhart report that raising boys (who’ve been exposed in utero to more testosterone) as girls after surgery to remove underdeveloped sex organs tends not to result in feminine behaviors—or even feminine identity. Of the 16 boys in their study, 2 were raised as boys, while 14 were raised as girls. Five of the fourteen remained female throughout the study, but 4 spontaneously declared themselves to be male, and 4 others decided they were male after being informed of the surgery they’d undergone. All 16 of the children displayed “moderate to marked” degrees of male-typical behavior. The authors write, “At the initial assessment, the parents of only four subjects assigned to female sex reported that their child had never stated a wish to be a boy.”

            An earlier study of so-called pseudo-hermaphrodites, boys with a hormone disorder who are born looking like girls but who become more virile in adolescence, revealed that of 18 participants who were raised as girls, all but one changed their gender identity to male. There is also a condition some girls are born with called Congenital Adrenal Hyperplasia (CAH), which is characterized by an increased amount of male hormones in their bodies. It often leads to abnormal testes and the need for surgery. But Sheri Berenbaum and J. Michael Bailey found that in the group of girls with CAH they studied, increased levels of male-typical behavior could not be explained by the development of male genitalia or the age of surgery. The hormones themselves are the likely cause of the differences. 
From Psychology Today and Satoshi Kanazawa

           One particularly fascinating finding about kids’ preferences for toys comes from the realm of ethology. It turns out that rhesus monkeys show preferences for certain types of toys depending on their gender—and they’re the same preferences you would expect. Girls will play with plush dolls or with wheeled vehicles, but boys are much more likely to go for the cars and trucks. And the difference is even more pronounced in vervet monkeys, with both females and males spending significantly more time with toys we might in other contexts call “stereotypical.” There’s even some good preliminary evidence that chimpanzees play with sticks differently depending on their gender, with males using them as tools or weapons and females cradling them like babies.

            Are gender roles based solely on stereotypes and cultural contingencies? In The Blank Slate, Pinker excerpts large sections of anthropologist Donald Brown’s inventory of behaviors that have been observed by ethnographers in all cultures that have been surveyed. Brown’s book is called Human Universals, and it casts serious doubt on theories that rule out every factor influencing development except socialization. Included in the inventory: “classification of sex,” “females do more direct child care,” “male and female and adult and child seen as having different natures,” “males more aggressive,” and “sex (gender) terminology is fundamentally binary” (435-8). These observations are based on societies, not individuals, who vary much more dramatically one to the next. The point isn’t that genes or biology determine behavioral outcomes; the relationship between biology and behavior isn’t mechanistic—it’s probabilistic. But the probabilities tend to be much higher than anyone in English departments assumes—higher even than the bloggers at Scientific American assume.

            Interestingly, even though there are resilient differences in math test scores between boys and girls—with boys’ scores showing the same average but stretching farther at each tail of the bell curve—researchers exploring women’s underrepresentation in STEM fields have ruled out the higher aptitude of a small subset of men as the most important factor. They’ve also ruled out socialization. Reviewing multiple sources of evidence, Stephen Ceci and Wendy Williams find that

            the omnipresent claim that sex differences in mathematics 
            result from early socialization (i.e., parents and teachers 
            inculcating a ‘‘math is for boys’’ attitude) fails empirical 
            scrutiny. One cannot assert that socialization causes girls to 
            opt out of math and science when girls take as many math 
            and science courses as boys in grades K–12, achieve higher 
            grades in them, and major in college math in roughly equal 
            numbers to males. Moreover, survey evidence of parental 
            attitudes and behaviors undermines the socialization 
            argument, at least for recent cohorts. (3)

If it’s not ability, and it’s not socialization, then how do we explain the greater desire on the part of men to pursue careers in math-intensive fields? Ceci and Williams believe it’s a combination of divergent preferences and the biological constraints of childbearing. Women tend to be more interested in social fields; while men like fields with a focus on objects and abstractions. However, girls with CAH show preferences closer to those of boys. (Cool, huh?)

  Ceci and Williams also point out that women who excel at math tend to score highly in tests of verbal reasoning as well, giving them more fields to choose from. (A recent longitudinal study replicates this finding - 3-26-2013.) This is interesting to me because if women are more likely to pursue careers dealing with people and words, they’re also more likely to be exposed to the strain of feminism that views science as just another male conspiracy to justify and perpetuate the patriarchal status quo. Poststructuralism and New Historicism are all the rage in the English department I study in, and deconstructing scientific texts is de rigueur. Might Derrida, Lacan, Foucault, and all their feminist successors be at fault for women’s underrepresentation in STEM fields at least as much as toys and stereotypes?

            I have little doubt that if society were arranged to optimize women’s interest in STEM fields they would be much better represented in them. But society isn’t a very easy thing to manipulate. We have to consider the possibility that the victory would be Pyrrhic. In any case, we should avoid treating children like ideological chess pieces. There’s good evidence that we couldn’t keep little kids from seeking gender cues even if we tried, and trying strikes me as cruel. None of this is to say that biology determines everything, or that gender role development is simple. In fact, my problem with the feminist view of gender is that it’s far too crude to account for such a complex phenomenon. The feminists are arm chair pontificators at best and conspiracy theorists at worst. They believe stereotypes can only be harmful. That’s akin to saying that the rules of grammar serve solely to curtail our ability to freely express ourselves. While grammar need not be as rigid as many once believed, doing away with it altogether would reduce language to meaningless babble. Humans need stereotypes and roles. We cannot live in a cultural vacuum.

            At the same time, in keeping with the general trend toward tribalism, the feminists’ complaints about pink microscopes are unfair to boys and young men. Imagine being a science-obsessed teenage boy who comes across a bunch of rants on the website for your favorite magazine. They all say, in capital and bolded letters, that suggesting to girls that trying to be pretty is a worthwhile endeavor represents some outrageous offense, that it will cause catastrophic psychological and economic harm to them. It doesn’t take a male or female genius to figure out that the main source of teenage girls’ desire to be pretty is the realization that pretty girls get more attention from hot guys. If a toy can arouse so much ire for suggesting a girl might like to be pretty, then young guys had better control their responses to hot girls—think of the message it sends. So we’re back to the idea that male attraction is inherently oppressive. Since most men can’t help being attracted to women, well, shame on them, right? 


(Full disclosure: probably as a result of a phenomenon called assortive pairing, I find ignorance of science to be a huge turn-off.)
Check out part 2 on "The Objectionable Concept of Objectification."
And part 1 on earnings.
These posts have generated pretty lengthy comment threads on Facebook, so stay tuned as well for updates based on my concession of points and links to further evidence.
And, as always, tell me what you think and share this with anyone you think would rip it apart (or anyone who might just enjoy it).
Update: Just a few minutes after posting this, I came across Evolutionary Psychologist Jesse Bering's Facebook update saying he was being unfairly attacked by feminists for his own Scientific American blog. If you'd like to show your solidarity, go to http://blogs.scientificamerican.com/bering-in-mind/.
Go here to read my response to commenters.

Gravitating Toward Tribal: The Danger of Free-Floating Ideologies

Image from the movie Zardoz. Courtesy of Thersic.com
          Ideologies are usually conceived through a coupling of comfortable tradition with a calculation of self-interest. But they can also be borne of good faith efforts at understanding. More important than their origin and development is the degree to which they are grounded. If you work out a comprehensive and adequately complex ideology that serves to explain an otherwise incomprehensible phenomenon and possibly even offers some guidance for dealing with an otherwise chaotic and frightening dynamic, you’ve created a theory that will appeal to human minds desperate for understanding and a sense, no matter how meager, of control. But does the ideology match up with reality? That’s an entirely different question.

            Free-floating ideologies, those that persist solely owing to the comforts they provide and the conveniences they secure, survive confrontations with reality and subsist despite vast lacuna in empirical support because human perception operates through a process of cross-referencing sensory inputs with prior knowledge. What we see is largely determined by what we’re looking for, and how we see it by what we believe about it. Patterns arising in what ought to be random incidents often sustain beliefs—even though in most contexts humans are terrible at calculating probabilities. A natural confirmation bias has us perceiving and remembering all the times predictions arising from our theories come to fruition while missing or forgetting all the times they fail. We tend to enjoy the company of like-minded others, and rather idiotically have our convictions bolstered by their common acceptance by those with whom we’ve chosen to associate.

            Unmoored ideologies gravitate toward certain predictable tracks in human cognition. We like to think there’s some sort of agency behind everything, an intelligence governing the universe. To think that no one’s in charge of all the swirling and colliding galaxies is variously unsettling and terrifying to us. So we take in the sublime beauty of quiet sunsets and wonder at the beneficence of the creator. Or we note coincidences in our lives, the way they fall together in a meaningful, beneficial way, and we feel a need to express gratitude to the guiding divinity. This is mostly innocent. Though it can lead to complacence and willful ignorance of entire regions where this supposedly beneficent guide has deigned never to set foot, and it can add an extra layer of grief in response to catastrophe, the comfort of believing in an invisible protector and guide has little immediate cost.

            Much more worrying is the gravitation of free-floating ideologies toward tribalism. The pseudo-scientific cult that has arisen around certain varieties of psychotherapy has bequeathed to our culture the horrifying belief that an unknown portion of the population, predominantly male, can induce the modern equivalent of demonic possession, severe psychological trauma, through an inverted laying-on of hands. The ideology has made monsters of men. The fetishizing of free markets likewise entails a belief in a loathsome variety of sub-humans. The economy, true believers assert, is a battle between the makers and the moochers, the producers and the parasites. As a conservative friend put in, in a discussion of healthcare reform, “Giving insurance to the slugs will just make them bigger slugs.”

            If you challenge someone’s beliefs by suggesting theirs is an ideology divorced from reality, as everyone does who advocates for one set of beliefs in opposition to another, the proper response is to insist that the ideology emerged from an awareness of facts through inductive reasoning. But sunsets, no matter how sublime, don’t really provide any evidence for the existence of an intelligent agency behind the curtain of the cosmos. Troubled young women with histories of abuse don’t prove that sexual experiences in childhood cause a wild assortment of psychological maladjustments. And the higher incarceration rate for impoverished groups doesn’t in any way establish some fundamental divide between good and bad types of people.

            Once ideologies reach a certain stage of development, they become all but immune to contradictory evidence. When the facts cooperate, they are trumpeted. When they don’t, the devout have recourse to principles. I’ve referred advocates of particular varieties of psychotherapy to evidence that they’re ineffective. In response, I didn’t get references to other bodies of evidence supporting the beliefs and practices in question; rather, I got an explanation of how the therapeutic techniques were supposed to work. Present a free market purist with evidence that market competition doesn’t led to innovation, or leads to detrimental innovations, and you’ll likely get a lecture explaining the principles behind how it’s supposed to work, according to the free market ideology, rather than evidence that it does, in fact, work in the theorized way. This convenient toggling back and forth between inductive and deductive reasoning literally allows us to explain away disconnects between our ideologies and the world.

            It is the tendency of free-floating ideologies toward tribalism that leads me to advocate a strict adherence to science in matters of public concern. It wasn’t merely coincidence that the enlightenment represented the inception of both the traditions of science and universal human rights, which have suffered through a traumatic childhood of their own, and are now living out a tumultuous adolescence. The tendency toward tribalism is also why I’m wary of commercial fiction which almost invariably makes characters represent ideas and personal qualities, only to pit the good guys against the bad. J.K. Rowling can claim all she wants that the Harry Potter books teach kids the evils of bigotry, but any work with goodies and baddies taps into tribal instincts. Literary fiction, on the other hand, at its best, is an exercise in empathy.

Beliefs that Make You Feel Good Make You Look Good Too—But You’re a Total Asshole if You Let That Influence You

Imagine you are among a group of around thirty people on an island and over the past few weeks you’ve learned of the presence of another group living on the same island, one which has been showing signs of hostility toward your own group. Because of your wisdom, your group has appointed you the task of convening a selective gathering to devise a strategy for dealing with the looming threat. Among your group there happen to be several people with military training as well as some with experience in diplomacy. There are also individuals claiming psychic powers and religious authority. You understand that the composition of the gathering will be among the most important factors determining the consensus strategy it will arrive at. Who do you invite to participate? Who do you exclude?

(Full disclosure: the first strategy that occurs to me is to find a way to get the rival group’s attention and then execute the psychics and religious authorities for them to witness, letting them know afterward this treatment is what they can expect from us should they decide to continue their hostility.)

Beliefs have consequences. A psychic in our hypothetical group may be convinced that he’s seen the future and in it the home group stands victorious, having suffered no casualties, over the rival group. This vision allows an otherwise outvoted military aggressor to persuade everyone else a violent raid is the best course of action. A religious leader may feel it incumbent on her to serve as a missionary to the savages. This may lead to an attempt at diplomacy which backfires by offending the rival group’s own religious sensibilities. The fate of the home group is at stake. Whose opinions do you seek?

This imaginary scenario is meant to illustrate the point that an individual’s beliefs inevitably contribute to the culture and ultimately influence the fate of societies. While it is true that the larger the society the smaller the impact of any one person’s ideas, it is likewise the case that through a mechanism called social proof the stated ideas of individuals have multiplier effects far beyond what any one person believes. Social norms are a major determiner of what people accept as true. And many people may not question pieces of conventional wisdom simply because it has never occurred to them to do so—at least not until they encounter someone who espouses wisdom of an unconventional strain.

This point may seem obvious enough, and yet it represents a major departure from the dominant approach to considering beliefs in American culture. When confronted with a new idea Americans automatically and unconsciously apply a rigid formula to assessing its merits: they ask, first, how would believing this idea make me feel, and, second, how would believing this idea make me look to others? The order of these questions may be reversed, but no other questions ever enter the equation. The foundation of our culture is an ethic of consumerism, and so people decide what to believe exactly the same way they decide what music they want to claim as their favorite, and the same way they decide what type of t-shirt they’ll wear to advertise their personal style.

Savvy marketers, public relations experts, and profiteering charlatan shitbags are well aware of the extent to which consumerism determines our beliefs and behaviors. There’s no shortage of people in this country who will have nothing to do with politics because the topic is just not sexy at all; they know politicians are considered dishonest, petty, and even corrupt. Who would want to associate themselves with that? This general distaste for government and its policy disputes derives much of its fuel from each party’s attempts to brand the other in as off-putting a way as possible. I haven’t seen a survey that establishes the link, but I’d wager where people fall on the political spectrum is largely determined by whether they'd find it less acceptable to be thought of as naïve and effete or to be thought of as callous and lacking in compassion.

I try, as much as possible, to adhere to the Enlightenment values of devotion to science and championing of universal human rights. When people of the consumerist mindset discuss their beliefs with me, they are often baffled as to why I would insist on scientific skepticism with regard to supernatural ideas and pop culture myths. Science is so dry and mechanical. So, when I tell people what I believe, I usually get one of three responses: the first is to assume that my knowledge about research on some issue must be completely independent of my beliefs, because beliefs are personal and science is not. “Okay, you’ve told me what you know about the results of some experiments. But what do you really believe?”

The second response, equally in keeping with the consumerist ethic, is to assume that anyone so devoted to science must be a dry and mechanical person, the type who is incapable of tapping into his intuition, who insists on cold hard facts and bloodless statistics. After all, the reasoning goes, this guy chose his beliefs based on how he wanted to represent himself, so if he’s spouting off stats and experimental results he must have a pretty limited and robotic personality. It should go without saying—but unfortunately it doesn’t—that this reasoning is based on a gross misunderstanding of science and statistics alike. But the other mistake implicit in this response is that people can only decide what to believe according to how they want to represent themselves to others.

And yet it’s the third response that’s the most troubling. When you listen to someone’s beliefs about, say, supply-side economics, or religion, or alternative medicine and then start going into detail about why those beliefs are almost certainly wrong, many people will immediately conclude that there’s an ulterior motive behind your scientific skepticism. Because you have such a strong tendency to reject other people’s beliefs, they reason, you must simply be the type of person who enjoys making other people feel and look stupid. It’s not enough to wear your own favorite brand of t-shirt; you have to ridicule other people’s fashion sense. People who respond this way—you know who you are—can be counted on to violently assert themselves when you challenge them. They take your arguments very personally.

The true reason I’m devoted to science, though, is that I take responsibility for the consequences of my beliefs. What you believe has a direct impact on the culture around you, and an indirect impact on the course of society at large. If you like the fit of supply-economics, if you explain to anyone who’ll listen how wealth at the top trickles down, and if you vote for conservative politicians, then you’re responsible for the results, positive or negative, of the implementation of those policies. In point of fact, the most reliable outcome of these policies is greater income inequality, which is associated with a host of societal ills from increased violent crime to higher infant mortality. I would argue that those signing on to the conservative agenda after these facts were established are complicit in the perpetuation of these social problems.

The position you take on any issue with broader social implications inevitably becomes more than a personal choice. And it’s more difficult than you may assume to come up with issues that don’t have broader social implications. Where, for instance, was your t-shirt made? What were the conditions the people who made it were working under? What effects did its manufacture have on the surrounding ecosystems? The plain fact is that any pure application of the consumerist ethic, whether to your choice of clothing or to what religion or political party you support, is profoundly irresponsible.

In my first novel, which I just recently completed, the characters address issues concerning recovered memories of child abuse. This is a topic I began researching as an undergrad studying psychology. It turns out the best research rules out the theory of repressed trauma with a high degree of certainty. Now, it shouldn’t require any great deal of trust on your part to believe I have no desire to associate myself in any way with the issue of child abuse, especially in any way that entails a risk of being perceived as wanting to defend or advocate it. But there are men in prison today convicted solely on the basis of evidence from recovered memories. If I simply towed the conventional line and neglected to thoroughly research the issue, or worse, if I ignored the products of that research, I would be complicit in the imprisonment of innocent men. This complicity extends to the seemingly innocent act of remaining silent when others around me are expressing views I know to be in error.

The tendency to rely on pure consumerism to assess ideas and to fail to take responsibility for their consequences is a trap all too easy to fall into. I can almost guarantee the shirt on your back right now was made in a third world country under conditions you’d literally kill to keep your own children safe from. But most Americans are blithely ignorant of this. And I can attest it is exceedingly difficult and prohibitively expensive to limit your purchases to products made under more humane conditions. Manufacturers depend on American consumers being ignorant and irresponsible. And yet, under some circumstances, people’s reasoning becomes eminently more practical. When your child gets sick, the sexiness of holistic medicine doesn’t lure you away from doctors trained in scientific medicine—though you may backslide if that first visit fails to cure them.

But how, you may ask, do you express your individuality if you are so committed to science? Alternatively, how can others assess your personality through your beliefs if they’re all based on some scientist’s research? Well, even if research were to prove somehow that it’s better to be extroverted than introverted, people have little control over such things. So it is with most personality traits. Science may also offer some hints about characteristics I ought to look for in a romantic partner, but ultimately which woman I pair up with will be determined by factors beyond the scope of any research project. Not every personal decision you make has wider societal consequences. Anyway, there’s plenty of room for individuality even for those of us thoroughly committed to taking responsibility for our actions and beliefs.

True Love with a Bloody Twist: the Uses and Abuses of the Sympathetic Vampire

             Two strangers lock eyes from across a crowded bar. She is a small-town waitress, living with her grandmother, he a veteran returning home. They look at each other and experience a mutual frisson of seeming recognition. First they’re intrigued with each other, then, within moments, they’re enthralled. With their long, guileless stares, involuntary shifting of their bodies to bring themselves helplessly forward, leaning toward one another then back in an unconscious conscious dance—as if it didn’t matter that they’ve been aware of each other’s existence for only a few minutes—their eyes begin to dart away from that requited gaze toward each other's lips; she tilts back her head, actually closes her eyes until she remembers where she is; any second he will heed the cues and pull her to him for the first kiss. But then they’re interrupted by rowdy patrons in the next booth over.

            There’s something not right about the following scene, which has our war-weary lover lying supine as those rowdy restaurant patrons rob him outside in the parking lot. And there’s something not right about how the waitress manages to fight them off, rescuing the helpless veteran. By the end of the first episode of True Blood, though, we see the waitress, Sookie Stackhouse, playing the more familiar role of damsel in distress—just in time for the credits to role, as if the cliffhanger left any doubt about whether the veteran, Bill Compton, would play the role of knight. And yet the familiarity of the romantic plot at the heart of the series is well subsumed within the supernatural subject matter and countless other plotlines. Bill, it turns out, is a veteran not of the Iraq or Afghanistan wars, but of the Civil War. He is a 170-plus-year-old vampire. Those patrons were robbing him not of his wallet but of his blood, which can be sold to humans as a potent aphrodisiac, having subdued him with silver chains and inserted needles into his veins.

            In terms of pure entertainment, True Blood is the best show I’ve seen in a long time. It aspires to seriousness by allegorizing the plight of homosexuals (LGBT) in modern America, while featuring a cast of characters transparently designed to explode stereotypes. Tara, an angry young black woman, is constantly reading, knows legal and medical jargon, and just wants to be loved. Lafayette, Tara’s cousin, a black flamboyantly gay cook who moonlights as a drug dealer and prostitute, is a Machiavellian mastermind who can whoop some ass. Jason, Sookie’s brother, a narcissistic horn-dog jock, has a heart of gold, and can’t bear to see anyone hurt who doesn’t deserve it. Sookie herself, at first blush an innocent and dizzy blonde, all smiles, quick nervous laughs, and friendly manners, is telepathic, strong-willed, and possesses boundless courage, as she displays in her rescue of Bill. But most of the show’s appeal comes from traditional—you might even say conservative—storytelling.

            Bill is played by Stephen Moyers, who was forty when the first season of True Blood was filmed, and Sookie by Anna Paquin, who was 27. (The actors, who began dating the first season, are now married.) Sookie likes her men older, as one of the central plotlines in season one is the love triangle she and Bill make up with Sam Merlotte, the owner of the bar where she works, who has neat tufts of gray hair and is played by Sam Trammell, 39 at the time. In season two, despite herself, Sookie takes a shine to Eric Northman, a thousand-year-old vampire played by Alexander Skarsgard, then 31.

            If the May December pairings seem insignificant, there’s also the throwback that Sookie, a woman in her mid-twenties, is a virgin when she meets Bill. Her telepathy supposedly explains her sexual reticence, as hearing the raunchy thoughts of men her own age inevitably precludes the budding of any intimacy—neat little plot device that. But there’s no doubt what the writers are really up to in the episode that has Sookie donning a billowy white dress and running bare-foot just after sunset to offer herself to Bill for the first time. The encounter takes place on a velvet blanket before a fireplace with candles on the mantle. Lest we get bored with this old-fashioned scene—or embarrassed by how much we’re enjoying it—Bill’s fangs emerge. “Do it. I want you to,” Sookie says. Sure enough, he plunges them into her neck, and lovingly licks up the gusher he’s caused. But the blood drinking is merely an interlude—in fact, it’s used as another cliffhanger—and the scene ends with Sookie’s orgasm. The sixteen-year-old boy in me exulted.

            What happens next is emblematic of the show’s worst vices. Sitting in the bathtub with Bill, Sookie reveals that she was once inappropriately touched by her great uncle when she was a young girl. What actually happened is obscured by her recollection of the man’s thoughts. The only offense actually depicted is him having her sit on his lap as he helps with her math homework. “It was just touching,” she says. “It wasn’t nearly as bad as what happens to some girls.” This storyline is superfluous, even gratuitous, meant simply to signal that Sookie is deep and complex. When Bill confronts the man, wheelchair-bound and in his eighties, the encounter is disturbing for all the wrong reasons. Bill is supposed to be struggling with his vampiric urge to kill, and the writers saw this subplot as an opportunity to let him backslide in a way that would, if anything, make him more sympathetic. But in a show so proud of its own sexual openness the unceremonious execution of a helpless old man for an unvoiced and opaquely acted out attraction he had no control over is unsettlingly hypocritical and unenlightened. (For punishment to qualify as altruistic, it needs to entail a cost or a risk to the punisher.)

            True Blood relies far too heavily on the trope of the haunted past for characterization. Watching the show, I keep imagining a Family Guy-style cutaway to Vince Vaughn in Wedding Crashers saying in a mock-tragic tone, “We lost a lot of good men out there.” This is bad psychology and lazy storytelling. The idea that personality is reducible to background lends itself to the very urge toward stereotyping the show delights in frustrating. And yet the show has the redeeming quality of being snarkily aware of its own reliance on pulp fiction conventions. In a scene from season one that has Sam and Tara somewhat begrudgingly allowing themselves to fall into a courtship of sorts, he asks her why she likes Sookie’s brother Jason and why she doesn’t do anything about it. She responds in her endearing-annoying rhotic twang, “It’s part of my whole fucked-up thing: low self-esteem, childhood trauma, blah, blah, snore.”

            In season two of Mad Men, Don Draper responds to an idea for a TV show, “It’s derivative with a twist, which is what they’re looking for.” Shows like True Blood, The Vampire Diaries, and Twilight certainly fall into the derivative-with-a-twist category. They’re all traditional romances jazzed up with supposed monsters who turn out to be nice guys with tragic pasts. And they all center on prototypes Anne Rice deserves credit for, even though her stories weren’t romances at all. Bill, Stefan, and Edward are really all Louis. Eric, Damon, and—sorry, Twilight is just too awful to watch—are Lestat. What makes the HBO version so much better is certainly not that it has anything like the substance of Rice’s early installments of The Vampire Chronicles, which owe much of their profundity to her abandonment—unfortunately short-lived—of Catholicism and her wrestling with existentialism; True Blood is fun because unlike the other shows in the paranormal romance genre it doesn’t take itself so damn seriously.

            Not all of the comedy comes from the characters’ snarky remarks about the ridiculous plots they find themselves in. The hectic pacing does wonders to keep the tone light, an effect that harks back to a much earlier HBO series dealing in supernatural fare, Tales from the Crypt. Watching the episodes, I almost want to pull out a stopwatch and see if the editors are allotting each subplot its portion of the show according to some preset pattern. Indeed, the intricate workings of the multiple plots suggest nothing so much as the inner mechanics of a watch. And the writers are savvy enough never to answer a question without replacing it with three others.

            Ultimately, True Blood fails to be anywhere near as progressive as it seems to want to be. For all the collective wincing among the audience every time someone speaks of favoring his or her own kind, any show that pits good guys against bad guys both panders to and promotes tribalism, however it’s defined in the narrative context. Some of the characters who appear bad at first show signs of redeemability. In fact, the writers, in making Eric so much worse than Bill before having him break down in tears at the death of the vampire who made him and becoming inexplicably protective of Sookie, are at risk of letting him steal the show. But there are plenty of other characters—that uncle, Maryanne, Loraine—who are simply beyond sympathy.

            The show does, however, have moments when it transcends its mere functionality as pure entertainment. There’s a scene in season one, for instance, in which Bill is sitting at a table in the kitchen of a church, his hand on a bottle of synthetic blood—ironically called true blood—awaiting the arrival of all the townspeople so he can give a speech about his memories of the Civil War, and all the while listening in, with his keen vampire hearing, as everyone remarks on the potential dangers in hosting a blood-thirsty monster. Just as you find yourself desperate for him to prove them all wrong and win them over, Sookie arrives on the arm of Sam Merlotte. Meanwhile, the gears of the watch are turning: Jason is tripping on vampire blood; his friend Hoyt is tempted to taste some true blood; the cops are keeping their ears open for clues about some murders, for which both Bill and Jason are suspects; and a group of miscreants is gearing up to ruin to the lecture. For all its busy distractibility, the scene is masterful. As Bill walks out, takes the American flag from where it’s been draped over a cross for his benefit, hangs it on its pole, and continues winning over the crowd, just as you'd hoped, you know that’s character, in both senses of the term.

Cults and Conversion Narratives

            There are three main positions you can take that will inevitably spark an argument where I’m from. And the people who jump up to disagree always do so with the same strategy: they tell a story. If you tell people here you’re an economic liberal, you may get a brief refresher course on supply side theory, but when you continue to disagree after hearing it, a story will inexorably follow which features the storyteller as a hero battling his or her way up from poverty into the proud and comfortable middle class. The implication is supposed to be that since the storyteller made it, it must be possible for everybody to make it. Hence financial safety nets and programs for the poor funded by the rich must be misguided ideas bound to fail.

            If you tell people you don’t believe in any god, there’s a slight chance you’ll get some inarticulate rehashing of the Argument from Design, but you’re much more likely to get a story. On this topic, there’s quite a bit of variety in the stories people tell. If the storyteller doesn’t have any loved ones who have died, you’ll likely get a story about an encounter with the supernatural. These stories always end with a statement along the lines of “There’s just no way to explain that,” or “There’s no way that could’ve been a coincidence.” But if the storyteller has had a loved one die, the story will be about how that loved one managed somehow to communicate with him or her from beyond the grave to let them know “they’re okay,” and “they’re waiting for me.” This supposedly proves there’s life after death, which somehow supposedly establishes the fact that some all-powerful deity presides over it.

            If you tell someone you don’t accept the theory of repressed memories, or point to evidence that there’s nothing especially damaging about childhood sexual abuse when compared to any other form of child abuse, you’ll first be called some choice names, then you’ll be accused of pedophilia yourself, and then finally you’ll get the poor woman’s story. There’s a lot of variation to these stories as well. But of course they all feature a male character in the role of evildoer. And they all end with a statement about how the storyteller continues to struggle with the resulting emotional turmoil and haunting memories to this day. (Repression and Severe Personality Disturbance from CSA are myths 13 and 34, respectively, in 50 Great Myths of Pop Psychology.)

            No matter which of the three topics you’re discussing, the storyteller will feel exhilarated at first because it seldom happens that they get a chance to spout wisdom to someone so hopelessly naïve. If you hold any of these three unpopular positions, you’ll get to hear lots and lots of stories, as if each storyteller is convinced theirs will be the story that finally converts you. But when you respond to their stories with alternative theories, describe ingeniously designed experiments, rattle off statistics, they’ll start to get uncomfortable. The next stage of the discussion will invariably entail the storyteller making a straw man of you: because you don’t answer their stories with your own, it’s assumed you don’t have any, and the reason is plain—you spend all your time reading. What follows will be a disparagement of “book learning,” an angry dismissal of what “you learned in some book,” and the general suggestion that you’ve lived your life sealed up in an Ivory Tower.

            I am a humanist. I believe the best we can do for humanity is to spread enlightenment principles as far and wide and as in depth as possible. That’s why I’m skeptical of all these conversion narratives. It’s not just that the evidence doesn’t support them. Each one of them implicitly conveys a message of tribalism. The hero of the rags-to-republican story is suggesting he or she made it because they were virtuous, that the people who don’t make it have only themselves to blame. And don’t get them started on that shadowy outgroup, the government. It’s us versus them and we’re better. The very basis of our ideas of good and evil rests on our innate proclivity to confuse the abstract with the supernatural. If you establish that even one supernatural event has occurred, you’ve simultaneously proven that some cosmic order underlies all existence. There are believers and infidels, saints and sinners. And if nearly every young girl in the world is living in the shadow of molestation by some unredeemable male predator then we must all mobilize to do battle against this great evil. You’re either with us or you’re one of them.

            I do not accept the idea that man is fallen, or that humans are. As a humanist, I believe that we are the most exulted beings on the planet, and quite likely among the most exulted beings in the universe. We need to act on behalf of humanity, not for some invisible entity whose interests can never be known, not for any subgroup we see as superior by dint of our individual membership in it. If you can only defend your beliefs with conversion narratives, then you are a member of a cult. And our division into such cults is precisely the impediment we need to overcome. The solution to problems like war and poverty and child abuse lies not in converting more members to this or that cult, but in our ingenuity and imagination. Just look what we’ve accomplished. Imagine what else we could accomplish.

            Are all these conversion narratives completely false then? Personality psychologist Dan McAdams, in his book The Stories We Live By: Personal Myths and the Making of the Self, describes identity as “an inner story of the self that integrates the reconstructed past, perceived present, and anticipated future to provide a life with unity, purpose, and meaning.” To an adult, childhood is a welter of floating details and vague impressions. McAdams suggests that at some point we structure all this ambiguity into a set of narratives. Every time we recall an event we reinterpret or “reconstruct” it, making our memories much more malleable than most of us are comfortable admitting. The problem comes in as we reconstruct the past into a narrative that gives us purpose. Too often that purpose consists of recognizing or acknowledging evil and thenceforth doing battle against it. What we fail to realize is that the supposed evildoers have their own narratives.

            The culture in which we develop our identities provides the raw material of wider narratives for us to sample. Sometime in our late teens or early twenties, we chose elements from one or two of these and subsequently go back in time to carve the formless block of our pasts into sculptures we want to resemble ourselves. (This happened for me when at twenty-two I read Carl Sagan’s Demon-Haunted World.) Some of our memories may better lend themselves to integration into particular narratives, so it’s not as though our pasts have no bearing on who we become. But it’s also probably true that we overestimate the significance of any given experience because it’s hard to accept how insignificant most experiences are. Such thinking leads to existentialism, a doubting of all purpose. But I have a purpose. I am a humanist. I especially enjoy a good story—just not one with good guys and bad guys.

Eric Harris: Antisocial Aggressor or Narcissistic Avenger?

Coincident with my writing a paper defending Gabriel Conroy in James Joyce’s story “The Dead” from charges of narcissism leveled by Lacanian critics, my then girlfriend was preparing a presentation on the Columbine shooter Eric Harris which had her trying to determine whether he would have better fit the DSM-IV diagnostic criteria for Narcissistic or for Antisocial Personality Disorder. Everything about Harris screamed narcissist, but there was a deal-breaker for the diagnosis: people who hold themselves in astronomical esteem seem unlikely candidates for suicide, and Harris turned his gun on himself in culmination of his murder spree.


Clinical diagnoses are mere descriptive categorizations which don’t in any way explain behavior; at best, they may pave the way for explanations by delineating the phenomenon to be explained. Yet the nature of Harris’s thinking about himself has important implications for our understanding of other types of violence. Was he incapable of empathizing with others, unable to see and unwilling to treat them as feeling, sovereign beings, in keeping with an antisocial diagnosis? Or did he instead believe himself to be so superior to his peers that they simply didn’t merit sympathy or recognition, suggesting narcissism? His infamous journals suggest pretty unequivocally that the latter was the case. But again we must ask if a real narcissist would kill himself?

This seeming paradox was brought to my attention again this week as I was reading 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior (about which I will very likely be writing more here). Myth #33 is that “Low Self-Esteem Is a Major Cause of Psychological Problems” (162). The authors make use of the common misconception that the two boys responsible for the shootings were meek and shy and got constantly picked on until their anger boiled over into violence. (It turns out the boiling-over metaphor is wrong too, as explained under Myth #30: “It’s Better to Express Anger to Others than to Hold It in.”) The boys were indeed teased and taunted, but the experience didn’t seem to lower their view of themselves. “Instead,” the authors write, “Harris and Klebold’s high self-esteem may have led them to perceive the taunts of their classmates as threats to their inflated sense of self-worth, motivating them to seek revenge” (165).

Narcissists, they explain, “believe themselves deserving of special privileges” or entitlements. “When confronted with a challenge to their perceived worth, or what clinical psychologists term a ‘narcissistic injury,’ they’re liable to lash out at others” (165). We usually think of school shootings as random acts of violence, but maybe the Columbine massacre wasn’t exactly random. It may rather have been a natural response to perceived offenses—just one that went atrociously beyond the realm of what anyone would consider fair. If what Harris did on that day in April of 1999 was not an act of aggression but one of revenge, it may be useful to consider it in terms of costly punishment, a special instance of costly signaling.

The strength of a costly signal is commensurate with that cost, so Harris’s willingness both to kill and to die might have been his way of insisting that the offense he was punishing was deathly serious. What the authors of 50 Great Myths argue is that the perceived crime consisted of his classmates not properly recognizing and deferring to his superiority. Instead of contradicting the idea that Harris held himself in great esteem then, his readiness to die for the sake of his message demonstrates just how superior he thought he was—in his mind the punishment was justified by the offense, and how seriously he took the slights of his classmates can be seen as an index of how superior to them he thought he was. The greater the difference in relative worth between Harris and his schoolmates, the greater the injustice.

Perceived relative status plays a role in all punishments. Among two people of equal status, such factors as any uncertainty regarding guilt, mitigating circumstances surrounding the offense, and concern for making the punishment equal to the crime will enter into any consideration of just deserts. But the degree to which these factors are ignored can be used as an index for the size of the power differential between the two individuals—or at least to the perceived power differential. Someone who feels infinitely superior will be willing to dish out infinite punishment. Absent a truly horrendous crime, revenge is a narcissistic undertaking.

Also read Sympathizing with Psychos: Why We Want to See Alex Escape His Fate as a Clockwork Orange.

And: The Mental Illness Zodiac: Why the DSM V Won't Be Anything But More Pseudoscience