Tribal Feminism

Are 1 in 5 Women Really Sexually Assaulted on College Campuses?

Falsely Accused Duke Lacrosse Players
            If you were a university administrator and you wanted to know how prevalent a particular experience was for students on campus, you would probably conduct a survey that asked a few direct questions about that experience—foremost among them the question of whether the student had at some point had the experience you’re interested in. Obvious, right? Recently, we’ve been hearing from many news media sources, and even from President Obama himself, that one in five college women experience sexual assault at some time during their tenure as students. It would be reasonable to assume that the surveys used to arrive at this ratio actually asked the participants directly whether or not they had been assaulted. 

            But it turns out the web survey that produced the one-in-five figure did no such thing. Instead, it asked students whether they had had any of several categories of experience the study authors later classified as sexual assault, or attempted sexual assault, in their analysis. This raises the important question of how we should define sexual assault when we’re discussing the issue—along with the related question of why we’re not talking about a crime that’s more clearly defined, like rape. Of course, whatever you call it, sexual violence is such a horrible crime that most of us are willing to forgive anyone who exaggerates the numbers or paints an overly frightening picture of reality in an attempt to prevent future cases. (The issue is so serious that PolitiFact refrained from applying their trademark Truth-O-Meter to the one-in-five figure.) 

            But there are four problems with this attitude. The first is that for every supposed assault there is an alleged perpetrator. Dramatically overestimating the prevalence of the crime comes with the attendant risk of turning public perception against the accused, making it more difficult for the innocent to convince anyone of their innocence. The second problem is that by exaggerating the danger in an effort to protect college students we’re sabotaging any opportunity these young adults may have to make informed decisions about the risks they take on. No one wants students to die in car accidents either, but we don’t manipulate the statistics to persuade them one in five drivers will die in a crash before they graduate from college. The third problem is that going to college and experimenting with sex are for many people a wonderful set of experiences they remember fondly for the rest of their lives. Do we really want young women to barricade themselves in their dorms? Do we want young men to feel like they have to get signed and notarized documentation of consent before they try to kiss anyone? The fourth problem I’ll get to in a bit.

            We need to strike some appropriate balance in our efforts to raise awareness without causing paranoia or inspiring unwarranted suspicion. And that balance should be represented by the results of our best good-faith effort to arrive at as precise an understanding of the risk as our most reliable methods allow. For this purpose, The Department of Justice’s Campus Sexual Assault Study, the source of the oft-cited statistic, is all but completely worthless. It has limitations, to begin with, when it comes to representativeness, since it surveyed students on just two university campuses. And, while the overall sample was chosen randomly, the 42% response rate implies a great deal of self-selection on behalf of the participants. The researchers did compare late responders to early ones to see if there was a systematic difference in their responses. But this doesn’t by any means rule out the possibility that many students chose categorically not to respond because they had nothing to say, and therefore had no interest in the study. (Some may have even found it offensive.) These are difficulties common to this sort of simple web-based survey, and they make interpreting the results problematic enough to recommend against their use in informing policy decisions.

            The biggest problems with the study, however, are not with the sample but with the methods. The survey questions appear to have been deliberately designed to generate inflated incidence rates. The basic strategy of avoiding direct questions about whether the students had been the victims of sexual assault is often justified with the assumption that many young people can’t be counted on to know what actions constitute rape and assault. But attempting to describe scenarios in survey items to get around this challenge opens the way for multiple interpretations and discounts the role of countless contextual factors. The CSA researchers write, “A surprisingly large number of respondents reported that they were at a party when the incident happened.” Cathy Young, a contributing editor at Reason magazine who analyzed the study all the way back in 2011, wrote that

the vast majority of the incidents it uncovered involved what the study termed “incapacitation” by alcohol (or, rarely, drugs): 14 percent of female respondents reported such an experience while in college, compared to six percent who reported sexual assault by physical force. Yet the question measuring incapacitation was framed ambiguously enough that it could have netted many “gray area” cases: “Has someone had sexual contact with you when you were unable to provide consent or stop what was happening because you were passed out, drugged, drunk, incapacitated, or asleep?” Does “unable to provide consent or stop” refer to actual incapacitation – given as only one option in the question – or impaired judgment?  An alleged assailant would be unlikely to get a break by claiming he was unable to stop because he was drunk.

This type of confusion is why it’s important to design survey questions carefully. That the items in the CSA study failed to make the kind of fine distinctions that would allow for more conclusive interpretations suggests the researchers had other goals in mind.

The researchers’ use of the blanket term “sexual assault,” and their grouping of attempted with completed assaults, is equally suspicious. Any survey designer cognizant of all the difficulties of web surveys would likely try to narrow the focus of the study as much as possible, and they would also try to eliminate as many sources of confusion with regard to definitions or descriptions as possible. But, as Young points out,

The CSA Study’s estimate of sexual assault by physical force is somewhat problematic as well – particularly for attempted sexual assaults, which account for nearly two-thirds of the total. Women were asked if anyone had ever had or attempted to have sexual contact with them by using force or threat, defined as “someone holding you down with his or her body weight, pinning your arms, hitting or kicking you, or using or threatening to use a weapon.” Suppose that, during a make-out session, the man tries to initiate sex by rolling on top of the woman, with his weight keeping her from moving away – but once she tells him to stop, he complies. Would this count as attempted sexual assault?

The simplest way to get around many of these difficulties would have been to ask the survey participants directly whether they had experienced the category of crime the researchers were interested in. If the researchers were concerned that the students might not understand that being raped while drunk still counts as rape, why didn’t they just ask the participants a question to that effect? It’s a simple enough question to devise.

The study did pose a follow up question to participants it classified as victims of forcible assault, the responses to which hint at the students’ actual thoughts about the incidents. It turns out 37 percent of so-called forcible assault victims explained that they hadn’t contacted law enforcement because they didn’t think the incident constituted a crime. That bears repeating: a third of the students the study says were forcibly assaulted didn’t think any crime had occurred. With regard to another category of victims, those of incapacitated assault, Young writes, “Not surprisingly, three-quarters of the female students in this category did not label their experience as rape.” Of those the study classified as actually having been raped while intoxicated, only 37 percent believed they had in fact been raped. Two thirds of the women the study labels as incapacitated rape victims didn’t believe they had been raped. Why so much disagreement on such a serious issue? Of the entire incapacitated sexual assault victim category, Young writes,

Two-thirds said they did not report the incident to the authorities because they didn’t think it was serious enough. Interestingly, only two percent reported having suffered emotional or psychological injury – a figure so low that the authors felt compelled to include a footnote asserting that the actual incidence of such trauma was undoubtedly far higher.

So the largest category making up the total one-in-five statistic is predominantly composed of individuals who didn’t think what happened to them was serious enough to report. And nearly all of them came away unscathed, both physically and psychologically.

            The impetus behind the CSA study was a common narrative about a so-called “rape culture” in which sexual violence is accepted as normal and young women fail to report incidents because they’re convinced you’re just supposed to tolerate it. That was the researchers’ rationale for using their own classification scheme for the survey participants’ experiences even when it was at odds with the students’ beliefs. But researchers have been doing this same dance for thirty years. As Young writes,

When the first campus rape studies in the 1980s found that many women labeled as victims by researchers did not believe they had been raped, the standard explanation was that cultural attitudes prevent women from recognizing forced sex as rape if the perpetrator is a close acquaintance. This may have been true twenty-five years ago, but it seems far less likely in our era of mandatory date rape and sexual assault workshops and prevention programs on college campuses.

The CSA also surveyed a large number of men, almost none of whom admitted to assaulting women. The researchers hypothesize that the men may have feared the survey wasn’t really anonymous, but that would mean they knew the behaviors in question were wrong. Again, if the researchers are really worried about mistaken beliefs regarding the definition of rape, they could investigate the issue with a few added survey items.

The huge discrepancies between incidences of sexual violence as measured by researchers and as reported by survey participants becomes even more suspicious in light of the history of similar studies. Those campus rape studies Young refers to from the 1980s produced a ratio of one in four. Their credibility was likewise undermined by later surveys that found that most of the supposed victims didn’t believe they’d been raped, and around forty percent of them went on to have sex with their alleged assailants again. A more recent study by the CDC used similar methods—a phone survey with a low response rate—and concluded that one in five women has been raped at some time in her life. Looking closer at this study, feminist critic and critic of feminism Christina Hoff Sommers attributes this finding as well to “a non-representative sample and vaguely worded questions.” It turns out activists have been conducting different versions of this same survey, and getting similarly, wildly inflated results for decades.

            Sommers challenges the CDC findings in a video everyone concerned with the issue of sexual violence should watch. We all need to understand that well-intentioned and intelligent people can, and often do, get carried away with activism that seems to have laudable goals but ends up doing more harm than good. Some people even build entire careers on this type of crusading. And PR has become so sophisticated that we never need to let a shortage, or utter lack of evidence keep us from advocating for our favorite causes. But there’s still a fourth problem with crazily exaggerated risk assessments—they obfuscate issues of real importance, making it more difficult to come up with real solutions. As Sommers explains,

To prevent rape and sexual assault we need state-of-the-art research. We need sober estimates. False and sensationalist statistics are going to get in the way of effective policies. And unfortunately, when it comes to research on sexual violence, exaggeration and sensation are not the exception; they are the rule. If you hear about a study that shows epidemic levels of sexual violence against American women, or college students, or women in the military, I can almost guarantee the researchers used some version of the defective CDC methodology. Now by this method, known as advocacy research, you can easily manufacture a women’s crisis. But here’s the bottom line: this is madness. First of all it trivializes the horrific pain and suffering of survivors. And it sends scarce resources in the wrong direction. Sexual violence is too serious a matter for antics, for politically motivated posturing. And right now the media, politicians, rape culture activists—they are deeply invested in these exaggerated numbers.

So while more and more normal, healthy, and consensual sexual practices are considered crimes, actual acts of exploitation and violence are becoming all the more easily overlooked in the atmosphere of paranoia. And college students face the dilemma of either risking assault or accusation by going out to enjoy themselves or succumbing to the hysteria and staying home, missing out on some of the richest experiences college life has to offer.

            One in five is a truly horrifying ratio. As conservative crime researcher Heather McDonald points out, “Such an assault rate would represent a crime wave unprecedented in civilized history. By comparison, the 2012 rape rate in New Orleans and its immediately surrounding parishes was .0234 percent; the rate for all violent crimes in New Orleans in 2012 was .48 percent.” I don’t know how a woman can pass a man on a sidewalk after hearing such numbers and not look at him with suspicion. Most of the reforms rape culture activists are pushing for now chip away at due process and strip away the rights of the accused. No one wants to make coming forward any more difficult for actual victims, but our first response to anyone making such a grave accusation—making any accusation—should be skepticism. Victims suffer severe psychological trauma, but then so do the falsely accused. The strongest evidence of an honest accusation is often the fact that the accuser must incur some cost in making it. That’s why we say victims who come forward are heroic. That’s the difference between a victim and a survivor.

Trumpeting crazy numbers creates the illusion that a large percentage of men are monsters, and this fosters an us-versus-them mentality that obliterates any appreciation for the difficulty of establishing guilt. That would be a truly scary world to live in. Fortunately, we in the US don’t really live in such a world. Sex doesn’t have to be that scary. It’s usually pretty damn fun. And the vast majority of men you meet—the vast majority of women as well—are good people. In fact, I’d wager most men would step in if they were around when some psychopath was trying to rape someone.



The Spider-Man Stars' Dust-up over Pseudo-Sexism

A new definition of the word sexism has taken hold in the English-speaking world, even to the point where it’s showing up in official definitions. No longer used merely to describe the belief that women are somehow inferior to men, sexism can now refer to any belief in gender differences. Case in point: when Spider-Man star Andrew Garfield fielded a question from a young boy about how the superhero came by his iconic costume by explaining that he sewed it himself, even though sewing is “kind of a feminine thing to do,” Emma Gray and The Huffington Post couldn’t resist griping about Garfield’s “Casual Sexism” and celebrating his girlfriend Emma Stone’s “Most Perfect Way” of calling it out. Gray writes,

Instead of letting the comment—which assumes that there is something fundamentally female about sewing, and that doing such a “girly” thing must be qualified with a “masculine” outcome—slide, Stone turned the Q&A panel into an important teachable moment. She stopped her boyfriend and asked: “It's feminine, how?”

Those three words are underwhelming enough to warrant suspicion that Gray is really just cheerleading for someone she sees as playing for the right team.  

            A few decades ago, people would express beliefs about the proper roles and places for women quite openly in public. Outside of a few bastions of radical conservatism, you’re unlikely to hear anyone say that women shouldn’t be allowed to run businesses or serve in high office today. But rather than being leveled with decreasing frequency the charge of sexism is now applied to a wider and more questionable assortment of ideas and statements. Surprised at having fallen afoul of this broadening definition of sexism, Garfield responded to Stone’s challenge by saying,

It’s amazing how you took that as an insult. It’s feminine because I would say femininity is about more delicacy and precision and detail work and craftsmanship. Like my mother, she’s an amazing craftsman. She in fact made my first Spider-Man costume when I was three. So I use it as a compliment, to compliment the feminine in women but in men as well. We all have feminine in us, young men.

Gray sees that last statement as a result of how Stone “pressed Garfield to explain himself.” Watch the video, though, and you’ll see she did little pressing. He seemed happy to explain what he meant. And that last line was actually a reiteration of the point he’d made originally by saying, “It’s kind of a feminine thing to do, but he really made a very masculine costume”—the line that Stone pounced on. 
  
            Garfield’s handling of both the young boy’s question and Stone’s captious interruption is far more impressive than Stone’s supposedly perfect way of calling him out. Indeed, Stone’s response was crudely ideological, implying quite simply that her boyfriend had revealed something embarrassing about himself—gotcha!—and encouraging him to expound further on his unacceptable ideas so she and the audience could chastise him. She had, like Gray, assumed that any reference to gender roles was sexist by definition. But did Garfield’s original answer to the boy’s question really reveal that he “assumes that there is something fundamentally female about sewing, and that doing such a ‘girly’ thing must be qualified with a ‘masculine’ outcome,” as Gray claims? (Note her deceptively inconsistent use of scare quotes and actual quotes.)

Garfield’s thinking throughout the exchange was quite sophisticated. First, he tried to play up Spider-Man’s initiative and self-sufficiency because he knew the young fan would appreciate these qualities in his hero. Then he seems to have realized that the young boy might be put off by the image of his favorite superhero engaging in an activity that’s predominantly taken up by women. Finally, he realized he could use this potential uneasiness as an opportunity for making the point that just because a male does something generally considered feminine that doesn’t mean he’s any less masculine. This is the opposite of sexism. So why did Stone and Gray cry foul? 

One of the tenets of modern feminism is that gender roles are either entirely chimerical or, to the extent that they exist, socially constructed. In other words, they’re nothing but collective delusions. Accepting, acknowledging, or referring to gender roles then, especially in the presence of a young child, abets in the perpetuation of these separate roles. Another tenet of modern feminism that comes into play here is that gender roles are inextricably linked to gender oppression. The only way for us as a society to move toward greater equality, according to this ideology, is for us to do away with gender roles altogether. Thus, when Garfield or anyone else refers to them as if they were real or in any way significant, he must be challenged.

One of the problems with Stone’s and Gray’s charge of sexism is that there happens to be a great deal of truth in every aspect of Garfield’s answer to the boy’s question. Developmental psychologists consistently find that young children really are preoccupied with categorizing behaviors by gender and that the salience of gender to children arises so reliably and at so young an age that it’s unlikely to stem from socialization. Studies have also consistently found that women tend to excel in tasks requiring fine motor skill, while men excel in most other dimensions of motor ability. And what percentage of men ever go beyond sewing buttons on their shirts—if they do even that? Why but for the sake of political correctness would anyone deny this difference? Garfield’s response to Stone’s challenge was also remarkably subtle. He didn’t act as though he’d been caught in a faux pas but instead turned the challenge around, calling Stone out for assuming he somehow intended to disparage women. He then proudly expounded on his original point. If anything, it looked a little embarrassing for Stone.

Modern feminism has grown over the past decade to include the push for LGBT rights. Historically, gender roles were officially sanctioned and strictly enforced, so it was understandable that anyone advocating for women’s rights would be inclined to question those roles. Today, countless people who don’t fit neatly into conventional gender categories are in a struggle with constituencies who insist their lifestyles and sexual preferences are unnatural. But even those of us who support equal rights for LGBT people have to ask ourselves if the best strategy for combating bigotry is an aggressive and wholesale denial of gender. Isn’t it possible to recognize gender differences, and even celebrate them, without trying to enforce them prescriptively? Can’t we accept the possibility that some average differences are innate without imposing definitions on individuals or punishing them for all the ways they upset expectations? And can’t we challenge religious conservatives for the asinine belief that nature sets up rigid categories and the idiotic assumption that biology is about order as opposed to diversity instead of ignoring (or attacking) psychologists who study gender differences?

I think most people realize there’s something not just unbecoming but unfair about modern feminism’s anti-gender attitude. And most people probably don’t appreciate all the cheap gotchas liberal publications like The Huffington Post and The Guardian and Slate are so fond of touting. Every time feminists accuse someone of sexism for simply referring to obvious gender differences, they belie their own case that feminism is no more and no less than a belief in the equality of women. Only twenty percent of Americans identify themselves as feminists, while over eighty percent believe in equality for women. Feminism, like sexism, has clearly come to mean something other than what it used to. It may be the case that just as the gender roles of the past century gradually came to be seen as too rigid so too that century’s ideologies are increasingly seen as too lacking in nuance and their proponents too quick to condemn. It may even be that we Americans and Brits no longer need churchy ideologies to tell us all people deserve to be treated equally. 

Why I Won't Be Attending the Gender-Flipped Shakespeare Play

The Guardian’s “Women’s Blog” reports that “Gender-flips used to challenge sexist stereotypes are having a moment,” and this is largely owing, author Kira Cochrane suggests, to the fact that “Sometimes the best way to make a point about sexism is also the simplest.” This simple approach to making a point consists of taking a work of art or piece of advertising and swapping the genders featured in them. Cochrane goes on to point out that “the gender-flip certainly isn’t a new way to make a political point,” and notes that “it’s with the recent rise of feminist campaigning and online debate that this approach has gone mainstream.”

What is the political point gender-flips are making? As a dancer in a Jennifer Lopez video that reverses the conventional gender roles asks, “Why do men always objectify the women in every single video?” Australian comedian Christiaan Van Vuuren explains that he posed for a reproduction of a GQ cover originally featuring a sexy woman to call attention to the “over-sexualization of the female body in the high-fashion world.” The original cover photo of Miranda Kerr is undeniably beautiful. The gender-flipped version is funny. The obvious takeaway is that we look at women and men differently (gasp!). When women strike an alluring pose, or don revealing clothes, it’s sexy. When men try to do the same thing, it’s ridiculous. Feminists insist that this objectification or over-sexualization of women is a means of oppression. But is it? And are gender-flips simple ways of making a point, or just cheap gimmicks? 

Tonight, my alma mater IPFW is hosting a production called “Juliet and Romeo,” a gender-flipped version of Shakespeare’s most recognizable play. The lead on the Facebook page for the event asks us to imagine that “Juliet is instead a bold Montague who courts a young, sheltered Capulet by the name of Romeo.” Lest you fear the production is just a stunt to make a political point about gender, the hosts have planned a “panel discussion focusing on Shakespeare, gender, and language.” Many former classmates and teachers, most of whom I consider friends, a couple I consider good friends, are either attending or participating in the event. But I won’t be going.

I don’t believe the production is being put on in the spirit of open-minded experimentation. Like the other gender-flip examples, the purpose of staging “Juliet and Romeo” is to make a point about stereotypes. And I believe this proclivity toward using literature as fodder to fuel ideological agendas is precisely what’s most wrong with English lit programs in today’s universities. There have to be better ways to foster interest in great works than by letting activists posing as educators use them as anvils to hammer agendas into students’ heads against.

You may take the position that my objections would carry more weight were I to attend the event before rendering judgment on it. But I believe the way to approach literature is as an experience, not as a static set of principles or stand-alone abstractions. And I don’t want thoughts about gender politics to intrude on my experience of Shakespeare—especially when those thoughts are of such dubious merit. I want to avoid the experience of a gender-flipped production of Shakespeare because I believe scholarship should push us farther into literature—enhance our experience of it, make it more immediate and real—not cast us out of it by importing elements of political agendas and making us cogitate about some supposed implications for society of what’s going on before our eyes.

Regarding that political point, I see no contradiction in accepting, even celebrating, our culture’s gender roles while at the same time supporting equal rights for both genders. Sexism is a belief that one gender is inferior to the other. Demonstrating that people of different genders tend to play different roles in no way proves that either is being treated as inferior. As for objectification and over-sexualization, a moment’s reflection ought to make clear that the feminists are getting this issue perfectly backward. Physical attractiveness is one of the avenues through which women exercise power over men. Miranda Kerr got paid handsomely for that GQ cover. And what could be more arrantly hypocritical than Jennifer Lopez complaining about objectification in music videos? She owes her celebrity in large part to her willingness to allow herself to be objectified. The very concept of objectification is only something we accept from long familiarity--people are sexually aroused by other people, not objects.

I’m not opposed to having a discussion about gender roles and power relations, but if you have something to say, then say it. I’m not even completely opposed to discussing gender in the context of Shakespeare’s plays. What I am opposed to is people hijacking our experience of Shakespeare to get some message across, people toeing the line by teaching that literature is properly understood by “looking at it through the lens” of one or another well-intentioned but completely unsupported ideology, and people misguidedly making sex fraught and uncomfortable for everyone. I doubt I’m alone in turning to literature, at least in part, to get away from that sort of puritanism in church. Guilt-tripping guys and encouraging women to walk around with a chip on their shoulders must be one of the least effective ways to get people to respect each other more we've ever come up with.

But, when you guys do a performance of the original Shakespeare, you can count on me being there to experience it. 


Update:

The link to this post on Facebook generated some heated commentary. Some were denials of ideological intentions on behalf of those putting on the event. Some were mischaracterizations based on presumed “traditionalist” associations with my position. Some made the point that Shakespeare himself played around with gender, so it should be okay for others to do the same with his work. In the end, I did feel compelled to attend the event because I had taken such a strong position. Having flipflopped and attended the event, I have to admit I enjoyed it. All the people involved were witty, charming, intellectually stimulating, and pretty much all-around delightful. 

But, as was my original complaint, it was quite clear—and at two points explicitly stated—that the "experiment" entailed using the play as a springboard for a discussion of current issues like marriage rights. Everyone, from the cast to audience members, was quick to insist after the play that they felt it was completely natural and convincing. But gradually more examples of "awkward," "uncomfortable," or "weird" lines or scenes came up. Shannon Bischoff, a linguist one commenter characterized as the least politically correct guy I’d ever meet, did in fact bring up a couple aspects of the adaptation that he found troubling. But even he paused after saying something felt weird, as if to say, "Is that alright?" (Being weirded out about a 15 year old Romeo being pursued by a Juliet in her late teens was okay because it was about age not gender.) 

The adapter himself, Jack Cant, said at one point that though he was tempted to rewrite some of the parts that seemed really strange he decided to leave them in because he wanted to let people be uncomfortable. The underlying assumption of the entire discussion was that gender is a "social construct" and that our expectations are owing solely to "stereotypes." And the purpose of the exercise was for everyone to be brought face-to-face with their assumptions about gender so that they could expiate them. I don't think any fair-minded attendee could deny the agreed-upon message was that this is a way to help us do away with gender roles—and that doing so would be a good thing. (If there was any doubt, Jack’s wife eliminated it when she stood up from her seat in the audience to say she wondered if Jack had learned enough from the exercise to avoid applying gender stereotypes to his nieces.) And this is exactly what I mean by ideology. Sure, Shakespeare played around with gender in As You Like It and Twelfth Night. But he did it for dramatic or comedic effect primarily, and to send a message secondarily—or more likely not at all.

For the record, I think biology plays a large (but of course not exclusive) part in gender roles, I enjoy and celebrate gender roles (love being a man; love women who love being women), but I also support marriage rights for homosexuals and try to be as accepting as I can of people who don't fit the conventional roles.

To make one further clarification: whether you support an anti-gender agenda and whether you think Shakespeare should be used as a tool for this or any other ideological agenda are two separate issues. I happen not to support anti-genderism. My main point in this post, however, is that ideology—good, bad, valid, invalid—should not play a part in literature education. Because, for instance, while students are being made to feel uncomfortable about their unexamined gender assumptions, they're not feeling uncomfortable about, say, whether Romeo might be rushing into marriage too hastily, or whether Juliet will wake up in time to keep him from drinking the poison—you know, the actual play. 

Whether Shakespeare was sending a message or not, I'm sure he wanted first and foremost for his audiences to respond to the characters he actually created. And we shouldn't be using "lenses" to look at plays; we should be experiencing them. They're not treatises. They're not coded allegories. And, as old as they may be to us, every generation of students gets to discover them anew. 

We can discuss politics and gender or whatever you want. There's a time and a place for that and it's not in a lit classroom. Sure, let's encourage students to have open minds about gender and other issues, and let's help them to explore their culture and their own habits of thought. There are good ways to do that—ideologically adulterated Shakespeare is not one of them.





The Self-Righteousness Instinct: Steven Pinker on the Better Angels of Modernity and the Evils of Morality

Steven Pinker is one of the few scientists who can write a really long book and still expect a significant number of people to read it. But I have a feeling many who might be vaguely intrigued by the buzz surrounding his 2011 book The Better Angels of Our Nature: Why Violence Has Declined wonder why he had to make it nearly seven hundred outsized pages long. Many curious folk likely also wonder why a linguist who proselytizes for psychological theories derived from evolutionary or Darwinian accounts of human nature would write a doorstop drawing on historical and cultural data to describe the downward trajectories of rates of the worst societal woes. The message that violence of pretty much every variety is at unprecedentedly low rates comes as quite a shock, as it runs counter to our intuitive, news-fueled sense of being on a crash course for Armageddon. So part of the reason behind the book’s heft is that Pinker has to bolster his case with lots of evidence to get us to rethink our views. But flipping through the book you find that somewhere between half and a third of its mass is devoted, not to evidence of the decline, but to answering the questions of why the trend has occurred and why it gives every indication of continuing into the foreseeable future. So is this a book about how evolution has made us violent or about how culture is making us peaceful?

The first thing that needs to be said about Better Angels is that you should read it. Despite its girth, it’s at no point the least bit cumbersome to read, and at many points it’s so fascinating that, weighty as it is, you’ll have a hard time putting it down. Pinker has mastered a prose style that’s simple and direct to the point of feeling casual without ever wanting for sophistication. You can also rest assured that what you’re reading is timely and important because it explores aspects of history and social evolution that impact pretty much everyone in the world but that have gone ignored—if not censoriously denied—by most of the eminences contributing to the zeitgeist since the decades following the last world war.

            Still, I suspect many people who take the plunge into the first hundred or so pages are going to feel a bit disoriented as they try to figure out what the real purpose of the book is, and this may cause them to falter in their resolve to finish reading. The problem is that the resistance Better Angels runs to such a prodigious page-count simultaneously anticipating and responding to doesn’t come from news media or the blinkered celebrities in the carnivals of sanctimonious imbecility that are political talk shows. It comes from Pinker’s fellow academics. The overall point of Better Angels remains obscure owing to some deliberate caginess on the author’s part when it comes to identifying the true targets of his arguments. 

            This evasiveness doesn’t make the book difficult to read, but a quality of diffuseness to the theoretical sections, a multitude of strands left dangling, does at points make you doubt whether Pinker had a clear purpose in writing, which makes you doubt your own purpose in reading. With just a little tying together of those strands, however, you start to see that while on the surface he’s merely righting the misperception that over the course of history our species has been either consistently or increasingly violent, what he’s really after is something different, something bigger. He’s trying to instigate, or at least play a part in instigating, a revolution—or more precisely a renaissance—in the way scholars and intellectuals think not just about human nature but about the most promising ways to improve the lot of human societies.

The longstanding complaint about evolutionary explanations of human behavior is that by focusing on our biology as opposed to our supposedly limitless capacity for learning they imply a certain level of fixity to our nature, and this fixedness is thought to further imply a limit to what political reforms can accomplish. The reasoning goes, if the explanation for the way things are is to be found in our biology, then, unless our biology changes, the way things are is the way they’re going to remain. Since biological change occurs at the glacial pace of natural selection, we’re pretty much stuck with the nature we have. 

            Historically, many scholars have made matters worse for evolutionary scientists today by applying ostensibly Darwinian reasoning to what seemed at the time obvious biological differences between human races in intelligence and capacity for acquiring the more civilized graces, making no secret of their conviction that the differences justified colonial expansion and other forms of oppressive rule. As a result, evolutionary psychologists of the past couple of decades have routinely had to defend themselves against charges that they’re secretly trying to advance some reactionary (or even genocidal) agenda. Considering Pinker’s choice of topic in Better Angels in light of this type of criticism, we can start to get a sense of what he’s up to—and why his efforts are discombobulating.

If you’ve spent any time on a university campus in the past forty years, particularly if it was in a department of the humanities, then you have been inculcated with an ideology that was once labeled postmodernism but that eventually became so entrenched in academia, and in intellectual culture more broadly, that it no longer requires a label. (If you took a class with the word "studies" in the title, then you got a direct shot to the brain.) Many younger scholars actually deny any espousal of it—“I’m not a pomo!”—with reference to a passé version marked by nonsensical tangles of meaningless jargon and the conviction that knowledge of the real world is impossible because “the real world” is merely a collective delusion or social construction put in place to perpetuate societal power structures. The disavowals notwithstanding, the essence of the ideology persists in an inescapable but unremarked obsession with those same power structures—the binaries of men and women, whites and blacks, rich and poor, the West and the rest—and the abiding assumption that texts and other forms of media must be assessed not just according to their truth content, aesthetic virtue, or entertainment value, but also with regard to what we imagine to be their political implications. Indeed, those imagined political implications are often taken as clear indicators of the author’s true purpose in writing, which we must sniff out—through a process called “deconstruction,” or its anemic offspring “rhetorical analysis”—lest we complacently succumb to the subtle persuasion.

In the late nineteenth and early twentieth centuries, faith in what we now call modernism inspired intellectuals to assume that the civilizations of Western Europe and the United States were on a steady march of progress toward improved lives for all their own inhabitants as well as the world beyond their borders. Democracy had brought about a new age of government in which rulers respected the rights and freedom of citizens. Medicine was helping ever more people live ever longer lives. And machines were transforming everything from how people labored to how they communicated with friends and loved ones. Everyone recognized that the driving force behind this progress was the juggernaut of scientific discovery. But jump ahead a hundred years to the early twenty-first century and you see a quite different attitude toward modernity. As Pinker explains in the closing chapter of Better Angels,

A loathing of modernity is one of the great constants of contemporary social criticism. Whether the nostalgia is for small-town intimacy, ecological sustainability, communitarian solidarity, family values, religious faith, primitive communism, or harmony with the rhythms of nature, everyone longs to turn back the clock. What has technology given us, they say, but alienation, despoliation, social pathology, the loss of meaning, and a consumer culture that is destroying the planet to give us McMansions, SUVs, and reality television? (692)

The social pathology here consists of all the inequities and injustices suffered by the people on the losing side of those binaries all us closet pomos go about obsessing over. Then of course there’s industrial-scale war and all the other types of modern violence. With terrorism, the War on Terror, the civil war in Syria, the Israel-Palestine conflict, genocides in the Sudan, Kosovo, and Rwanda, and the marauding bands of drugged-out gang rapists in the Congo, it seems safe to assume that science and democracy and capitalism have contributed to the construction of an unsafe global system with some fatal, even catastrophic design flaws. And that’s before we consider the two world wars and the Holocaust. So where the hell is this decline Pinker refers to in his title?
Historical Myopia, Edge.org History of Violence
            One way to think about the strain of postmodernism or anti-modernism with the most currency today (and if you’re reading this essay you can just assume your views have been influenced by it) is that it places morality and politics—identity politics in particular—atop a hierarchy of guiding standards above science and individual rights. So, for instance, concerns over the possibility that a negative image of Amazonian tribespeople might encourage their further exploitation trump objective reporting on their culture by anthropologists, even though there’s no evidence to support those concerns. And evidence that the disproportionate number of men in STEM fields reflects average differences between men and women in lifestyle preferences and career interests is ignored out of deference to a political ideal of perfect parity. The urge to grant moral and political ideals veto power over science is justified in part by all the oppression and injustice that abounds in modern civilizations—sexism, racism, economic exploitation—but most of all it’s rationalized with reference to the violence thought to follow in the wake of any movement toward modernity. Pinker writes,

“The twentieth century was the bloodiest in history” is a cliché that has been used to indict a vast range of demons, including atheism, Darwin, government, science, capitalism, communism, the ideal of progress, and the male gender. But is it true? The claim is rarely backed up by numbers from any century other than the 20th, or by any mention of the hemoclysms of centuries past. (193)

He gives the question even more gravity when he reports that all those other areas in which modernity is alleged to be such a colossal failure tend to improve in the absence of violence. “Across time and space,” he writes in the preface, “the more peaceable societies also tend to be richer, healthier, better educated, better governed, more respectful of their women, and more likely to engage in trade” (xxiii). So the question isn’t just about what the story with violence is; it’s about whether science, liberal democracy, and capitalism are the disastrous blunders we’ve learned to think of them as or whether they still just might hold some promise for a better world.
*******
Thomas Hobbes
            It’s in about the third chapter of Better Angels that you start to get the sense that Pinker’s style of thinking is, well, way out of style. He seems to be marching to the beat not of his own drummer but of some drummer from the nineteenth century. In the chapter previous, he drew a line connecting the violence of chimpanzees to that in what he calls non-state societies, and the images he’s left you with are savage indeed. Now he’s bringing in the philosopher Thomas Hobbes’s idea of a government Leviathan that once established immediately works to curb the violence that characterizes us humans in states of nature and anarchy. According to sociologist Norbert Elias’s 1969 book, The Civilizing Process, a work whose thesis plays a starring role throughout Better Angels, the consolidation of a Leviathan in England set in motion a trend toward pacification, beginning with the aristocracy no less, before spreading down to the lower ranks and radiating out to the countries of continental Europe and onward thence to other parts of the world. You can measure your feelings of unease in response to Pinker’s civilizing scenario as a proxy for how thoroughly steeped you are in postmodernism.
Norbert Elias
            The two factors missing from his account of the civilizing pacification of Europe that distinguish it from the self-congratulatory and self-exculpatory sagas of centuries past are the innate superiority of the paler stock and the special mission of conquest and conversion commissioned by a Christian god. In a later chapter, Pinker violates the contemporary taboo against discussing—or even thinking about—the potential role of average group (racial) differences in a propensity toward violence, but he concludes the case for any such differences is unconvincing: “while recent biological evolution may, in theory, have tweaked our inclinations toward violence and nonviolence, we have no good evidence that it actually has” (621). The conclusion that the Civilizing Process can’t be contingent on congenital characteristics follows from the observation of how readily individuals from far-flung regions acquire local habits of self-restraint and fellow-feeling when they’re raised in modernized societies. As for religion, Pinker includes it in a category of factors that are “Important but Inconsistent” with regard to the trend toward peace, dismissing the idea that atheism leads to genocide by pointing out that “Fascism happily coexisted with Catholicism in Spain, Italy, Portugal, and Croatia, and though Hitler had little use for Christianity, he was by no means an atheist, and professed that he was carrying out a divine plan.” Though he cites several examples of atrocities incited by religious fervor, he does credit “particular religious movements at particular times in history” with successfully working against violence (677).

            Despite his penchant for blithely trampling on the taboos of the liberal intelligentsia, Pinker refuses to cooperate with our reflex to pigeonhole him with imperialists or far-right traditionalists past or present. He continually holds up to ridicule the idea that violence has any redeeming effects. In a section on the connection between increasing peacefulness and rising intelligence, he suggests that our violence-tolerant “recent ancestors” can rightly be considered “morally retarded” (658). He singles out George W. Bush as an unfortunate and contemptible counterexample in a trend toward more complex political rhetoric among our leaders. And if it’s either gender that comes out not looking as virtuous in Better Angels it ain’t the distaff one. Pinker is difficult to categorize politically because he’s a scientist through and through. What he’s after are reasoned arguments supported by properly weighed evidence.

But there is something going on in Better Angels beyond a mere accounting for the ongoing decline in violence that most of us are completely oblivious of being the beneficiaries of. For one, there’s a challenge to the taboo status of topics like genetic differences between groups, or differences between individuals in IQ, or differences between genders. And there’s an implicit challenge as well to the complementary premises he took on more directly in his earlier book The Blank Slate that biological theories of human nature always lead to oppressive politics and that theories of the infinite malleability of human behavior always lead to progress (communism relies on a blank slate theory, and it inspired guys like Stalin, Mao, and Pol Pot to murder untold millions). But the most interesting and important task Pinker has set for himself with Better Angels is a restoration of the Enlightenment, with its twin pillars of science and individual rights, to its rightful place atop the hierarchy of our most cherished guiding principles, the position we as a society misguidedly allowed to be usurped by postmodernism, with its own dual pillars of relativism and identity politics.

 But, while the book succeeds handily in undermining the moral case against modernism, it does so largely by stealth, with only a few explicit references to the ideologies whose advocates have dogged Pinker and his fellow evolutionary psychologists for decades. Instead, he explores how our moral intuitions and political ideals often inspire us to make profoundly irrational arguments for positions that rational scrutiny reveals to be quite immoral, even murderous. As one illustration of how good causes can be taken to silly, but as yet harmless, extremes, he gives the example of how “violence against children has been defined down to dodgeball” (415) in gym classes all over the US, writing that

The prohibition against dodgeball represents the overshooting of yet another successful campaign against violence, the century-long movement to prevent the abuse and neglect of children. It reminds us of how a civilizing offensive can leave a culture with a legacy of puzzling customs, peccadilloes, and taboos. The code of etiquette bequeathed to us by this and other Rights Revolutions is pervasive enough to have acquired a name. We call it political correctness. (381)

Such “civilizing offensives” are deliberately undertaken counterparts to the fortuitously occurring Civilizing Process Elias proposed to explain the jagged downward slope in graphs of relative rates of violence beginning in the Middle Ages in Europe. The original change Elias describes came about as a result of rulers consolidating their territories and acquiring greater authority. As Pinker explains,

Once Leviathan was in charge, the rules of the game changed. A man’s ticket to fortune was no longer being the baddest knight in the area but making a pilgrimage to the king’s court and currying favor with him and his entourage. The court, basically a government bureaucracy, had no use for hotheads and loose cannons, but sought responsible custodians to run its provinces. The nobles had to change their marketing. They had to cultivate their manners, so as not to offend the king’s minions, and their empathy, to understand what they wanted. The manners appropriate for the court came to be called “courtly” manners or “courtesy.” (75)

And this higher premium on manners and self-presentation among the nobles would lead to a cascade of societal changes.
Elias first lighted on his theory of the Civilizing Process as he was reading some of the etiquette guides which survived from that era. It’s striking to us moderns to see that knights of yore had to be told not to dispose of their snot by shooting it into their host’s table cloth, but that simply shows how thoroughly people today internalize these rules. As Elias explains, they’ve become second nature to us. Of course, we still have to learn them as children. Pinker prefaces his discussion of Elias’s theory with a recollection of his bafflement at why it was so important for him as a child to abstain from using his knife as a backstop to help him scoop food off his plate with a fork. Table manners, he concludes, reside on the far end of a continuum of self-restraint at the opposite end of which are once-common practices like cutting off the nose of a dining partner who insults you. Likewise, protecting children from the perils of flying rubber balls is the product of a campaign against the once-common custom of brutalizing them. The centrality of self-control is the common underlying theme: we control our urge to misuse utensils, including their use in attacking our fellow diners, and we control our urge to throw things at our classmates, even if it’s just in sport. The effect of the Civilizing Process in the Middle Ages, Pinker explains, was that “A culture of honor—the readiness to take revenge—gave way to a culture of dignity—the readiness to control one’s emotions” (72). In other words, diplomacy became more important than deterrence.

            What we’re learning here is that even an evolved mind can adjust to changing incentive schemes. Chimpanzees have to control their impulses toward aggression, sexual indulgence, and food consumption in order to survive in hierarchical bands with other chimps, many of whom are bigger, stronger, and better-connected. Much of the violence in chimp populations takes the form of adult males vying for positions in the hierarchy so they can enjoy the perquisites males of lower status must forgo to avoid being brutalized. Lower ranking males meanwhile bide their time, hopefully forestalling their gratification until such time as they grow stronger or the alpha grows weaker. In humans, the capacity for impulse-control and the habit of delaying gratification are even more important because we live in even more complex societies. Those capacities can either lie dormant or they can be developed to their full potential depending on exactly how complex the society is in which we come of age. Elias noticed a connection between the move toward more structured bureaucracies, less violence, and an increasing focus on etiquette, and he concluded that self-restraint in the form of adhering to strict codes of comportment was both an advertisement of, and a type of training for, the impulse-control that would make someone a successful bureaucrat.

            Aside from children who can’t fathom why we’d futz with our forks trying to capture recalcitrant peas, we normally take our society’s rules of etiquette for granted, no matter how inconvenient or illogical they are, seldom thinking twice before drawing unflattering conclusions about people who don’t bother adhering to them, the ones for whom they aren’t second nature. And the importance we place on etiquette goes beyond table manners. We judge people according to the discretion with which they dispose of any and all varieties of bodily effluent, as well as the delicacy with which they discuss topics sexual or otherwise basely instinctual. 

            Elias and Pinker’s theory is that, while the particular rules are largely arbitrary, the underlying principle of transcending our animal nature through the application of will, motivated by an appreciation of social convention and the sensibilities of fellow community members, is what marked the transition of certain constituencies of our species from a violent non-state existence to a relatively peaceful, civilized lifestyle. To Pinker, the uptick in violence that ensued once the counterculture of the 1960s came into full blossom was no coincidence. The squares may not have been as exciting as the rock stars who sang their anthems to hedonism and the liberating thrill of sticking it to the man. But a society of squares has certain advantages—a lower probability for each of its citizens of getting beaten or killed foremost among them.

            The Civilizing Process as Elias and Pinker, along with Immanuel Kant, understand it picks up momentum as levels of peace conducive to increasingly complex forms of trade are achieved. To understand why the move toward markets or “gentle commerce” would lead to decreasing violence, us pomos have to swallow—at least momentarily—our animus for Wall Street and all the corporate fat cats in the top one percent of the wealth distribution. The basic dynamic underlying trade is that one person has access to more of something than they need, but less of something else, while another person has the opposite balance, so a trade benefits them both. It’s a win-win, or a positive-sum game. The hard part for educated liberals is to appreciate that economies work to increase the total wealth; there isn’t a set quantity everyone has to divvy up in a zero-sum game, an exchange in which every gain for one is a loss for another. And Pinker points to another benefit:

Positive-sum games also change the incentives for violence. If you’re trading favors or surpluses with someone, your trading partner suddenly becomes more valuable to you alive than dead. You have an incentive, moreover, to anticipate what he wants, the better to supply it to him in exchange for what you want. Though many intellectuals, following in the footsteps of Saints Augustine and Jerome, hold businesspeople in contempt for their selfishness and greed, in fact a free market puts a premium on empathy. (77)

The Occupy Wall Street crowd will want to jump in here with a lengthy list of examples of businesspeople being unempathetic in the extreme. But Pinker isn’t saying commerce always forces people to be altruistic; it merely encourages them to exercise their capacity for perspective-taking. Discussing the emergence of markets, he writes,

The advances encouraged the division of labor, increased surpluses, and lubricated the machinery of exchange. Life presented people with more positive-sum games and reduced the attractiveness of zero-sum plunder. To take advantage of the opportunities, people had to plan for the future, control their impulses, take other people’s perspectives, and exercise the other social and cognitive skills needed to prosper in social networks. (77)

And these changes, the theory suggests, will tend to make merchants less likely on average to harm anyone. As bad as bankers can be, they’re not out sacking villages.

            Once you have commerce, you also have a need to start keeping records. And once you start dealing with distant partners it helps to have a mode of communication that travels. As writing moved out of the monasteries, and as technological advances in transportation brought more of the world within reach, ideas and innovations collided to inspire sequential breakthroughs and discoveries. Every advance could be preserved, dispersed, and ratcheted up. Pinker focuses on two relatively brief historical periods that witnessed revolutions in the way we think about violence, and both came in the wake of major advances in the technologies involved in transportation and communication. The first is the Humanitarian Revolution that occurred in the second half of the eighteenth century, and the second covers the Rights Revolutions in the second half of the twentieth. The Civilizing Process and gentle commerce weren’t sufficient to end age-old institutions like slavery and the torture of heretics. But then came the rise of the novel as a form of mass entertainment, and with all the training in perspective-taking readers were undergoing the hitherto unimagined suffering of slaves, criminals, and swarthy foreigners became intolerably imaginable. People began to agitate and change ensued.

            The Humanitarian Revolution occurred at the tail end of the Age of Reason and is recognized today as part of the period known as the Enlightenment. According to some scholarly scenarios, the Enlightenment, for all its successes like the American Constitution and the abolition of slavery, paved the way for all those allegedly unprecedented horrors in the first half of the twentieth century. Notwithstanding all this ivory tower traducing, the Enlightenment emerged from dormancy after the Second World War and gradually gained momentum, delivering us into a period Pinker calls the New Peace. Just as the original Enlightenment was preceded by increasing cosmopolitanism, improving transportation, and an explosion of literacy, the transformations that brought about the New Peace followed a burst of technological innovation. For Pinker, this is no coincidence. He writes,

If I were to put my money on the single most important exogenous cause of the Rights Revolutions, it would be the technologies that made ideas and people increasingly mobile. The decades of the Rights Revolutions were the decades of the electronics revolutions: television, transistor radios, cable, satellite, long-distance telephones, photocopiers, fax machines, the Internet, cell phones, text messaging, Web video. They were the decades of the interstate highway, high-speed rail, and the jet airplane. They were the decades of the unprecedented growth in higher education and in the endless frontier of scientific research. Less well known is that they were also the decades of an explosion in book publishing. From 1960 to 2000, the annual number of books published in the United States increased almost fivefold. (477)

Violence got slightly worse in the 60s. But the Civil Rights Movement was underway, Women’s Rights were being extended into new territories, and people even began to acknowledge that animals could suffer, prompting them to argue that we shouldn’t cause them to do so without cause. Today the push for Gay Rights continues. By 1990, the uptick in violence was over, and so far the move toward peace is looking like an ever greater success. Ironically, though, all the new types of media bringing images from all over the globe into our living rooms and pockets contributes to the sense that violence is worse than ever.
*******

            Three factors brought about a reduction in violence over the course of history then: strong government, trade, and communications technology. These factors had the impact they did because they interacted with two of our innate propensities, impulse-control and perspective-taking, by giving individuals both the motivation and the wherewithal to develop them both to ever greater degrees. It’s difficult to draw a clear delineation between developments that were driven by chance or coincidence and those driven by deliberate efforts to transform societies. But Pinker does credit political movements based on moral principles with having played key roles:

Insofar as violence is immoral, the Rights Revolutions show that a moral way of life often requires a decisive rejection of instinct, culture, religion, and standard practice. In their place is an ethics that is inspired by empathy and reason and stated in the language of rights. We force ourselves into the shoes (or paws) of other sentient beings and consider their interests, starting with their interest in not being hurt or killed, and we ignore superficialities that may catch our eye such as race, ethnicity, gender, age, sexual orientation, and to some extent, species. (475)

Some of the instincts we must reject in order to bring about peace, however, are actually moral instincts.

Pinker is setting up a distinction here between different kinds of morality. The one he describes that’s based on perspective-taking—which evidence he presents later suggests inspires sympathy—and is “stated in the language of rights” is the one he credits with transforming the world for the better. Of the idea that superficial differences shouldn’t distract us from our common humanity, he writes,

This conclusion, of course, is the moral vision of the Enlightenment and the strands of humanism and liberalism that have grown out of it. The Rights Revolutions are liberal revolutions. Each has been associated with liberal movements, and each is currently distributed along a gradient that runs, more or less, from Western Europe to the blue American states to the red American states to the democracies of Latin America and Asia and then to the more authoritarian countries, with Africa and most of the Islamic world pulling up the rear. In every case, the movements have left Western cultures with excesses of propriety and taboo that are deservedly ridiculed as political correctness. But the numbers show that the movements have reduced many causes of death and suffering and have made the culture increasingly intolerant of violence in any form. (475-6)

So you’re not allowed to play dodgeball at school or tell off-color jokes at work, but that’s a small price to pay. The most remarkable part of this passage though is that gradient he describes; it suggests the most violent regions of the globe are also the ones where people are the most obsessed with morality, with things like Sharia and so-called family values. It also suggests that academic complaints about the evils of Western culture are unfounded and startlingly misguided. As Pinker casually points out in his section on Women’s Rights, “Though the United States and other Western nations are often accused of being misogynistic patriarchies, the rest of the world is immensely worse” (413).
Jonathan Haidt
            The Better Angels of Our Nature came out about a year before Jonathan Haidt’s The Righteous Mind, but Pinker’s book beats Haidt’s to the punch by identifying a serious flaw in his reasoning. The Righteous Mind explores how liberals and conservatives conceive of morality differently, and Haidt argues that each conception is equally valid so we should simply work to understand and appreciate opposing political views. It’s not like you’re going to change anyone’s mind anyway, right? But the liberal ideal of resisting certain moral intuitions tends to bring about a rather important change wherever it’s allowed to be realized. Pinker writes that

right or wrong, retracting the moral sense from its traditional spheres of community, authority, and purity entails a reduction of violence. And that retraction is precisely the agenda of classical liberalism: a freedom of individuals from tribal and authoritarian force, and a tolerance of personal choices as long as they do not infringe on the autonomy and well-being of others. (637)

Classical liberalism—which Pinker distinguishes from contemporary political liberalism—can even be viewed as an effort to move morality away from the realm of instincts and intuitions into the more abstract domains of law and reason. The perspective-taking at the heart of Enlightenment morality can be said to consist of abstracting yourself from your identifying characteristics and immediate circumstances to imagine being someone else in unfamiliar straits. A man with a job imagines being a woman who can’t get one. A white man on good terms with law enforcement imagines being a black man who gets harassed. This practice of abstracting experiences and distilling individual concerns down to universal principles is the common thread connecting Enlightenment morality to science.

            So it’s probably no coincidence, Pinker argues, that as we’ve gotten more peaceful, people in Europe and the US have been getting better at abstract reasoning as well, a trend which has been going on for as long as researchers have had tests to measure it. Psychologists over the course of the twentieth century have had to adjust IQ test results (the average is always 100) a few points every generation because scores on a few subsets of questions have kept going up. The regular rising of scores is known as the Flynn Effect, after psychologist James Flynn, who was one of the first researchers to realize the trend was more than methodological noise. Having posited a possible connection between scientific and moral reasoning, Pinker asks, “Could there be a moral Flynn Effect?” He explains,

We have several grounds for supposing that enhanced powers of reason—specifically, the ability to set aside immediate experience, detach oneself from a parochial vantage point, and frame one’s ideas in abstract, universal terms—would lead to better moral commitments, including an avoidance of violence. And we have just seen that over the course of the 20th century, people’s reasoning abilities—particularly their ability to set aside immediate experience, detach themselves from a parochial vantage point, and think in abstract terms—were steadily enhanced. (656)

Pinker cites evidence from an array of studies showing that high-IQ people tend have high moral IQs as well. One of them, an infamous study by psychologist Satoshi Kanazawa based on data from over twenty thousand young adults in the US, demonstrates that exceptionally intelligent people tend to hold a particular set of political views. And just as Pinker finds it necessary to distinguish between two different types of morality he suggests we also need to distinguish between two different types of liberalism:

Intelligence is expected to correlate with classical liberalism because classical liberalism is itself a consequence of the interchangeability of perspectives that is inherent to reason itself. Intelligence need not correlate with other ideologies that get lumped into contemporary left-of-center political coalitions, such as populism, socialism, political correctness, identity politics, and the Green movement. Indeed, classical liberalism is sometimes congenial to the libertarian and anti-political-correctness factions in today’s right-of-center coalitions. (662)

And Kanazawa’s findings bear this out. It’s not liberalism in general that increases steadily with intelligence, but a particular kind of liberalism, the type focusing more on fairness than on ideology.
*******

Following the chapters devoted to historical change, from the early Middle Ages to the ongoing Rights Revolutions, Pinker includes two chapters on psychology, the first on our “Inner Demons” and the second on our “Better Angels.” Ideology gets some prime real estate in the Demons chapter, because, he writes, “the really big body counts in history pile up” when people believe they’re serving some greater good. “Yet for all that idealism,” he explains, “it’s ideology that drove many of the worst things that people have ever done to each other.” Christianity, Nazism, communism—they all “render opponents of the ideology infinitely evil and hence deserving of infinite punishment” (556). Pinker’s discussion of morality, on the other hand, is more complicated. It begins, oddly enough, in the Demons chapter, but stretches into the Angels one as well. This is how the section on morality in the Angels chapter begins:

The world has far too much morality. If you added up all the homicides committed in pursuit of self-help justice, the casualties of religious and revolutionary wars, the people executed for victimless crimes and misdemeanors, and the targets of ideological genocides, they would surely outnumber the fatalities from amoral predation and conquest. The human moral sense can excuse any atrocity in the minds of those who commit it, and it furnishes them with motives for acts of violence that bring them no tangible benefit. The torture of heretics and conversos, the burning of witches, the imprisonment of homosexuals, and the honor killing of unchaste sisters and daughters are just a few examples. (622)

The postmodern push to give precedence to moral and political considerations over science, reason, and fairness may seem like a good idea at first. But political ideologies can’t be defended on the grounds of their good intentions—they all have those. And morality has historically caused more harm than good. It’s only the minimalist, liberal morality that has any redemptive promise:

Though the net contribution of the human moral sense to human well-being may well be negative, on those occasions when it is suitably deployed it can claim some monumental advances, including the humanitarian reforms of the Enlightenment and the Rights Revolutions of recent decades. (622)

            One of the problems with ideologies Pinker explores is that they lend themselves too readily to for-us-or-against-us divisions which piggyback on all our tribal instincts, leading to dehumanization of opponents as a step along the path to unrestrained violence. But, we may ask, isn’t the Enlightenment just another ideology? If not, is there some reliable way to distinguish an ideological movement from a “civilizing offensive” or a “Rights Revolution”? Pinker doesn’t answer these questions directly, but it’s in his discussion of the demonic side of morality where Better Angels offers its most profound insights—and it’s also where we start to be able to piece together the larger purpose of the book. He writes,

In The Blank Slate I argued that the modern denial of the dark side of human nature—the doctrine of the Noble Savage—was a reaction against the romantic militarism, hydraulic theories of aggression, and glorification of struggle and strife that had been popular in the late 19th and early 20th centuries. Scientists and scholars who question the modern doctrine have been accused of justifying violence and have been subjected to vilification, blood libel, and physical assault. The Noble Savage myth appears to be another instance of an antiviolence movement leaving a cultural legacy of propriety and taboo. (488)

Since Pinker figured that what he and his fellow evolutionary psychologists kept running up against was akin to the repulsion people feel against poor table manners or kids winging balls at each other in gym class, he reasoned that he ought to be able to simply explain to the critics that evolutionary psychologists have no intention of justifying, or even encouraging complacency toward, the dark side of human nature. “But I am now convinced,” he writes after more than a decade of trying to explain himself, “that a denial of the human capacity for evil runs even deeper, and may itself be a feature of human nature” (488). That feature, he goes on to explain, makes us feel compelled to label as evil anyone who tries to explain evil scientifically—because evil as a cosmic force beyond the reach of human understanding plays an indispensable role in group identity.

Roy Baumeister
            Pinker began to fully appreciate the nature of the resistance to letting biology into discussions of human harm-doing when he read about the work of psychologist Roy Baumeister exploring the wide discrepancies in accounts of anger-inducing incidents between perpetrators and victims. The first studies looked at responses to minor offenses, but Baumeister went on to present evidence that the pattern, which Pinker labels the “Moralization Gap,” can be scaled up to describe societal attitudes toward historical atrocities. Pinker explains,

The Moralization Gap consists of complementary bargaining tactics in the negotiation for recompense between a victim and a perpetrator. Like opposing counsel in a lawsuit over a tort, the social plaintiff will emphasize the deliberateness, or at least the depraved indifference, of the defendant’s action, together with the pain and suffering the plaintiff endures. The social defendant will emphasize the reasonableness or unavoidability of the action, and will minimize the plaintiff’s pain and suffering. The competing framings shape the negotiations over amends, and also play to the gallery in a competition for their sympathy and for a reputation as a responsible reciprocator. (491)

Another of the Inner Demons Pinker suggests plays a key role in human violence is the drive for dominance, which he explains operates not just at the level of the individual but at that of the group to which he or she belongs. We want our group, however we understand it in the immediate context, to rest comfortably atop a hierarchy of other groups. What happens is that the Moralization Gap gets mingled with this drive to establish individual and group superiority. You see this dynamic playing out even in national conflicts. Pinker points out,

The victims of a conflict are assiduous historians and cultivators of memory. The perpetrators are pragmatists, firmly planted in the present. Ordinarily we tend to think of historical memory as a good thing, but when the events being remembered are lingering wounds that call for redress, it can be a call to violence. (493)

Name a conflict and with little effort you’ll likely also be able to recall contentions over historical records associated with it.

            The outcome of the Moralization Gap being taken to the group historical level is what Pinker and Baumeister call the “Myth of Pure Evil.” Harm-doing narratives start to take on religious overtones as what began as a conflict between regular humans pursuing or defending their interests, in ways they probably reasoned were just, transforms into an eternal struggle against inhuman and sadistic agents of chaos. And Pinker has come to realize that it is this Myth of Pure Evil that behavioral scientists ineluctably end up blaspheming:

Baumeister notes that in the attempt to understand harm-doing, the viewpoint of the scientist or scholar overlaps with the viewpoint of the perpetrator. Both take a detached, amoral stance toward the harmful act. Both are contextualizers, always attentive to the complexities of the situation and how they contributed to the causation of the harm. And both believe that the harm is ultimately explicable. (495)

This is why evolutionary psychologists who study violence inspire what Pinker in The Blank Slate called “political paranoia and moral exhibitionism” (106) on the part of us naïve pomos, ravenously eager to showcase our valor by charging once more into the breach against the mythical malevolence. All the while, our impregnable assurance of our own righteousness is borne of the conviction that we’re standing up for the oppressed. Pinker writes,

The viewpoint of the moralist, in contrast, is the viewpoint of the victim. The harm is treated with reverence and awe. It continues to evoke sadness and anger long after it was perpetrated. And for all the feeble ratiocination we mortals throw at it, it remains a cosmic mystery, a manifestation of the irreducible and inexplicable existence of evil in the universe. Many chroniclers of the Holocaust consider it immoral even to try to explain it. (495-6)

We simply can’t help inflating the magnitude of the crime in our attempt to convince our ideological opponents of their folly—though what we’re really inflating is our own, and our group’s, glorification—and so we can’t abide anyone puncturing our overblown conception because doing so lends credence to the opposition, making us look a bit foolish in the process for all our exaggerations.

            Reading Better Angels, you get the sense that Pinker experienced some genuine surprise and some real delight in discovering more and more corroboration for the idea that rates of violence have been trending downward in nearly every domain he explored. But things get tricky as you proceed through the pages because many of his arguments take on opposing positions he avoids naming. He seems to have seen the trove of evidence for declining violence as an opportunity to outflank the critics of evolutionary psychology in leftist, postmodern academia (to use a martial metaphor). Instead of calling them out directly, he circles around to chip away at the moral case for their political mission. We see this, for example, in his discussion of rape, which psychologists get into all kinds of trouble for trying to explain. After examining how scientists seem to be taking the perspective of perpetrators, Pinker goes on to write,

The accusation of relativizing evil is particularly likely when the motive the analyst imputes to the perpetrator appears to be venial, like jealousy, status, or retaliation, rather than grandiose, like the persistence of suffering in the world or the perpetuation of race, class, or gender oppression. It is also likely when the analyst ascribes the motive to every human being rather than to a few psychopaths or to the agents of a malignant political system (hence the popularity of the doctrine of the Noble Savage). (496)

In his earlier section on Woman’s Rights and the decline of rape, he attributed the difficulty in finding good data on the incidence of the crime, as well as some of the “preposterous” ideas about what motivates it, to the same kind of overextensions of anti-violence campaigns that lead to arbitrary rules about the use of silverware and proscriptions against dodgeball:

Common sense never gets in the way of a sacred custom that has accompanied a decline in violence, and today rape centers unanimously insist that “rape or sexual assault is not an act of sex or lust—it’s about aggression, power, and humiliation, using sex as the weapon. The rapist’s goal is domination.” (To which the journalist Heather MacDonald replies: “The guys who push themselves on women at keggers are after one thing only, and it’s not a reinstatement of the patriarchy.”) (406)

Jumping ahead to Pinker’s discussion of the Moralization Gap, we see that the theory that rape is about power, as opposed to the much more obvious theory that it’s about sex, is an outgrowth of the Myth of Pure Evil, an inflation of the mundane drives that lead some pathetic individuals to commit horrible crimes into eternal cosmic forces, inscrutable and infinitely punishable.

            When feminists impute political motives to rapists, they’re crossing the boundary from Enlightenment morality to the type of moral ideology that inspires dehumanization and violence. The good news is that it’s not difficult to distinguish between the two. From the Enlightenment perspective, rape is indefensibly wrong because it violates the autonomy of the victim—it’s an act of violence perpetrated by one individual against another. From the ideological perspective, every rape must be understood in the context of the historical oppression of women by men; it transcends the individuals involved as a representation of a greater evil. The rape-as-a-political-act theory also comes dangerously close to implying a type of collective guilt, which is a clear violation of individual rights.

Scholars already make the distinction between three different waves of feminism. The first two fall within Pinker’s definition of Rights Revolutions; they encompassed pushes for suffrage, marriage rights, and property rights, and then the rights to equal pay and equal opportunity in the workplace. The third wave is avowedly postmodern, its advocates committed to the ideas that gender is a pure social construct and that suggesting otherwise is an act of oppression. What you come away from Better Angels realizing, even though Pinker doesn’t say it explicitly, is that somewhere between the second and third waves feminists effectively turned against the very ideas and institutions that had been most instrumental in bringing about the historical improvements in women’s lives from the Middle Ages to the turn of the twenty-first century. And so it is with all the other ideologies on the postmodern roster.

Another misguided propaganda tactic that dogged Pinker’s efforts to identify historical trends in violence can likewise be understood as an instance of inflating the severity of crimes on behalf of a moral ideology—and the taboo placed on puncturing the bubble or vitiating the purity of evil with evidence and theories of venial motives. As he explains in the preface, “No one has ever recruited activists to a cause by announcing that things are getting better, and bearers of good news are often advised to keep their mouths shut lest they lull people into complacency” (xxii). Here again the objective researcher can’t escape the appearance of trying to minimize the evil, and therefore risks being accused of looking the other way, or even of complicity. But in an earlier section on genocide Pinker provides the quintessential Enlightenment rationale for the clear-eyed scientific approach to studying even the worst atrocities. He writes,

The effort to whittle down the numbers that quantify the misery can seem heartless, especially when the numbers serve as propaganda for raising money and attention. But there is a moral imperative in getting the facts right, and not just to maintain credibility. The discovery that fewer people are dying in wars all over the world can thwart cynicism among compassion-fatigued news readers who might otherwise think that poor countries are irredeemable hellholes. And a better understanding of what drove the numbers down can steer us toward doing things that make people better off rather than congratulating ourselves on how altruistic we are. (320)

This passage can be taken as the underlying argument of the whole book. And it gestures toward some far-reaching ramifications to the idea that exaggerated numbers are a product of the same impulse that causes us to inflate crimes to the status of pure evil.

Could it be that the nearly universal misperception that violence is getting worse all over the world, that we’re doomed to global annihilation, and that everywhere you look is evidence of the breakdown in human decency—could it be that the false impression Pinker set out to correct with Better Angels is itself a manifestation of a natural urge in all of us to seek out evil and aggrandize ourselves by unconsciously overestimating it? Pinker himself never goes as far as suggesting the mass ignorance of waning violence is a byproduct of an instinct toward self-righteousness. Instead, he writes of the “gloom” about the fate of humanity,

I think it comes from the innumeracy of our journalistic and intellectual culture. The journalist Michael Kinsley recently wrote, “It is a crushing disappointment that Boomers entered adulthood with Americans killing and dying halfway around the world, and now, as Boomers reach retirement and beyond, our country is doing the same damned thing.” This assumes that 5,000 Americans dying is the same damned thing as 58,000 Americans dying, and that a hundred thousand Iraqis being killed is the same damned thing as several million Vietnamese being killed. If we don’t keep an eye on the numbers, the programming policy “If it bleeds it leads” will feed the cognitive shortcut “The more memorable, the more frequent,” and we will end up with what has been called a false sense of insecurity. (296)

Pinker probably has a point, but the self-righteous undertone of Kinsley’s “same damned thing” is unmistakable. He’s effectively saying, I’m such an outstanding moral being the outrageous evilness of the invasion of Iraq is blatantly obvious to me—why isn’t it to everyone else? And that same message seems to underlie most of the statements people make expressing similar sentiments about how the world is going to hell.

            Though Pinker neglects to tie all the strands together, he still manages to suggest that the drive to dominance, ideology, tribal morality, and the Myth of Pure Evil are all facets of the same disastrous flaw in human nature—an instinct for self-righteousness. Progress on the moral front—real progress like fewer deaths, less suffering, and more freedom—comes from something much closer to utilitarian pragmatism than activist idealism. Yet the activist tradition is so thoroughly enmeshed in our university culture that we’re taught to exercise our powers of political righteousness even while engaging in tasks as mundane as reading books and articles. 

            If the decline in violence and the improvement of the general weal in various other areas are attributable to the Enlightenment, then many of the assumptions underlying postmodernism are turned on their heads. If social ills like warfare, racism, sexism, and child abuse exist in cultures untouched by modernism—and they in fact not only exist but tend to be much worse—then science can’t be responsible for creating them; indeed, if they’ve all trended downward with the historical development of all the factors associated with male-dominated western culture, including strong government, market economies, run-away technology, and scientific progress, then postmodernism not only has everything wrong but threatens the progress achieved by the very institutions it depends on, emerged from, and squanders innumerable scholarly careers maligning.

Of course some Enlightenment figures and some scientists do evil things. Of course living even in the most Enlightened of civilizations is no guarantee of safety. But postmodernism is an ideology based on the premise that we ought to discard a solution to our societal woes for not working perfectly and immediately, substituting instead remedies that have historically caused more problems than they solved by orders of magnitude. The argument that there’s a core to the Enlightenment that some of its representatives have been faithless to when they committed atrocities may seem reminiscent of apologies for Christianity based on the fact that Crusaders and Inquisitors weren’t loving their neighbors as Christ enjoined. The difference is that the Enlightenment works—in just a few centuries it’s transformed the world and brought about a reduction in violence no religion has been able to match in millennia. If anything, the big monotheistic religions brought about more violence.

Embracing Enlightenment morality or classical liberalism doesn’t mean we should give up our efforts to make the world a better place. As Pinker describes the transformation he hopes to encourage with Better Angels,

As one becomes aware of the decline of violence, the world begins to look different. The past seems less innocent; the present less sinister. One starts to appreciate the small gifts of coexistence that would have seemed utopian to our ancestors: the interracial family playing in the park, the comedian who lands a zinger on the commander in chief, the countries that quietly back away from a crisis instead of escalating to war. The shift is not toward complacency: we enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to reduce it, and so we should work to reduce the violence that remains in our time. Indeed, it is a recognition of the decline of violence that best affirms that such efforts are worthwhile. (xxvi)

Since our task for the remainder of this century is to extend the reach of science, literacy, and the recognition of universal human rights farther and farther along the Enlightenment gradient until they're able to grant the same increasing likelihood of a long peaceful life to every citizen of every nation of the globe, and since the key to accomplishing this task lies in fomenting future Rights Revolutions while at the same time recognizing, so as to be better equipped to rein in, our drive for dominance as manifested in our more deadly moral instincts, I for one am glad Steven Pinker has the courage to violate so many of the outrageously counterproductive postmodern taboos while having the grace to resist succumbing himself, for the most part, to the temptation of self-righteousness.




Sabbath Says: Philip Roth and the Dilemmas of Ideological Castration

            Sabbath’s Theater is the type of book you lose friends over. Mickey Sabbath, the adulterous title character who follows in the long literary line of defiantly self-destructive, excruciatingly vulnerable, and offputtingly but eloquently lustful leading males like Holden Caulfield and Humbert Humbert, strains the moral bounds of fiction and compels us to contemplate the nature of our own voyeuristic impulse to see him through to the end of the story—and not only contemplate it but defend it, as if in admitting we enjoy the book, find its irreverences amusing, and think that in spite of how repulsive he often is there still might be something to be said for poor old Sabbath we’re confessing to no minor offense of our own. Fans and admiring critics alike can’t resist rushing to qualify their acclaim by insisting they don’t condone his cheating on both of his wives, the seduction of a handful of his students, his habit of casually violating others’ privacy, his theft, his betrayal of his lone friend, his manipulations, his racism, his caustic, often cruelly precise provocations—but by the time they get to the end of Sabbath’s debt column it’s a near certainty any list of mitigating considerations will fall short of getting him out of the red. Sabbath, once a puppeteer who now suffers crippling arthritis, doesn’t seem like a very sympathetic character, and yet we sympathize with him nonetheless. In his wanton disregard for his own reputation and his embrace, principled in a way, of his own appetites, intuitions, and human nastiness, he inspires a fascination none of the literary nice guys can compete with. So much for the argument that the novel is a morally edifying art form.

            Thus, in Sabbath, Philip Roth has created a character both convincing and compelling who challenges a fundamental—we may even say natural—assumption about readers’ (or viewers’) role in relation to fictional protagonists, one made by everyone from the snarky authors of even the least sophisticated Amazon.com reviews to the theoreticians behind the most highfalutin academic criticism—the assumption that characters in fiction serve as vehicles for some message the author created them to convey, or which some chimerical mechanism within the “dominant culture” created to serve as agents of its own proliferation. The corollary is that the task of audience members is to try to decipher what the author is trying to say with the work, or what element of the culture is striving to perpetuate itself through it. If you happen to like the message the story conveys, or agree with it at some level, then you recommend the book and thus endorse the statement. Only rarely does a reviewer realize or acknowledge that the purpose of fiction is not simply to encourage readers to behave as the protagonists behave or, if the tale is a cautionary one, to expect the same undesirable consequences should they choose to behave similarly. Sabbath does in fact suffer quite a bit over the course of the novel, and much of that suffering comes as a result of his multifarious offenses, so a case can be made on behalf of Roth’s morality. Still, we must wonder if he really needed to write a story in which the cheating husband is abandoned by both of his wives to make the message sink in that adultery is wrong—especially since Sabbath doesn’t come anywhere near to learning that lesson himself. “All the great thoughts he had not reached,” Sabbath muses in the final pages, “were beyond enumeration; there was no bottom to what he did not have to say about the meaning of his life” (779).

           Part of the reason we can’t help falling back on the notions that fiction serves a straightforward didactic purpose and that characters should be taken as models, positive or negative, for moral behavior is that our moral emotions are invariably and automatically engaged by stories; indeed, what we usually mean when we say we got into a story is that we were in suspense as we anticipated whether the characters ultimately met with the fates we felt they deserved. We reflexively size up any character the author introduces the same way we assess the character of a person we’re meeting for the first time in real life. For many readers, the question of whether a novel is any good is interchangeable with the question of whether they liked the main characters, assuming they fare reasonably well in the culmination of the plot. If an author like Roth evinces an attitude drastically different from ours toward a character of his own creation like Sabbath, then we feel that in failing to condemn him, in holding him up as a model, the author is just as culpable as his character. In a recent edition of PBS’s American Masters devoted to Roth, for example, Jonathan Franzen, a novelist himself, describes how even he couldn’t resist responding to his great forebear’s work in just this way. “As a young writer,” Franzen recalls, “I had this kind of moralistic response of ‘Oh, you bad person, Philip Roth’” (54:56).

Jonathan Franzen
            That fiction’s charge is to strengthen our preset convictions through a process of narrative tempering, thus catering to our desire for an orderly calculus of just deserts, serves as the basis for a contract between storytellers and audiences, a kind of promise on which most commercial fiction delivers with a bang. And how many of us have wanted to throw a book out of the window when we felt that promise had been broken? The goal of professional and academic critics, we may imagine, might be to ease their charges into an appreciation of more complex narrative scenarios enacted by characters who escape easy categorization. But since scholarship in the humanities, and in literary criticism especially, has been in a century-long sulk over the greater success of science and the greater renown of scientists, professors of literature have scarcely even begun to ponder what anything resembling a valid answer to the questions of how fiction works and what the best strategies for experiencing it might look like. Those who aren’t pouting in a corner about the ascendancy of science—but the Holocaust!—are stuck in the muck of the century-old pseudoscience of psychoanalysis. But the real travesty is that the most popular, politically inspired schools of literary criticism—feminism, Marxism, postcolonialism—actively preach the need to ignore, neglect, and deny the very existence of moral complexity in literature, violently displacing any appreciation of difficult dilemmas with crudely tribal formulations of good and evil.

            For those inculcated with a need to take a political stance with regard to fiction, the only important dynamics in stories involve the interplay of society’s privileged oppressors and their marginalized victims. In 1976, nearly twenty years before the publication of Sabbath’s Theater, the feminist critic Vivian Gornick lumped Roth together with Saul Bellow and Norman Mailer in an essay asking “Why Do These Men Hate Women?” because she took issue with the way women are portrayed in their novels. Gornick, following the methods standard to academic criticism, doesn’t bother devoting any space in her essay to inconvenient questions about how much we can glean about these authors from their fictional works or what it means that the case for her prosecution rests by necessity on a highly selective approach to quoting from those works. And this slapdash approach to scholarship is supposedly justified because she and her fellow feminist critics believe women are in desperate need of protection from the incalculable harm they assume must follow from such allegedly negative portrayals. In this concern for how women, or minorities, or some other victims are portrayed and how they’re treated by their notional oppressors—rich white guys—Gornick and other critics who make of literature a battleground for their political activism are making the same assumption about fiction’s straightforward didacticism as the most unschooled consumers of commercial pulp. The only difference is that the academics believe the message received by audiences is all that’s important, not the message intended by the author. The basis of this belief probably boils down to its obvious convenience.

Gornick
            In Sabbath’s Theater, the idea that literature, or art of any kind, is reducible to so many simple messages, and that these messages must be measured against political agendas, is dashed in the most spectacularly gratifying fashion. Unfortunately, the idea is so seldom scrutinized, and the political agendas are insisted on so inclemently, clung to and broadcast with such indignant and prosecutorial zeal, that it seems not one of the critics, nor any of the authors, who were seduced by Sabbath were able to fully reckon with the implications of that seduction. Franzen, for instance, in a New Yorker article about fictional anti-heroes, dodges the issue as he puzzles over the phenomenon that “Mickey Sabbath may be a disgustingly self-involved old goat,” but he’s somehow still sympathetic. The explanation Franzen lights on is that

the alchemical agent by which fiction transmutes my secret envy or my ordinary dislike of “bad” people into sympathy is desire. Apparently, all a novelist has to do is give a character a powerful desire (to rise socially, to get away with murder) and I, as a reader, become helpless not to make that desire my own. (63)

If Franzen is right—and this chestnut is a staple of fiction workshops—then the political activists are justified in their urgency. For if we’re powerless to resist adopting the protagonist’s desires as our own, however fleetingly, then any impulse to victimize women or minorities must invade readers’ psyches at some level, conscious or otherwise. The simple fact, however, is that Sabbath has not one powerful desire but many competing desires, ones that shift as the novel progresses, and it’s seldom clear even to Sabbath himself what those desires are. (And is he really as self-involved as Franzen suggests? It seems to me rather that he compulsively tries to get into other people’s heads, reflexively imagining elaborate stories for them.)

            While we undeniably respond to virtuous characters in fiction by feeling anxiety on their behalf as we read about or watch them undergo the ordeals of the plot, and we just as undeniably enjoy seeing virtue rewarded alongside cruelty being punished—the goodies prevailing over the baddies—these natural responses do not necessarily imply that stories compel our interest and engage our emotions by providing us with models and messages of virtue. Stories aren’t sermons. In his interview for American Masters, Roth explained what a writer’s role is vis-à-vis social issues.

My job isn’t to be enraged. My job is what Chekhov said the job of an artist was, which is the proper presentation of the problem. The obligation of the writer is not to provide the solution to a problem. That’s the obligation of a legislator, a leader, a crusader, a revolutionary, a warrior, and so on. That’s not the goal or aim of a writer. You’re not selling it, and you’re not inviting condemnation. You’re inviting understanding. (59:41)

Chekhov
The crucial but overlooked distinction that characters like Sabbath—but none so well as Sabbath—bring into stark relief is the one between declarative knowledge on the one hand and moment-by-moment experience on the other. Consider for a moment how many books and movies we’ve all been thoroughly engrossed in for however long it took to read or watch them, only to discover a month or so later that we can’t remember even the broadest strokes of how their plots resolved themselves—much less what their morals might have been.

            The answer to the question of what the author is trying to say is that he or she is trying to give readers a sense of what it would be like to go through what the characters are going through—or what it would be like to go through it with them. In other words, authors are not trying to say anything; they’re offering us an experience, once-removed and simulated though it may be. This isn’t to say that these simulated experiences don’t engage our moral emotions; indeed, we’re usually only as engaged in a story as our moral emotions are engaged by it. The problem is that in real-time, in real life, political ideologies, psychoanalytic theories, and rigid ethical principles are too often the farthest thing from helpful. “Fuck the laudable ideologies,” Sabbath helpfully insists: “Shallow, shallow, shallow!” Living in a complicated society with other living, breathing, sick, cruel, saintly, conniving, venal, altruistic, deceitful, noble, horny humans demands not so much a knowledge of the rules as a finely honed body of skills—and our need to develop and hone these skills is precisely why we evolved to find the simulated experiences of fictional narratives both irresistibly fascinating and endlessly pleasurable. Franzen was right that desires are important, the desire to be a good person, the desire to do things others may condemn, the desire to get along with our families and friends and coworkers, the desire to tell them all to fuck off so we can be free, even if just for an hour, to breathe… or to fuck an intern, as the case may be. Grand principles offer little guidance when it comes to balancing these competing desires. This is because, as Sabbath explains, “The law of living: fluctuation. For every thought a counterthought, for every urge a counterurge” (518).

            Fiction then is not a conveyance for coded messages—how tedious that would be (how tedious it really is when writers make this mistake); it is rather a simulated experience of moral dilemmas arising from scenarios which pit desire against desire, conviction against reality, desire against conviction, reality against desire, in any and all permutations. Because these experiences are once-removed and, after all, merely fictional, and because they require our sustained attention, the dilemmas tend to play out in the vicinity of life’s extremes. Here’s how Sabbath’s Theater opens:

                        Either forswear fucking others or the affair is over.
            This was the ultimatum, the maddeningly improbable, wholly unforeseen ultimatum, that the mistress of fifty-two delivered in tears to her lover of sixty-four on the anniversary of an attachment that had persisted with an amazing licentiousness—and that, no less amazingly, had stayed their secret—for thirteen years. But now with hormonal infusions ebbing, with the prostate enlarging, with probably no more than another few years of semi-dependable potency still his—with perhaps not that much more life remaining—here at the approach of the end of everything, he was being charged, on pain of losing her, to turn himself inside out. (373)

The ethical proposition that normally applies in situations like this is that adultery is wrong, so don’t commit adultery. But these two have been committing adultery with each other for thirteen years already—do we just stop reading? And if we keep reading, maybe nodding once in a while as we proceed, cracking a few wicked grins along the way, does that mean we too must be guilty?
                               *****
Updike
            Much of the fiction written by male literary figures of the past generation, guys like Roth, Mailer, Bellow, and Updike, focuses on the morally charged dilemmas instanced by infidelity, while their gen-x and millennial successors, led by guys like Franzen and David Foster Wallace, have responded to shifting mores—and a greater exposure to academic literary theorizing—by completely overhauling how these dilemmas are framed. Whereas the older generation framed the question as how can we balance the intense physical and spiritual—even existential—gratification of sexual adventure on the one hand with our family obligations on the other, for their successors the question has become how can we males curb our disgusting, immoral, intrinsically oppressive lusting after young women inequitably blessed with time-stamped and overwhelmingly alluring physical attributes. “The younger writers are so self-conscious,” Katie Roiphe writes in a 2009 New York Times essay, “so steeped in a certain kind of liberal education, that their characters can’t condone even their own sexual impulses; they are, in short, too cool for sex.” Roiphe’s essay, “The Naked and the Confused,” stands alongside a 2012 essay in The New York Review of Books by Elaine Blair, “Great American Losers,” as the best descriptions of the new literary trend toward sexually repressed and pathetically timid male leads. The typical character in this vein, Blair writes, “is the opposite of entitled: he approaches women cringingly, bracing for a slap.”

Katie Roiphe
            The writers in the new hipster cohort create characters who bury their longings layers-deep in irony because they’ve been assured the failure on the part of men of previous generations to properly check these same impulses played some unspecified role in the abysmal standing of women in society. College students can’t make it past their first semester without hearing about the evils of so-called objectification, but it’s nearly impossible to get a straight answer from anyone, anywhere, to the question of how objectification can be distinguished from normal, non-oppressive male attraction and arousal. Even Roiphe, in her essay lamenting the demise of male sexual virility in literature, relies on a definition of male oppression so broad that it encompasses even the most innocuous space-filling lines in the books of even the most pathetically diffident authors, writing that “the sexism in the work of the heirs apparent” of writers like Roth and Updike,

is simply wilier and shrewder and harder to smoke out. What comes to mind is Franzen’s description of one of his female characters in “The Corrections”: “Denise at 32 was still beautiful.” To the esteemed ladies of the movement I would suggest this is not how our great male novelists would write in the feminist utopia.

How, we may ask, did it get to the point where acknowledging that age influences how attractive a woman is qualifies a man for designation as a sexist? Blair, in her otherwise remarkably trenchant essay, lays the blame for our oversensitivity—though paranoia is probably a better word—at the feet of none other than those great male novelists themselves, or, as David Foster Wallace calls them, the Great Male Narcissists. She writes,

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

That Roth et al were sexist, condescending, disgusting, narcissistic—these are articles of faith for feminist critics. Yet when we consider how expansive the definition of terms like sexism and misogyny have become—in practical terms, they both translate to: not as radically feminist as me—and the laughably low standard of evidence required to convince scholars of the accusations, female empowerment starts to look like little more than a reserved right to stand in self-righteous judgment of men for giving voice to and acting on desires anyone but the most hardened ideologue will agree are only natural.

             The effect on writers of this ever-looming threat of condemnation is that they either allow themselves to be silenced or they opt to participate in the most undignified of spectacles, peevishly sniping their colleagues, falling all over themselves to be granted recognition as champions for the cause. Franzen, at least early in his career, was more the silenced type. Discussing Roth, he wistfully endeavors to give the appearance of having moved beyond his initial moralistic responses. “Eventually,” he says, “I came to feel as if that was coming out of an envy: like, wow, I wish I could be as liberated of worry about other’s people’s opinion of me as Roth is” (55:18). We have to wonder if his espousal of the reductive theory that sympathy for fictional characters is based solely on the strength of their desires derives from this same longing for freedom to express his own. David Foster Wallace, on the other hand, wasn’t quite as enlightened or forgiving when it came to his predecessors. Here’s how he explains his distaste for a character in one of Updike’s novels, openly intimating the author’s complicity:

D.F. Wallace
It’s that he persists in the bizarre adolescent idea that getting to have sex with whomever one wants whenever one wants is a cure for ontological despair. And so, it appears, does Mr. Updike—he makes it plain that he views the narrator’s impotence as catastrophic, as the ultimate symbol of death itself, and he clearly wants us to mourn it as much as Turnbull does. I’m not especially offended by this attitude; I mostly just don’t get it. Erect or flaccid, Ben Turnbull’s unhappiness is obvious right from the book’s first page. But it never once occurs to him that the reason he’s so unhappy is that he’s an asshole.

So the character is an asshole because he wants to have sex outside of marriage, and he’s unhappy because he’s an asshole, and it all traces back to the idea that having sex with whomever one wants is a source of happiness? Sounds like quite the dilemma—and one that pronouncing the main player an asshole does nothing to solve. This passage is the conclusion to a review in which Wallace tries to square his admiration for Updike’s writing with his desire to please a cohort of women readers infuriated by the way Updike writes about—portrays—women (which begs the question of why they’d read so many of his books). The troubling implication of his compromise is that if Wallace were himself to freely express his sexual feelings, he’d be open to the charge of sexism too—he’d be an asshole. Better to insist he simply doesn’t “get” why indulging his sexual desires might alleviate his “ontological despair.” What would Mickey Sabbath make of the fact that Wallace hanged himself when he was only forty-six, eleven years after publishing that review? (This isn’t just a nasty rhetorical point; Sabbath has a fascination with artists who commit suicide.)

The inadequacy of moral codes and dehumanizing ideologies when it comes to guiding real humans through life’s dilemmas, along with their corrosive effects on art, is the abiding theme of Sabbath’s Theater. One of the pivotal moments in Sabbath’s life is when a twenty-year-old student he’s in the process of seducing leaves a tape recorder out to be discovered in a lady’s room at the university. The student, Kathy Goolsbee, has recorded a phone sex session between her and Sabbath, and when the tape finds its way into the hands of the dean, it becomes grounds for the formation of a committee of activists against the abuse of women. At first, Kathy doesn’t realize how bad things are about to get for Sabbath. She even offers to give him a blow job as he berates her for her carelessness. Trying to impress on her the situation’s seriousness, he says,

Your people have on tape my voice giving reality to all the worst things they want the world to know about men. They have a hundred times more proof of my criminality than could be required by even the most lenient of deans to drive me out of every decent antiphallic educational institution in America. (586)

The committee against Sabbath proceeds to make the full recorded conversation available through a call-in line (the nineties equivalent of posting the podcast online). But the conversation itself isn’t enough; one of the activists gives a long introduction, which concludes,

The listener will quickly recognize how by this point in his psychological assault on an inexperienced young woman, Professor Sabbath has been able to manipulate her into thinking that she is a willing participant. (567-8)

Sabbath knows full well that even consensual phone sex can be construed as a crime if doing so furthers the agenda of those “esteemed ladies of the movement” Roiphe addresses. 

Reading through the lens of a tribal ideology ineluctably leads to the refraction of reality beyond recognizability, and any aspiring male writer quickly learns in all his courses in literary theory that the criteria for designation as an enemy to the cause of women are pretty much whatever the feminist critics fucking say they are. Wallace wasn’t alone in acquiescing to feminist rage by denying his own boorish instincts. Roiphe describes the havoc this opportunistic antipathy toward male sexuality wreaks in the minds of male writers and their literary creations:

Rather than an interest in conquest or consummation, there is an obsessive fascination with trepidation, and with a convoluted, postfeminist second-guessing. Compare [Benjamin] Kunkel’s tentative and guilt-ridden masturbation scene in “Indecision” with Roth’s famous onanistic exuberance with apple cores, liver and candy wrappers in “Portnoy’s Complaint.” Kunkel: “Feeling extremely uncouth, I put my penis away. I might have thrown it away if I could.” Roth also writes about guilt, of course, but a guilt overridden and swept away, joyously subsumed in the sheer energy of taboo smashing: “How insane whipping out my joint like that! Imagine what would have been had I been caught red-handed! Imagine if I had gone ahead.” In other words, one rarely gets the sense in Roth that he would throw away his penis if he could.

And what good comes of an ideology that encourages the psychological torture of bookish young men? It’s hard to distinguish the effects of these so-called literary theories from the hellfire scoldings delivered from the pulpits of the most draconian and anti-humanist religious patriarchs. Do we really need to ideologically castrate all our male scholars to protect women from abuse and further the cause of equality?
*****
The experience of sexual relations between older teacher and younger student in Sabbath’s Theater is described much differently when the gender activists have yet to get involved—and not just by Sabbath but by Kathy as well. “I’m of age!” she protests as he chastises her for endangering his job and opening him up to public scorn; “I do what I want” (586). Absent the committee against him, Sabbath’s impression of how his affairs with his students impact them reflects the nuance of feeling inspired by these experimental entanglements, the kind of nuance that the “laudable ideologies” can’t even begin to capture.

There was a kind of art in his providing an illicit adventure not with a boy of their own age but with someone three times their age—the very repugnance that his aging body inspired in them had to make their adventure with him feel a little like a crime and thereby give free play to their budding perversity and to the confused exhilaration that comes of flirting with disgrace. Yes, despite everything, he had the artistry still to open up to them the lurid interstices of life, often for the first time since they’d given their debut “b.j.” in junior high. As Kathy told him in that language which they all used and which made him want to cut their heads off, through coming to know him she felt “empowered.” (566)

Opening up “the lurid interstices of life” is precisely what Roth and the other great male writers—all great writers—are about. If there are easy answers to the questions of what characters should do, or if the plot entails no more than a simple conflict between a blandly good character and a blandly bad one, then the story, however virtuous its message, will go unattended.

            But might there be too much at stake for us impressionable readers to be allowed free reign to play around in imaginary spheres peopled by morally dubious specters? After all, if denouncing the dreamworlds of privileged white men, however unfairly, redounds to the benefit of women and children and minorities, then perhaps it’s to the greater good. In fact, though, right alongside the trends of increasing availability for increasingly graphic media portrayals of sex and violence have occurred marked decreases in actual violence and the abuse of women. And does anyone really believe it’s the least literate, least media-saturated societies that are the kindest to women? The simple fact is that the theory of literature subtly encouraging oppression can’t be valid. But the problem is once ideologies are institutionalized, once a threshold number of people depend on their perpetuation for their livelihoods, people whose scholarly work and reputations are staked on them, then victims of oppression will be found, their existence insisted on, regardless of whether they truly exist or not.

In another scandal Sabbath was embroiled in long before his flirtation with Kathy Goolsbee, he was brought up on charges of indecency because in the course of a street performance he’d exposed a woman’s nipple. The woman herself, Helen Trumbull, maintains from the outset of the imbroglio that whatever Sabbath had done, he’d done it with her consent—just as will be the case with his “psychological assault” on Kathy. But even as Sabbath sits assured that the case against him will collapse once the jury hears the supposed victim testify on his behalf, the prosecution takes a bizarre twist:
  
In fact, the victim, if there even is one, is coming this way, but the prosecutor says no, the victim is the public. The poor public, getting the shaft from this fucking drifter, this artist. If this guy can walk along a street, he says, and do this, then little kids think it’s permissible to do this, and if little kids think it’s permissible to do this, then they think it’s permissible to blah blah banks, rape women, use knives. If seven-year-old kids—the seven nonexistent kids are now seven seven-year-old kids—are going to see that this is fun and permissible with strange women… (663-4)

Here we have Roth’s dramatization of the fundamental conflict between artists and moralists. Even if no one is directly hurt by playful scenarios, that they carry a message, one that threatens to corrupt susceptible minds, is so seemingly obvious it’s all but impossible to refute. Since the audience for art is “the public,” the acts of depravity and degradation it depicts are, if anything, even more fraught with moral and political peril than any offense against an individual victim, real or imagined.  

            This theme of the oppressive nature of ideologies devised to combat oppression, the victimizing proclivity of movements originally fomented to protect and empower victims, is most directly articulated by a young man named Donald, dressed in all black and sitting atop a file cabinet in a nurse’s station when Sabbath happens across him at a rehab clinic. Donald “vaguely resembled the Sabbath of some thirty years ago,” and Sabbath will go on to apologize for interrupting him, referring to him as “a man whose aversions I wholeheartedly endorse.” What he was saying before the interruption:

“Ideological idiots!” proclaimed the young man in black. “The third great ideological failure of the twentieth century. The same stuff. Fascism. Communism. Feminism. All designed to turn one group of people against another group of people. The good Aryans against the bad others who oppress them. The good poor against the bad rich who oppress them. The good women against the bad men who oppress them. The holder of ideology is pure and good and clean and the other wicked. But do you know who is wicked? Whoever imagines himself to be pure is wicked! I am pure, you are wicked… There is no human purity! It does not exist! It cannot exist!” he said, kicking the file cabinet for emphasis. “It must not and should not exist! Because it’s a lie. … Ideological tyranny. It’s the disease of the century. The ideology institutionalizes the pathology. In twenty years there will be a new ideology. People against dogs. The dogs are to blame for our lives as people. Then after dogs there will be what? Who will be to blame for corrupting our purity?” (620-1)

It’s noteworthy that this rant is made by a character other than Sabbath. By this point in the novel, we know Sabbath wouldn’t speak so artlessly—unless he was really frightened or angry. As effective and entertaining an indictment of “Ideological tyranny” as Sabbath’s Theater is, we shouldn’t expect to encounter anywhere in a novel by a storyteller as masterful as Roth a character operating as a mere mouthpiece for some argument. Even Donald himself, Sabbath quickly gleans, isn’t simply spouting off; he’s trying to impress one of the nurses.

            And it’s not just the political ideologies that conscript complicated human beings into simple roles as oppressors and victims. The pseudoscientific psychological theories that both inform literary scholarship and guide many non-scholars through life crises and relationship difficulties function according to the same fundamental dynamic of tribalism; they simply substitute abusive family members for more generalized societal oppression and distorted or fabricated crimes committed in the victim’s childhood for broader social injustices. Sabbath is forced to contend with this particular brand of depersonalizing ideology because his second wife, Roseanna, picks it up through her AA meetings, and then becomes further enmeshed in it through individual treatment with a therapist named Barbara. Sabbath, who considers himself a failure, and who is carrying on an affair with the woman we meet in the opening lines of the novel, is baffled as to why Roseanna would stay with him. Her therapist provides an answer of sorts.

But then her problem with Sabbath, the “enslavement,” stemmed, according to Barbara, from her disastrous history with an emotionally irresponsible mother and a violent alcoholic father for both of whom Sabbath was the sadistic doppelganger. (454)

Roseanna’s father was a geology professor who hanged himself when she was a young teenager. Sabbath is a former puppeteer with crippling arthritis. Naturally, he’s confused by the purported identity of roles.

These connections—between the mother, the father, and him—were far clearer to Barbara than they were to Sabbath; if there was, as she liked to put it, a “pattern” in it all, the pattern eluded him.

In the midst of a shouting match, Sabbath tells his wife, “As for the ‘pattern’ governing a life, tell Barbara it’s commonly called chaos” (455). When she protests, “You are shouting at me like my father,” Sabbath asserts his individuality: “The fuck that’s who I’m shouting at you like! I’m shouting at you like myself!” (459). Whether you see his resistance as heroic or not probably depends on how much credence you give to those psychological theories.

            From the opening lines of Sabbath’s Theater when we’re presented with the dilemma of the teary-eyed mistress demanding monogamy in their adulterous relationship, the simple response would be to stand in easy judgment of Sabbath, and like Wallace did to Updike’s character, declare him an asshole. It’s clear that he loves this woman, a Croatian immigrant named Drenka, a character who at points steals the show even from the larger-than-life protagonist. And it’s clear his fidelity would mean a lot to her. Is his freedom to fuck other women really so important? Isn’t he just being selfish? But only a few pages later our easy judgment suddenly gets more complicated:

As it happened, since picking up Christa several years back Sabbath had not really been the adventurous libertine Drenka claimed she could no longer endure, and consequently she already had the monogamous man she wanted, even if she didn’t know it. To women other than her, Sabbath was by now quite unalluring, not just because he was absurdly bearded and obstinately peculiar and overweight and aging in every obvious way but because, in the aftermath of the scandal four years earlier with Kathy Goolsbee, he’s become more dedicated than ever to marshaling the antipathy of just about everyone as though he were, in fact, battling for his rights. (394)

Christa was a young woman who participated in a threesome with Sabbath and Drenka, an encounter to which Sabbath’s only tangible contribution was to hand the younger woman a dildo.

            One of the central dilemmas for a character who loves the thrill of sex, who seeks in it a rekindling of youthful vigor—“the word’s rejuvenation,” Sabbath muses at one point (517)—the adrenaline boost borne of being in the wrong and the threat of getting caught, what Roiphe calls “the sheer energy of taboo smashing,” becomes ever more indispensable as libido wanes with age. Even before Sabbath ever had to contend with the ravages of aging, he reveled in this added exhilaration that attends any expedition into forbidden realms. What makes Drenka so perfect for him is that she has not just a similarly voracious appetite but a similar fondness for outrageous sex and the smashing of taboo. And it’s this mutual celebration of the verboten that Sabbath is so reluctant to relinquish. Of Drenka, he thinks,

The secret realm of thrills and concealment, this was the poetry of her existence. Her crudeness was the most distinguishing force in her life, lent her life its distinction. What was she otherwise? What was he otherwise? She was his last link with another world, she and her great taste for the impermissible. As a teacher of estrangement from the ordinary, he had never trained a more gifted pupil; instead of being joined by the contractual they were interconnected by the instinctual and together could eroticize anything (except their spouses). Each of their marriages cried out for a countermarriage in which the adulterers attack their feelings of captivity. (395)

Those feelings of captivity, the yearnings to experience the flow of the old juices, are anything but adolescent, as Wallace suggests of them; adolescents have a few decades before they have to worry about dwindling arousal. Most of them have the opposite problem.

            The question of how readers are supposed to feel about a character like Sabbath doesn’t have any simple answers. He’s an asshole at several points in the novel, but at several points he’s not. One of the reasons he’s so compelling is that working out what our response to him should be poses a moral dilemma of its own. Whether or not we ultimately decide that adultery is always and everywhere wrong, the experience of being privy to Sabbath’s perspective can help us prepare ourselves for our own feelings of captivity, lusting nostalgia, and sexual temptation. Most of us will never find ourselves in a dilemma like Sabbath gets himself tangled in with his friend Norman’s wife, for instance, but it would be to our detriment to automatically discount the old hornball’s insights.

He could discern in her, whenever her husband spoke, the desire to be just a little cruel to Norman, saw her sneering at the best of him, at the very best things in him. If you don’t go crazy because of your husband’s vices, you go crazy because of his virtues. He’s on Prozac because he can’t win. Everything is leaving her except for her behind, which her wardrobe informs her is broadening by the season—and except for this steadfast prince of a man marked by reasonableness and ethical obligation the way others are marked by insanity or illness. Sabbath understood her state of mind, her state of life, her state of suffering: dusk is descending, and sex, our greatest luxury, is racing away at a tremendous speed, everything is racing off at a tremendous speed and you wonder at your folly in having ever turned down a single squalid fuck. You’d give your right arm for one if you are a babe like this. It’s not unlike the Great Depression, not unlike going broke overnight after years of raking it in. “Nothing unforeseen that happens,” the hot flashes inform her, “is likely ever again going to be good.” Hot flashes mockingly mimicking the sexual ecstasies. Dipped, she is, in the very fire of fleeting time. (651)

Welcome to messy, chaotic, complicated life.

            Sabbath’s Theater is, in part, Philip Roth’s raised middle finger to the academic moralists whose idiotic and dehumanizing ideologies have spread like a cancer into all the venues where literature is discussed and all the avenues through which it’s produced. Unfortunately, the unrecognized need for culture-wide chemotherapy hasn’t gotten any less dire in the nearly two decades since the novel was published. With literature now drowning in the devouring tide of new media, the tragic course set by the academic custodians of art toward bloodless prudery and impotent sterility in the name of misguided political activism promises to do nothing but ensure the ever greater obsolescence of epistemologically doomed and resoundingly pointless theorizing, making of college courses the places where you go to become, at best, profoundly confused about where you should stand with relation to fiction and fictional characters, and, at worst, a self-righteous demagogue denouncing the chimerical evils allegedly encoded into every text or cultural artifact. All the conspiracy theorizing about the latent evil urgings of literature has amounted to little more than another reason not to read, another reason to tune in to Breaking Bad or Mad Men instead. But the only reason Roth’s novel makes such a successful case is that it at no point allows itself to be reducible to a mere case, just as Sabbath at no point allows himself to be conscripted as a mere argument. We don’t love or hate him; we love and hate him. But we sort of just love him because he leaves us free to do both as we experience his antics, once removed and simulated, but still just as complicatedly eloquent in their message of “Fuck the laudable ideologies”—or not, as the case may be. 


Also read Let's Play Kill Your Brother: Fiction as a Moral Dilemma Game.

And Stories, Social Proof, and Our Two Selves.

And Can't Win for Losing: Why There Are So Many Losers in Literature and Why It Has to Change.

The STEM Fempire Strikes Back: Feminists’ Desperate Search for Smoking Guns

Sean Carroll, Your Disdain for Your
Own Readers is Showing
            Bad news: lots of research points to the inescapable conclusion that you, Dear Reader, whether you’re a man or a woman, are a sexist. You may be inclined to reject this label. You may even try to insist that you don’t in fact believe that one sex is inferior to the other. But it doesn’t matter, because the research suggests that what you claim to believe about the relative statuses of the genders doesn’t align with how quickly you attach positive or negative labels to pictures of women and men in a task called the Implicit Association Test. Your sexism is “subtle,” “implicit,” “unconscious.” If this charge irks you or if you feel it’s completely unfair, that probably means you’re even more of a sexist than we might have originally assumed. You can try to find fault with the research that demonstrates you’re a sexist, or offer alternative interpretations of the findings, but why would you do that unless you’re a sexist and trying to cover it up—unless, that is, you’re secretly harboring and seeking to rationalize hostile feelings toward women? Sexism is like original sin. It’s in us whether we like it or not, so we must work hard to avoid succumbing to it. We must abase ourselves before the altar of gender equality.
At least, this is what the feminists involved in the controversy over women’s underrepresentation in STEM fields—the STEM fems—would have us believe. Responding to the initial fifty-eight comments to his blog post “Scientists, Your Gender Bias is Showing,” in which he discusses a new study that found significant bias in ratings of competence and hireability depending on the sex of unseen applicants to a lab manager’s position, physicist Sean Carroll ungraciously—you might even say unbecomingly—writes, “At least the trolls have moved on from ‘there is no discrimination’ to ‘discrimination is rationally justified.’ Progress!”
By Carroll’s accounting, I am a troll (by mine, he’s a toady) because I happen to believe gender-based discrimination accounts for a very modest portion of career segregation and pay differentials in industrialized societies—and it may not account for any. And, this latest study notwithstanding, nearly all the available evidence suggests the underrepresentation of women in STEM fields is based on the fact that men and women, on average, prefer to pursue different types of careers. Indeed, the study Carroll so self-righteously trumpets, which didn’t track any actual hirings but only asked participants to treat application materials hypothetically, may have produced the findings it did because its one hundred and twenty-seven participants were well aware of these different preferences.
Dennis Junk
The underrepresentation of women in science, technology, engineering, and mathematics fields is taken by feminists as self-evident proof of discrimination. Since most people who work or teach in these areas understand that sexism is wrong—or at least recognize that it’s thought to be wrong by an influential if possibly misguided majority—not many of them openly admit to deliberately discriminating against women. Yet the underrepresentation continues, ergo the discrimination still exists. That’s why in the past decade there’s been so much discussion of unacknowledged or unconscious bias. Anyone who points out that there is another possible explanation—women and men are essentially (in the statistical sense) different—is accused of being a biological determinist, being a misogynist, having a reactionary political agenda, or some combination of the three.
Now, “essentially different” isn’t all that far from “naturally different,” which is of course part of the formula for sexism, since the belief that one sex is inferior assumes they are somehow inherently different. (I’m excluding genders besides male and female not as a statement but for simplicity’s sake.) But the idea that the sexes tend to be different need not imply either is inferior. Historically, women were considered less intelligent by most men (fewer records exist of what women thought of men), but most educated people today realize this isn’t the case. The important differences are in what men and women tend to find interesting and in what types of careers they tend to prefer (note the word “tend”).
So we have two rival theories. The STEM fems explain career segregation and pay gaps with the theory of latent sexism and rampant discrimination. My fellow trolls and I explain them with the theory that women disproportionately prefer careers focusing on people as opposed to objects and abstractions, while also prioritizing time with family over the achievement of higher rank and higher pay. The fems believe that gender roles, including those associated with career trajectories, are a bad thing, that they limit freedom, and that they are imposed on people, sometimes violently, by a patriarchal society. We preference theory folk, on the other hand, believe that gender begins with individuals, that it is expressed and enacted freely, and that the structure of advanced civilizations, including career segregation and a somewhat regular division of labor with regard to family roles, emerges from the choices and preferences of these individuals.
The best case that can be made for the feminist theory is historical. In the past, women were forbidden to work in certain careers. They were kept out of higher education. They were tethered with iron bonds to their children and their husbands’ homes. Men, meanwhile, had to live with the same type of rigid gender definitions, but at least they had some freedom to choose their careers, could count on their wives tending to the children, and enjoyed the highest position of authority in their families. So we can assume, the reasoning goes, that when we look at society today and find income inequality and segregation what we’re seeing is a holdover from this patriarchal system of the past. From this perspective, the idea that the different outcomes for each gender could possibly emerge from choices freely made is anathema because it seems similar to the rationalizations for the rigid roles of yore. Women naturally want to be mothers and homemakers? Anyone who dares make such a claim belongs in the 1950s, right?  
Though this take on history is a bit of a caricature (class differences were much more significant than gender ones), it has been easy, until recently, to take as self-evident the notion that gender roles erode in lockstep with the advance of civilization toward ever greater individual freedom for ever greater numbers.  Still, tying modern preference theory to policies of the past is nothing but evil rhetoric (almost as evil as accusations of unconscious thought crimes). No one wants to bring back educational and professional barriers to women. The question is whether in the absence of those barriers career segregation and differences in income between the genders will disappear altogether or if women will continue to disproportionally occupy certain professions and continue to spend more time on average with their families than men.
Catherine Hakim
Catherine Hakim, a former Senior Research Fellow at the London School of Economics, and the mind behind preference theory, posits that economic sex differences emerge from what she calls work-life preferences. She has devised three categories that can be used to describe individuals: work-centered people prioritize their careers, adaptive people try to strike some kind of balance between employment and family work, and home- or family-centered people prefer to give priority to private or family life after they get married. In all the western democracies that have been surveyed, most men but only a small minority of women fit into the work-centered category, while the number of women who are home-centered drastically exceeds the number of men. This same pattern emerges in the US even in samples restricted to elite math and science students. In 2001, David Lubinsky and his colleagues reported that in their surveys of high-achieving students 31% of females said that working part-time for some limited period in their careers was either important or extremely important, compared to only 9% of males. Nineteen percent of the females said the same about a permanent part-time career, compared to 9% for males.
Careers in science and math are notoriously demanding. You have to be a high achiever and a fierce competitor to even be considered for a position, so the fact that men disproportionately demonstrate work-centered priorities goes some way toward explaining the underrepresentation of women. Another major factor that researchers have identified is that women and men tend to be interested in different types of careers, with women preferring jobs that focus on people and men preferring those that focus on things. A 2009 meta-analysis carried out by Rong Su, James Rounds, and Patrick Ian Armstrong compiled data from over 500,000 surveys of vocational interests and found that gender differences on the Things-People dimension produce an effect size that is probably larger than any other in research on gender and personality. Once differences in work-life preferences and vocational interests are taken into consideration, there is probably very little left to explain.
Corine Moss-Racusin
Feminism is a social movement that has many admirable goals, most of which I share. But it is also an ideology that has a fitful relationship with science. Unfortunately, the growing body of evidence that gender segregation and pay gaps emerge from choices freely made by individuals based on preferences that fit reliable patterns in societies all over the world hasn’t done much to end the furor over discrimination. The study on that front that Sean Carroll insists is so damning, “Science Faculty’s Subtle Gender Biases Favor Male Students,” by Corinne A. Moss-Racusin, John F. Dovidio, Victoria L. Brescoll, Mark J. Graham, and Jo Handelsman, is the most substantial bit of actual evidence the STEM fems have been able to marshal in support of their cause in some time. Covering the study in her Scientific American blog, Ilana Yurkiewicz writes,
Whenever the subject of women in science comes up, there are people fiercely committed to the idea that sexism does not exist. They will point to everything and anything else to explain differences while becoming angry and condescending if you even suggest that discrimination could be a factor. But these people are wrong. This data shows they are wrong. And if you encounter them, you can now use this study to inform them they’re wrong. You can say that a study found that absolutely all other factors held equal, females are discriminated against in science. Sexism exists. It’s real. Certainly, you cannot and should not argue it’s everything. But no longer can you argue it’s nothing.
What this rigorous endorsement reveals is that prior to Moss-Racusin et al.’s study there was only weak evidence backing up the STEM fems conviction that sexism was rampant in science departments all over the country and the world. You can also see that Yurkiewicz takes this debate very personally. It’s really important to her that women who complain about discrimination be vindicated. I suppose that makes sense, but I wonder if she realizes that the point she’s so desperately trying to prove is intrinsically insulting to her male colleagues—to all male scientists. I also wonder if in any other scientific debate she would be so quick to declare the matter settled based on a single study that only sampled 127 individuals. 
The preference theorists have some really good reasons to be skeptical of the far-reaching implications many are claiming for the study. Most importantly, the authors’ conclusions contradict the findings of a much larger study that measured the key variables more directly. In 2010, the National Academies Press published the findings of a task force that was
asked by Congress to ‘conduct a study to assess gender differences in the careers of science, engineering, and mathematics (SEM) faculty, focusing on four-year institutions of higher education that award bachelor’s and graduate degrees. The study will build on the National Academies’ previous work and examine issues such as faculty hiring, promotion, tenure, and allocation of institutional resources including (but not limited to) laboratory space. (VII)
The report, Gender Differences at Critical Transitions in the Careers of Science, Engineering, and Mathematics Faculty, surprised nearly everyone because it revealed no evidence of gender-based discrimination. After reviewing records for 500 academic departments and conducting surveys with 1,800 faculty members (a larger sample than Moss-Racusin et al.’s study by more than an order of magnitude), the National Academies committee concluded,
For the most part, male and female faculty in science, engineering, and mathematics have enjoyed comparable opportunities within the university, and gender does not appear to have been a factor in a number of important career transitions and outcomes. (Bolded in original, 153)
But the two studies were by no means identical, so it’s important to compare the specific findings of one to the other.
            Moss-Racusin and her colleagues sent application materials to experienced members of science faculties at research-intensive institutions. Sixty-three of the packets showed the name John and listed the sex as male; 64 had the name Jennifer and sex female. The study authors gave the participants the cover story that they were going to use their answers to several items on a questionnaire about their responses to the applications in the development of a mentoring program to help undergraduate science students. The questions focused on the applicant’s competence, hireability, likeability, how likely the rater would be to mentor the applicant, and how much the rater would offer to pay the applicant. The participants rating applications from females tended to give them better scores for likeability, but lower ones for competence and hireability. The participants, whether male or female themselves, also showed less willingness to mentor females, and indicated they would offer females lower salaries. So there you have it: the participants didn’t dislike the female applicants—they weren’t hostile or “old-fashioned” sexists. But you can see how women forced to deal with this type of bias might be discouraged. To me, the lower salary offers are the most striking. But a difference in medians between $30,200 and $26,500 doesn't seem that big when you consider the overall spread was between $45,000 and $15,000, there was no attempt to control for differences in average salary between universities, and the sample size is really small.

            Moss-Racusin et al. also had the participants complete the Modern Sexism Scale, which was designed as an indirect measure of gender attitudes. On the supporting information page for the study, the authors describe the scale,
Items included: On average, people in our society treat husbands and wives equally; Discrimination against women is no longer a problem in the United States; and Over the past few years, the government and new media have been showing more concern about the treatment of women than is warranted by women’s actual experiences (α = 0.92). Items were averaged to form the gender attitudes scale, with higher numbers indicating more negative attitudes toward women.
Aside from the fact that it defines a lack of support for feminism as sexism (and the middle item, which bears directly on the third, is precisely the matter the study is attempting to treat empirically), this so-called sexism scale introduces the second of two possible confounds. The first is that the cover story may have encouraged many of the participants to answer even direct questions about their own responses as if they were answering questions about how they believed most other people in their position would answer them. And the second problem is that for obvious reasons it’s important that the participants not know the true purpose of the study, which the authors insist was “double-blind.” But we must wonder what conclusions the participants might have drawn about the researchers’ goals when they came across the “Modern Sexism Scale,” a really odd set of questions about the responders’ own views in a survey of their thoughts about an applicant.

           We also need to distinguish sexism—the belief that one sex is inferior—from biased behavior. Bias can be based on several factors besides sexism—but the feminists fail to acknowledge this. The authors of the study explain the (modest) difference in ratings for wholly imaginary applicants as the result of arbitrary, sexist stereotypes that have crept into people’s minds. (They of course ignore the sexist belief that men are less likeable—rightly so because the methods don't allow them to identify that belief.) The alternative explanation is that the bias is based on actual experiences with real people: the evaluators may have actually known more men who wanted lab management positions, more men who had successfully worked in that role, and/or more females who didn't work out in it. The conflating of sexism (or racism) with bias is akin to saying anyone who doesn't forget everything they’ve experienced with different types of people when making hiring decisions is guilty of perpetrating some injustice.
Jo Handelsman
            In a live chat hosted on Science’s webpage, one of the study authors, Jo Handelsman, writes, “We know from a lot of research that people apply more bias in decision making when they have less information, so I think this type of quick review is the most prone to ‘gut level’ decisions, which are colored by bias.” Implicit or gut-level reactions are notoriously sensitive to things like the way questions are framed, the order in which information is presented, and seemingly irrelevant or inconsequential cues. This sensitivity makes complex results from studies of implicit associations extremely difficult to interpret. Handelsman and her colleagues tried to control for extraneous factors by holding the conditions of their study constant for all participants, with the sole difference being the name and sex on the forms. But if I’m a scientist who’s agreed to assess an application in a hypothetical hiring situation for the purpose of helping to design a mentoring program, I would very likely be primed to provide information that I believe might give the students who are the beneficiaries of the research some useful guidance. I might, for instance, want to give female scientists a heads-up about some of the obstacles they might encounter—especially if in the course of the survey I’m reminded of the oppression of wives by husbands, discrimination in society at large, and the fact that some people are so callous as to not even want to hear about how bad women have it.
Another possibility is that the omnipresent and inescapable insistence of STEM fems that sexism is rampant is actually creating some of the bias the studies by STEM fems then turn around and measure. Since Moss-Racusin et al. report that high scores on the so-called Modern Sexism Scale correlated with lower ratings for females’ competence and hireability, we have to ask if the study participants might have been worried about women primed to make excuses for themselves, or if they might have been reluctant to hire someone with an ideologically inspired chip on her shoulder who would be ready to cry gender discrimination at the first whiff of rough treatment. Such alternative interpretations may seem like special pleading. But the discrepancy between the findings of this study and those of the National Academies committee, which, again, were based on a sample that was more than ten times larger and measured the variables directly, calls out for an explanation.
Perhaps the most troubling implication of the study is that women applicants to scientific positions will be less likely to make to the interview stage of the hiring process, so all the implicit stereotypes about women being less competent will never be overridden with more information. However, the National Academies committee found that in actuality, The percentage of women who were interviewed for tenure-track or tenured positions was higher than the percentage of women who applied” (157). Unless we assume males tend to be worse candidates for some reason—sexism against men?—this finding rules out the possibility that women are discriminated against for interviews. Are the women who make it to the interview stage thought to be less competent and hireable than their male counterparts? According to the committee report, “For all disciplines the percentage of tenure-track women who received the first job offer was greater than the percentage in the interview pool.” This finding suggests that for some reason women are thought to be better, not worse, candidates for academic positions. If there’s any discrimination, it’s against men.
It could still be argued that the Moss-Racusin et al. study suggests that the reason fewer women apply for positions in science and math fields is that they get less encouragement to do so because participants said they were less likely to mentor female applicants for a hypothetical position. But how do we square this finding with that of the National Academies finding that “Female tenure-track and tenured faculty reported that they were more likely to have mentors than male faculty. In the case of tenure-track faculty, 57 percent of women had mentors compared to 49 percent of men” (159). Well, even if women are more successful at finding mentors, it could still be argued that they would be discouraged by offers of lower starting salaries. But how would they know, unless they read the study, that they can expect lower offers? And is it even true that women in science positions are paid less than men. In its review of the records of 500 academic departments, the National Academies study determined that “Men and women seem to have been treated equally when they were hired. The overall size of start-up packages and the specific resources of reduced initial teaching load, travel funds, and summer salary did not differ between male and female faculty” (158).
Real world outcomes seem to be completely at odds with the implications of the new study, and at odds too with STEM fems insistence that discrimination accounts for a major portion of women’s underrepresentation in math and science careers. The National Academies study did however offer some strong support for preference theory. It turns out that women are more likely to turn down job offers, and the reason they cite is telling.
In 95 percent of the tenure-track and 100 percent of the tenured positions where a man was the first choice for a position, a man was ultimately hired. In contrast, in cases where a woman was the first choice, a woman was ultimately hired in only 70 percent of the tenure-track and 77 percent of the tenured positions. When faculty were asked what factors they considered when selecting their current position, the effect of gender was statistically significant for only one factor—“family-related reasons.”

Stephen Ceci
The Moss-Racusin et al. study was probably conceived of as a response to another article published in the same journal, The Proceedings of the National Academy of Science, in February of 2011. In “Understanding Current Causes of Women’s Underrepresentation in Science,” authors Stephen Ceci and Wendy Williams examine evidence from a vast array of research and write, “We find the evidence for recent sex discrimination–when it exists–is aberrant, of small magnitude, and is superseded by larger, more sophisticated analyses showing no bias, or occasionally, bias in favor of women” (1-2). That Moss-Racusin et al.’s study will likewise be superseded seems quite likely—in fact, it already has been superseded by the NAS study. Ceci and Williams' main conclusion from their review is a good summary of preference theory:
Wendy Williams
Despite frequent assertions that women’s current underrepresentation in math-intensive fields is caused by sex discrimination by grant agencies, journal reviewers, and search committees, the evidence shows women fare as well as men in hiring, funding, and publishing (given comparable resources). That women tend to occupy positions offering fewer resources is not due to women being bypassed in interviewing and hiring or being denied grants and journal publications because of their sex. It is due primarily to factors surrounding family formation and childrearing, gendered expectations, lifestyle choices, and career preferences—some originating before or during adolescence. (5)
Moss-Racusin et al.’s study should not be summarily dismissed—that’s not what I’m arguing. It is suggestive, and the proverbial further studies should be conducted. But let’s not claim it’s more important than it really is just because it produced the results the STEM fems were hoping for. And let’s quit acting like every study that produces evidence of gender discrimination is a victory for the good guys. Far too many people assume that feminism can only be good for women and good for science. But if discrimination really doesn’t play that big a role for women in science—which everyone should acknowledge the current weight of evidence suggests is the case—the infusion of gender politics has the potential to cause real harm. The standing accusation of sexism may not in the end lead to better treatment of women—it may lead to resentment. And the suggestion that every male scientist is the beneficiary of unfair hiring practices will as likely as not lead to angry defiance and increasing tension.

           To succeed in the most elite fields, you have to be cut-throat. It would be surprising if science and math careers turned out to be peopled with the nicest, most accommodating individuals. Will the young woman scientist who has a run-in with a jerk frame the encounter as just that—a run-in with an individual who happens to be a jerk—or will she see it as a manifestation of patriarchal oppression? It seems to me the latter response embodies the same type of prejudice the STEM fems claim to be trying to end.

Read Catherine Hakim's Feminists Myths and Magic Medicine

And my series of posts on "Why I Am Not a Feminist"
 

A Crash Course in Multilevel Selection Theory part 2: Steven Pinker Falls Prey to the Averaging Fallacy Sober and Wilson Tried to Warn Him about

 Read Part 1               If you were a woman applying to graduate school at the University of California at Berkeley in 1973, you would have had a 35 percent chance of being accepted. If you were a man, your chances would have been significantly better. Forty-four percent of male applicants got accepted that year. Apparently, at this early stage of the feminist movement, even a school as notoriously progressive as Berkeley still discriminated against women. But not surprisingly, when confronted with these numbers, the women of the school were ready to take action to right the supposed injustice. After a lawsuit was filed charging admissions offices with bias, however, a department-by-department examination was conducted which produced a curious finding: not a single department admitted a significantly higher percentage of men than women. In fact, there was a small but significant trend in the opposite direction—a bias against men.
What this means is that somehow the aggregate probability of being accepted into grad school was dramatically different from the probabilities worked out through disaggregating the numbers with regard to important groupings, in this case the academic departments housing the programs assessing the applications. This discrepancy called for an explanation, and statisticians had had one on hand since 1951.
This paradoxical finding fell into place when it was noticed that women tended to apply to departments with low acceptance rates. To see how this can happen, imagine that 90 women and 10 men apply to a department with a 30 percent acceptance rate. This department does not discriminate and therefore accepts 27 women and 3 men. Another department, with a 60 percent acceptance rate, receives applications from 10 women and 90 men. This department doesn’t discriminate either and therefore accepts 6 women and 54 men. Considering both departments together, 100 men and 100 women applied, but only 33 women were accepted, compared with 57 men. A bias exists in the two departments combined, despite the fact that it does not exist in any single department, because the departments contribute unequally to the total number of applicants who are accepted. (25)
This is how the counterintuitive statistical phenomenon known as Simpson’s Paradox is explained by philosopher Elliott Sober and biologist David Sloan Wilson in their 1998 book Unto Others: The Evolution and Psychology of Unselfish Behavior, in which they argue that the same principle can apply to the relative proliferation of organisms in groups with varying percentages of altruists and selfish actors. In this case, the benefit to the group of having more altruists is analogous to the higher acceptance rates for grad school departments which tend to receive a disproportionate number of applications from men. And the counterintuitive outcome is that, in an aggregated population of groups, altruists have an advantage over selfish actors—even though within each of those groups selfish actors outcompete altruists.  
            Sober and Wilson caution that this assessment is based on certain critical assumptions about the population in question. “This model,” they write, “requires groups to be isolated as far as the benefits of altruism are concerned but nevertheless to compete in the formation of new groups” (29). It also requires that altruists and nonaltruists somehow “become concentrated in different groups” (26) so the benefits of altruism can accrue to one while the costs of selfishness accrue to the other. One type of group that follows this pattern is a family, whose members resemble each other in terms of their traits—including a propensity for altruism—because they share many of the same genes. In humans, families tend to be based on pair bonds established for the purpose of siring and raising children, forming a unit that remains stable long enough for the benefits of altruism to be of immense importance. As the children reach adulthood, though, they disperse to form their own family groups. Therefore, assuming families live in a population with other families, group selection ought to lead to the evolution of altruism.
(pg 24) Darker area represents altruists and shrinks in
both groups--but notice the right circle gets bigger.
            Sober and Wilson wrote Unto Others to challenge the prevailing approach to solving mysteries in evolutionary biology, which was to focus strictly on competition between genes. In place of this exclusive attention on gene selection, they advocate a pluralistic approach that takes into account the possibility of selection occurring at multiple levels, from genes to individuals to groups. This is where the term multilevel selection comes from. In certain instances, focusing on one level instead of another amounts to a mere shift in perspective. Looking at families as groups, for instance, leads to many of the same conclusions as looking at them in terms of vehicles for carrying genes. William D. Hamilton, whose thinking inspired both Richard Dawkins’ Selfish Gene and E.O. Wilson’s Sociobiology, long ago explained altruism within families by setting forth the theory of kin selection, which posits that family members will at times behave in ways that benefit each other even at their own expense because the genes underlying the behavior don’t make any distinction between the bodies which happen to be carrying copies of themselves. Sober and Wilson write,
As we have seen, however, kin selection is a special case of a more general theory—a point that Hamilton was among the first to appreciate. In his own words, “it obviously makes no difference if altruists settle with altruists because they are related… or because they recognize fellow altruists as such, or settle together because of some pleiotropic effect of the gene on habitat preference.” We therefore need to evaluate human social behavior in terms of the general theory of multilevel selection, not the special case of kin selection. When we do this, we may discover that humans, bees, and corals are all group-selected, but for different reasons. (134)
A general proclivity toward altruism based on section at the level of family groups may look somewhat different from kin-selected altruism targeted solely at those who are recognized as close relatives. For obvious reasons, the possibility of group selection becomes even more important when it comes to explaining the evolution of altruism among unrelated individuals.
Elliott Sober
            We have to bear in mind that Dawkins’s selfish genes are only selfish with regard to concerning themselves with nothing but ensuring their own continued existence—by calling them selfish he never meant to imply they must always be associated with selfishness as a trait of the bodies they provide the blueprints for. Selfish genes, in other words, can sometimes code for altruistic behavior, as in the case of kin selection. So the question of what level selection operates on is much more complicated than it would be if the gene-focused approach predicted selfishness while the multilevel approach predicted altruism. But many strict gene selection advocates argue that because selfish gene theory can account for altruism in myriad ways there’s simply no need to resort to group selection. Evolution is, after all, changes over time in gene frequencies. So why should we look to higher levels?
David Sloan Wilson
            Sober and Wilson demonstrate that if you focus on individuals in their simple model of predominantly altruistic groups competing against predominantly selfish groups you will conclude that altruism is adaptive because it happens to be the trait that ends up proliferating. You may add the qualifier that it’s adaptive in the specified context, but the upshot is that from the perspective of individual selection altruism outcompetes selfishness. The problem is that this is the same reasoning underlying the misguided accusations against Berkley; for any individual in that aggregate population, it was advantageous to be a male—but there was never any individual selection pressure against females. Sober and Wilson write,
The averaging approach makes “individual selection” a synonym for “natural selection.” The existence of more than one group and fitness differences between the groups have been folded into the definition of individual selection, defining group selection out of existence. Group selection is no longer a process that can occur in theory, so its existence in nature is settled a priori. Group selection simply has no place in this semantic framework. (32)
Thus, a strict focus on individuals, though it may appear to fully account for the outcome, necessarily obscures a crucial process that went into producing it. The same logic might be applicable to any analysis based on gene-level accounting. Sober and Wilson write that
if the point is to understand the processes at work, the resultant is not enough. Simpson’s paradox shows how confusing it can be to focus only on net outcomes without keeping track of the component causal factors. This confusion is carried into evolutionary biology when the separate effects of selection within and between groups are expressed in terms of a single quantity. (33)
They go on to label this approach “the averaging fallacy.” Acknowledging that nobody explicitly insists that group selection is somehow impossible by definition, they still find countless instances in which it is defined out of existence in practice. They write,
Even though the averaging fallacy is not endorsed in its general form, it frequently occurs in specific cases. In fact, we will make the bold claim that the controversy over group selection and altruism in biology can be largely resolved simply by avoiding the averaging fallacy. (34)
            Unfortunately, this warning about the averaging fallacy continues to go unheeded by advocates of strict gene selection theories. Even intellectual heavyweights of the caliber of Steven Pinker fall into the trap. In a severely disappointing essay published just last month at Edge.org called “The False Allure of Group Selection,” Pinker writes
If a person has innate traits that encourage him to contribute to the group’s welfare and as a result contribute to his own welfare, group selection is unnecessary; individual selection in the context of group living is adequate. Individual human traits evolved in an environment that includes other humans, just as they evolved in environments that include day-night cycles, predators, pathogens, and fruiting trees.
Steven Pinker
Multilevel selectionists wouldn’t disagree with this point; they would readily explain traits that benefit everyone in the group at no cost to the individuals possessing them as arising through individual selection. But Pinker here shows his readiness to fold the process of group competition into some generic “context.” The important element of the debate, of course, centers on traits that benefit the group at the expense of the individual. Pinker writes,
Except in the theoretically possible but empirically unlikely circumstance in which groups bud off new groups faster than their members have babies, any genetic tendency to risk life and limb that results in a net decrease in individual inclusive fitness will be relentlessly selected against. A new mutation with this effect would not come to predominate in the population, and even if it did, it would be driven out by any immigrant or mutant that favored itself at the expense of the group.
But, as Sober and Wilson demonstrate, those self-sacrificial traits wouldn’t necessarily be selected against in the population. In fact, self-sacrifice would be selected for if that population is an aggregation of competing groups. Pinker fails to even consider this possibility because he’s determined to stick with the definition of natural selection as occurring at the level of genes.
            Indeed, the centerpiece of Pinker’s argument against group selection in this essay is his definition of natural selection. Channeling Dawkins, he writes that evolution is best understood as competition between “replicators” to continue replicating. The implication is that groups, and even individuals, can’t be the units of selection because they don’t replicate themselves. He writes,
The theory of natural selection applies most readily to genes because they have the right stuff to drive selection, namely making high-fidelity copies of themselves. Granted, it's often convenient to speak about selection at the level of individuals, because it’s the fate of individuals (and their kin) in the world of cause and effect which determines the fate of their genes. Nonetheless, it’s the genes themselves that are replicated over generations and are thus the targets of selection and the ultimate beneficiaries of adaptations.
The underlying assumption is that, because genes rely on individuals as “vehicles” to replicate themselves, individuals can sometimes be used as shorthand for genes when discussing natural selection. Since gene competition within an individual would be to the detriment of all the genes that individual carries and strives to pass on, the genes collaborate to suppress conflicts amongst themselves. The further assumption underlying Pinker’s and Dawkins’s reasoning is that groups make for poor vehicles because suppressing within group conflict would be too difficult. But, as Sober and Wilson write,
This argument does not evaluate group selection on a trait-by-trait basis. In addition, it begs the question of how individuals became such good vehicles of selection in the first place. The mechanisms that currently limit within-individual selection are not a happy coincidence but are themselves adaptions that evolved by natural selection. Genomes that managed to limit internal conflict presumably were more fit than other genomes, so these mechanisms evolve by between-genome selection. Being a good vehicle as Dawkins defines it is not a requirement for individual selection—it’s a product of individual selection. Similarly, groups do not have to be elaborately organized “superorganisms” to qualify as a unit of selection with respect to particular traits. (97)
The idea of a “trait-group” is exemplified by the simple altruistic group versus selfish group model they used to demonstrate the potential confusion arising from Simpson’s paradox. As long as individuals with the altruism trait interact with enough regularity for the benefits to be felt, they can be defined as a group with regard to that trait.
            Pinker makes several other dubious points in his essay, most of them based on the reasoning that group selection isn’t “necessary” to explain this or that trait, only justifying his prejudice in favor of gene selection with reference to the selfish gene definition of evolution. Of course, it may be possible to imagine gene-level explanations to behaviors humans engage in predictably, like punishing cheaters in economic interactions even when doing so means the punisher incurs some cost to him or herself. But Pinker is so caught up with replicators he overlooks the potential of this type of punishment to transform groups into functional vehicles. As Sober and Wilson demonstrate, group competition can lead to the evolution of altruism on its own. But once altruism reaches a certain threshold group selection can become even more powerful because the altruistic group members will, by definition, be better at behaving as a group. And one of the mechanisms we might expect to evolve through an ongoing process of group selection would operate to curtail within group conflict and exploitation. The costly punishment Pinker dismisses as possibly explicable through gene selection is much more likely to havearisen through group selection. Sober and Wilson delight in the irony that, “The entire language of social interactions among individuals in groups has been burrowed to describe genetic interactions within individuals; ‘outlaw’ genes, ‘sheriff’ genes, ‘parliaments’ of genes, and so on” (147).
            Unto Others makes such a powerful case against strict gene-level explanations and for the potentially crucial role of group selection that anyone who undertakes to argue that the appeal of multilevel selection theory is somehow false without even mentioning it risks serious embarrassment. Published fourteen years ago, it still contains a remarkably effective rebuttal to Pinker’s essay:  
In short, the concept of genes as replicators, widely regarded as a decisive argument against group selection, is in fact totally irrelevant to the subject. Selfish gene theory does not invoke any processes that are different from the ones described in multilevel selection theory, but merely looks at the same processes in a different way. Those benighted group selectionists might be right in every detail; group selection could have evolved altruists that sacrifice themselves for the benefit of others, animals that regulate their numbers to avoid overexploiting their resources, and so on. Selfish gene theory calls the genes responsible for these behaviors “selfish” for the simple reason that they evolved and therefore replicated more successfully than other genes. Multilevel selection theory, on the other hand, is devoted to showing how these behaviors evolve. Fitness differences must exist somewhere in the biological hierarchy—between individuals within groups, between groups in the global population, and so on. Selfish gene theory can’t even begin to explore these questions on the basis of the replicator concept alone. The vehicle concept is its way of groping toward the very issues that multilevel selection theory was developed to explain. (88)
Sober and Wilson, in opening the field of evolutionary studies to forces beyond gene competition, went a long way toward vindicating Stephen Jay Gould, who throughout his career held that selfish gene theory was too reductionist—he even incorporated their arguments into his final book. But Sober and Wilson are still working primarily in the abstract realm of evolutionary modeling, although in the second half of Unto Others they cite multiple psychological and anthropological sources. A theorist even more after Gould’s own heart, one who synthesizes both models and evidence from multiple fields, from paleontology to primatology to ethnography, into a hypothetical account of the natural history of human evolution, from the ancestor we share with the great apes to modern nomadic foragers and beyond, is the anthropologist Christopher Boehm, whose work we’ll be exploring in part 3.
Read Part 1 of A Crash Course in Multilevel Selection Theory: The Goundwork Laid by Dawkins and Gould
And Part 3: The People Who Evolved Our Genes for Us: Christopher Boehm on Moral Origins.

Can’t Win for Losing: Why There Are So Many Losers in Literature and Why It Has to Change

Doris Lessing
            Ironically, the author of The Golden Notebook, celebrating its 50th anniversary this year, and considered by many a “feminist bible,” happens to be an outspoken critic of feminism. When asked in a 1982 interview with Lesley Hazelton about her response to readers who felt some of her later works were betrayals of the women whose cause she once championed, Doris Lessing replied,

What the feminists want of me is something they haven't examined because it comes from religion. They want me to bear witness. What they would really like me to say is, ‘Ha, sisters, I stand with you side by side in your struggle toward the golden dawn where all those beastly men are no more.’ Do they really want people to make oversimplified statements about men and women? In fact, they do. I've come with great regret to this conclusion.

Lessing has also been accused of being overly harsh—“castrating”—to men, too many of whom she believes roll over a bit too easily when challenged by women aspiring to empowerment. As a famous novelist, however, who would go on to win the Nobel prize in literature in 2007, she got to visit a lot of schools, and it gradually dawned on her that it wasn’t so much that men were rolling over but rather that they were being trained from childhood to be ashamed of their maleness. In a lecture she gave to the Edinburgh book festival in 2001, she said,

Great things have been achieved through feminism. We now have pretty much equality at least on the pay and opportunities front, though almost nothing has been done on child care, the real liberation. We have many wonderful, clever, powerful women everywhere, but what is happening to men? Why did this have to be at the cost of men? I was in a class of nine- and 10-year-olds, girls and boys, and this young woman was telling these kids that the reason for wars was the innately violent nature of men. You could see the little girls, fat with complacency and conceit while the little boys sat there crumpled, apologising for their existence, thinking this was going to be the pattern of their lives.

Lessing describes how the teacher kept casting glances expectant of her approval as she excoriated these impressionable children. 

           Elaine Blair, in “Great American Losers,” an essay that’s equal parts trenchant and infuriatingly obtuse, describes a dynamic in contemporary fiction that’s similar to the one Lessing saw playing out in the classroom.

The man who feels himself unloved and unlovable—this is a character that we know well from the latest generation or two of American novels. His trials are often played for sympathetic laughs. His loserdom is total: it extends to his stunted career, his squalid living quarters, his deep unease in the world.

At the heart of this loserdom is his auto-manifesting knowledge that women don’t like him. As opposed to men of earlier generations who felt entitled to a woman’s respect and admiration, Blair sees this modern male character as being “the opposite of entitled: he approaches women cringingly, bracing for a slap.” This desperation on the part of male characters to avoid offending women, to prove themselves capable of sublimating their own masculinity so they can be worthy of them, finds its source in the authors themselves. Blair writes,

Our American male novelists, I suspect, are worried about being unloved as writers—specifically by the female reader. This is the larger humiliation looming behind the many smaller fictional humiliations of their heroes, and we can see it in the way the characters’ rituals of self-loathing are tacitly performed for the benefit of an imagined female audience.
D.F. Wallace courtesy of
infinitesummer.org

           Blair quotes a review David Foster Wallace wrote of a John Updike novel to illustrate how conscious males writing literature today are of their female readers’ hostility toward men who write about sex and women without apologizing for liking sex and women—sometimes even outside the bounds of caring, committed relationships. Labeling Updike as a “Great Male Narcissist,” a distinction he shares with writers like Philip Roth and Norman Mailer, Wallace writes,

Most of the literary readers I know personally are under forty, and a fair number are female, and none of them are big admirers of the postwar GMNs. But it’s John Updike in particular that a lot of them seem to hate. And not merely his books, for some reason—mention the poor man himself and you have to jump back:
“Just a penis with a thesaurus.”
“Has the son of a bitch ever had one unpublished thought?”
“Makes misogyny seem literary the same way Rush 
[Limbaugh] makes fascism seem funny.”
And trust me: these are actual quotations, and I’ve heard even
worse ones, and they’re all usually accompanied by the sort of
facial expressions where you can tell there’s not going to be any profit in appealing to the intentional fallacy or talking about the sheer aesthetic pleasure of Updike’s prose.

Since Wallace is ready to “jump back” at the mere mention of Updike’s name, it’s no wonder he’s given to writing about characters who approach women “cringingly, bracing for a slap.”

Blair goes on to quote from Jonathan Franzen’s novel The Corrections, painting a plausible picture of male writers who fear not only that their books will be condemned if too misogynistic—a relative term which has come to mean "not as radically feminist as me"—but they themselves will be rejected. In Franzen’s novel, Chip Lambert has written a screenplay and asked his girlfriend Julia to give him her opinion. She holds off doing so, however, until after she breaks up with him and is on her way out the door. “For a woman reading it,” she says, “it’s sort of like the poultry department. Breast, breast, breast, thigh, leg” (26). Franzen describes his character’s response to the critique:

It seemed to Chip that Julia was leaving him because “The Academy Purple” had too many breast references and a draggy opening, and that if he could correct these few obvious problems, both on Julia’s copy of the script and, more important, on the copy he’d specially laser-printed on 24-pound ivory bond paper for [the film producer] Eden Procuro, there might be hope not only for his finances but also for his chances of ever again unfettering and fondling Julia’s own guileless, milk-white breasts. Which by this point in the day, as by late morning of almost every day in recent months, was one of the last activities on earth in which he could still reasonably expect to take solace for his failures. (28)

If you’re reading a literary work like The Corrections, chances are you’ve at some point sat in a literature class—or even a sociology or culture studies class—and been instructed that the proper way to fulfill your function as a reader is to critically assess the work in terms of how women (or minorities) are portrayed. Both Chip and Julia have sat through such classes. And you’re encouraged to express disapproval, even outrage if something like a traditional role is enacted—or, gasp, objectification occurs. Blair explains how this affects male novelists:

When you see the loser-figure in a novel, what you are seeing is a complicated bargain that goes something like this: yes, it is kind of immature and boorish to be thinking about sex all the time and ogling and objectifying women, but this is what we men sometimes do and we have to write about it. We fervently promise, however, to avoid the mistake of the late Updike novels: we will always, always, call our characters out when they’re being self-absorbed jerks and louts. We will make them comically pathetic, and punish them for their infractions a priori by making them undesirable to women, thus anticipating what we imagine will be your judgments, female reader. Then you and I, female reader, can share a laugh at the characters’ expense, and this will bring us closer together and forestall the dreaded possibility of your leaving me.

In other words, these male authors are the grownup versions of those poor school boys Lessing saw forced to apologize for their own existence. Indeed, you can feel this dynamic, this bargain, playing out when you’re reading these guys’ books. Blair’s description of the problem is spot on. Her theory of what caused it, however, is laughable.

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

The dread of slipping down the slope from attraction to exploitation has nothing to do with John Updike. Rather, it is embedded in terms at the very core of feminist ideology. Misogyny, for instance, is frequently deemed an appropriate label for men who indulge in lustful gazing, even in private. And the term objectification implies that the female whose subjectivity isn’t being properly revered is the victim of oppression. The main problem with this idea—and there are several—is that the term objectification is synonymous with attraction. The deluge of details about the female body in fiction by male authors can just as easily be seen as a type of confession, an unburdening of guilt by the offering up of sins. The female readers respond by assigning the writers some form of penance, like never writing, never thinking like that again without flagellating themselves.

           The conflict between healthy male desire and disapproving feminist prudery doesn’t just play out in the tortured psyches of geeky American male novelists. A.S. Byatt, in her Booker prize-winning novel Possession, satirizes the plight of scholars steeped in literary theories for being “papery” and sterile. But the novel ends with a male scholar named Roland overcoming his theory-induced self-consciousness to initiate sex with another scholar named Maud. Byatt describes the encounter:

And very slowly and with infinite gentle delays and delicate diversions and variations of indirect assault Roland finally, to use an outdated phrase, entered and took possession of all her white coolness that grew warm against him, so that there seemed to be no boundaries, and he heard, towards dawn, from a long way off, her clear voice crying out, uninhibited, unashamed, in pleasure and triumph. (551)

The literary critic Monica Flegel cites this passage as an example of how Byatt’s old-fashioned novel features “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by how “stereotypical gender roles are reaffirmed” in the sex scene. “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her.’” How, we may wonder, did a man assuring a woman he would take care of her become an act of misogyny?
Martin Amis

            Perhaps critics like Flegel occupy some radical fringe; Byatt’s book was after all a huge success with audiences and critics alike, and it did win Byatt the Booker. The novelist Martin Amis, however, isn’t one to describe his assaults as indirect. He routinely dares to feature men who actually do treat women poorly in his novels—without any authorial condemnation. Martin Goff, the non-intervening director of the Booker Prize committee, tells the story of the 1989 controversy over whether or not Amis’s London Fields should be on the shortlist. Maggie Gee, a novelist, and Helen McNeil, a professor, simply couldn’t abide Amis’s treatment of his women characters. “It was an incredible row,” says Goff.

Maggie and Helen felt that Amis treated women appallingly in the book. That is not to say they thought books which treated women badly couldn't be good, they simply felt that the author should make it clear he didn't favour or bless that sort of treatment. Really, there was only two of them and they should have been outnumbered as the other three were in agreement, but such was the sheer force of their argument and passion that they won. David [Lodge] has told me he regrets it to this day, he feels he failed somehow by not saying, “It's two against three, Martin's on the list”.

In 2010, Amis explained his career-spanning failure to win a major literary award, despite enjoying robust book sales, thus:

There was a great fashion in the last century, and it's still with us, of the unenjoyable novel. And these are the novels which win prizes, because the committee thinks, “Well it's not at all enjoyable, and it isn't funny, therefore it must be very serious.”

Brits like Hilary Mantel, and especially Ian McEwan are working to turn this dreadful trend around. But when McEwan dared to write a novel about a neurosurgeon who prevails in the end over an afflicted, less privileged tormenter he was condemned by critic Jennifer Szalai in the pages of Harper’s Magazine for his “blithe, bourgeois sentiments.” If you’ve read Saturday, you know the sentiments are anything but blithe, and if you read Szalai’s review you’ll be taken aback by her articulate blindness.

           Amis is probably right in suggesting that critics and award committees have a tendency to mistake misery for profundity. But his own case, along with several others like it, hint at something even more disturbing, a shift in the very idea of what role fictional narratives play in our lives. The sad new reality is that, owing to the growing influence of ideologically extreme and idiotically self-righteous activist professors, literature is no longer read for pleasure and enrichment—it’s no longer even read as a challenging exercise in outgroup empathy. Instead, reading literature is supposed by many to be a ritual of male western penance. Prior to taking an interest in literary fiction, you must first be converted to the proper ideologies, made to feel sufficiently undeserving yet privileged, the beneficiary of a long history of theft and population displacement, the scion and gene-carrier of rapists and genocidaires—the horror, the horror. And you must be taught to systematically overlook and remain woefully oblivious of all the evidence that the Enlightenment was the best fucking thing that ever happened to the human species. Once you’re brainwashed into believing that so-called western culture is evil and that you’ve committed the original sin of having been born into it, you’re ready to perform your acts of contrition by reading horrendously boring fiction that forces you to acknowledge and reflect upon your own fallen state.
"In his new self-lacerating 'Memoir', J.M. Coetzee portrays
himself as a loser with no sexual presence." Here he is at the
Nobel ceremony.

           Fittingly, the apotheosis of this new literary tradition won the Booker in 1999, and its author, like Lessing, is a Nobel laureate. J.M. Coetzee’s Disgrace chronicles in exquisite free indirect discourse the degradation of David Lurie, a white professor in Cape Town, South Africa, beginning with his somewhat pathetic seduction of black student, a crime for which he pays with the loss of his job, his pension, and his reputation, and moving on to the aftermath of his daughter’s rape at the hands of three black men who proceed to rob her, steal his car, douse him with spirits and light him on fire. What’s unsettling about the novel—and it is a profoundly unsettling novel—is that its structure implies that everything that David and Lucy suffer flows from his original offense of lusting after a young black woman. This woman, Melanie, is twenty years old, and though she is clearly reluctant at first to have sex with her teacher there’s never any force involved. At one point, she shows up at David’s house and asks to stay with him. It turns out she has a boyfriend who is refusing to let her leave him without a fight. It’s only after David unheroically tries to wash his hands of the affair to avoid further harassment from this boyfriend—while stooping so low as to insist that Melanie make up a test she missed in his class—that she files a complaint against him.

            David immediately comes clean to university officials and admits to taking advantage of his position of authority. But he stalwartly refuses to apologize for his lust, or even for his seduction of the young woman. This refusal makes him complicit, the novel suggests, in all the atrocities of colonialism. As he’s awaiting a hearing to address Melanie’s complaint, David gets a message:

On campus it is Rape Awareness Week. Women Against Rape, WAR, announces a twenty-four-hour vigil in solidarity with “recent victims”. A pamphlet is slipped under his door: ‘WOMEN SPEAK OUT.’ Scrawled in pencil at the bottom is a message: ‘YOUR DAYS ARE OVER, CASANOVA.’ (43)

During the hearing, David confesses to doctoring the attendance ledgers and entering a false grade for Melanie. As the attendees become increasingly frustrated with what they take to be evasions, he goes on to confess to becoming “a servant of Eros” (52). But this confession only enrages the social sciences professor Farodia Rassool:

Yes, he says, he is guilty; but when we try to get specificity, all of a sudden it is not abuse of a young woman he is confessing to, just an impulse he could not resist, with no mention of the pain he has caused, no mention of the long history of exploitation of which this is part. (53)

There’s also no mention, of course, of the fact that already David has gone through more suffering than Melanie has, or that her boyfriend deserves a great deal of the blame, or that David is an individual, not a representative of his entire race who should be made to answer for the sins of his forefathers.
From the movie version of Disgrace

            After resigning from his position in disgrace, David moves out to the country to live with his daughter on a small plot of land. The attack occurs only days after he’s arrived. David wants Lucy to pursue some sort of justice, but she refuses. He wants her to move away because she’s clearly not safe, but she refuses. She even goes so far as to accuse him of being in the wrong for believing he has any right to pronounce what happened an injustice—and for thinking it is his place to protect his daughter. And if there’s any doubt about the implication of David’s complicity she clears it up. As he’s pleading with her to move away, they begin talking about the rapists’ motivation. Lucy says to her father,

When it comes to men and sex, David, nothing surprises me anymore. Maybe, for men, hating the woman makes sex more exciting. You are a man, you ought to know. When you have sex with someone strange—when you trap her, hold her down, get her under you, put all your weight on her—isn’t it a bit like killing? Pushing the knife in; exiting afterwards, leaving the body behind covered in blood—doesn’t it feel like murder, like getting away with murder? (158)

The novel is so engrossing and so disturbing that it’s difficult to tell what the author’s position is vis à vis his protagonist’s degradation or complicity. You can’t help sympathizing with him and feeling his treatment at the hands of Melanie, Farodia, and Lucy is an injustice. But are you supposed to question that feeling in light of the violence Melanie is threatened with and Lucy is subjected to? Are you supposed to reappraise altogether your thinking about the very concept of justice in light of the atrocities of history? Are we to see David Lurie as an individual or as a representative of western male colonialism, deserving of whatever he’s made to suffer and more?
From the Crucible

            Personally, I think David Lurie’s position in Disgrace is similar to that of John Proctor in The Crucible (although this doesn’t come out nearly as much in the movie version). And it’s hard not to see feminism in its current manifestations—along with Marxism and postcolonialism—as a pernicious new breed of McCarthyism infecting academia and wreaking havoc with men and literature alike. It’s really no surprise at all that the most significant developments in the realm of narratives lately haven’t occurred in novels at all. Insofar as the cable series contributing to the new golden age of television can be said to adhere to a formula, it’s this: begin with a bad ass male lead who doesn’t apologize for his own existence and has no qualms about expressing his feelings toward women. As far as I know, these shows are just as popular with women viewers as they are with the guys.

            When David first arrives at Lucy’s house, they take a walk and he tells her a story about a dog he remembers from a time when they lived in a neighborhood called Kenilworth.

It was a male. Whenever there was a bitch in the vicinity it would get excited and unmanageable, and with Pavlovian regularity the owners would beat it. This went on until the poor dog didn’t know what to do. At the smell of a bitch it would chase around the garden with its ears flat and its tail between its legs, whining, trying to hide…There was something so ignoble in the spectacle that I despaired. One can punish a dog, it seems to me, for an offence like chewing a slipper. A dog will accept the justice of that: a beating for a chewing. But desire is another story. No animal will accept the justice of being punished for following its instincts.

Lucy breaks in, “So males must be allowed to follow their instincts unchecked? Is that the moral?” David answers,

No, that is not the moral. What was ignoble about the Kenilworth spectacle was that the poor dog had begun to hate its own nature. It no longer needed to be beaten. It was ready to punish itself. At that point it would be better to shoot it.

“Or have it fixed,” Lucy offers. (90)

Also read The Adaptive Appeal of Bad Boys

Madness and Bliss: Critical vs Primitive Readings in A.S. Byatt's Possesion: A Romance 2

Read part one.
            The critical responses to the challenges posed by Byatt in Possession fit neatly within the novel’s satire. Louise Yelin, for instance, unselfconsciously divides the audience for the novel into “middlebrow readers” and “the culturally literate” (38), placing herself in the latter category. She overlooks Byatt’s challenge to her methods of criticism and the ideologies underpinning them, for the most part, and suggests that several of the themes, like ventriloquism, actually support poststructuralist philosophy. Still, Yelin worries about the novel’s “homophobic implications” (39). (A lesbian, formerly straight character takes up with a man in the end, and Christabel LaMotte’s female lover commits suicide after the dissolution of their relationship, but no one actually expresses any fear or hatred of homosexuals.) Yelin then takes it upon herself to “suggest directions that our work might take” while avoiding the “critical wilderness” Byatt identifies. She proposes a critical approach to a novel that “exposes its dependencies on the bourgeois, patriarchal, and colonial economies that underwrite” it (40). And since all fiction fails to give voice to one or another oppressed minority, it is the critic’s responsibility to “expose the complicity of those effacements in the larger order that they simultaneously distort and reproduce” (41). This is not in fact a response to Byatt’s undermining of critical theories; it is instead an uncritical reassertion of their importance.

            Yelin and several other critics respond to Possession as if Byatt had suggested that “culturally literate” readers should momentarily push to the back of their minds what they know about how literature is complicit in various forms of political oppression so they can get more enjoyment from their reading. This response is symptomatic of an astonishing inability to even imagine what the novel is really inviting literary scholars to imagine—that the theories implicating literature are flat-out wrong. Monica Flegel for instance writes that “What must be privileged and what must be sacrificed in order for Byatt’s Edenic reading (and living) state to be achieved may give some indication of Byatt’s own conventionalizing agenda, and the negative enchantment that her particular fairy tale offers” (414). Flegel goes on to show that she does in fact appreciate the satire on academic critics; she even sympathizes with the nostalgia for simpler times, before political readings became mandatory. But she ends her critical essay with another reassertion of the political, accusing Byatt of “replicating” through her old-fashioned novel “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by Maud’s treatment in the final scene, since, she claims, “stereotypical gender roles are reaffirmed” (428). “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her’” (429). This interpretation places Flegel in the company of the feminists in the novel who hiss at Maud for trying to please men, forcing her to bind her head.

            Flegel believes that her analysis proves Byatt is guilty of misogyny and mistreatment of the poor. “Byatt urges us to leave behind critical readings and embrace reading for enjoyment,” she warns her fellow critics, “but the narrative she offers shows just how much is at stake when we leave criticism behind” (429). Flegel quotes Yelin to the effect that Possession is “seductive,” and goes on to declaim that

it is naïve, and unethical, to see the kind of reading that Byatt offers as happy. To return to an Edenic state of reading, we must first believe that such a state truly existed and that it was always open to all readers of every class, gender, and race. Obviously, such a belief cannot be held, not because we have lost the ability to believe, but because such a space never really did exist. (430)

In her preening self-righteous zealotry, Flegel represents a current in modern criticism that’s only slightly more extreme than that represented by Byatt’s misguided but generally harmless scholars. The step from using dubious theories to decode alleged justifications for political oppression in literature to Flegel’s frightening brand of absurd condemnatory moralizing leveled at authors and readers alike is a short one.

            Another way critics have attempted to respond to Byatt’s challenge is by denying that she is in fact making any such challenge. Christien Franken suggests that Byatt’s problems with theories like poststructuralism stem from her dual identity as a critic and an author. In a lecture Byatt once gave titled “Identity and the Writer” which was later published as an essay, Franken finds what she believes is evidence of poststructuralist thinking, even though Byatt denies taking the theory seriously. Franken believes that in the essay, “the affinity between post-structuralism and her own thought on authorship is affirmed and then again denied” (18). Her explanation is that

the critic in A.S. Byatt begins her lecture “Identity and the Writer” with a recognition of her own intellectual affinity with post-structuralist theories which criticize the paramount importance of “the author.” The writer in Byatt feels threatened by the same post-structuralist criticism. (17)

Franken claims that this ambivalence runs throughout all of Byatt’s fiction and criticism. But Ann Marie Adams disagrees, writing that “When Byatt does delve into poststructuralist theory in this essay, she does so only to articulate what ‘threatens’ and ‘beleaguers’ her as a writer, not to productively help her identify the true ‘identity’ of the writer” (349). In Adams view, Byatt belongs in the humanist tradition of criticism going back to Matthew Arnold and the romantics. In her own response to Byatt, Adams manages to come closer than any of her fellow critics to being able to imagine that the ascendant literary theories are simply wrong. But her obvious admiration for Byatt doesn’t prevent her from suggesting that “Yelin and Flegel are right to note that the conclusion of Possession, with its focus on closure and seeming transcendence of critical anxiety, affords a particularly ‘seductive’ and ideologically laden pleasure to academic readers” (120). And, while she seems to find some value in Arnoldian approaches, she fails to engage in any serious reassessment of the theories Byatt targets.

            Frederick Holmes, in his attempt to explain Byatt’s attitude toward history as evidenced by the novel, agrees with the critics who see in Possession clear signs of the author’s embrace of postmodernism in spite of the parody and explicit disavowals. “It is important to acknowledge,” he writes,

that the liberation provided by Roland’s imagination from the previously discussed sterility of his intellectual sophistication is never satisfactorily accounted for in rational terms. It is not clear how he overcomes the post-structuralist positions on language, authorship, and identity. His claim that some signifiers are concretely attached to signifieds is simply asserted, not argued for. (330)


While Holmes is probably mistaken in taking this absence of rational justification as a tacit endorsement of the abandoned theory, the observation is the nearest any of the critics comes to a rebuttal of Byatt’s challenge. What Holmes is forgetting, though, is that structuralist and poststructuralist theorists themselves, from Saussure through Derrida, have been insisting on the inadequacy of language to describe the real world, a radical idea that flies in the face of every human’s lived experience, without ever providing any rational, empirical, or even coherent support for the departure. The stark irony to which Holmes is completely oblivious is that he’s asking for a rational justification to abandon a theory that proclaims such justification impossible. The burden remains on poststructuralists to prove that people shouldn’t trust their own experiences with language. And it is precisely this disconnect with experience that Byatt shows to be problematic. Holmes, like the other critics, simply can’t imagine that critical theories have absolutely no validity, so he’s forced to read into the novel the same chimerical ambivalence Franken tries so desperately to prove.

Roland’s dramatic alteration is validated by the very sort of emotional or existential experience that critical theory has conditioned him to dismiss as insubstantial. We might account for Roland’s shift by positing, not a rejection of his earlier thinking, but a recognition that his psychological well-being depends on his living as if such powerful emotional experiences had an unquestioned reality. (330)

            Adams quotes from an interview in which Byatt discusses her inspiration for the characters in Possession, saying, “poor moderns are always asking themselves so many questions about whether their actions are real and whether what they say can be thought to be true […] that they become rather papery and are miserably aware of this” (111-2). Byatt believes this type of self-doubt is unnecessary. Indeed, Maud’s notion that “there isn’t a unitary ego” (290) and Roland’s thinking of the “idea of his ‘self’ as an illusion” (459)—not to mention Holmes’s conviction that emotional experiences are somehow unreal—are simple examples of the reductionist fallacy. While it is true that an individual’s consciousness and sense of self rest on a substrate of unconscious mental processes and mechanics that can be traced all the way down to the firing of neurons, to suggest this mechanical accounting somehow delegitimizes selfhood is akin to saying that water being made up of hydrogen and oxygen atoms means the feeling of wetness can only be an illusion.
Helen Fisher

       Just as silly are the ideas that romantic love is a “suspect ideological construct” (290), as Maud calls it, and that “the expectations of Romance control almost everyone in the Western world” (460), as Roland suggests. Anthropologist Helen Fisher writes in her book Anatomy of Love, “some Westerners have come to believe that romantic love is an invention of the troubadours… I find this preposterous. Romantic love is far more widespread” (49). After a long list of examples of love-strickenness from all over the world from west to east to everywhere in-between, Fisher concludes that it “must be a universal human trait” (50). Scientists have found empirical support as well for Roland’s discovery that words can in fact refer to real things. Psychologist Nicole Speer and her colleagues used fMRI to scan people’s brains as they read stories. The actions and descriptions on the page activated the same parts of the brain as witnessing or perceiving their counterparts in reality. The researchers report, “Different brain regions track different aspects of a story, such as a character’s physical location or current goals. Some of these regions mirror those involved when people perform, imagine, or observe similar real-world activities” (989).

       Critics like Flegel insist on joyless reading because happy endings necessarily overlook the injustices of the world. But this is like saying anyone who savors a meal is complicit in world hunger (or for that matter anyone who enjoys reading about a character savoring a meal). If feminist poststructuralists were right about how language functions as a vehicle for oppressive ideologies, then the most literate societies would be the most oppressive, instead of the other way around. Jacques Lacan is the theorist Byatt has the most fun with in Possession—and he is also the main target of the book Fashionable Nonsense:Postmodern Intellectuals’ Abuse of Science by the scientists Alan Sokal and Jean Bricmont. “According to his disciples,” they write, Lacan “revolutionized the theory and practice of psychoanalysis; according to his critics, he is a charlatan and his writings are pure verbiage” (18). After assessing Lacan’s use of concepts in topological mathematics, like the Mobius strip, which he sets up as analogies for various aspects of the human psyche, Sokal and Bricmont conclude that Lacan’s ideas are complete nonsense. They write,

Bricmont and Sokal
The most striking aspect of Lacan and his disciples is probably their attitude toward science, and the extreme privilege they accord to “theory”… at the expense of observations and experiments… Before launching into vast theoretical generalizations, it might be prudent to check the empirical adequacy of at least some of its propositions. But, in Lacan’s writings, one finds mainly quotations and analyses of texts and concepts. (37)

Sokal and Bricmont wonder if the abuses of theorists like Lacan “arise from conscious fraud, self-deception, or perhaps a combination of the two” (6). The question resonates with the poem Randolph Henry Ash wrote about his experience exposing a supposed spiritualist as a fraud, in which he has a mentor assure her protégée, a fledgling spiritualist with qualms about engaging in deception, “All Mages have been tricksters” (444).

There’s even some evidence that Byatt is right about postmodern thinking making academics into “papery” people. In a 2006 lecture titled “The Inhumanity of the Humanities,” William van Peer reports on research he conducted with a former student comparing the emotional intelligence of students in the humanities to students in the natural sciences. Although the psychologists Raymond Mar and Keith Oately (407) have demonstrated that reading fiction increases empathy and emotional intelligence, van Peer found that humanities students had no advantage in emotional intelligence over students of natural science. In fact, there was a weak trend in the opposite direction—this despite the fact that the humanities students were reading more fiction. Van Peer attributes the deficit to common academic approaches to literature:

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

Steven Pinker
Whether it derives from her early reading of Arnold and his successors, as Adams suggests, or simply from her own artistic and readerly sensibilities, Byatt has an intense desire to revive that very humanity so many academics sacrifice on the altar of postmodern theory. Critical theory urges students to assume that any discussion of humanity, or universal traits, or human nature can only be exclusionary, oppressive, and, in Flegel’s words, “naïve” and “unethical.” The cognitive linguist Steven Pinker devotes his book The Blank Slate: The Modern Denial of Human Nature to debunking the radical cultural and linguistic determinism that is the foundation of modern literary theory. In a section on the arts, Pinker credits Byatt for playing a leading role in what he characterizes as a “revolt”: 

Museum-goers have become bored with the umpteenth exhibit on the female body featuring dismembered torsos or hundreds of pounds of lard chewed up and spat out by the artist. Graduate students in the humanities are grumbling in emails and conference hallways about being locked out of the job market unless they write in gibberish while randomly dropping the names of authorities like Foucault and Butler. Maverick scholars are doffing the blinders that prevented them from looking at exciting developments in the sciences of human nature. And younger artists are wondering how the art world got itself into the bizarre place in which beauty is a dirty word. (Pinker 416)

There are resonances with Roland Mitchell’s complaints about how psychoanalysis transforms perceptions of landscapes in Pinker’s characterization of modern art. And the idea that beauty has become a dirty word is underscored by the critical condemnations of Byatt’s “fairy tale ending.” Pinker goes on to quote Byatt’s response in New York Times Magazine to the question of what the best story ever told was.  Her answer—The Arabian Nights:

The stories in “The Thousand and One Nights”… are about storytelling without ever ceasing to be stories about love and life and death and money and food and other human necessities. Narration is as much a part of human nature as breath and the circulation of the blood. Modernist literature tried to do away with storytelling, which it thought vulgar, replacing it with flashbacks, epiphanies, streams of consciousness. But storytelling is intrinsic to biological time, which we cannot escape. Life, Pascal said, is like living in a prison from which every day fellow prisoners are taken away to be executed. We are all, like Scheherazade, under sentences of death, and we all think of our lives as narratives, with beginnings, middles, and ends. (quoted in Pinker 419)

            Byatt’s satire of papery scholars and her portrayals of her characters’ transcendence of nonsensical theories are but the simplest and most direct ways she celebrates the power of language to transport readers—and the power of stories to possess them. Though she incorporates an array of diverse genres, from letters to poems to diaries, and though some of the excerpts’ meanings subtly change in light of discoveries about their authors’ histories, all these disparate parts nonetheless “hook together,” collaborating in the telling of this magnificent tale. This cooperation would be impossible if the postmodern truism about the medium being the message were actually true. Meanwhile, the novel’s intimate engagement with the mythologies of wide-ranging cultures thoroughly undermines the paradigm according to which myths are deterministic “repeating patterns” imposed on individuals, showing instead that these stories simultaneously emerge from and lend meaning to our common human experiences. As the critical responses to Possession make abundantly clear, current literary theories are completely inadequate in any attempt to arrive at an understanding of Byatt’s work. While new theories may be better suited to the task, it is incumbent on us to put forth a good faith effort to imagine the possibility that true appreciation of this and other works of literature will come only after we’ve done away with theory altogether. 

Madness and Bliss: Critical versus Primitive Readings in A.S. Byatt’s Possession: a Romance

                                                                                Part 1 of 2
“You have one of the gifts of the novelist at least,” Christabel LaMotte says to her cousin Sabine de Kercoz in A.S. Byatt’s Possession: a Romance, “you persist in undermining facile illusions” (377). LaMotte is staying with her uncle and cousin, Sabine later learns, because she is carrying the child of the renowned, and married, poet Randolph Henry Ash. The affair began when the two met at a breakfast party where they struck up an impassioned conversation that later prompted Ash to instigate a correspondence. LaMotte too was a poet, so each turned out to be an ideal reader for the other’s work. Just over a hundred years after this initial meeting, in the present day of Byatt’s narrative, the literary scholar Roland Mitchell finds two drafts of Ash’s first letter to LaMotte tucked away in the pages of a book he’s examining for evidence about the great poet’s life, and the detective work begins.

Roland, an unpaid research assistant financially dependent on the girlfriend he’s in a mutually unfulfilling relationship with, is overtaken with curiosity and embarks on a quest to piece together the story of what happened between LaMotte and Ash. Knowing next to nothing about LaMotte, Mitchell partners with the feminist scholar Maud Bailey, who one character describes as “a chilly mortal” (159), and a stilted romance develops between them as they seek out the clues to the earlier, doomed relationship. Through her juxtaposition of the romance between the intensely passionate, intensely curious nineteenth century couple and the subdued, hyper-analytic, and sterile modern one, the novelist Byatt does some undermining of facile illusions of her own.
A.S. Byatt

       Both of the modern characters are steeped in literary theory, but Byatt’s narrative suggests that their education and training is more a hindrance than an aid to true engagement with literature, and with life. It is only by breaking with professional protocol—by stealing the drafts of the letter from Ash to LaMotte—and breaking away from his mentor and fellow researchers that Roland has a chance to read, and experience, the story that transforms him. “He had been taught that language was essentially inadequate, that it could never speak what was there, that it only spoke itself” (513). But over the course of the story Roland comes to believe that this central tenet of poststructuralism is itself inadequate, along with the main tenets of other leading critical theories, including psychoanalysis. Byatt, in a later book of criticism, counts herself among the writers of fiction who “feel that powerful figures in the modern critical movements feel almost a gladiatorial antagonism to the author and the authority the author claims” (6).  Indeed, Possession can be read as the novelist’s narrative challenge to the ascendant critical theories, an “undermining of facile illusions” about language and culture and politics—a literary refutation of current approaches to literary criticism. In the two decades since the novel’s publication, critics working in these traditions have been unable to adequately respond to Byatt’s challenge because they’ve been unable to imagine that their ideas are not simply impediments to pleasurable reading but that they’re both wrong and harmful to the creation and appreciation of literature.

       The possession of the title refers initially to how the story of LaMotte and Ash’s romance takes over Maud and Roland—in defiance of the supposed inadequacy of language. If words only speak themselves, then true communication would be impossible. But, as Roland says to Maud after they’ve discovered some uncanny correspondences between each of the two great poets’ works and the physical setting the modern scholars deduce they must’ve visited together, “People’s minds do hook together” (257). This hooking-together is precisely what inspires them to embark on their mission of discovery in the first place. “I want to—to—follow the path,” Maud says to Roland after they’ve read the poets’ correspondence together.

I feel taken over by this. I want to know what happened, and I want it to be me that finds out. I thought you were mad when you came to Lincoln with your piece of stolen letter. Now I feel the same. It isn’t professional greed. It’s something more primitive. (239)

Roland interrupts to propose the label “Narrative curiosity” for her feeling of being taken over, to which she responds, “Partly” (239). Later in the story, after several more crucial discoveries, Maud proposes revealing all they’ve learned to their academic colleagues and returning to their homes and their lives. Roland worries doing so would mean going back “Unenchanted.” “Are we enchanted?” Maud replies. “I suppose we must start thinking again, sometime” (454). But it’s the primitive, enchanted, supposedly unthinking reading of the biographical clues about the poets that has brought the two scholars to where they are, and their journey ends up resulting in a transformation that allows Maud and Roland to experience the happy ending LaMotte and Ash were tragically deprived of.

            Before discovering and being possessed by the romance of the nineteenth century poets, both Maud and Roland were living isolated and sterile lives. Maud, for instance, always has her hair covered in a kind of “head-binding” and twisted in tightly regimented braids that cause Roland “a kind of sympathetic pain on his own skull-skin” (282). She later reveals that she has to cover it because her fellow feminists always assume she’s “dyeing it to please men.” “It’s exhausting,” Roland has just said. “When everything’s a deliberate political stance. Even if it’s interesting” (295). Maud’s bound head thus serves as a symbol (if read in precisely the type of way Byatt’s story implicitly admonishes her audience to avoid) of the burdensome and even oppressive nature of an ideology that supposedly works for the liberation and wider consciousness of women.

            Meanwhile, Roland is troubling himself about the implications of his budding romantic feelings for Maud. He has what he calls a “superstitious dread” of “repeating patterns,” a phrase he repeats over and over again throughout the novel. Thinking of his relations with Maud, he muses,

“Falling in love,” characteristically, combs the appearances of the world, and of the particular lover’s history, out of a random tangle and into a coherent plot. Roland was troubled that the opposite might be true. Finding themselves in a plot, they might suppose it appropriate to behave as though it was a sort of plot. And that would be to compromise some kind of integrity they had set out with. (456)

He later wrestles with the idea that “a Romance was one of the systems that controlled him, as the expectations of Romance control almost everyone in the Western world” (460). Because of his education, he cannot help doubting his own feelings, suspecting that giving in to their promptings would have political implications, and worrying that doing so would result in a comprising of his integrity (which he must likewise doubt) and his free will. Roland’s self-conscious lucubration forms a stark contrast to what Randolph Henry Ash wrote in an early letter to his wife Ellen: “I cannot get out of my mind—as indeed, how should I wish to, whose most ardent desire is to be possessed entirely by the pure thought of you—I cannot get out of my mind the entire picture of you” (500). It is only by reading letters like this, and by becoming more like Ash, turning away in the process from his modern learning, that Roland can come to an understanding of himself and accept his feelings for Maud as genuine and innocent.

            Identity for modern literary scholars, Byatt suggests, is a fraught and complicated issue. At different points in the novel, both Maud and Roland engage in baroque, abortive efforts to arrive at a sense of who they are. Maud, reflecting on how another scholar’s writing about Ash says more about the author than about the subject, meditates,

Narcissism, the unstable self, the fractured ego, Maud thought, who am I? A matrix for a susurration of texts and codes? It was both a pleasant and an unpleasant idea, this requirement that she think of herself as intermittent and partial. There was the question of the awkward body. The skin, the breath, the eyes, the hair, their history, which did seem to exist. (273)

Roland later echoes this head-binding poststructuralist notion of the self as he continues to dither over whether or not he should act on his feelings for Maud.

Roland had learned to see himself, theoretically, as a crossing-place for a number of systems, all loosely connected. He had been trained to see his idea of his “self” as an illusion, to be replaced by a discontinuous machinery and electrical message-network of various desires, ideological beliefs and responses, language forms and hormones and pheromones. Mostly he liked this. He had no desire for any strenuous Romantic self-assertion. (459)

But he mistakes that lack of desire for self-assertion as genuine, when it fact it is borne of his theory-induced self-doubt. He will have to discover in himself that very desire to assert or express himself if he wants to escape his lifeless, menial occupation and end his sexless isolation. He and Maud both have to learn how to integrate their bodies and their desires into their conceptions of themselves.
Yorkshire Moors courtesy of Park Benches and Book Ends

            Unfortunately, thinking about sex is even more fraught with exhausting political implications for Byatt’s scholars than thinking about the self. While on a trek to retrace the steps they believe LaMotte and Ash took in the hills of Yorkshire, Roland considers the writing of a psychoanalytic theorist. Disturbed, he asks Maud, “Do you never have the sense that our metaphors eat up our world?” (275). He goes on to explain, that no matter what they tried to discuss,

It all reduced like boiling jam to—human sexuality… And then, really, what is it, what is this arcane power we have, when we see everything is human sexuality? It’s really powerlessness… We are so knowing… Everything relates to us and so we’re imprisoned in ourselves—we can’t see things. (276)

The couple is coming to realize that they can in fact see things, the same things that the couple whose story they're tracking down saw over a century ago. This budding realization inspires in Roland an awareness of how limiting, even incapacitating, the dubious ideas of critical theorizing can be. Through the distorting prism of psychoanalysis, “Sexuality was like thick smoked glass; everything took on the same blurred tint through it. He could not imagine a pool with stones and water” (278).

The irony is that for all the faux sophistication of psychoanalytic sexual terminology it engenders in both Roland and Maud nothing but bafflement and aversion to actual sex. Roland highlights this paradox later, thinking,

They were children of a time and culture that mistrusted love, “in love,” romantic love, romance in toto, and which nevertheless in revenge proliferated sexual language, linguistic sexuality, analysis, dissection, deconstruction, exposure. (458)

Maud sums up the central problem when she says to Roland, “And desire, that we look into so carefully—I think all the looking-into has some very odd effects on the desire” (290). In that same scene, while still in Yorkshire trying to find evidence of LaMotte’s having accompanied Ash on his trip, the two modern scholars discover they share a fantasy, not a sexual fantasy, but one involving “An empty clean bed,” “An empty bed in an empty room,” and they wonder if “they’re symptomatic of whole flocks of exhausted scholars and theorists” (290-1).

            Guided by their intense desire to be possessed by the two poets of the previous century, Maud and Roland try to imagine how they would have seen the world, and in so doing they try to imagine what it would be like not to believe in the poststructuralist and psychoanalytic theories they’ve been inculcated with. At first Maud tells Roland, “We live in the truth of what Freud discovered. Whether or not we like it. However we’ve modified it. We aren’t really free to suppose—to imagine—he could possibly have been wrong about human nature” (276). But after they’ve discovered a cave with a pool whose reflected light looks like white fire, a metaphor that both LaMotte and Ash used in poems written around the time they would’ve come to that very place, prompting Maud to proclaim, “She saw this. I’m sure she saw this” (289), the two begin trying in earnest to imagine what it would be like to live without their theories. Maud explains to Roland,

We know all sorts of things, too—about how there isn’t a unitary ego—how we’re made up of conflicting, interacting systems of things—and I suppose we believe that? We know we’re driven by desire, but we can’t see it as they did, can we? We never say the word Love, do we—we know it’s a suspect ideological construct—especially Romantic Love—so we have to make a real effort of imagination to know what it felt like to be them, here, believing in these things—Love—themselves—that what they did mattered—(290)

       Though many critics have pointed out how the affair between LaMotte and Ash parallels the one between Maud and Roland, in some way the trajectories of the two relationships run in opposite directions. For instance, LaMotte leaves Ash as even more of a “chilly mortal” (310) than she was when she first met him. It turns out the term derives from a Mrs. Cammish, who lodged LaMotte and Ash while they were on their trip, and was handed down to the Lady Bailey, Maud’s relative, who applies it to her in a conversation with Roland. And whereas the ultimate falling out between LaMotte and Ash comes in the wake of Ash exposing a spiritualist, whose ideas and abilities LaMotte had invested a great deal of faith in, as a fraud, Roland’s counterpart disillusionment, his epiphany that literary theory as he has learned it is a fraud, is what finally makes the consummation of his relationship with Maud possible. Maud too has to overcome, to a degree, her feminist compunctions to be with Roland. Noting how this chilly mortal is warming over the course of their quest, Roland thinks how, “It was odd to hear Maud Bailey talking wildly of madness and bliss” (360). But at last she lets her hair down.

Brittany Coast
Sabine’s journal of the time her cousin Christabel stayed with her and her father on the Brittany coast, where she’d sought refuge after discovering she was pregnant, offers Roland and Maud a glimpse at how wrongheaded it can be to give precedence to their brand of critical reading over what they would consider a more primitive approach. Ironically, it is the young aspiring writer who gives them this glimpse as she chastises her high-minded poet cousin for her attempts to analyze and explain the meanings of the myths and stories she’s grown up with. “The stories come before the meanings,” Sabine insists to Christabel. “I do not believe all these explanations. They diminish. The idea of Woman is less than brilliant Vivien, and the idea of Merlin will not allegorise into male wisdom. He is Merlin” (384). These words come from the same young woman who LaMotte earlier credited for her persistence “in undermining facile illusions” (377).

Readers of Byatt’s novel, though not Maud and Roland, both of whom likely already know of the episode, learn about how Ash attended a séance and, reaching up to grab a supposedly levitating wreath, revealed it to be attached to a set of strings connected to the spiritualist. In a letter to Ruskin read for Byatt’s readers by another modern scholar, Ash expresses his outrage that someone would exploit the credulity and longing of the bereaved, especially mothers who’ve lost children. “If this is fraud, playing on a mother’s harrowed feelings, it is wickedness indeed” (423). He also wonders what the ultimate benefit would be if spiritualist studies into other realms proved to be valid. “But if it were so, if the departed spirits were called back—what good does it do? Were we meant to spend our days sitting and peering into the edge of the shadows?” (422). LaMotte and Ash part ways for good after his exposure of the spiritualist as a charlatan because she is so disturbed by the revelation. And, for the reader, the interlude serves as a reminder of past follies that today are widely acknowledged to have depended on trickery and impassioned credulity. So it might be for the ideas of Freud and Derrida and Lacan.

Roland arrives at the conclusion that this is indeed the case. Having been taught that language is inadequate and only speaks itself, he gradually comes to realize that this idea is nonsense. Reflecting on how he was taught that language couldn’t speak about what really existed in the world, he suddenly realizes that he’s been disabused of the idea. “What happened to him was that the ways in which it could be said had become more interesting than the idea that it could not” (513). He has learned through his quest to discover what had occurred between LaMotte and Ash that “It is possible for a writer to make, or remake at least, for a reader, the primary pleasures of eating, or drinking, or looking on, or sex.” People’s minds do in fact “hook together,” as he’d observed earlier, and they do it through language. The novel’s narrator intrudes to explain here near the end of the book what Roland is coming to understand.

 Now and then there are readings that make the hairs on the neck, the non-existent pelt, stand on end and tremble, when every word burns and shines hard and clear and infinite and exact, like stones of fire, like points of stars in the dark—readings when the knowledge that we shall know the writing differently or better or satisfactorily, runs ahead of any capacity to say what we know, or how. In these readings, a sense of that text has appeared to be wholly new, never before seen, is followed, almost immediately, by the sense that it was always there, that we the readers, knew it was always there, and have always known it was as it was, though we have now for the first time recognised, become fully cognisant of, our knowledge. (512) (Neuroscientists agree.)

The recognition the narrator refers to—which Roland is presumably experiencing in the scene—is of a shared human nature, and shared human experience, the notions of which are considered by most literary critics to be politically reactionary.

Though he earlier claimed to have no desire to assert himself, Roland discovers he has a desire to write poetry. He decides to turn away from literary scholarship altogether and become a poet. He also asserts himself by finally taking charge and initiating sex with Maud.

And very slowly and with infinite gentle delays and delicate diversions and variations of indirect assault Roland finally, to use an outdated phrase, entered and took possession of all her white coolness that grew warm against him, so that there seemed to be no boundaries, and he heard, towards dawn, from a long way off, her clear voice crying out, uninhibited, unashamed, in pleasure and triumph. (551)

This is in fact, except for postscript focusing on Ash, the final scene of the novel, and it represents Roland’s total, and Maud’s partial transcendence of the theories and habits that hitherto made their lives so barren and lonely.

The Upper Hand in Relationships

         People perform some astoundingly clever maneuvers in pursuit of the upper hand in their romantic relationships, and some really stupid ones too. They try to make their partners jealous. They feign lack of interest. They pretend to have enjoyed wild success in the realm of dating throughout their personal histories, right up until the point at which they met their current partners. The edge in cleverness, however, is usually enjoyed by women—though you may be inclined to call it subtlety, or even deviousness.

            Some of the most basic dominance strategies used in romantic relationships are based either on one partner wanting something more than the other, or on one partner being made to feel more insecure than the other. We all know couples whose routine revolves around the running joke that the man is constantly desperate for sex, which allows the woman to set the terms he must meet in order to get some. His greater desire for sex gives her the leverage to control him in other domains. I’ll never forget being nineteen and hearing a friend a few years older say of her husband, “Why would I want to have sex with him when he can’t even remember to take out the garbage?” Traditionally, men held the family purse strings, so they—assuming they or their families had money—could hold out the promise of things women wanted more. Of course, some men still do this, giving their wives little reminders of how hard they work to provide financial stability, or dropping hints of their extravagant lifestyles to attract prospective dates.

            You can also get the upper hand on someone by taking advantage of his or her insecurities. (If that fails, you can try producing some.) Women tend to be the most vulnerable to such tactics at the moment of choice, wanting their features and graces and wiles to make them more desirable than any other woman prospective partners are likely to see. The woman who gets passed up in favor of another goes home devastated, likely lamenting the crass superficiality of our culture.

            Most of us probably know a man or two who, deliberately or not, manages to keep his girlfriend or wife in constant doubt when it comes to her ability to keep his attention. These are the guys who can’t control their wandering eyes, or who let slip offhand innuendos about incremental weight gain. Perversely, many women respond by expending greater effort to win his attention and his approval.

           Men tend to be the most vulnerable just after sex, in the Was-it-good-for-you moments. If you found yourself seething at some remembrance of masculine insensitivity reading the last paragraph, I recommend a casual survey of your male friends in which you ask them how many of their past partners at some point compared them negatively to some other man, or men, they had been with prior to the relationship. The idea that the woman is settling for a man who fails to satisfy her as others have plays into the narrative that he wants sex more—and that he must strive to please her outside the bedroom.
  
          If you can put your finger on your partner’s insecurities, you can control him or her by tossing out reassurances like food pellets to a trained animal. The alternative would be for a man to be openly bowled over by a woman’s looks, or for a woman to express in earnest her enthusiasm for a man’s sexual performances. These options, since they disarm, can be even more seductive; they can be tactics in their own right—but we’re talking next-level expertise here so it’s not something you’ll see very often.

           I give the edge to women when it comes to subtly attaining the upper hand in relationships because I routinely see them using a third strategy they seem to have exclusive rights to. Being the less interested party, or the most secure and reassuring party, can work wonders, but for turning proud people into sycophants nothing seems to work quite as well as a good old-fashioned guilt-trip.

           To understand how guilt-trips work, just consider the biggest example in history: Jesus died on the cross for your sins, and therefore you owe your life to Jesus. The illogic of this idea is manifold, but I don’t need to stress how many people it has seduced into a lifetime of obedience to the church. The basic dynamic is one of reciprocation: because one partner in a relationship has harmed the other, the harmer owes the harmed some commensurate sacrifice.
  
          I’m probably not the only one who’s witnessed a woman catching on to her man’s infidelity and responding almost gleefully—now she has him. In the first instance of this I watched play out, the woman, in my opinion, bore some responsibility for her husband’s turning elsewhere for love. She was brutal to him. And she believed his guilt would only cement her ascendancy. Fortunately, they both realized about that time she must not really love him and they divorced.
  
          But the guilt need not be tied to anything as substantive as cheating. Our puritanical Christian tradition has joined forces in America with radical feminism to birth a bastard lovechild we encounter in the form of a groundless conviction that sex is somehow inherently harmful—especially to females. Women are encouraged to carry with them stories of the traumas they’ve suffered at the hands of monstrous men. And, since men are of a tribe, a pseudo-logic similar to the Christian idea of collective guilt comes into play. Whenever a man courts a woman steeped in this tradition, he is put on early notice—you’re suspect; I’m a trauma survivor; you need to be extra nice, i.e. submissive.

           It’s this idea of trauma, which can be attributed mostly to Freud, that can really make a relationship, and life, fraught and intolerably treacherous. Behaviors that would otherwise be thought inconsiderate or rude—a hurtful word, a wandering eye—are instead taken as malicious attempts to cause lasting harm. But the most troubling thing about psychological trauma is that belief in it is its own proof, even as it implicates a guilty party who therefore has no way to establish his innocence.
  
          Over the course of several paragraphs, we’ve gone from amusing but nonetheless real struggles many couples get caught up in to some that are just downright scary. The good news is that there is a subset of people who don’t see relationships as zero-sum games. (Zero-sum is a game theory term for interactions in which every gain for one party is a loss for the other. Non zero-sum games are those in which cooperation can lead to mutual benefits.) The bad news is that they can be hard to find.
      
            There are a couple of things you can do now though that will help you avoid chess match relationships—or minimize the machinations in your current romance. First, ask yourself what dominance tactics you tend to rely on. Be honest with yourself. Recognizing your bad habits is the first step toward breaking them. And remember, the question isn’t whether you use tactics to try to get the upper hand; it’s which ones you use how often?

           The second thing you can do is cultivate the habit and the mutual attitude of what’s good for one is good for the other. Relationship researcher Arthur Aron says that celebrating your partner’s successes is one of the most important things you can do in a relationship. “That’s even more important,” he says, “than supporting him or her when things go bad.” Watch out for zero-sum responses, in yourself and in your partner. And beware of zero-summers in the realm of dating. Ladies, you know the guys who seem vaguely resentful of the power you have over them by dint of your good looks and social graces. And, guys, you know the women who make you feel vaguely guilty and set-upon every time you talk to them. The best thing to do is stay away.
       
     But you may be tempted, once you realize a dominance tactic is being used on you, to perform some kind of countermove. It’s one of my personal failings to be too easily provoked into these types of exchanges. It is a dangerous indulgence.

Is Patriarchy Even a Real Thing?

            One of the definitions of patriarchy is male control of families and government. But what many are referring to when they use the term is a culture and socialization process that privilege men and boys while oppressing, disadvantaging, and subjugating women and girls. In practice, patriarchy often means the simple assumption that males have it better than females and that they work, often deviously, often with the complicity of blinkered females, to maintain their advantage.
 
            I'm skeptical that there even is such a thing as patriarchy, at least in that latter sense of the term. I can imagine my feminist friends reading that line and getting set to unload a barrage of anecdotes and snippets of history lessons. Before you begin your attempts at setting me straight, let me be clear about what exactly I'm suggesting. It's undeniable that the treatment of women in third-world countries is often abysmal. It's undeniable that some cultures—usually the religious sectors in particular—explicitly preach that women are to be submissive to men. Those explicit teachings are rightly called patriarchy.

            There is an important distinction to be made, however, between a system of family hierarchy and social governance on the one hand, and the suggestion on the other hand that an entire culture is biased in favor of men. Keeping score on both sides of the gender divide by adding up all the miseries and subtracting all the privileges to see who has it worst is exactly the type of tribal behavior that makes this sort of politics so divisive and incendiary. So let me just point out that there are a lot more people monitoring the travails of women and not bothering to consider for a second that men might be going through things that are just as bad or worse. Often small groups of men prey on women and other men alike. And women often enjoy certain advantages over men, especially if they're intelligent or attractive or both, even in third-world countries. (The case of third-world countries, incidentally, ought to give feminists pause before they spout off about the evils of civilization.)

            I'm not a Pollyanna. There really are groups who suffer from severe societal and generational disadvantages, even in this first-world country. In fact, their plight offers a helpful template for how we should expect oppression to appear in various measures. Here, for instance, is a graph of how African Americans and whites have responded to surveys investigating their subjective well-being since the early 1970s.
Source: "Subjective and Objective Indicators of Racial Progress," by Betsey Stevenson and Justin Wolfers
            Two things stand out in this graph. One is that whites have a significant advantage in terms of subjective well-being, just as we might expect. The other is that the gap between the races has been narrowing, albeit at a disconcertingly sluggish pace, over the past forty years. This is probably due in large part to the victories of the civil rights movement, and other deliberate social efforts to right injustices.

            Based on the conventional wisdom regarding the plight of women, we might expect the happiness divide between the sexes to demonstrate pretty much the same trend. But it turns out the two graphs look nothing alike. 
Source: "The Paradox of Declining Female Happiness," by Betsey Stevenson and Justin Wolfters

            First, men didn't start out with a clear advantage. Second, it seems women have actually gotten slightly less happy over the past forty years.

            Yes, men continue to make more money, men continue to hold more positions of power. So, you could argue that men are still privileged and women are just disappointed that they haven't made much progress. And I bet at least a few of you reading this are wondering how girls might be socialized to claim to be as happy as men even when they're not. But neither of these special pleads really accounts for the pattern anyway.

            We need an alternative hypothesis. Maybe it is human nature to develop different roles for women and men. These roles may even be influenced by regular developmental differences, like the production of hormones, and then over time become somewhat exaggerated through a process of observational learning and norm generation. Insofar as this is the case, it’s simply wrong to point to the different roles and claim their existence is proof of one side’s privilege.

            With each role comes a set of privileges and burdens, and maybe, just maybe the two cancel out pretty well. Many men probably feel the need to make money is a burden. Many women probably feel the greater burden of child-rearing placed on them is a privilege.

            None of this necessarily works as evidence for the superiority of traditional sex roles—and I certainly don’t advocate any enforcement of them. Indeed, we need to do our best to support people who for whatever reason want to step outside the bounds of our common expectations. But we also have to be prepared to accept the conclusion—should it be arrived at with a threshold degree of certainty—that people are happier when they embrace their differences, whether those differences are the product of biology, culture, or both.

            And, if you are wont to insist on the existence of male privilege, how will you demonstrate it? How can you be sure it isn’t limited to circumscribed domains? How would you convince a reasonable and informed skeptic that patriarchy is a real problem?

What Would Make Me a Feminist: Response to Comments and Criticisms

            My argument is that there is an important distinction between women’s rights as a goal and feminism as an ideology. I support women’s rights, though I prefer to advocate for universal human rights without exclusion or demarcation. The feminist ideology, however, is far too problematic for me to identify with. I write this fully aware that feminism has various strains.

            Feminism, in my experience, relies on an insultingly crude dialectic: men, women, and patriarchy. This formula leads to some facile and incendiary assumptions and claims. In my first post, I argued that too many feminists fly into rages over the income gap, even though differences in wages are the result of many complex factors and the role of discrimination may be vanishingly small. There is a report of a study that found 6.9 % of the income gap in 2004 was unaccounted for by other factors stemming from different preferences. But eight months after the paper was covered, it has yet to be published, suggesting it failed to make it through peer review. (It may still appear, but we should reserve judgment.)

            If the 7% figure holds up, I admit I’ll be surprised. But I doubt there will be many feminists who look at the twenty percent income gap and rush to remind everyone that over ninety percent of the difference is attributable to divergent preferences regarding fields, working conditions, and family management. (For a more sober discussion of the pay gap from a staunch conservative--strange bedfellow--go here.) I have to emphasize that my argument is not based on a complete absence of any pay gap; it focuses rather on the assumptions feminists make about it. Again, the vast majority of it can be attributed to choices freely made.

            My secondpost took on the feminist tendency to conflate male attraction with oppression. Many complained that the idea of “objectification,” though perhaps untrue, was nevertheless useful. Men who engage in harassment or sexual violence, they maintain, are not recognizing their victims’ humanity. These commenters are mistaking familiarity with usefulness. If objectification theory actually did identify factors that make violence more likely, then we could conclude it was useful. But it simply doesn’t. The theory points to media portrayals of women that emphasize body parts (objects) over emotions or intelligence and suggests such portrayals encourage men to dehumanize women, which might lead to violence. Psychological experiments find this not to be the case at all. Further, as the availability and consumption of pornography have exploded over the past decade, sexual violence has actually decreased. Objectification is an invalid theory with offensive implications about men. And there are better factors to target—like economic inequality—to address the issue of violence.

            My third post took on the ridiculously facile assumption that all gender differences stem from stereotypes and socialization. Many feminists charge anyone who suggests natural differences in behavior or career preference with essentialism. This is nonsense stemming from scientific ignorance. None of the commenters brought up any challenges that require addressing.

            While my problem with feminism begins with the term itself—because it comes freighted with tribal implications—I accept that unconditional opposition would be pretty much meaningless. So here are some things that would make me more accepting of feminism:

-         Evidence that people learning about feminism are systematically warned of the dangers of demonizing or vilifying men.

-         Evidence that feminism, as an ideology and not as an extension of Enlightenment ideals regarding human rights, has contributed valid or useful insights that have advanced science or benefited society.

-         Evidence that being a feminist has beneficial effects for individuals.

On this last point, a study that was published with much fanfare in 2007 reported that feminism had no ill effects on romantic relationships and that men in a relationship with a feminist were more likely to say their sex lives were satisfactory. The study, published in the journal Sex Roles, is titled, “The Interpersonal Power of Feminism: Is Feminism Good for Romantic Relationships?” The authors, Laurie Rudman and Julie Phelan, leave little doubt regarding the purpose of their study:

            What is particularly disturbing is that, by eschewing feminism, women themselves may
be participating in backlash. Thus, it is important to understand the reasons why women
today tend not to embrace feminism.

It’s amazing to me that something this blatantly ideological got published in a scientific journal. What the media coverage failed to mention is the study actually discovered that the female participants who labeled themselves as feminists actually reported higher levels of conflict within their relationships. Rudman and Phelan felt this was a statistical artifact, though, and dismissed it. I don’t understand their reasoning, and I have to assume if it wasn’t valid the reviewers would’ve picked it up. But it does suggest the methods they used might not have been sufficiently sensitive.

            The biggest issue I have with the study, however, is that it willfully conflates support for career women with feminism; in this study, I would have been counted as a strong feminist. The authors justify the move by pointing to a strong correlation between the actual label and attitudes toward women in high-powered positions. But the self-reports didn’t match up in many cases—as they wouldn’t have in mine. Using attitudes toward working women as a stand-in for feminism also opens the door to confounds like higher education and the benefits to a household of having two incomes.

            With regard to my concerns about ideological feminism, the Rudman and Phelan study is completely meaningless. Some studies I’d like to see: a comparison between academic departments measuring relationship satisfaction and stability; some objective measure of women’s support for feminist ideology, like knowledge of prominent authors, compared with attitudes toward men as measured by self-report and results from Implicit Association Tests; an objective measure of men’s exposure to feminism, like a test of their knowledge of feminist authors, and both their attitudes toward women and the satisfaction and stability of their relationships. I would also like to know more about how ideological feminism impacts young girls and boys.

            We shouldn’t keep giving this tribal ideology a free pass because we assume it’s in the service of a good cause. We shouldn’t celebrate studies designed to produce congenial results. Feminism, like any other idea, needs to pass empirical muster if it is to be taken seriously. Unfortunately, policies inspired by it continue to be implemented in the absence of any tests or challenges.

Why I Am Not a Feminist—and You Shouldn’t Be Either part 3: Engendering Gender Madness


             "As a professional debunker I feel like I know bunk when I see it, and Wertheim has well captured the genre: 'In all likelihood there will be an abundant use of CAPITAL LETTERS and exclamation points!!! Important sections will be underlined or bolded, or circled, for emphasis.'"


This is from Skeptic editor Michael Shermer's review of a book on the demarcation problem, the thorny question of how to recognize whether ideas are revolutionary or just, well, bunk. Obviously, if someone's writing begs for attention in way that seems meretricious or unhinged, you're likely dealing with a bunk peddler. What to make, then, of these lines, to which I have not added any formatting?

"Honestly, I can’t think of a better way to make a girl in grade school question whether she’ll have any interest in or aptitude for science than to present her with a 'science for girls' kit."

"And, science kits that police these gender stereotypes run the risk of alienating boys from science, too."

"I really don’t think that science kits should be segregated by gender, but if you are going to segregate them at least make the experiments for girls NOT SO LAME."

"If girls are at all interested in science, then it must be in a pretty, feminine way that reinforces notions of beauty. It’s mystical. The chemistry of perfumery is hidden behind 'perfection.' But boys get actual physics and chemistry—just like that, with no fancy modifiers. This division is NOT okay..."

To the first, I’d say, really? You must have a very limited imagination. To the second, I’d say, really? Isn’t “police” a strong term for science kits sold at a toy store? I agree with the third, but I think the author needs to settle down. And to the fourth, I’d say, well, if the kids really want kits of this nature—and if they don’t want them the manufacturer won’t be offering them for long—you’d have to demonstrate that they actually cause some harm before you can say, in capitals or otherwise, they’re not okay.

Were these breathless fulminations posted on the pages of some poststructuralist site for feminist rants? The first and second are from philosopher Janet Stemwedel’s blog at Scientific American. The third is from a blog hosted by the American Geophysical Union and was written by geologist Evelyn Mervine. And the fourth is from anthropologist Krystal D’Costa’s blog, also at Scientific American.

           You’d hope these blog posts, as emphatic as they are, would provide links to some pretty compelling research on the dangers of pandering to kids’ and parents’ gender stereotypes. One of the posts has a link to a podcast about research on how vaginas are supposed to smell. Another of Stemwedel’s posts on the issue links to yet another post, by Christie Wilcox, in which she not-so-gently takes the journal Nature to task for publishing what was supposed to be a humorous piece on gender differences. It’s only through this indirect route that you can find any actual evidence—in any of these posts—that stereotyping is harmful. “Reinforcing negative gender stereotypes is anything but harmless,” Wilcox declares. But does humor based on stereotypes in fact reinforce them, or does it make them seem ridiculous? How far are we really willing to go to put a stop to this type of humor? It seems to me that gender and racial and religious stereotypes are the bread-and-butter of just about every comedian in the business. 

            The science Wilcox refers to has nothing to do with humor but instead demonstrates a phenomenon psychologists call stereotype threat. It’s a fascinating topic—really one of the most fascinating in psychology in my opinion. It may even be an important factor in the underrepresentation of women in STEM fields. Still, the connection between research on stereotypes and performance—stereotype boost has also been documented—and humor is tenuous. And the connection with pink and pretty microscopes is even more nebulous.

           Helping women in STEM fields feel more welcome is a worthy cause. Gender stereotypes probably play some role in their current underrepresentation. I take these authors at their word that they routinely experience the ill effects of common misconceptions about women’s cognitive abilities, so I sympathize with their frustration to a degree. I even have to admit that it’s a testament to the success of past feminists that the societal injustices their modern counterparts rail against are so much less overt—so subtle. But they may actually be getting too subtle; decrying them sort of resembles the righteous, evangelical declaiming of conspiracy theorists. If you can imagine a way that somebody may be guilty of reinforcing stereotypes, you no longer even have to shoulder the burden of proving they’re guilty.

          The takeaway from all this righteously indignant finger-pointing is that you should never touch anything with even a remote resemblance to a stereotype. Allow me some ironic capitals of my own: STEREOTYPES BAD!!! This message, not surprisingly, even reaches into realms where a casual dismissal of science is fashionable, and skepticism about the value of empirical research, expressed in tortured prose, is an ascendant virtue—or maybe I have the direction of the influence backward.

           On two separate occasions now, one of my colleagues in the English department has posted the story of a baby named Storm on Facebook. Storm’s parents opted against revealing the newborn’s sex to friends and any but immediate family to protect her or him from those nasty stereotypes. In the comments under these links were various commendations and expressions of solidarity. Storm’s parents, most agreed, are heroes. Parents bragged about all their own children’s androgynous behavior, expressing their desire to rub it in the faces of “gender nazis.” 

From the Toronto Star
             From what I can tell, Storm’s parents had no idea the story of their unorthodox parenting would go viral, so we probably shouldn’t condemn them for using their child to get media attention. And I don’t think the “experiment,” as some have called it, poses any direct threat to Storm’s psychological well-being. But Storm’s parents are jousting with windmills. They’re assuming that gender is something imposed on children by society—those chimerical gender nazis—through a process called socialization. The really disheartening thing is that even the bloggers at Scientific American make this mistake; they assume that sparkly pink science kits that help girls explore the chemistry of lipstick and perfume send direct messages about who and what girls should be, and that the girls will receive and embrace these messages without resistance, as if the little tykes were noble savages with pristine spirits forever vulnerable to the tragic overvaluing of outward beauty.

            When they’re thinking clearly, all parents know a simple truth that gets completely discounted in discussions of gender—it’s really hard to get through to your kids even with messages you’re sending deliberately and explicitly. The notion that you can accidentally send some subtle cue that’s going to profoundly shape a child’s identity deserves a lot more skepticism than it gets (ask my conservative parents, especially my Catholic mom). This is because identity is something children actively create for themselves, not the sum total of all the cultural assumptions foisted on them as they grow up. Children’s minds are not receptacles for all our ideological garbage. They rummage around for their own ideological garbage, and they don’t just pick up whatever they find lying around.

            Psychologist John Money was a prominent advocate of the theory that gender is determined completely through socialization. So he advised the parents of a six-month-old boy whose penis had been destroyed in a botched circumcision to have the testicles removed as well and to raise the boy as a girl. The boy, David Reimer, never thought of himself as a girl, despite his parents’ and Money’s efforts to socialize him as one. Money nevertheless kept declaring success, claiming Reimer (who was called Brenda at the time) proved his theory of gender development. By age 13, however, the poor kid was suicidal. At 14, he declared himself a boy, and later went on to get further surgeries to reconstruct his genitals. In his account, written with John Colapinto, As Nature Made Him: The Boy Who Was Raised as a Girl, Reimer says that Money’s ministrations were in no way therapeutic—they were traumatic. Having read about Reimer in Steven Pinker’s book The Blank Slate: The Modern Denial of Human Nature, I thought of John Money every time I came across the term gender nazi in the Facebook comments about Storm (though I haven’t read Colapinto’s book in its entirety and don’t claim to know the case in enough detail to support such a severe charge).

            Reimer’s case is by no means the only evidence that gender identity and gender-typical behavior are heavily influenced by hormones. Psychiatrist William Reiner and urologist John Gearhart report that raising boys (who’ve been exposed in utero to more testosterone) as girls after surgery to remove underdeveloped sex organs tends not to result in feminine behaviors—or even feminine identity. Of the 16 boys in their study, 2 were raised as boys, while 14 were raised as girls. Five of the fourteen remained female throughout the study, but 4 spontaneously declared themselves to be male, and 4 others decided they were male after being informed of the surgery they’d undergone. All 16 of the children displayed “moderate to marked” degrees of male-typical behavior. The authors write, “At the initial assessment, the parents of only four subjects assigned to female sex reported that their child had never stated a wish to be a boy.”

            An earlier study of so-called pseudo-hermaphrodites, boys with a hormone disorder who are born looking like girls but who become more virile in adolescence, revealed that of 18 participants who were raised as girls, all but one changed their gender identity to male. There is also a condition some girls are born with called Congenital Adrenal Hyperplasia (CAH), which is characterized by an increased amount of male hormones in their bodies. It often leads to abnormal testes and the need for surgery. But Sheri Berenbaum and J. Michael Bailey found that in the group of girls with CAH they studied, increased levels of male-typical behavior could not be explained by the development of male genitalia or the age of surgery. The hormones themselves are the likely cause of the differences. 
From Psychology Today and Satoshi Kanazawa

           One particularly fascinating finding about kids’ preferences for toys comes from the realm of ethology. It turns out that rhesus monkeys show preferences for certain types of toys depending on their gender—and they’re the same preferences you would expect. Girls will play with plush dolls or with wheeled vehicles, but boys are much more likely to go for the cars and trucks. And the difference is even more pronounced in vervet monkeys, with both females and males spending significantly more time with toys we might in other contexts call “stereotypical.” There’s even some good preliminary evidence that chimpanzees play with sticks differently depending on their gender, with males using them as tools or weapons and females cradling them like babies.

            Are gender roles based solely on stereotypes and cultural contingencies? In The Blank Slate, Pinker excerpts large sections of anthropologist Donald Brown’s inventory of behaviors that have been observed by ethnographers in all cultures that have been surveyed. Brown’s book is called Human Universals, and it casts serious doubt on theories that rule out every factor influencing development except socialization. Included in the inventory: “classification of sex,” “females do more direct child care,” “male and female and adult and child seen as having different natures,” “males more aggressive,” and “sex (gender) terminology is fundamentally binary” (435-8). These observations are based on societies, not individuals, who vary much more dramatically one to the next. The point isn’t that genes or biology determine behavioral outcomes; the relationship between biology and behavior isn’t mechanistic—it’s probabilistic. But the probabilities tend to be much higher than anyone in English departments assumes—higher even than the bloggers at Scientific American assume.

            Interestingly, even though there are resilient differences in math test scores between boys and girls—with boys’ scores showing the same average but stretching farther at each tail of the bell curve—researchers exploring women’s underrepresentation in STEM fields have ruled out the higher aptitude of a small subset of men as the most important factor. They’ve also ruled out socialization. Reviewing multiple sources of evidence, Stephen Ceci and Wendy Williams find that

            the omnipresent claim that sex differences in mathematics 
            result from early socialization (i.e., parents and teachers 
            inculcating a ‘‘math is for boys’’ attitude) fails empirical 
            scrutiny. One cannot assert that socialization causes girls to 
            opt out of math and science when girls take as many math 
            and science courses as boys in grades K–12, achieve higher 
            grades in them, and major in college math in roughly equal 
            numbers to males. Moreover, survey evidence of parental 
            attitudes and behaviors undermines the socialization 
            argument, at least for recent cohorts. (3)

If it’s not ability, and it’s not socialization, then how do we explain the greater desire on the part of men to pursue careers in math-intensive fields? Ceci and Williams believe it’s a combination of divergent preferences and the biological constraints of childbearing. Women tend to be more interested in social fields; while men like fields with a focus on objects and abstractions. However, girls with CAH show preferences closer to those of boys. (Cool, huh?)

  Ceci and Williams also point out that women who excel at math tend to score highly in tests of verbal reasoning as well, giving them more fields to choose from. (A recent longitudinal study replicates this finding - 3-26-2013.) This is interesting to me because if women are more likely to pursue careers dealing with people and words, they’re also more likely to be exposed to the strain of feminism that views science as just another male conspiracy to justify and perpetuate the patriarchal status quo. Poststructuralism and New Historicism are all the rage in the English department I study in, and deconstructing scientific texts is de rigueur. Might Derrida, Lacan, Foucault, and all their feminist successors be at fault for women’s underrepresentation in STEM fields at least as much as toys and stereotypes?

            I have little doubt that if society were arranged to optimize women’s interest in STEM fields they would be much better represented in them. But society isn’t a very easy thing to manipulate. We have to consider the possibility that the victory would be Pyrrhic. In any case, we should avoid treating children like ideological chess pieces. There’s good evidence that we couldn’t keep little kids from seeking gender cues even if we tried, and trying strikes me as cruel. None of this is to say that biology determines everything, or that gender role development is simple. In fact, my problem with the feminist view of gender is that it’s far too crude to account for such a complex phenomenon. The feminists are arm chair pontificators at best and conspiracy theorists at worst. They believe stereotypes can only be harmful. That’s akin to saying that the rules of grammar serve solely to curtail our ability to freely express ourselves. While grammar need not be as rigid as many once believed, doing away with it altogether would reduce language to meaningless babble. Humans need stereotypes and roles. We cannot live in a cultural vacuum.

            At the same time, in keeping with the general trend toward tribalism, the feminists’ complaints about pink microscopes are unfair to boys and young men. Imagine being a science-obsessed teenage boy who comes across a bunch of rants on the website for your favorite magazine. They all say, in capital and bolded letters, that suggesting to girls that trying to be pretty is a worthwhile endeavor represents some outrageous offense, that it will cause catastrophic psychological and economic harm to them. It doesn’t take a male or female genius to figure out that the main source of teenage girls’ desire to be pretty is the realization that pretty girls get more attention from hot guys. If a toy can arouse so much ire for suggesting a girl might like to be pretty, then young guys had better control their responses to hot girls—think of the message it sends. So we’re back to the idea that male attraction is inherently oppressive. Since most men can’t help being attracted to women, well, shame on them, right? 


(Full disclosure: probably as a result of a phenomenon called assortive pairing, I find ignorance of science to be a huge turn-off.)
Check out part 2 on "The Objectionable Concept of Objectification."
And part 1 on earnings.
These posts have generated pretty lengthy comment threads on Facebook, so stay tuned as well for updates based on my concession of points and links to further evidence.
And, as always, tell me what you think and share this with anyone you think would rip it apart (or anyone who might just enjoy it).
Update: Just a few minutes after posting this, I came across Evolutionary Psychologist Jesse Bering's Facebook update saying he was being unfairly attacked by feminists for his own Scientific American blog. If you'd like to show your solidarity, go to http://blogs.scientificamerican.com/bering-in-mind/.
Go here to read my response to commenters.

Why I Am Not a Feminist - and You Shouldn't Be Either part 2: The Objectionable Concept of Objectification

From eatingdisordersfacts.org
          Feminists theorize that one of the ways men subjugate women is by objectifying them. The idea is that a man, as part of the wider male conspiracy, makes a point of letting girls and young women know they’re constantly being ogled by people who are evaluating them and comparing them to other women—based solely on their physical features. Even compliments can contribute to this heightened awareness and concern for appearance, since they let women know what aspects of their persons are attention-worthy. The most heinous example of objectification is the casual dismissal of a woman’s ideas in the workplace and the substitution of some remark about her appearance in place of the serious consideration her idea deserves.


            Or maybe the most heinous example of objectification is the parading of impossibly attractive actresses and dangerously thin models all over the media landscape, setting the standards of beauty so high young women can never even hope to compete. In Hollywood, directors are fond of lovingly sweeping their cameras over their favorite parts of a female’s anatomy to let every young woman viewing the films know precisely what men find most appealing. The lustful male gaze is thus a powerful tool of oppression because it causes women to feel self-conscious and insecure—or so the feminist theory suggests (or, rather, one of the feminist theories).

            Looking through the  ever-looming feminist lens at statistics about how much more common self-esteem issues and eating disorders are among girls has a predictable impact on how we view boys. “Inevitably, boys are resented,” writes Christina Hoff Sommers in her book The War Against Boys: How Misguided Feminism is Harming Our Young Men, setting them up to be

seen both as the unfairly privileged gender and as obstacles on the path to gender justice for girls. There is an understandable dialectic: the more girls are portrayed as diminished, the more boys are regarded as needing to be taken down a notch and reduced in importance. (23-4) (excerpt)

The effect on young boys of being taught this theory of oppression by objectification must be akin to the effect of Catholic preachings about the fallen state of man and the danger to their souls of succumbing to the temptations of carnal desire. At some point, they’re going to start experiencing that desire, they’re not going to be able to do anything about it, and it’s going to make them feel pretty guilty. It’s a bit similar as well to what young homosexuals must experience growing up with families who believe attraction toward members of the same sex is sinful and unnatural.

            Men look at women and assess their attractiveness. They even get aroused merely from the sight of women who have certain features. Movie-makers and marketers know all about men’s fondness for checking out women. I’m not going to cite any of the research from the field of evolutionary psychology that explores whether or not men’s passion for beautiful women is something that occurs reliably in diverse cultures, or whether or not there are certain features that are considered beautiful by men all over the world. I’m not going to recite the logic of natural selection as it pertains to mate selection and the relative cost of reproduction. You can find that stuff anywhere, and you’ve probably already got some response to it worked out.

I’m going to do my best to explain why objectification can’t be a valid theory and doesn’t in any way establish the need for any social and political movement pitting the genders against each other at as purely practical a level as I can manage.

Objectification goes wrong before even getting beyond the term itself. Men aren’t—humans aren’t—with a few rare exceptions, attracted to or sexually aroused by objects. By being attracted to or sexually aroused by a woman, a man is in fact acknowledging her humanity. We humans are physical beings, and sex is a physical act. It stands to reason that in assessing a potential sexual partner’s compatibility, we focus a great deal on physical attributes. We have to distinguish humans from objects obviously, and we have to have some other criteria on which to base our decisions about who to couple with. For one thing, we need a way to figure out whether the prospective partner is mature enough for sex—so features signaling sexual maturity tend to be seen as attractive. And, since most people prefer to couple with members of one sex over the other, features signaling that membership will also tend to be seen as attractive.

Individualist feminist (there’s got to be a better term) Wendy McElroy, in a defense of pornography, points out the flaw in thinking of objectification as automatically and invariably degrading, using logic very similar to mine:


The assumed degradation is often linked to the 'objectification'   
of women: that is, porn converts them into sexual objects. What 
does this mean? If taken literally, it means nothing because 
objects don't have sexuality; only beings do. But to say that porn 
portrays women as 'sexual beings' makes for poor rhetoric. 
Usually, the term 'sex objects' means showing women as 'body 
parts', reducing them to physical objects. What is wrong with 
this? Women are as much their bodies as they are their minds or 
souls. No one gets upset if you present women as 'brains' or as 
'spiritual beings'. If I concentrated on a woman's sense of humor 
to the exclusion of her other characteristics, is this degrading? 
Why is it degrading to focus on her sexuality?

Few women, as far as I know, complain about being treated as sexual beings by men they happen to be attracted to. The trouble arises when they’re treated that way when it’s inappropriate, as in the work situation I’ve described. The problem in such situations—and of course I agree it’s a problem—isn’t that the woman is seen as an object; it’s not even that she’s being recognized as attractive; it’s that someone is refusing to see her as more than merely a sexual being.

            But why do men have to be so obsessed with sex? And why does it seem like a woman’s role as a sexual being takes precedence over her other roles so frequently? Practically speaking, if two people who don’t know each other are going to begin a physical relationship, at least one of them must be motivated to pursue and get to know the other. Since the pursuer doesn’t yet know anything about the pursued, all there is to go on is physical appearance. Think about this for a second or two and you’ll come to a realization most women take for granted and, as long as it’s not in the context of a discussion about gender oppression, freely admit: Being the one who is the most motivated to pursue a relationship puts you at a disadvantage. An attractive woman has the power to accept or reject overtures from any of her suitors—and the more attractive she is the more of them she’ll have to choose from.
From CDC

           It’s just as legitimate to look at the numbers of women who suffer from eating disorders or undergo risky surgeries to improve their looks as evidence of an intense desire on the part of females to have the upper hand over men. The problem young girls face is the same problem young boys face—competition for attractive partners is unavoidable. Judging from suicide statistics, the consequences of this competition are even direr for the boys. The explanation for girls’ increasing self-consciousness and their more readily resorting to more extreme measures is probably the simple fact that media technology has opened the world up to everyone like never before, so that now the standards of beauty are determined by a contest with a much larger pool of contestants—not to mention the technological wonders of digital alteration.

            All the panic notwithstanding, this wider field of competition may actually be a societal boon. Some people of both genders harm themselves trying to be thin or athletic. At the same time, though, the obesity epidemic is doing even more harm. It’s easy to find stats and figures on anorexia, but how many people, after seeing a Victoria’s Secret model or that Twilight kid with his shredded abs, simply forgo that extra helping they were tempted to devour? And the competition extends beyond the realm of physical appearance. We don’t usually complain about how the work of geniuses makes it difficult for us to say anything interesting—even though we have to assume many first dates end in disappointment owing to lackluster conversation. What’s so special about attractiveness that it calls for protection from high standards? (This is not to say that there aren't plenty of other good reasons not to watch crap TV and read glitzy crap magazines.)

            Even if women admit that they like sex and that male attention is flattering, most of them will still attest to having experienced unwanted or inappropriate sexual attention or commentary at some point. While a lot of the time their complaints about this issue are probably bragging in disguise, that at least sometimes male attention can be downright scary or just outrageously inappropriate is undeniable. Still, women have to keep in mind that men like to tease their friends, often aggressively, and the point at which intimate liberty-taking shades into something more malicious is often ambiguous.

            And if you think a workplace dominated by females would be some kind of peaceful utopia, you probably haven’t spent much time around groups of women. If a man has a problem with you, he’s much more likely to tell you directly. Women, on the other hand, are much more likely to smile to your face and then attack your reputation when your back is turned. This is one of those patterns that emerge reliably across cultures; psychologists call it indirect aggression. I’m citing it because it’s not about beauty standards or male desire—and because it underscores the point that when a man makes some comment about a woman’s "proper role" it’s an act of aggression perpetrated by an individual, not an act of political or economic oppression for which the entire gender is guilty.
From shortsupport.org 1
            Those perpetrators are also much more likely to be at the bottom of the workplace hierarchy than they are at the top. Studies of natural hierarchy formation find that self-sacrifice and altruism are key determiners of status. There is also strong evidence that people resort to aggression primarily to compensate for low status. Although unwanted sexual advances aren’t acts of aggression, a rejected man’s effort to save face can certainly be frightening. The important thing to keep in mind, though, is that even these face-saving measures aren’t politically motivated. The guy’s not belittling the woman from a position of power; his position is in fact pitiable. (I'll even make a prediction: the guy who's bugging you--he's short isn't he?)

            What do neuroscientists and psychologists say about the nature of men’s lustful gazes? A small preliminary imaging study presented at an AAAS meeting in Chicago by Susan Fiske seemed to offer some support for the idea of objectification. When men were put in scanners and allowed to look at pictures of women, the region of the brain that motivates and manages male conspiracies lit up like a Christmas tree--sorry, couldn't help myself. Here is the claim Fiske actually made: 

            I’m not saying that they literally think these photographs of 
            women are photographs of tools per se, or photographs of 
            non-humans, but what the brain imaging data allow us to do   
            is to look at it as scientific metaphor. That is, they are  
            reacting to these photographs as people react to objects.   
            (Quoted here)

However, Fiske goes on to say that when she matched the scans with surveys of attitudes she discovered that “the hostile sexists were likely to deactivate the part of the brain that thinks about other people’s intentions.” So along with the part of the brain associated with using tools, people who aren’t “hostile sexists” actually do think about naked people’s intentions. This finding has actually been replicated.

            The most comprehensive study to date on how people’s attitudes are affected by viewing pictures of scantily clad women and men concludes that while seeing skin does in fact lead to a diminishment in assessments of agency, it leads to an increase in assessments of a capacity to experience either pleasure or pain. The authors write: 

            To the extent that this modified framework concerning    
            perceptions of the mind and body turns out to be correct, it is 
            inaccurate to describe the body focus as inducing  
            “objectification.” People who seem especially embodied are 
            not treated as mere physical objects but, instead, like 
            nonhuman animals, as beings who are less capable of 
            thinking or reasoning but who may be even more capable of 
            desires, sensations, emotions, and passions. (12)

Looking at other humans like they’re animals isn’t much better than looking at them like objects—but the study was of people looking at pictures of individuals they’d never met. Assuming a capacity for desires, sensations, emotions, and passions is, at least in my opinion, a really good start considering the pictures are of naked people in sexually suggestive poses; people with more clothes were perceived to be more like robots. (So show more skin to hide your agendas, as if you didn't already know.) The authors not only take issue with the term objectification; they also failed to discover any justification for thinking the changes that occur in attitudes toward strangers based on how much skin they’re showing are only experienced by men:

Objectification is often discussed in terms of men objectifying women …, but we found that both men and women strip agency and confer experience to both men and women when a bodily focus is induced. (11)

This study’s findings dovetail almost perfectly with those of a study that found men who watch a moderate amount of pornography demonstrate less sexist attitudes in general, but when sexism does emerge in relation to porn it tends toward so-called "benevolent sexism," the supposedly paternalistic, protective, and worshipful variety (the measures for which are shot through with dubious feminist assumptions).

            Benevolent or not, men's feelings toward women in porn are probably the starkest proof that objectification is a nonsensical idea: if men were aroused by objects or instruments, the women in x-rated videos would be passive and inert as often as they are active and enthusiastic. I don't have any numbers to cite on this but I'd say most men, by far, cringe at the thought of taking pleasure without reciprocating. Advocates of objectification theory seem to worry that someone will sneak up behind a man and slap him on the back while he's looking at a woman as a sexual being, causing his mind to get stuck that way. I can't be the only man who on more than one occasion has had sex with one woman only to drive to work a short time afterward and speak to other women in a purely professional capacity. Guys looking at porn and then going to work--got to be happening millions of times a day. People shift modes all the time.
  
            The study that questions the term objectification is titled “More Than a Body: Mind Perception and the Nature of Objectification.” Tellingly, when Peircarlo Valdesolo reported on it for Scientific American, the headline read “How Our Brains Turn Women Into Objects.” In my future post on the hysteria (yes, I’m using this term with a sexist etymology ironically) over the “gendering” of children, I’m going to point out how this flagship publication for popular science seems to be bowing to pressure to be more feminist-friendly. Valdesolo, to be fair, did include a subheading: “There is, it turns out, more than one kind of ‘objectification’.” Those quotation marks notwithstanding, I still have to object—no, in fact, there aren’t any kinds of objectification. (A later “60-Second Mind” podcast has a much more accurate title and subheading: “How We View Half-Naked Men and Women: Research finds that scantily-clad women and men are judged in similar ways.”)

            Make no mistake, those hostile sexists are out there. But not all of them are men. Some people, women and men, are hell-bent on plunging this country back into the dark ages and on dispelling all the evolution craziness that gets taught in schools, all the global warming crap, all the godlessness. These people are sure to belittle and disparage anyone, woman or man, with more liberal or libertarian leanings (and we them). Make no mistake on the point too that while mixing up objectification and attraction is wrong and offensive, there are acts that really do deny the humanity and sovereignty of women and men. In America, we can be glad that it's overwhelmingly more likely for the most disadvantaged people to be either the perpetrators or the victims of such acts. I believe, nonetheless, that by targeting the forces behind their disadvantage we can and should be doing more to prevent such acts.

           The stats on part 1 of Why I Am Not a Feminist: Earnings are still blowing up. But the comments have stopped coming in. Please let me know what you think. Feel free as well to share this post with anyone you think can tear it apart.
Read part 3: Engendering Gender Madness
Read my response to commenters.

Why I Am Not a Feminist—and You Shouldn’t Be Either part 1: Earnings

From a Georgetown University study called "Education, Occupation, and Lifetime Earnings"
           In order to establish beyond all doubt the continuing criticality of the battle for women’s equality, feminists rely heavily on data demonstrating an earnings discrepancy between genders. Women make less money in America, and therefore women are not yet equal. If women aren’t making as much as men who work in the same industry, if women aren’t making as much as men with the same education level, isn’t that an injustice? So how can I claim something is wrong with feminism, a movement seeking equal rights and equal treatment and equal pay for half the population of the country?


            There’s a point at which dwelling on the crimes committed against a group of people becomes a subtle form of bigotry toward other groups. Jews like to rehearse their long history of persecution for a reason. Focusing on anti-Semitism can bolster solidarity among Jews—if for no other reason than that it fosters suspicion of gentiles. This is not to minimize the true horrors and hatreds faced by God’s chosen people, but rather to point out that no matter how horrible their past is it doesn’t justify atrocities against other groups of people.

            I’m not writing merely to bemoan male-bashing, and I'm not suggesting feminists are guilty of atrocities (though a case could be made that they are). I’m writing because the good cause of equal rights and equal pay shades with distressing frequency into sloppy thinking and unscientific, perfervid preaching. Feminism has become a free-floating ideology, a cause inspiring blind frenzies and impassioned pronouncements about mysterious evils unlikely to exist in the world of living, breathing humans. And, yes, it is unfair to men, mean to boys, and counterproductive to women.

            I am an advocate of universal human rights, and many of my positions overlap with those of feminists. A pregnant woman has the right to choose whether or not to carry her baby to term. Any type of legal or educational enforcement of gender roles is a violation of the right of individuals to choose their own lifestyles, educational trajectories, careers, and the nature of their relationships. But this freedom in regard to gender roles also means that girls and boys, women and men, have just as much of a right to choose to be traditional or stereotypical in any of these domains. Any law or educational policy that goes after any aspect of gender freely chosen or naturally occurring is just as much of an injustice as one that forces individuals to take on roles that don’t fit them.
From a 2011 Gallup Report
           
          If it were true that the figures showing earnings discrepancies in fact represented compelling evidence of hiring or promoting biases favoring men, I would support the cause of reform—not in the name of women’s rights, but in the name of human rights, in the name of fairness. As stark an image as they paint, however, the results of the studies these figures come from are no more proof of bias than a study showing boys win more often in school sports would be proof of cheating. Just as you would have to address the question of how many girls are even playing sports, you have to ask how many women are applying for top-paying positions. Fortunately, several studies have looked at the application and hiring process directly—at least in academic fields.
From a CDC 2011 Report
            Before discussing those results, though, I’d like to point out (only somewhat flippantly) that earnings aren’t the only area in which reliable gender differences occur. Men have more heart attacks than women. And men tend to die at an earlier age than women, heart disease being the single most common cause of death. One of the main concerns of feminists is the so-called objectification of women and, more specifically, the theory that media portrayals of underweight actresses and models instill in young girls the conviction that they must be dangerously skinny to be attractive. Might it also be the case that media portrayals of extremely wealthy men instill in boys the notion that in order to be attractive they must make extremely large incomes, incomes they go to dangerous lengths to secure, say, by working long hours, spending little time with family and friends, ignoring their health, stressing themselves out, and working themselves into early graves.

            A 2010 study published in the Proceedings of the National Academy of Sciences by Daniel Kahneman and Angus Deaton begins its discussion of results thus:

            More money does not necessarily buy more happiness, but 
            less money is associated with emotional pain. Perhaps 
            $75,000 is a threshold beyond which further increases in 
            income no longer improve individuals’ ability to do what 
            matters most to their emotional well-being, such as spending 
            time with people they like, avoiding pain and disease, and 
            enjoying leisure. According to the ACS, mean (median) US 
            household income was $71,500 ($52,000) in 2008, and about 
            a third of households were above the $75,000 threshold. It 
            also is likely that when income rises beyond this value, the 
            increased ability to purchase positive experiences is 
            balanced, on average, by some negative effects. recent 
            psychological study using priming methods provided 
            suggestive evidence of a possible association between high 
            income and a reduced ability to savor small pleasures. (4)


Perhaps a monomaniacal lusting after money is a pathology, one that men suffer from in much greater numbers than women. But my point isn’t that I think we should try to do something to protect these men from harm; it’s rather that income is not necessarily an absolute good. So why should it be a benchmark for women’s rights that they make dollar for dollar what men make? We have to at least consider the possibility that women have it as good or better than men already today.

            Still, if a woman wants to go toe-to-toe with her male counterparts to see who can earn more, there should be no institutional barriers hampering her ability to compete. Before we look at those earnings charts and imagine sinister cabals of Scotch-swigging conspirators, however, we must determine whether or not the numbers result from choices freely made by women. “Gender Differences at Critical Transitions in the Careers of Science, Math, and Engineering Faculty” is the 2010 report of a task force established to investigate this very question. The main finding:


For the most part, male and female faculty in science, engineering, and mathematics have enjoyed comparable opportunities within the university, and gender does not appear to have been a factor in a number of important career transitions and outcomes. (153)


How does the study account for the underrepresentation of women in these fields? “Women accounted for about 17 percent of applications for both tenure-track and tenured positions in the departments surveyed” (154). So the plain fact is that women apply for these positions less frequently. Could it be because they despair of their chances for getting an interview? It turns out that “The percentage of women who were interviewed for tenure-track or tenured positions was higher than the percentage of women who applied” (157), which does sound a bit like discrimination—against men. And it gets better (or worse): “For all disciplines the percentage of tenure-track women who received the first job offer was greater than the percentage in the interview pool” (157). Fewer women applying to positions in these fields, not discriminatory hiring or promoting, explains their underrepresentation.

            Reviewing this and several other research programs, Stephen Ceci and Wendy Williams, in a report likewise published in the Proceedings of the National Academy of Sciences titled "Understanding current causes of women's underrepresentation in science", explain that 

            Despite frequent assertions that women’s current       
            underrepresentation in math-intensive fields is caused by sex 
            discrimination by grant agencies, journal reviewers, and 
            search committees, the evidence shows women fare as well 
            as men in hiring, funding, and publishing (given comparable 
            resources). That women tend to occupy positions offering 
            fewer resources is not due to women being bypassed in 
            interviewing and hiring or being denied grants and journal 
            publications because of their sex. It is due primarily to 
            factors surrounding family formation and childrearing, 
            gendered expectations, lifestyle choices, and career 
            preferences—some originating before or during adolescence 
            —and secondarily to sex differences at the extreme right tail 
            of mathematics performance on tests used as gateways to 
            graduate school admission. As noted, women in 
            math-intensive fields are interviewed and hired slightly in 
            excess of their representation among PhDs applying for 
            tenure-track positions. The primary factors in women’s 
            underrepresentation are preferences and choices—both freely 
            made and constrained: “Women choose at a young age not to 
            pursue math-intensive careers, with few adolescent girls 
            expressing desires to be engineers or physicists, preferring 
            instead to be medical doctors, veterinarians, biologists, 
            psychologists, and lawyers. Females make this choice 
            despite earning higher math and science grades than males 
            throughout schooling”. (5)

These "math-intensive" fields (Wallstreet?) are central to our economy and accordingly tend to mean higher pay for those who chose them. Since the study that compared incomes by gender and education level failed to account for what field the education or the career was in, the differences in fields chosen probably explains the difference in pay. The PNAS study authors cite a Government Accountability Office report whose findings accorded well with this explanation. Ceci and Williams write that

            the GAO report mentions studies of pay differentials,  
            demonstrating that nearly all current salary differences 
            can be accounted for by factors other than 
            discrimination, such as women being disproportionately 
            employed at teaching-intensive institutions paying less 
            and providing less time for research. (4)

Conservatives are fond of the principle that equality of opportunity doesn’t mean equality of outcome. Though they are demonstrably wrong when it comes to economic inequality in general (since inequality and mobility are negatively correlated), the principle is completely sound. I have no doubt that some men are barring the doors of employment to some women in America today. There are probably places where the reverse is true as well. But feminism is a body of facile assumptions that leads to ready conclusions of questionable validity. The assumption of discrimination when faced with earnings discrepancies is just one example.

Feminism is the political and social effort to attain equality between the sexes. While this sounds perfectly innocuous, even admirable, it frames relations between women and men as fundamentally antagonistic; it’s us versus them. Even a whiff of tribalism tends to make otherwise admirable efforts take tragic turns. How many relationships have been undermined by the idea that difference means inequality means oppression, by the notion that within every man lurks the impulse to dehumanize and dominate women.

In future posts, I’m going to look at the faulty assumptions inspired by feminism in the realms of sex and attraction—i.e. the bizarre notion of objectification—and in the upbringing of children, where so much pointless hand-wringing takes place over whether gender stereotypes are being subtly imposed. For now, I’m going to close with some questions from a graduate level textbook, Theory into Practice: An Introduction to Literary Criticism by Ann Dobie. They’re from a section devoted to helping burgeoning scholars learn to write feminist essays about literature. The idea is to pose these questions to yourself as you’re reading. See if you can spot the assumptions. See if you think they’re valid or fair.

-What stereotypes of women do you find? Are they oversimplified, demeaning, untrue? For example, are all blondes understood to be dumb?
-Examine the roles women play in a work. Are they minor, supportive, powerless, obsequious? Or are they independent and influential?
-How do the male characters talk about the female characters?
-How do the male characters treat the female characters?
-How do the female characters act toward the male characters?
-Who are the socially and politically powerful characters?
-What attitudes toward women are suggested by the answers to these questions?
-Do the answers to these questions indicate that the work lends itself more naturally to a study of differences between the male and female characters, a study of power imbalances between the sexes, or a study of unique female experience? (121-2)

In case you missed it, let me quote from the first page of the chapter: "The premise that unites those who call themselves feminist critics is the assumption that Western culture is fundamentally patriarchal, creating an imbalance of power which marginalizes women and their work" (104). While I acknowledge the assumption was historically justified, I have a feeling people will keep making it long after its promise of a better tomorrow is exhausted.
Read part 2: The Objectionable Concept of Objectification
and part 3: Engendering Gender Madness
Read my response to commenters.

Gravitating Toward Tribal: The Danger of Free-Floating Ideologies

Image from the movie Zardoz. Courtesy of Thersic.com
          Ideologies are usually conceived through a coupling of comfortable tradition with a calculation of self-interest. But they can also be borne of good faith efforts at understanding. More important than their origin and development is the degree to which they are grounded. If you work out a comprehensive and adequately complex ideology that serves to explain an otherwise incomprehensible phenomenon and possibly even offers some guidance for dealing with an otherwise chaotic and frightening dynamic, you’ve created a theory that will appeal to human minds desperate for understanding and a sense, no matter how meager, of control. But does the ideology match up with reality? That’s an entirely different question.

            Free-floating ideologies, those that persist solely owing to the comforts they provide and the conveniences they secure, survive confrontations with reality and subsist despite vast lacuna in empirical support because human perception operates through a process of cross-referencing sensory inputs with prior knowledge. What we see is largely determined by what we’re looking for, and how we see it by what we believe about it. Patterns arising in what ought to be random incidents often sustain beliefs—even though in most contexts humans are terrible at calculating probabilities. A natural confirmation bias has us perceiving and remembering all the times predictions arising from our theories come to fruition while missing or forgetting all the times they fail. We tend to enjoy the company of like-minded others, and rather idiotically have our convictions bolstered by their common acceptance by those with whom we’ve chosen to associate.

            Unmoored ideologies gravitate toward certain predictable tracks in human cognition. We like to think there’s some sort of agency behind everything, an intelligence governing the universe. To think that no one’s in charge of all the swirling and colliding galaxies is variously unsettling and terrifying to us. So we take in the sublime beauty of quiet sunsets and wonder at the beneficence of the creator. Or we note coincidences in our lives, the way they fall together in a meaningful, beneficial way, and we feel a need to express gratitude to the guiding divinity. This is mostly innocent. Though it can lead to complacence and willful ignorance of entire regions where this supposedly beneficent guide has deigned never to set foot, and it can add an extra layer of grief in response to catastrophe, the comfort of believing in an invisible protector and guide has little immediate cost.

            Much more worrying is the gravitation of free-floating ideologies toward tribalism. The pseudo-scientific cult that has arisen around certain varieties of psychotherapy has bequeathed to our culture the horrifying belief that an unknown portion of the population, predominantly male, can induce the modern equivalent of demonic possession, severe psychological trauma, through an inverted laying-on of hands. The ideology has made monsters of men. The fetishizing of free markets likewise entails a belief in a loathsome variety of sub-humans. The economy, true believers assert, is a battle between the makers and the moochers, the producers and the parasites. As a conservative friend put in, in a discussion of healthcare reform, “Giving insurance to the slugs will just make them bigger slugs.”

            If you challenge someone’s beliefs by suggesting theirs is an ideology divorced from reality, as everyone does who advocates for one set of beliefs in opposition to another, the proper response is to insist that the ideology emerged from an awareness of facts through inductive reasoning. But sunsets, no matter how sublime, don’t really provide any evidence for the existence of an intelligent agency behind the curtain of the cosmos. Troubled young women with histories of abuse don’t prove that sexual experiences in childhood cause a wild assortment of psychological maladjustments. And the higher incarceration rate for impoverished groups doesn’t in any way establish some fundamental divide between good and bad types of people.

            Once ideologies reach a certain stage of development, they become all but immune to contradictory evidence. When the facts cooperate, they are trumpeted. When they don’t, the devout have recourse to principles. I’ve referred advocates of particular varieties of psychotherapy to evidence that they’re ineffective. In response, I didn’t get references to other bodies of evidence supporting the beliefs and practices in question; rather, I got an explanation of how the therapeutic techniques were supposed to work. Present a free market purist with evidence that market competition doesn’t led to innovation, or leads to detrimental innovations, and you’ll likely get a lecture explaining the principles behind how it’s supposed to work, according to the free market ideology, rather than evidence that it does, in fact, work in the theorized way. This convenient toggling back and forth between inductive and deductive reasoning literally allows us to explain away disconnects between our ideologies and the world.

            It is the tendency of free-floating ideologies toward tribalism that leads me to advocate a strict adherence to science in matters of public concern. It wasn’t merely coincidence that the enlightenment represented the inception of both the traditions of science and universal human rights, which have suffered through a traumatic childhood of their own, and are now living out a tumultuous adolescence. The tendency toward tribalism is also why I’m wary of commercial fiction which almost invariably makes characters represent ideas and personal qualities, only to pit the good guys against the bad. J.K. Rowling can claim all she wants that the Harry Potter books teach kids the evils of bigotry, but any work with goodies and baddies taps into tribal instincts. Literary fiction, on the other hand, at its best, is an exercise in empathy.

Beliefs that Make You Feel Good Make You Look Good Too—But You’re a Total Asshole if You Let That Influence You

Imagine you are among a group of around thirty people on an island and over the past few weeks you’ve learned of the presence of another group living on the same island, one which has been showing signs of hostility toward your own group. Because of your wisdom, your group has appointed you the task of convening a selective gathering to devise a strategy for dealing with the looming threat. Among your group there happen to be several people with military training as well as some with experience in diplomacy. There are also individuals claiming psychic powers and religious authority. You understand that the composition of the gathering will be among the most important factors determining the consensus strategy it will arrive at. Who do you invite to participate? Who do you exclude?

(Full disclosure: the first strategy that occurs to me is to find a way to get the rival group’s attention and then execute the psychics and religious authorities for them to witness, letting them know afterward this treatment is what they can expect from us should they decide to continue their hostility.)

Beliefs have consequences. A psychic in our hypothetical group may be convinced that he’s seen the future and in it the home group stands victorious, having suffered no casualties, over the rival group. This vision allows an otherwise outvoted military aggressor to persuade everyone else a violent raid is the best course of action. A religious leader may feel it incumbent on her to serve as a missionary to the savages. This may lead to an attempt at diplomacy which backfires by offending the rival group’s own religious sensibilities. The fate of the home group is at stake. Whose opinions do you seek?

This imaginary scenario is meant to illustrate the point that an individual’s beliefs inevitably contribute to the culture and ultimately influence the fate of societies. While it is true that the larger the society the smaller the impact of any one person’s ideas, it is likewise the case that through a mechanism called social proof the stated ideas of individuals have multiplier effects far beyond what any one person believes. Social norms are a major determiner of what people accept as true. And many people may not question pieces of conventional wisdom simply because it has never occurred to them to do so—at least not until they encounter someone who espouses wisdom of an unconventional strain.

This point may seem obvious enough, and yet it represents a major departure from the dominant approach to considering beliefs in American culture. When confronted with a new idea Americans automatically and unconsciously apply a rigid formula to assessing its merits: they ask, first, how would believing this idea make me feel, and, second, how would believing this idea make me look to others? The order of these questions may be reversed, but no other questions ever enter the equation. The foundation of our culture is an ethic of consumerism, and so people decide what to believe exactly the same way they decide what music they want to claim as their favorite, and the same way they decide what type of t-shirt they’ll wear to advertise their personal style.

Savvy marketers, public relations experts, and profiteering charlatan shitbags are well aware of the extent to which consumerism determines our beliefs and behaviors. There’s no shortage of people in this country who will have nothing to do with politics because the topic is just not sexy at all; they know politicians are considered dishonest, petty, and even corrupt. Who would want to associate themselves with that? This general distaste for government and its policy disputes derives much of its fuel from each party’s attempts to brand the other in as off-putting a way as possible. I haven’t seen a survey that establishes the link, but I’d wager where people fall on the political spectrum is largely determined by whether they'd find it less acceptable to be thought of as naïve and effete or to be thought of as callous and lacking in compassion.

I try, as much as possible, to adhere to the Enlightenment values of devotion to science and championing of universal human rights. When people of the consumerist mindset discuss their beliefs with me, they are often baffled as to why I would insist on scientific skepticism with regard to supernatural ideas and pop culture myths. Science is so dry and mechanical. So, when I tell people what I believe, I usually get one of three responses: the first is to assume that my knowledge about research on some issue must be completely independent of my beliefs, because beliefs are personal and science is not. “Okay, you’ve told me what you know about the results of some experiments. But what do you really believe?”

The second response, equally in keeping with the consumerist ethic, is to assume that anyone so devoted to science must be a dry and mechanical person, the type who is incapable of tapping into his intuition, who insists on cold hard facts and bloodless statistics. After all, the reasoning goes, this guy chose his beliefs based on how he wanted to represent himself, so if he’s spouting off stats and experimental results he must have a pretty limited and robotic personality. It should go without saying—but unfortunately it doesn’t—that this reasoning is based on a gross misunderstanding of science and statistics alike. But the other mistake implicit in this response is that people can only decide what to believe according to how they want to represent themselves to others.

And yet it’s the third response that’s the most troubling. When you listen to someone’s beliefs about, say, supply-side economics, or religion, or alternative medicine and then start going into detail about why those beliefs are almost certainly wrong, many people will immediately conclude that there’s an ulterior motive behind your scientific skepticism. Because you have such a strong tendency to reject other people’s beliefs, they reason, you must simply be the type of person who enjoys making other people feel and look stupid. It’s not enough to wear your own favorite brand of t-shirt; you have to ridicule other people’s fashion sense. People who respond this way—you know who you are—can be counted on to violently assert themselves when you challenge them. They take your arguments very personally.

The true reason I’m devoted to science, though, is that I take responsibility for the consequences of my beliefs. What you believe has a direct impact on the culture around you, and an indirect impact on the course of society at large. If you like the fit of supply-economics, if you explain to anyone who’ll listen how wealth at the top trickles down, and if you vote for conservative politicians, then you’re responsible for the results, positive or negative, of the implementation of those policies. In point of fact, the most reliable outcome of these policies is greater income inequality, which is associated with a host of societal ills from increased violent crime to higher infant mortality. I would argue that those signing on to the conservative agenda after these facts were established are complicit in the perpetuation of these social problems.

The position you take on any issue with broader social implications inevitably becomes more than a personal choice. And it’s more difficult than you may assume to come up with issues that don’t have broader social implications. Where, for instance, was your t-shirt made? What were the conditions the people who made it were working under? What effects did its manufacture have on the surrounding ecosystems? The plain fact is that any pure application of the consumerist ethic, whether to your choice of clothing or to what religion or political party you support, is profoundly irresponsible.

In my first novel, which I just recently completed, the characters address issues concerning recovered memories of child abuse. This is a topic I began researching as an undergrad studying psychology. It turns out the best research rules out the theory of repressed trauma with a high degree of certainty. Now, it shouldn’t require any great deal of trust on your part to believe I have no desire to associate myself in any way with the issue of child abuse, especially in any way that entails a risk of being perceived as wanting to defend or advocate it. But there are men in prison today convicted solely on the basis of evidence from recovered memories. If I simply towed the conventional line and neglected to thoroughly research the issue, or worse, if I ignored the products of that research, I would be complicit in the imprisonment of innocent men. This complicity extends to the seemingly innocent act of remaining silent when others around me are expressing views I know to be in error.

The tendency to rely on pure consumerism to assess ideas and to fail to take responsibility for their consequences is a trap all too easy to fall into. I can almost guarantee the shirt on your back right now was made in a third world country under conditions you’d literally kill to keep your own children safe from. But most Americans are blithely ignorant of this. And I can attest it is exceedingly difficult and prohibitively expensive to limit your purchases to products made under more humane conditions. Manufacturers depend on American consumers being ignorant and irresponsible. And yet, under some circumstances, people’s reasoning becomes eminently more practical. When your child gets sick, the sexiness of holistic medicine doesn’t lure you away from doctors trained in scientific medicine—though you may backslide if that first visit fails to cure them.

But how, you may ask, do you express your individuality if you are so committed to science? Alternatively, how can others assess your personality through your beliefs if they’re all based on some scientist’s research? Well, even if research were to prove somehow that it’s better to be extroverted than introverted, people have little control over such things. So it is with most personality traits. Science may also offer some hints about characteristics I ought to look for in a romantic partner, but ultimately which woman I pair up with will be determined by factors beyond the scope of any research project. Not every personal decision you make has wider societal consequences. Anyway, there’s plenty of room for individuality even for those of us thoroughly committed to taking responsibility for our actions and beliefs.

Objectification isn't a Valid Complaint

Just as some attractive women become conditioned to rely on their looks to achieve status and other social ends, many educated women deem it degrading to go out of their way to turn on a man--at least they claim to in polite conversation. This is probably owing to the their exposure to feminism, which has women shoring up their reservations and inhibitions in the name of dignity, lest they be objectified—even though objectification is a nonsensical concept. Men are not typically aroused by any inanimate objects; they're turned on by flesh-and-blood women (unless they're homosexual of course). Even for fetishists, the object of their desire arouses them by dint of its association to living, human persons. One hears much more often of fetishes for high-heeled shoes than for, say, tables or clocks.

Objectification owes its name to the idea that men, left to their own patriarchal devises, won’t pay proper heed to a woman’s subjectivity, to what’s going on in her head. But if you look at the epitome of the supposed crime you’ll see this idea is pretty easily ruled out. If men were given to ignoring what women were subjectively experiencing then pornography would feature just as many women lying inert while doing the deed, or having the deed done to them, as there are who, shall we say, overact. That some men may appear eager to discount women's subjectivity underscores that they really couldn’t ignore the women's experience if they tried—they’re preoccupied with it. You can still argue that men one-dimensionalize women, but then there are plenty of one-dimensional people of both genders out there.


Really celebrating one another's sexual dimension in no way precludes deep appreciation for your partner's other qualities; if anything, that appreciation becomes exaggerated as the sex gets better. But what ends up happening is that nonsense about objectification gives women an excuse not to make any attempts in a realm where failure is resoundingly and lastingly mortifying—welcome to men’s world.


Is it a problem that men can be sexually aroused by features of women that have nothing to do with their characters? If so, it must be borne in mind that men aren't the only ones at fault. But the real problem is that some people are so zealous in their efforts to politicize any and every aspect of relations between men and women they've long since ceased to care about whether their ideas have any validity or whether they lead to any greater happiness or fulfillment in the lives of those influenced by them. Sex has political implications, but it's a physical act. And if a man got turned by his partner's lofty orations about her master's thesis, that would probably be offensive in its own right.

Cults and Conversion Narratives

            There are three main positions you can take that will inevitably spark an argument where I’m from. And the people who jump up to disagree always do so with the same strategy: they tell a story. If you tell people here you’re an economic liberal, you may get a brief refresher course on supply side theory, but when you continue to disagree after hearing it, a story will inexorably follow which features the storyteller as a hero battling his or her way up from poverty into the proud and comfortable middle class. The implication is supposed to be that since the storyteller made it, it must be possible for everybody to make it. Hence financial safety nets and programs for the poor funded by the rich must be misguided ideas bound to fail.

            If you tell people you don’t believe in any god, there’s a slight chance you’ll get some inarticulate rehashing of the Argument from Design, but you’re much more likely to get a story. On this topic, there’s quite a bit of variety in the stories people tell. If the storyteller doesn’t have any loved ones who have died, you’ll likely get a story about an encounter with the supernatural. These stories always end with a statement along the lines of “There’s just no way to explain that,” or “There’s no way that could’ve been a coincidence.” But if the storyteller has had a loved one die, the story will be about how that loved one managed somehow to communicate with him or her from beyond the grave to let them know “they’re okay,” and “they’re waiting for me.” This supposedly proves there’s life after death, which somehow supposedly establishes the fact that some all-powerful deity presides over it.

            If you tell someone you don’t accept the theory of repressed memories, or point to evidence that there’s nothing especially damaging about childhood sexual abuse when compared to any other form of child abuse, you’ll first be called some choice names, then you’ll be accused of pedophilia yourself, and then finally you’ll get the poor woman’s story. There’s a lot of variation to these stories as well. But of course they all feature a male character in the role of evildoer. And they all end with a statement about how the storyteller continues to struggle with the resulting emotional turmoil and haunting memories to this day. (Repression and Severe Personality Disturbance from CSA are myths 13 and 34, respectively, in 50 Great Myths of Pop Psychology.)

            No matter which of the three topics you’re discussing, the storyteller will feel exhilarated at first because it seldom happens that they get a chance to spout wisdom to someone so hopelessly naïve. If you hold any of these three unpopular positions, you’ll get to hear lots and lots of stories, as if each storyteller is convinced theirs will be the story that finally converts you. But when you respond to their stories with alternative theories, describe ingeniously designed experiments, rattle off statistics, they’ll start to get uncomfortable. The next stage of the discussion will invariably entail the storyteller making a straw man of you: because you don’t answer their stories with your own, it’s assumed you don’t have any, and the reason is plain—you spend all your time reading. What follows will be a disparagement of “book learning,” an angry dismissal of what “you learned in some book,” and the general suggestion that you’ve lived your life sealed up in an Ivory Tower.

            I am a humanist. I believe the best we can do for humanity is to spread enlightenment principles as far and wide and as in depth as possible. That’s why I’m skeptical of all these conversion narratives. It’s not just that the evidence doesn’t support them. Each one of them implicitly conveys a message of tribalism. The hero of the rags-to-republican story is suggesting he or she made it because they were virtuous, that the people who don’t make it have only themselves to blame. And don’t get them started on that shadowy outgroup, the government. It’s us versus them and we’re better. The very basis of our ideas of good and evil rests on our innate proclivity to confuse the abstract with the supernatural. If you establish that even one supernatural event has occurred, you’ve simultaneously proven that some cosmic order underlies all existence. There are believers and infidels, saints and sinners. And if nearly every young girl in the world is living in the shadow of molestation by some unredeemable male predator then we must all mobilize to do battle against this great evil. You’re either with us or you’re one of them.

            I do not accept the idea that man is fallen, or that humans are. As a humanist, I believe that we are the most exulted beings on the planet, and quite likely among the most exulted beings in the universe. We need to act on behalf of humanity, not for some invisible entity whose interests can never be known, not for any subgroup we see as superior by dint of our individual membership in it. If you can only defend your beliefs with conversion narratives, then you are a member of a cult. And our division into such cults is precisely the impediment we need to overcome. The solution to problems like war and poverty and child abuse lies not in converting more members to this or that cult, but in our ingenuity and imagination. Just look what we’ve accomplished. Imagine what else we could accomplish.

            Are all these conversion narratives completely false then? Personality psychologist Dan McAdams, in his book The Stories We Live By: Personal Myths and the Making of the Self, describes identity as “an inner story of the self that integrates the reconstructed past, perceived present, and anticipated future to provide a life with unity, purpose, and meaning.” To an adult, childhood is a welter of floating details and vague impressions. McAdams suggests that at some point we structure all this ambiguity into a set of narratives. Every time we recall an event we reinterpret or “reconstruct” it, making our memories much more malleable than most of us are comfortable admitting. The problem comes in as we reconstruct the past into a narrative that gives us purpose. Too often that purpose consists of recognizing or acknowledging evil and thenceforth doing battle against it. What we fail to realize is that the supposed evildoers have their own narratives.

            The culture in which we develop our identities provides the raw material of wider narratives for us to sample. Sometime in our late teens or early twenties, we chose elements from one or two of these and subsequently go back in time to carve the formless block of our pasts into sculptures we want to resemble ourselves. (This happened for me when at twenty-two I read Carl Sagan’s Demon-Haunted World.) Some of our memories may better lend themselves to integration into particular narratives, so it’s not as though our pasts have no bearing on who we become. But it’s also probably true that we overestimate the significance of any given experience because it’s hard to accept how insignificant most experiences are. Such thinking leads to existentialism, a doubting of all purpose. But I have a purpose. I am a humanist. I especially enjoy a good story—just not one with good guys and bad guys.