Tuesday, January 27, 2015

Typhoid Mary and the Anti-Vaxxers

The measles outbreak in Southern California has been generously made possible by California law's "personal belief exemption," which allows adults to refuse vaccinations for their children or themselves based on their so-called "personal belief" that vaccines cause autism. Here "personal belief" is extended to include not simply religious and moral teachings -- the question here isn't that religion teaches that vaccination is morally wrong -- but factual errors. This allows people in Orange County, to construe medical fallacies as "belief." So it's time for a short history lesson about Mary Mallon, who went down in history as Typhoid Mary, and her commitment to her personal medical beliefs.

About a hundred years ago, some doctors told Mary Mallon that she was carrying the germs that cause typhoid fever. She had no symptoms, because she herself was immune to the disease (she might have had it when young and survived). But, the doctors told Mary she was a typhoid carrier, one of the first ever discovered,who was infected with the disease and could infect others, even if she never felt sick herself. Mary didn't believe them.

After all, she wasn't sick. She was never sick. She certainly didn't have typhoid fever. So how could other people get typhoid fever, which she didn't have, from her? It made no sense. She preferred that the doctors go about their own business and let Mary get back to her own. Mary didn't believe in the idea of a "disease carrier." More broadly, she didn't really accept the whole germ theory of disease. It just didn't make sense to her.

So Mary just kept on doing what she was doing. Which was working as a cook.

Of course, everywhere Mary worked, numbers of people who'd eaten her food began coming down with life-threatening cases of typhoid, and a few of them actually died. This was how the doctors had originally found Mary and diagnosed her as a disease carrier: she was the one person who had worked in every kitchen involved in a mysterious string of dangerous typhoid outbreaks. But what could Mary do? She didn't understand why this kept happening, but it clearly wasn't her. She was healthy as a horse. She just needed to keep looking for another kitchen job. She kept finding them.

Did I mention that Mary didn't believe in washing her hands before preparing food? Mary didn't see the point. She wasn't sick, so what could happen?

Eventually, Mary was put in enforced medical isolation; the legal mechanism might have been a little hinky, but eventually the authorities couldn't let her keep going from cooking job to cooking job and infecting people. (At least three people Mary cooked for over the course of her career died; there may have been more.) They finally decided that Mary Mallon did not have the freedom to disbelieve the doctors if she was putting public health at risk. Her personal belief that she was not infectious was outweighed by the fact that she kept infecting people.

After a few years of forced isolation, they let Mary out. They had trained her for a new job, as a laundress, which was basically safe. As long as Mary didn't prepare food for people, everything would be okay.

But Mary preferred cooking, and it paid better than the laundry did. So after a while she took an assumed name and began hiring herself out as a cook.

After continued outbreaks, they put Mary back in isolation for the rest of her life. Was this an infringement of her liberty? Certainly. Her liberty was taken away from her entirely, because she insisted on endangering other people. What Mary believed, or refused to believe, was ultimately not the point.

I've been thinking about Mary a lot lately, because of the anti-vaccine movement. Our culture gives a lot of deference and liberty to people's beliefs, and rightly so. But refusal to believe a scientific or medical fact is not a belief. You can believe that God loves everyone, or that the good in human nature outweighs the bad. You can believe that God doesn't want you to eat cheeseburgers or shellfish. But you are not free to believe that mental illness is caused by sleeping in the moonlight. You are not free to believe that eating pork causes leprosy, or that fluoride in the municipal water supply is a mind-control drug. You are not free to treat your child's case of flu with bleeding or leeches. These are not beliefs. These are mistakes. They might be harmless mistakes. But if they grow to the point that they endanger others around you, you lose any right to them.You are not free to smoke in an enclosed public space because you believe that smoking has nothing to do with cancer. You are not free to have unprotected sex after an AIDS diagnosis because you don't believe that AIDS is sexually transmitted. You are not free to drive your infant around in a car without a car seat; medical evidence has accumulated to the point where that decision has been legally taken out of parents' hands.

There is a deep American conviction that we are entitled to our beliefs. But this is true for things that are ultimately beliefs because they cannot be tested for truth or falsehood. "Jesus loves me" or "Our people were singled out by God" are not testable beliefs in the conventional sense. They are choices of perspective. "MMR vaccine causes autism" is not a belief of this kind. It is a claim of fact that can be tested. And those tests have proved it false. We are entitled to our own values. We are not entitled to simply make things up. That was Typhoid Mary's mistake.

cross-posted from Dagblog

Monday, January 05, 2015

Your New Year's Public Domain Report, 2015


I'm late with my annual public-domain update this year. But that's okay because yet again this year, nothing new entered the public domain this January 1. That's right, because of repeated extensions of the copyright laws in the US, no copyrights expired this year. Or last year. Or the year before.
Almost none have since January 1, 1979.

American copyright law started out by specifying a 14-year term, renewable once to provide 28 years of exclusive protection. That was very much in line with the original 18th-century copyright laws in Britain. By 1976, that 28 years had crept up to 56. But that year Congress passed a new copyright act, extending terms to either fifty years after the author's death or (in the case of previously existing copyrights) 75 years from the work's creation. The law didn't go into effect until 1978, and copyrights that expired in 1978 weren't protected.  So on January 1, 1979, works published in 1922 entered the public domain. Works published in 1923 did not, and still haven't.

Even with that extension, those works from 1923 would have become public on January 1, 1999. But in 1998 Congress passed another extension, variously nicknamed the Millennium Copyright Act or the Sonny Bono Act (after one of its sponsors), which added another 20 years to copyright terms. Now previously-copyrighted works stayed in copyright for 95 years. As we get closer to 2019, we can expect intense lobbying by large media companies to pass yet another extension, defying the Constitution's mandate that intellectual property be protected for "for a limited time." (Article I, section 8, clause 8.)

So there's nothing in under our public-domain tree this morning. But let's look at what would have become public domain if not for these laws.

If not for the Milennium Copyright Act:

The major headline this New Year's Day would have been Batman 's entry into the public domain. Batman would be on his own for a while, without famous supporting characters like Robin, the Catwoman, or the Joker, but they would be entering public domain in 2016 and 2017. For this year, it would just be Batman and Commissioner Gordon. Under the laws in force when he debuted of course, Batman would have become public domain in 1996. I'd like to say better late then never, but late is threatening to turn into never.

At the movies, Gone with the Wind, The Wizard of Oz, and Mr. Smith Goes to Washington would all become public domain this year. So would the rest of the many, many great films produced in 1939, including Stagecoach, Of Mice and Men, Goodbye, Mr Chips, Wuthering Heights, and Dark Victory. And let's not forget Beau Geste, Babes in Arms, At the Circus with the Marx Brothers, Gunga Din, Each Dawn I Die, The Hunchback of Notre Dame with Charles Laughton, Ninotchka, Intermezzo, of Mice and Men, The Women, Son of Frankenstein, old Dr. Cleveland favorite The Roaring Twenties with James Cagney, and of course Laurence Olivier in Wuthering Heights. Installments in the Thin Man, Andy Hardy, Charlie Chan, and Mr. Moto series would leave copyright, as would classic serials starring Zorro, Dick Tracy, Buck Rogers, and the Lone Rangehttps://www.blogger.com/blogger.g?blogID=35762378#editor/target=post;postID=470211736745711182r.

 In music, "God Bless America" should enter the public domain this week, as should another important American classic: Billie Holliday's "Strange Fruit." Also in popular music, "Back in the Saddle," "All of the Things You Are," "At the Woodchopper's Ball," "Brazil," "Go Fly a Kite," "Heaven Can Wait," "I Get Along Very Well Without You," "In the Mood," "The Lamp is Low," "Lydia the Tattooed Lady," "Over the Rainbow," "Moonlight Serenade," "South of the Border," "When You Wish Upon a Star," and "Tuxedo Junction" would all enter public domain. So would Cole Porter's "Darn That Dream," "Give Him the Oooh-La-La," "I've Got My Eyes on You," and "Well, Did You Evah?" -- not necessarily Porter's best year, but pretty good for the rest of us. In classical music, Shostakovich's 6th Symphony, Prokofiev's Alexander Nevsky and William Wallton's Violin Concerto would all be leaving copyright.


Finnegans Wake, The Snows of Kilimanjaro, and The Grapes of Wrath should be entering the public domain. So should Johnny Got His Gun, The Big Sleep, The Day of the Locust, Goodbye to Berlin, Tarzan the Magnificent, Pale Horse, Pale Rider, At Swim-Two-Birds and Saint-Exupery's Sun, Wind, and Stars. Mystery novels by Eric Ambler, Dorothy Sayers, Ellery Queen, Rex Stout, and three by Agatha Christie would become free from copyright. Brecht's Galileo, Hellman's Little Foxes, and Eliot's Family Reunion would become free for all to perform, as would The Man Who Came to Dinner, The Time of Your Life, Arsenic and Old Lace, and The Philadelphia Story. Poems by Frost, Auden, May Sarton, Edna St. Vincent Millay, Dylan Thomas, Archibald MacLeish, Muriel Rukeyser, and Louis Macneice would enter public domain, as would the last of Yeats's poems and Eliot's Old Possum's Book of Practical Cats, the children's book that became the musical Cats.

 According to Congress, no one has had a fair chance to make a profit off these works yet, and they will stay in copyright until at least 2035.

If not for the 1976 Copyright Act:


Chinua Achebe's classic Things Fall Apart would be entering public domain, as would Suddenly, Last Summer, Krapp's Last Tape, The Unnameable , The Dharma Bums, Breakfast at Tiffany's, Playback, Our Man in Havana, Pinter's The Birthday Party and of course Dr. No. So would poems by William Carlos Williams, e e cummings, Lawrence Ferlinghetti, Gregory Corso, Muriel Rukeyser, Theodore Roethke, John Berryman, John Betjeman, and Djuna Barnes.

Among the films newly available in public domain would be Vertigo, Gigi, Cat on a Hot Tin Roof, South Pacific, No Time for Sergeants, Aunti Mamie and The Vikings. Also entering public domain would be The Blob, The Attack of the 50-Foot Woman, Hercules, Kurosawa's The Hidden Fortress, The Fly, The Touch of Evil, The Revenge of Frankenstein, and Run Silent, Run Deep.


It would be a banner year for fans of early rock and roll, with new public-domain hits like "Johnny B. Goode," "Chantilly Lace," "16 Candles," "All I Have to Do Is Dream," "Donna," "Do You Want to Dance?," "Maybe Baby," "Yakety Yak," "Sweet Little Sixteen," "The Summertime Blues," and of course, "The Flying Purple People Eater." Folkies would get public-domain access to Pete Seeger's "If I Had a Hammer" and "Kumbayah." Also in pop music, "Volare" and the themes from Rawhide and Peter Gunn would enter public domain. So would classical pieces by Benjamin Britten, Dmitri Shostakovich, and John Cage.

However, all of those works will remain in private hands, usually meaning in the practical control of large corporations engaging in rent-seeking behavior, until 2054 at the earliest. Apparently, none of them count as classics yet. If you don't want to wait even longer than 2054, tell Congress next time copyright-extension time comes along.

cross-posted from Dagblog

Wednesday, December 31, 2014

Shakespeare "Authorship Debates" and Amateur Scholarship

So, just in time to ruin my New Year's celebrations, Newsweek has seen fit to publish a credulous article trumpeting the old who-wrote-Shakespeare conspiracy theories. I won't give Newsweek a link, but you can click through Amanda Marcotte's smart takedown at Rawstory if you're curious. The original piece is full of breathless non-facts like "Nobody ever recognised Shakespeare as a writer during his lifetime" [except for at least three dozen separate individuals, writing both in print and manuscript, because Shakespeare was famous]  "and when he died, in 1616, no one seemed to notice" [except the six different poets who wrote memorial verses for him]. Apparently you can always say, "there's no evidence" even when there is evidence.

Now, I'm on record about this question on this blog, and under my professional name, and I've been quoted about it in a major newspaper, so I don't want to belabor the key facts here. As the above example suggests, this isn't really a debate about facts anyway. But this phony debate often gets cast as insiders vs. outsiders, the stuffy Shakespeare establishment, with all the PhDs and whatever vs. the free-thinking, imaginative amateur scholars. So I'd like to clarify a few things about how academic and amateur Shakespeareans work.

1. Professional Shakespeareans constantly argue with each other and are rewarded for new ideas.

The standard position of the Francis Bacon/Earl of Oxford/etc./etc. fans is that "orthodox" Shakespeareans are all sticking together because we are afraid of new ideas. This ignores the fact that academic Shakespeare scholars argue with each other constantly about any question that can reasonably be disputed. Winning arguments with each other is how we get ahead in our careers. And winning an argument that brings in a big new idea, or overturns an important old idea, is the gold standard. The academic Shakespeare establishment isn't a conspiracy. It's a boxing ring.

This is one of the reasons that academic writing can be hard for general readers to enjoy: it focuses on highlighting the new idea that the writer is putting forward, rather than the ideas that the reader might find most interesting. Something that's interesting to you as a reader but that every scholar's agreed on for the last fifty years won't get much attention, while today's new idea, even if it's quite small, will get the most attention. And because every argument a scholar puts forward is liable to being torn apart by other scholars, scholarly writing tends to be carefully hedged and to carefully shore up even pretty small issues so that they don't give another critic an opening. That's another reason academese is hard to read.

I don't write my scholarship to highlight how much I agree with more established Shakespeareans. It's just the reverse. I once criticized something written by the then-head of the Shakespeare Birthplace Trust (whom many Oxfordians especially dislike) so, ah, energetically that I was publicly accused, in print, of having been "unfair" to him. (Of course, I don't think that I was unfair, but hey, to offend and judge are distinct offices.) Scholarly writing demands pointing out where other scholars are wrong.

A member of the "Shakespeare establishment" who could make a strong case that Shakespeare's works had been written by someone else would stand to benefit enormously. Even if it weren't a completely bullet-proof case, the rewards for making a reasonably strong case, opening room for legitimate doubt, would be huge. You'd immediately become a major player in the field. If I thought I had the evidence to back up a case like that, you'd better believe that I would make it. And so would a lot of other people like me. Yes, that would mean publicly disagreeing with many important senior scholars; that would only make it sweeter.

(On the other hand, the reward for believing Shakespeare wrote Shakespeare is nothing, just like the reward for believing that the sky is blue and water is wet is nothing. No one beats someone else out for a job because they both believe the same thing that no one else doubts. One of the frustrations many literary scholars have teaching beginning undergraduates is those students' deep commitment to arguing things that are so obviously true that they're not worth bringing up; making arguments like that is not what professional academics value at all.)

The reason I don't make a case for someone else writing Shakespeare is that I can't. The reason that a large group of other people inside the academic world haven't done it is that they can't either. If there were evidence to make a good case, someone would certainly be ambitious enough to make it. But it never happens.

2. Amateur scholars are welcome in academic debates.

One of my generation's two greatest historians of Shakespeare's theater is an independent scholar named Dave Kathman, who doesn't have a university job or a PhD in literature. Dave works as a financial analyst in Chicago, and does the Shakespeare-theater-history thing as a hobby. But he's enormously productive and valuable as a scholar. There's only one PhD-holder in my generation who's more important to that specific field than Dave is. (That scholar is an Oxford professor, very much part of the establishment.) Dave has found original documents that we had not known about, because he looked in archives people had not thought about trying. So suddenly, thanks to Dave, we have apprenticeship records for Shakespeare's boy actors. We can prove when they joined the company, and we can closely estimate their ages. It used to be we knew very little about the boys who played female parts, but now we know more about them than we know about some of the adult actors.

Dave doesn't get turned away because he doesn't have a PhD in our field, or because he doesn't teach college. He's been welcomed and valued, because he makes important contributions. He has also made a strong argument that changed the way we think about an important primary document from theater history, a piece of old paper that's obscure to outsiders but which turns out to underwrite a lot of other theories about what was going on in the 1590s. Dave made strong case for that document being from a different year than we thought, and belonging to a different acting company. This, of course, led to a debate. Shakespeareans debate things. And Dave was opposed by some very high-profile senior scholars who were committed to the old way of looking at that document. But they didn't pull rank on him. No one said, "I teach at an Ivy and you don't have a PhD in English, so you're wrong." They had to meet him on the facts, and some eventually had to concede that he was right.

We don't turn amateurs away because they're amateurs. An amateur who makes a strong case can win the day.

3. Shakespeare "authorship disputes" are actually OLDER than professional Shakespeare scholarship. 

In fact, the "authorship controversy" started in the days when every Shakespearean was an amateur. It didn't start until the 19th century, which is long enough after Shakespeare's death to raise difficult questions. (No one in the 16th, 17th, or 18th centuries expressed any doubts. But sometime after Shakespeare had been dead for 200 years a few people suddenly decided that it was impossible that he wrote his works.) But university courses on Shakespeare come even later still, as do doctoral degrees in English literature. Those don't get underway until the second half of the 19th century.

So this didn't start as an argument between professors and outsiders. There were no professors of Shakespeare. Everyone was an amateur (and that includes some of the greatest Shakespeare scholars who have ever lived).

But when literature departments got organized and people started writing research dissertations on Shakespeare, none of the maybe-someone-else-wrote-it stuff got used by the new group of pros. It wasn't because people conspired to exclude it. Someone who could prove that case in 1865 or 1915 would have been highly rewarded, the same way someone would for proving it in 2015. But the evidence for other candidates has never been there. And you can't get away telling your PhD adviser bullshit like "No one ever mentioned Shakespeare as a writer during his lifetime." Your adviser will know that's a lie.

The "Shakespeare authorship" arguments are like astrology: an old idea that professionals working in the field have outgrown but that stays popular with a slice of the general public. Like astrology, the Shakespeare-authorship game has trouble generating new hypotheses that can stand up to a rigorous test. And so authorship debates, like astrology, tend to recycle old claims over and over again, giving them a certain time-in-a-bottle quality. I'm having trouble finding anything in that Newsweek story that you couldn't find somewhere else by, say, 1940. In the academic world, a piece that just repeats things from decades ago is completely unpublishable. But the authorship hobbyists are more than happy to dish out the same old cabbage, no matter how many times it's been served before.

Journalists writing "news" stories about these conspiracy theories need to spin the Shakespeare-not-Shakespeare idea as somehow, well, new. But it's not new. It's a very old idea, nearly two hundred years old at this point, and it hasn't made any progress in a long time.

cross-posted from (and comments welcome at) Dagblog



Tuesday, December 23, 2014

Confidence, Rejection, and Criticism: Advice from Actors to Academics, Part Three

Christmas week is especially hard for young academics trying to get a job, especially in literary studies. The annual rhythm of the job search means that most first-round interviews (the interviews that take place at major disciplinary conferences over the winter) get scheduled during the first half of December. By this time of year, grad students (and recent PhDs) looking for a job are counting the meager number of schools where their applications are still active; they may have applied to dozens of jobs and gotten one or two first-round interviews to show for it. Worse yet, many people are getting nothing but the long and lengthening silence that tells them, day by painful day, that nothing is happening for them this winter.

And then many of those eager job-seekers, who've been tops of their classes all of their lives, have to fly home for Christmas and explain to their family that they're not getting a job this year, that they're not even in the running for a job this year. The darkest time of year is pretty dark for people in my business.

So, since it's that time of the year, it's time for a third installment of "Career Advice from Actors to Academics," inspired by Robert Cohen's classic book of advice Acting Professionally. As I wrote in part one, an academic career has become like a career in the arts because of the scarcity of work and the pervasive rejection. So academics, especially newer academics, can learn from our brothers and sisters in the theater (and in the other arts as well), who've been dealing with rejection and penury since back in the day.

But the hard truth is that artists are better prepared for rejection than scholars are. Few arts careers have the kind of heartbreaking schedule that the academic job search demands, where two-thirds of your job prospects for the year can evaporate within a window of a few weeks. Actors get rejected all year round. So do writers, dancers, sculptors, painters, and filmmakers. A stand-up comic who lives in the right city and works hard enough can get rejected every single night of the week. But none of those artists have to watch most of their chances for the year slip away just before Christmas. Artists get to space out the rejections better, and to inure themselves on a daily and weekly basis. (When I was an undergraduate theater bum, every show for the semester did make its casting decisions during the first week of the term, which is the only thing in my life that even remotely prepared me for the annual MLA conference.)

More importantly, almost every working artist, including the ones who've gotten an MFA, have started racking up rejections from their early twenties at least, while scholars don't begoin to experience the hard knocks until they've been systematically unprepared for it. Every actor gets turned down at auditions constantly, and many who enter a graduate acting program have been turned down many times before they get to school. The same goes with writers and other artists. After all, if people were already putting you on Broadway, you wouldn't go to acting school. But in fact, people are not putting you on Broadway. So actors who get a graduate degree have had to toughen up at least a little; they're prepared for the hard realities of the market because they've already tested them.

The scholar's career path is completely different. The rejections don't start until after you get your PhD. Graduate school is tough in all kinds of ways, but it promises to reward all of the deserving, as the job market will never do. If you earn a degree, you will get that degree.The working life of many doctoral students is a long series of A grades, scholarships, fellowships, and departmental awards until graduating with the doctorate. Then the working life suddenly turns into a long, bitter round of pummeling rejection. Most people aren't ready for that at all. How could they be?

The only thing that will get you through is confidence, which is not at all the same as ego. Let's go to Cohen's working definition:

Confidence is the power a person has over his or her own personality; it allows the person to accept criticism and at the same time rise above it.

Note the importance here of accepting criticism without feeling belittled by it. The false confidence of a heavily-defended ego cannot take criticism at all. But an ego like that can never survive an artistic, or an academic, career. You have to have a healthy perspective that can take criticism on board (although you don't necessarily allow any given critic to override your judgment completely) and find ways to use it.

It's important here to distinguish between criticism and rejection. If you have a fragile ego, those two things sound the same, or nearly the same. In fact, they are fundamentally different. Criticism, even if it happens to be mistaken, is almost always a gift. Someone has taken time and effort, neither of which are in great supply, in order to help you improve. Some criticism is not helpful, but all criticism is an attempt to be helpful.

Rejection does not come with criticism. It just says no, and moves along. The eerie silence that haunts some job-seekers the week before Christmas, or the brief formal rejections that some hiring departments send, don't offer any tips on how to improve.

You also need confidence to deal with the polite silence of rejection. Here is Cohen again:

 [Confidence] allows an actor to believe in the reality of his or her performance even when no one else does. A person may have all kinds of doubts about his or her potential for career success, but may not doubt that "he is an actor," that "she can act."

Note that confidence is not a prediction about results. I will be a star someday is not confidence. I am an actor, even when you've been turned down for the fifteenth audition in a row, is confidence.

In the same way, your conviction that you will get an incredible and shiny job this time out is not confidence. (In most cases, it is simply magical thinking, which is unhealthy.) Confidence, in the sense Cohen is using it, is not about predicting career outcomes. These businesses are too precarious for predictions like that. Confidence is not your will to believe that the plum Ivy League job on this fall's listings will be yours. Confidence is the belief that your work is valid, that your scholarship is actually a contribution. You might note the distinction here is very close to a distinction between believing in "yourself" and believing in your work as something separate from you, which I think is largely a healthy distinction.

Confidence, in Cohen's sense, is about believing that the work you do is useful and worth reading: that what you do is scholarship, no matter what the external rewards are.

I would add one further thing about weathering rejection. In all of the arts, the psychologically healthy rule of thumb is that successful artists (read: academics) get rejected all the time, and that they are rewarded only sometimes. To put it another way: you will be rejected even if your work is good, indeed no matter how good your work is, but you will never be accepted at all unless your work is good.

That means that you should usually only attach meaning to the good results, and write the rejections off as the normal cost of doing business. That would be an irrational approach in most other endeavors, but in the arts, including the scholarly arts, that approach accurately describes the real facts on the ground. If the talented are only rewarded one time in ten or twenty, then the single success is more meaningful than the nine or nineteen rejections.

If you are a struggling new academic and you are finding that the rejection is truly universal -- if, for example, you applied for three dozen jobs this fall and none of them even asked you for an additional writing sample -- then you probably need to change something about what you're doing. Most likely, there is some qualification that you need and don't have, something that you need to add to your CV in order to enter the pool of viable candidates.

But if you are getting small flashes of encouragement inside dark, watery depths of rejection, then you are perfectly rational to take the encouragement, and not the rejection, as meaningful. No one can afford to reward all of the deserving job-seekers. There are too many good people to offer them all even a first-round interview. This is the literal truth. But at the same time, no one has any need or reason to waste their own time with someone who isn't, on some level, a viable candidate. You only get asked for extra materials if your CV is up to snuff. You only get a conference interview if the committee thinks they might actually hire you. And even if you don't progress to the next round of interviews, you should remember that the people who interviewed you saw you as a professional doing real work.

There are more good people than there are rewards for good people. So even the good are only rewarded rarely. But only the good are rewarded at all. Forget the failures, because everyone fails. Remember the successes, because there is only one explanation for success.

cross-posted from (and comments welcome at) Dagblog

Sunday, December 21, 2014

Police, Danger, and the Social Contract

I was blogging about the police tonight, and about the responses to protests of police brutality. Then I heard about the shooting of two police officers in New York City,  so the rest of that post (and some of the others I have been working on) will have to wait.

The first thing I want to say is that absolutely nothing justifies this. Nothing justifies the murder. And if the murderer committed his crime in the service of any reasonable cause, he has set back that cause tonight. You don't get justice for Eric Garner, or for anyone else, by vigilante revenge. That only makes the problem worse.

We live in a culture where the police (who face real danger in their work) have been taught -- indeed, actively trained -- to be excessively fearful, to the point where some officers will put citizens' lives and safety at substantial risk rather than face some very small risk themselves. (For example, they might apply a chokehold to a suspect who's already being held on every side by several officers, and who can't free either of his arms. What danger that was meant to eliminate is hard to say.) Basically, the logic is that every possible trace of danger should be eliminated, which is impossible. So the effort to eliminate all danger generally means being extremely aggressive in situations that aren't actually that dangerous.

But that culture of fear is not helped by randomly killing police officers. It's fear that's driving the aggression, and the fear is fed by the potential randomness of the danger. Cops being killed without seeing the danger coming translates, in our current atmosphere, into cops being hyper-aggressive in situations of minimal danger because, "You never know."

Killing two cops in an ambush won't break up the mindset that killed Eric Garner, Killing two cops at random feeds the mindset that killed Eric Garner. When the police have been convinced that they could die at any moment, they take crazy and dangerous steps against people who actually pose no threat to them.

But anyone pointing at this murder as justifying that bad and crazy policing is a mistake. This doesn't justify anything. And the NYPD's aggression on the street will not, cannot, protect them from things like this. Saying they need to get tough with non-violent offenders in misdemeanor arrests because they could be killed in an ambush makes no sense. That strategy creates new problems without solving the old one.

The ugly truth is, police officers (like every other human being) are extremely vulnerable to a surprise ambush with a gun. Two police officers were killed this way in Las Vegas this year, gunned down by anti-government nuts while eating lunch. And obviously, the Tsarnaev brothers ambushed and killed an MIT campus cop last year. The method is the same every time: come up behind a police officer and shoot. The attacks in all three cases came out of the blue, and there was nothing that the officers could have done to protect themselves. It's not a question of police tactics. None of those cops had a chance.

But this is where the current approach to police work, the attempt to eliminate any and all potential danger, breaks down. You can never eliminate all danger. You can never even eliminate all mortal danger. Every police officer -- and every police officer's family -- has to live with that small, terrible chance. (I am a police lieutenant's son myself; I know exactly what this feels like.) There is always a little danger that you can't foresee or protect yourself from. But you definitely can't get rid of that uncontrollable danger by getting extra tough when there is no danger. Someone with a gun could always come up behind you. You can't protect yourself from that by choking an unarmed guy who's selling loosies on the street. That only creates more problems.

Least of all should the Mayor of New York, or other people who have legitimately criticized police tactics, be blamed for a crime against police officers. Police work is too important to be shielded from any criticism, and the difference between good and bad police work is much, much too important for bad cops to get a free pass. When police work becomes so recklessly bad that unarmed civilians are getting killed, when the police have actually become a cause of violence on the street, then the civil authorities have a duty to look into that. They would be derelict if they did NOT investigate.

In fact, the police are safest when they have strong civic oversight. In the end, the police's greatest protection is the social contract, which they are meant to enforce. The public support the police because the police protect them. If they endanger the public instead, the social contract breaks down. If ordinary citizens know there's a legitimate grievance process that works, and that they are safe from needless aggression by the police, the cops are safest and most respected. But when those legitimate outlets do not exist, or break down, then people are wrongly tempted to redress by illegitimate means. A breakdown of the social contract leads to unpredictable violence. And that puts everyone in more danger.

cross-posted from Dagblog