So, just in time to ruin my New Year's celebrations, Newsweek has seen fit to publish a credulous article trumpeting the old who-wrote-Shakespeare conspiracy theories. I won't give Newsweek a link, but you can click through Amanda Marcotte's smart takedown at Rawstory if you're curious. The original piece is full of breathless non-facts like "Nobody ever recognised Shakespeare as a writer during his lifetime" [except for at least three dozen separate individuals, writing both in print and manuscript, because Shakespeare was famous] "and when he died, in 1616, no one seemed to notice" [except the six different poets who wrote memorial verses for him]. Apparently you can always say, "there's no evidence" even when there is evidence.
Now, I'm on record about this question on this blog, and under my professional name, and I've been quoted about it in a major newspaper, so I don't want to belabor the key facts here. As the above example suggests, this isn't really a debate about facts anyway. But this phony debate often gets cast as insiders vs. outsiders, the stuffy Shakespeare establishment, with all the PhDs and whatever vs. the free-thinking, imaginative amateur scholars. So I'd like to clarify a few things about how academic and amateur Shakespeareans work.
1. Professional Shakespeareans constantly argue with each other and are rewarded for new ideas.
The standard position of the Francis Bacon/Earl of Oxford/etc./etc. fans is that "orthodox" Shakespeareans are all sticking together because we are afraid of new ideas. This ignores the fact that academic Shakespeare scholars argue with each other constantly about any question that can reasonably be disputed. Winning arguments with each other is how we get ahead in our careers. And winning an argument that brings in a big new idea, or overturns an important old idea, is the gold standard. The academic Shakespeare establishment isn't a conspiracy. It's a boxing ring.
This is one of the reasons that academic writing can be hard for general readers to enjoy: it focuses on highlighting the new idea that the writer is putting forward, rather than the ideas that the reader might find most interesting. Something that's interesting to you as a reader but that every scholar's agreed on for the last fifty years won't get much attention, while today's new idea, even if it's quite small, will get the most attention. And because every argument a scholar puts forward is liable to being torn apart by other scholars, scholarly writing tends to be carefully hedged and to carefully shore up even pretty small issues so that they don't give another critic an opening. That's another reason academese is hard to read.
I don't write my scholarship to highlight how much I agree with more established Shakespeareans. It's just the reverse. I once criticized something written by the then-head of
the Shakespeare Birthplace Trust (whom many Oxfordians especially dislike) so, ah, energetically that I was publicly accused, in print, of having been "unfair" to him. (Of course, I don't think that I was unfair, but hey, to offend and judge are distinct offices.) Scholarly writing demands pointing out where other scholars are wrong.
A member of the "Shakespeare establishment" who could make a strong case that Shakespeare's works had been written by someone else would stand to benefit enormously. Even if it weren't a completely bullet-proof case, the rewards for making a reasonably strong case, opening room for legitimate doubt, would be huge. You'd immediately become a major player in the field. If I thought I had the evidence to back up a case like that, you'd better believe that I would make it. And so would a lot of other people like me. Yes, that would mean publicly disagreeing with many important senior scholars; that would only make it sweeter.
(On the other hand, the reward for believing Shakespeare wrote Shakespeare is nothing, just like the reward for believing that the sky is blue and water is wet is nothing. No one beats someone else out for a job because they both believe the same thing that no one else doubts. One of the frustrations many literary scholars have teaching beginning undergraduates is those students' deep commitment to arguing things that are so obviously true that they're not worth bringing up; making arguments like that is not what professional academics value at all.)
The reason I don't make a case for someone else writing Shakespeare is that I can't. The reason that a large group of other people inside the academic world haven't done it is that they can't either. If there were evidence to make a good case, someone would certainly be ambitious enough to make it. But it never happens.
2. Amateur scholars are welcome in academic debates.
One of my generation's two greatest historians of Shakespeare's theater is an independent scholar named Dave Kathman, who doesn't have a university job or a PhD in literature. Dave works as a financial analyst in Chicago, and does the Shakespeare-theater-history thing as a hobby. But he's enormously productive and valuable as a scholar. There's only one PhD-holder in my generation who's more important to that specific field than Dave is. (That scholar is an Oxford professor, very much part of the establishment.) Dave has found original documents that we had not known about, because he looked in archives people had not thought about trying. So suddenly, thanks to Dave, we have apprenticeship records for Shakespeare's boy actors. We can prove when they joined the company, and we can closely estimate their ages. It used to be we knew very little about the boys who played female parts, but now we know more about them than we know about some of the adult actors.
Dave doesn't get turned away because he doesn't have a PhD in our field, or because he doesn't teach college. He's been welcomed and valued, because he makes important contributions. He has also made a strong argument that changed the way we think about an important primary document from theater history, a piece of old paper that's obscure to outsiders but which turns out to underwrite a lot of other theories about what was going on in the 1590s. Dave made strong case for that document being from a different year than we thought, and belonging to a different acting company. This, of course, led to a debate. Shakespeareans debate things. And Dave was opposed by some very high-profile senior scholars who were committed to the old way of looking at that document. But they didn't pull rank on him. No one said, "I teach at an Ivy and you don't have a PhD in English, so you're wrong." They had to meet him on the facts, and some eventually had to concede that he was right.
We don't turn amateurs away because they're amateurs. An amateur who makes a strong case can win the day.
3. Shakespeare "authorship disputes" are actually OLDER than professional Shakespeare scholarship.
In fact, the "authorship controversy" started in the days when every Shakespearean was an amateur. It didn't start until the 19th century, which is long enough after Shakespeare's death to raise difficult questions. (No one in the 16th, 17th, or 18th centuries expressed any doubts. But sometime after Shakespeare had been dead for 200 years a few people suddenly decided that it was impossible that he wrote his works.) But university courses on Shakespeare come even later still, as do doctoral degrees in English literature. Those don't get underway until the second half of the 19th century.
So this didn't start as an argument between professors and outsiders. There were no professors of Shakespeare. Everyone was an amateur (and that includes some of the greatest Shakespeare scholars who have ever lived).
But when literature departments got organized and people started writing research dissertations on Shakespeare, none of the maybe-someone-else-wrote-it stuff got used by the new group of pros. It wasn't because people conspired to exclude it. Someone who could prove that case in 1865 or 1915 would have been highly rewarded, the same way someone would for proving it in 2015. But the evidence for other candidates has never been there. And you can't get away telling your PhD adviser bullshit like "No one ever mentioned Shakespeare as a writer during his lifetime." Your adviser will know that's a lie.
The "Shakespeare authorship" arguments are like astrology: an old idea that professionals working in the field have outgrown but that stays popular with a slice of the general public. Like astrology, the Shakespeare-authorship game has trouble generating new hypotheses that can stand up to a rigorous test. And so authorship debates, like astrology, tend to recycle old claims over and over again, giving them a certain time-in-a-bottle quality. I'm having trouble finding anything in that Newsweek story that you couldn't find somewhere else by, say, 1940. In the academic world, a piece that just repeats things from decades ago is completely unpublishable. But the authorship hobbyists are more than happy to dish out the same old cabbage, no matter how many times it's been served before.
Journalists writing "news" stories about these conspiracy theories need to spin the Shakespeare-not-Shakespeare idea as somehow, well, new. But it's not new. It's a very old idea, nearly two hundred years old at this point, and it hasn't made any progress in a long time.
cross-posted from (and comments welcome at) Dagblog
Wednesday, December 31, 2014
Tuesday, December 23, 2014
Confidence, Rejection, and Criticism: Advice from Actors to Academics, Part Three
Christmas week is especially hard for young academics trying to get a job, especially in literary studies. The annual rhythm of the job search means that most first-round interviews (the interviews that take place at major disciplinary conferences over the winter) get scheduled during the first half of December. By this time of year, grad students (and recent PhDs) looking for a job are counting the meager number of schools where their applications are still active; they may have applied to dozens of jobs and gotten one or two first-round interviews to show for it. Worse yet, many people are getting nothing but the long and lengthening silence that tells them, day by painful day, that nothing is happening for them this winter.
And then many of those eager job-seekers, who've been tops of their classes all of their lives, have to fly home for Christmas and explain to their family that they're not getting a job this year, that they're not even in the running for a job this year. The darkest time of year is pretty dark for people in my business.
So, since it's that time of the year, it's time for a third installment of "Career Advice from Actors to Academics," inspired by Robert Cohen's classic book of advice Acting Professionally. As I wrote in part one, an academic career has become like a career in the arts because of the scarcity of work and the pervasive rejection. So academics, especially newer academics, can learn from our brothers and sisters in the theater (and in the other arts as well), who've been dealing with rejection and penury since back in the day.
And then many of those eager job-seekers, who've been tops of their classes all of their lives, have to fly home for Christmas and explain to their family that they're not getting a job this year, that they're not even in the running for a job this year. The darkest time of year is pretty dark for people in my business.
So, since it's that time of the year, it's time for a third installment of "Career Advice from Actors to Academics," inspired by Robert Cohen's classic book of advice Acting Professionally. As I wrote in part one, an academic career has become like a career in the arts because of the scarcity of work and the pervasive rejection. So academics, especially newer academics, can learn from our brothers and sisters in the theater (and in the other arts as well), who've been dealing with rejection and penury since back in the day.
But the hard truth is that artists are better prepared for rejection than scholars are. Few arts careers have the kind of heartbreaking schedule that the academic job search demands, where two-thirds of your job prospects for the year can evaporate within a window of a few weeks. Actors get rejected all year round. So do writers, dancers, sculptors, painters, and filmmakers. A stand-up comic who lives in the right city and works hard enough can get rejected every single night of the week. But none of those artists have to watch most of their chances for the year slip away just before Christmas. Artists get to space out the rejections better, and to inure themselves on a daily and weekly basis. (When I was an undergraduate theater bum, every show for the semester did make its casting decisions during the first week of the term, which is the only thing in my life that even remotely prepared me for the annual MLA conference.)
More importantly, almost every working artist, including the ones who've gotten an MFA, have started racking up rejections from their early twenties at least, while scholars don't begoin to experience the hard knocks until they've been systematically unprepared for it. Every actor gets turned down at auditions constantly, and many who enter a graduate acting program have been turned down many times before they get to school. The same goes with writers and other artists. After all, if people were already putting you on Broadway, you wouldn't go to acting school. But in fact, people are not putting you on Broadway. So actors who get a graduate degree have had to toughen up at least a little; they're prepared for the hard realities of the market because they've already tested them.
The scholar's career path is completely different. The rejections don't start until after you get your PhD. Graduate school is tough in all kinds of ways, but it promises to reward all of the deserving, as the job market will never do. If you earn a degree, you will get that degree.The working life of many doctoral students is a long series of A grades, scholarships, fellowships, and departmental awards until graduating with the doctorate. Then the working life suddenly turns into a long, bitter round of pummeling rejection. Most people aren't ready for that at all. How could they be?
The only thing that will get you through is confidence, which is not at all the same as ego. Let's go to Cohen's working definition:
Confidence is the power a person has over his or her own personality; it allows the person to accept criticism and at the same time rise above it.
Note the importance here of accepting criticism without feeling belittled by it. The false confidence of a heavily-defended ego cannot take criticism at all. But an ego like that can never survive an artistic, or an academic, career. You have to have a healthy perspective that can take criticism on board (although you don't necessarily allow any given critic to override your judgment completely) and find ways to use it.
It's important here to distinguish between criticism and rejection. If you have a fragile ego, those two things sound the same, or nearly the same. In fact, they are fundamentally different. Criticism, even if it happens to be mistaken, is almost always a gift. Someone has taken time and effort, neither of which are in great supply, in order to help you improve. Some criticism is not helpful, but all criticism is an attempt to be helpful.
Rejection does not come with criticism. It just says no, and moves along. The eerie silence that haunts some job-seekers the week before Christmas, or the brief formal rejections that some hiring departments send, don't offer any tips on how to improve.
You also need confidence to deal with the polite silence of rejection. Here is Cohen again:
It's important here to distinguish between criticism and rejection. If you have a fragile ego, those two things sound the same, or nearly the same. In fact, they are fundamentally different. Criticism, even if it happens to be mistaken, is almost always a gift. Someone has taken time and effort, neither of which are in great supply, in order to help you improve. Some criticism is not helpful, but all criticism is an attempt to be helpful.
Rejection does not come with criticism. It just says no, and moves along. The eerie silence that haunts some job-seekers the week before Christmas, or the brief formal rejections that some hiring departments send, don't offer any tips on how to improve.
You also need confidence to deal with the polite silence of rejection. Here is Cohen again:
[Confidence] allows an actor to believe in the reality of his or her performance even when no one else does. A person may have all kinds of doubts about his or her potential for career success, but may not doubt that "he is an actor," that "she can act."
Note that confidence is not a prediction about results. I will be a star someday is not confidence. I am an actor, even when you've been turned down for the fifteenth audition in a row, is confidence.
In the same way, your conviction that you will get an incredible and shiny job this time out is not confidence. (In most cases, it is simply magical thinking, which is unhealthy.) Confidence, in the sense Cohen is using it, is not about predicting career outcomes. These businesses are too precarious for predictions like that. Confidence is not your will to believe that the plum Ivy League job on this fall's listings will be yours. Confidence is the belief that your work is valid, that your scholarship is actually a contribution. You might note the distinction here is very close to a distinction between believing in "yourself" and believing in your work as something separate from you, which I think is largely a healthy distinction.
Confidence, in Cohen's sense, is about believing that the work you do is useful and worth reading: that what you do is scholarship, no matter what the external rewards are.
I would add one further thing about weathering rejection. In all of the arts, the psychologically healthy rule of thumb is that successful artists (read: academics) get rejected all the time, and that they are rewarded only sometimes. To put it another way: you will be rejected even if your work is good, indeed no matter how good your work is, but you will never be accepted at all unless your work is good.
That means that you should usually only attach meaning to the good results, and write the rejections off as the normal cost of doing business. That would be an irrational approach in most other endeavors, but in the arts, including the scholarly arts, that approach accurately describes the real facts on the ground. If the talented are only rewarded one time in ten or twenty, then the single success is more meaningful than the nine or nineteen rejections.
If you are a struggling new academic and you are finding that the rejection is truly universal -- if, for example, you applied for three dozen jobs this fall and none of them even asked you for an additional writing sample -- then you probably need to change something about what you're doing. Most likely, there is some qualification that you need and don't have, something that you need to add to your CV in order to enter the pool of viable candidates.
But if you are getting small flashes of encouragement inside dark, watery depths of rejection, then you are perfectly rational to take the encouragement, and not the rejection, as meaningful. No one can afford to reward all of the deserving job-seekers. There are too many good people to offer them all even a first-round interview. This is the literal truth. But at the same time, no one has any need or reason to waste their own time with someone who isn't, on some level, a viable candidate. You only get asked for extra materials if your CV is up to snuff. You only get a conference interview if the committee thinks they might actually hire you. And even if you don't progress to the next round of interviews, you should remember that the people who interviewed you saw you as a professional doing real work.
There are more good people than there are rewards for good people. So even the good are only rewarded rarely. But only the good are rewarded at all. Forget the failures, because everyone fails. Remember the successes, because there is only one explanation for success.
cross-posted from (and comments welcome at) Dagblog
Sunday, December 21, 2014
Police, Danger, and the Social Contract
I was blogging about the police tonight, and about the responses to protests of police brutality. Then I heard about the shooting of two police officers in New York City, so the rest of that post (and some of the others I have been working on) will have to wait.
The first thing I want to say is that absolutely nothing justifies this. Nothing justifies the murder. And if the murderer committed his crime in the service of any reasonable cause, he has set back that cause tonight. You don't get justice for Eric Garner, or for anyone else, by vigilante revenge. That only makes the problem worse.
We live in a culture where the police (who face real danger in their work) have been taught -- indeed, actively trained -- to be excessively fearful, to the point where some officers will put citizens' lives and safety at substantial risk rather than face some very small risk themselves. (For example, they might apply a chokehold to a suspect who's already being held on every side by several officers, and who can't free either of his arms. What danger that was meant to eliminate is hard to say.) Basically, the logic is that every possible trace of danger should be eliminated, which is impossible. So the effort to eliminate all danger generally means being extremely aggressive in situations that aren't actually that dangerous.
But that culture of fear is not helped by randomly killing police officers. It's fear that's driving the aggression, and the fear is fed by the potential randomness of the danger. Cops being killed without seeing the danger coming translates, in our current atmosphere, into cops being hyper-aggressive in situations of minimal danger because, "You never know."
Killing two cops in an ambush won't break up the mindset that killed Eric Garner, Killing two cops at random feeds the mindset that killed Eric Garner. When the police have been convinced that they could die at any moment, they take crazy and dangerous steps against people who actually pose no threat to them.
But anyone pointing at this murder as justifying that bad and crazy policing is a mistake. This doesn't justify anything. And the NYPD's aggression on the street will not, cannot, protect them from things like this. Saying they need to get tough with non-violent offenders in misdemeanor arrests because they could be killed in an ambush makes no sense. That strategy creates new problems without solving the old one.
The ugly truth is, police officers (like every other human being) are extremely vulnerable to a surprise ambush with a gun. Two police officers were killed this way in Las Vegas this year, gunned down by anti-government nuts while eating lunch. And obviously, the Tsarnaev brothers ambushed and killed an MIT campus cop last year. The method is the same every time: come up behind a police officer and shoot. The attacks in all three cases came out of the blue, and there was nothing that the officers could have done to protect themselves. It's not a question of police tactics. None of those cops had a chance.
But this is where the current approach to police work, the attempt to eliminate any and all potential danger, breaks down. You can never eliminate all danger. You can never even eliminate all mortal danger. Every police officer -- and every police officer's family -- has to live with that small, terrible chance. (I am a police lieutenant's son myself; I know exactly what this feels like.) There is always a little danger that you can't foresee or protect yourself from. But you definitely can't get rid of that uncontrollable danger by getting extra tough when there is no danger. Someone with a gun could always come up behind you. You can't protect yourself from that by choking an unarmed guy who's selling loosies on the street. That only creates more problems.
Least of all should the Mayor of New York, or other people who have legitimately criticized police tactics, be blamed for a crime against police officers. Police work is too important to be shielded from any criticism, and the difference between good and bad police work is much, much too important for bad cops to get a free pass. When police work becomes so recklessly bad that unarmed civilians are getting killed, when the police have actually become a cause of violence on the street, then the civil authorities have a duty to look into that. They would be derelict if they did NOT investigate.
In fact, the police are safest when they have strong civic oversight. In the end, the police's greatest protection is the social contract, which they are meant to enforce. The public support the police because the police protect them. If they endanger the public instead, the social contract breaks down. If ordinary citizens know there's a legitimate grievance process that works, and that they are safe from needless aggression by the police, the cops are safest and most respected. But when those legitimate outlets do not exist, or break down, then people are wrongly tempted to redress by illegitimate means. A breakdown of the social contract leads to unpredictable violence. And that puts everyone in more danger.
cross-posted from Dagblog
The first thing I want to say is that absolutely nothing justifies this. Nothing justifies the murder. And if the murderer committed his crime in the service of any reasonable cause, he has set back that cause tonight. You don't get justice for Eric Garner, or for anyone else, by vigilante revenge. That only makes the problem worse.
We live in a culture where the police (who face real danger in their work) have been taught -- indeed, actively trained -- to be excessively fearful, to the point where some officers will put citizens' lives and safety at substantial risk rather than face some very small risk themselves. (For example, they might apply a chokehold to a suspect who's already being held on every side by several officers, and who can't free either of his arms. What danger that was meant to eliminate is hard to say.) Basically, the logic is that every possible trace of danger should be eliminated, which is impossible. So the effort to eliminate all danger generally means being extremely aggressive in situations that aren't actually that dangerous.
But that culture of fear is not helped by randomly killing police officers. It's fear that's driving the aggression, and the fear is fed by the potential randomness of the danger. Cops being killed without seeing the danger coming translates, in our current atmosphere, into cops being hyper-aggressive in situations of minimal danger because, "You never know."
Killing two cops in an ambush won't break up the mindset that killed Eric Garner, Killing two cops at random feeds the mindset that killed Eric Garner. When the police have been convinced that they could die at any moment, they take crazy and dangerous steps against people who actually pose no threat to them.
But anyone pointing at this murder as justifying that bad and crazy policing is a mistake. This doesn't justify anything. And the NYPD's aggression on the street will not, cannot, protect them from things like this. Saying they need to get tough with non-violent offenders in misdemeanor arrests because they could be killed in an ambush makes no sense. That strategy creates new problems without solving the old one.
The ugly truth is, police officers (like every other human being) are extremely vulnerable to a surprise ambush with a gun. Two police officers were killed this way in Las Vegas this year, gunned down by anti-government nuts while eating lunch. And obviously, the Tsarnaev brothers ambushed and killed an MIT campus cop last year. The method is the same every time: come up behind a police officer and shoot. The attacks in all three cases came out of the blue, and there was nothing that the officers could have done to protect themselves. It's not a question of police tactics. None of those cops had a chance.
But this is where the current approach to police work, the attempt to eliminate any and all potential danger, breaks down. You can never eliminate all danger. You can never even eliminate all mortal danger. Every police officer -- and every police officer's family -- has to live with that small, terrible chance. (I am a police lieutenant's son myself; I know exactly what this feels like.) There is always a little danger that you can't foresee or protect yourself from. But you definitely can't get rid of that uncontrollable danger by getting extra tough when there is no danger. Someone with a gun could always come up behind you. You can't protect yourself from that by choking an unarmed guy who's selling loosies on the street. That only creates more problems.
Least of all should the Mayor of New York, or other people who have legitimately criticized police tactics, be blamed for a crime against police officers. Police work is too important to be shielded from any criticism, and the difference between good and bad police work is much, much too important for bad cops to get a free pass. When police work becomes so recklessly bad that unarmed civilians are getting killed, when the police have actually become a cause of violence on the street, then the civil authorities have a duty to look into that. They would be derelict if they did NOT investigate.
In fact, the police are safest when they have strong civic oversight. In the end, the police's greatest protection is the social contract, which they are meant to enforce. The public support the police because the police protect them. If they endanger the public instead, the social contract breaks down. If ordinary citizens know there's a legitimate grievance process that works, and that they are safe from needless aggression by the police, the cops are safest and most respected. But when those legitimate outlets do not exist, or break down, then people are wrongly tempted to redress by illegitimate means. A breakdown of the social contract leads to unpredictable violence. And that puts everyone in more danger.
cross-posted from Dagblog
Tuesday, November 25, 2014
The Fire This Time
There is one truth that we need to face today, after the grand jury's decision in Ferguson. And that truth is simple. No one can live like this.
No one can live with an arrangement where their sons can be killed with impunity. No one can make their peace with that. No one can accept that. No one can live like this.
You cannot tell someone, "Look, some things are better for you than they have been. And other things are not so bad. Your son may go out for a walk one day and never come home, because he has been shot in the street: even if he is unarmed, even if he is no danger. But other than that one thing life is good." You cannot say that. And that is why people try to say it without putting it in exactly those words. But that is not something you can ask of anyone.
Calling for peace in the aftermath is an insult. There is no peace. If some people are allowed to be shot dead and others are allowed to get away with that, then there is no peace. There is no public safety, because not all of the public are safe. The old saying "No justice, no peace," is relevant here, but it's even starker than that. No peace, no peace.
Pretending that this is not about race does not help. No one would defend a Darren Wilson if they did not think it was about race, because no one would defend Darren Wilson if they really thought he was equally likely to kill their own son or grandson or nephew. If it weren't about race, Darren Wilson would already be under the jail. I do not want to live in a country where the police shoot unarmed teenagers dead. That they only kill unarmed teenagers from a specific ethnic group does not make it better. No one can live like this.
Rationalizing every crime, blaming the victim, does not help. It adds insult to injury. Unarmed teenagers can be shot dead in the street, and the killer will get away with it. Then, to comfort themselves, many people will defame the murdered boy. That is an act of aggression. Killing someone and then claiming it was their own fault does not make it better. It makes it worse. No one can live with that.
Forget the fire next time. Think about the fire this time. No one can live with a system that murders their sons without bothering much for a reason. No one can accept a system that allows that. There can be no peace, no justice, in our country unless that peace and justice extends to all alike. Economic inequality is cruel. Racial inequality is unjust. But inequality in matters of life and death is unbearable. No one will bear it, because no one can. Darren Wilson didn't just kill one unarmed man. He put yet another bullet in the body of the American Way.
cross-posted (and comments welcome) at Dagblog
No one can live with an arrangement where their sons can be killed with impunity. No one can make their peace with that. No one can accept that. No one can live like this.
You cannot tell someone, "Look, some things are better for you than they have been. And other things are not so bad. Your son may go out for a walk one day and never come home, because he has been shot in the street: even if he is unarmed, even if he is no danger. But other than that one thing life is good." You cannot say that. And that is why people try to say it without putting it in exactly those words. But that is not something you can ask of anyone.
Calling for peace in the aftermath is an insult. There is no peace. If some people are allowed to be shot dead and others are allowed to get away with that, then there is no peace. There is no public safety, because not all of the public are safe. The old saying "No justice, no peace," is relevant here, but it's even starker than that. No peace, no peace.
Pretending that this is not about race does not help. No one would defend a Darren Wilson if they did not think it was about race, because no one would defend Darren Wilson if they really thought he was equally likely to kill their own son or grandson or nephew. If it weren't about race, Darren Wilson would already be under the jail. I do not want to live in a country where the police shoot unarmed teenagers dead. That they only kill unarmed teenagers from a specific ethnic group does not make it better. No one can live like this.
Rationalizing every crime, blaming the victim, does not help. It adds insult to injury. Unarmed teenagers can be shot dead in the street, and the killer will get away with it. Then, to comfort themselves, many people will defame the murdered boy. That is an act of aggression. Killing someone and then claiming it was their own fault does not make it better. It makes it worse. No one can live with that.
Forget the fire next time. Think about the fire this time. No one can live with a system that murders their sons without bothering much for a reason. No one can accept a system that allows that. There can be no peace, no justice, in our country unless that peace and justice extends to all alike. Economic inequality is cruel. Racial inequality is unjust. But inequality in matters of life and death is unbearable. No one will bear it, because no one can. Darren Wilson didn't just kill one unarmed man. He put yet another bullet in the body of the American Way.
cross-posted (and comments welcome) at Dagblog
Monday, November 17, 2014
Turning Down the Imaginary Car (Advice from Actors to Academics, Part 2)
I blogged earlier about how the academic job search can be framed like the search for an acting job (where the odds are incredibly steep, rejection is pervasive, and the stakes feel deeply personal). Today's post is a second installment of advice from Robert Cohen's classic Acting Professionally, a very career-specific book of advice that I have found applicable to other careers. Cohen's maxim that "Children are rewarded for being good" while "Adults are rewarded for being useful" has stuck with me and proved invaluable. So has his point about what I will call Turning Down the Imaginary Car, a thing that plagues budding academics as much as would-be actors.
Cohen writes that many acting students begin (or once began), with a fairly naive and juvenile fantasy of acting success leading to vast fame and fortune. Hollywood! Broadway! Ten million dollars a picture! A hundred million fans! Marrying Brad and/or Angelina! And that's perfectly natural. Even people who never set foot on a stage have that Hollywood-star fantasy, and of course people motivated enough to pursue an acting career seriously usually started out with that fantasy. The question is how you mature out of it.
So, Cohen writes, many acting students (and here we're not talking about undergrads, but people in competitive graduate programs) move past that initial fantasy to a point where they say that they could be happy without fame, fortune, and international stardom. They just want a good living in the theater, just steady work in some repertory company. They just want to practice their craft in interesting ways. This looks like a realistic lowering of sights, but in fact it is -- as Cohen points out -- another fantasy. "Just" making a living by acting is winning a huge brass ring. As Cohen puts it:
Turning down the imaginary car disguises itself as a realistic adjustment of expectations, so the person doing it doesn't have to face actual reality. But in fact, it is the form as magical thinking called bargaining: "if I give up daydream A, I will magically be given daydream A-minus." It is a way of conning yourself into thinking that you already deserve something so that you don't have to earn it.
The graduate student/job-seeker version of this is to say, "I don't want a job in the Ivy League. I'd be happy with a job at [Michigan/UCLA/Williams College/an R1 university/on the tenure-track with a 3-3 load/on the tenure-track]. Not aiming for a gold medal doesn't guarantee you a silver or a bronze. In fact, everyone who wins silver or bronze does so by striving like hell for the gold.
You will not get a job because you view that particular job as humbling, or because you view yourself as humble for being willing to accept it. That unglamorous job in an unglamorous location may have "only" 175 other job applications, instead of 300. But that hardly makes it a consolation prize. You may think that you're not asking for much, but hundreds of other people are asking for the same thing as you are, and most of them are at least a deserving as you are.
The most pernicious effect of imagining some jobs, any jobs, as automatic consolation prizes is that it leads you to underestimate those jobs' actual requirements. The most common version of this problem is to lowball the amount of research that a school doing the hiring expects. Telling yourself that you don't need to publish more because you don't want one of the fancy jobs is self-destructive. Telling yourself that the two book reviews you've published should be good enough for a place like Unglamorous State is a huge mistake. The research expectations at every school, from the top to the bottom, have risen steadily over the past decades, and that school you think of as humble doesn't hire people who won't publish enough to make tenure there.
Cohen writes that many acting students begin (or once began), with a fairly naive and juvenile fantasy of acting success leading to vast fame and fortune. Hollywood! Broadway! Ten million dollars a picture! A hundred million fans! Marrying Brad and/or Angelina! And that's perfectly natural. Even people who never set foot on a stage have that Hollywood-star fantasy, and of course people motivated enough to pursue an acting career seriously usually started out with that fantasy. The question is how you mature out of it.
So, Cohen writes, many acting students (and here we're not talking about undergrads, but people in competitive graduate programs) move past that initial fantasy to a point where they say that they could be happy without fame, fortune, and international stardom. They just want a good living in the theater, just steady work in some repertory company. They just want to practice their craft in interesting ways. This looks like a realistic lowering of sights, but in fact it is -- as Cohen points out -- another fantasy. "Just" making a living by acting is winning a huge brass ring. As Cohen puts it:
Too often the actor who "rejects" Hollywood thinks that by dint of that rejection regular repertory work will materialize somewhere else. It is as if scorning an unoffered Mercedes-Benz somehow entitled us to a Honda Civic.
Turning down the imaginary car disguises itself as a realistic adjustment of expectations, so the person doing it doesn't have to face actual reality. But in fact, it is the form as magical thinking called bargaining: "if I give up daydream A, I will magically be given daydream A-minus." It is a way of conning yourself into thinking that you already deserve something so that you don't have to earn it.
The graduate student/job-seeker version of this is to say, "I don't want a job in the Ivy League. I'd be happy with a job at [Michigan/UCLA/Williams College/an R1 university/on the tenure-track with a 3-3 load/on the tenure-track]. Not aiming for a gold medal doesn't guarantee you a silver or a bronze. In fact, everyone who wins silver or bronze does so by striving like hell for the gold.
You will not get a job because you view that particular job as humbling, or because you view yourself as humble for being willing to accept it. That unglamorous job in an unglamorous location may have "only" 175 other job applications, instead of 300. But that hardly makes it a consolation prize. You may think that you're not asking for much, but hundreds of other people are asking for the same thing as you are, and most of them are at least a deserving as you are.
The most pernicious effect of imagining some jobs, any jobs, as automatic consolation prizes is that it leads you to underestimate those jobs' actual requirements. The most common version of this problem is to lowball the amount of research that a school doing the hiring expects. Telling yourself that you don't need to publish more because you don't want one of the fancy jobs is self-destructive. Telling yourself that the two book reviews you've published should be good enough for a place like Unglamorous State is a huge mistake. The research expectations at every school, from the top to the bottom, have risen steadily over the past decades, and that school you think of as humble doesn't hire people who won't publish enough to make tenure there.
In fact, even if the amount of research a university expects you to do for tenure is low, what that means is that some of the people competing for that job will already be close to having enough published to get tenure, maybe more than halfway to the local standard. That's a nice proposition for the hiring committee. If you're really a place that doesn't prioritize research, but (for example), expects two peer-reviewed articles for tenure, and some of the applicants for that job already have two articles ... well if they hire one of those people, the school doesn't have to worry about them publishing enough for tenure. And it doesn't have to make time for them to keep publishing. That beats hiring you without any articles, giving you course releases, and crossing their fingers that you'll get across the finish line.
If you think that you shouldn't need to have publications just to get a job at X State, then you are turning down the imaginary car. The question isn't what you think should be expected of you. It is what your competitors for that job are already offering.
On the flip side, if you're coming from a high-powered PhD program with a load of publications under your belt, and you get a whiff of the big, shiny jobs, that doesn't mean schools further down the prestige chain will be grateful to have you. They're not your consolation prize, either. If you get interviewed by an Ivy that doesn't hire you, that doesn't mean a "lesser" school will be grateful to have you. A school full of big shots might be more willing to hire a promising researcher with less teaching experience, or less experience teaching low-level classes. But when you apply to X State you will be in a pool where other applicants are almost as well-published as you are but have much more teaching experience. Less glamorous jobs are often different jobs, with different demands.
The lesson, which actors long ago had to learn and academics have begun to work the hard way, is that any gig is hard to get, and precious. They all require hard work and good luck. You have to take them all seriously. And if a job doesn't seem flashy enough for you to work hard for, there are people more talented than you are who don't feel that way. It's not about the dream job. Making a living at your calling is living the dream.
cross-posted from, and comments welcome at, Dagblog
Thursday, October 30, 2014
Thinking Like the Plague
The Ebola panic in the American media seems uncannily familiar to me, in the worst possible way. Anyone who studies Renaissance literature for a living has read many accounts of terrible epidemics, and many stories of epidemic hysteria. (In fact, some people have written learned and illuminating books about literary responses to the plague; I can't pretend to be one of them.) Smallpox is a terrible affliction. Bubonic plague is worse. But human responses to those diseases often made them more dangerous, just as today's hysteria about Ebola threatens to make Ebola more dangerous.Of course, dangerous diseases require precautions. But there is a panicked mindset that poses as a defense against the plague but makes it works. It is plaguethink: the plague's herald and accomplice. Even when the disease itself is controllable, plaguethink can lay entire communities to waste.
Plaguethink has two basic precepts:
1. A conviction that nothing can be done to stop the disease.
2. The idea that you can save yourself by abandoning the sick.
From those two ideas often come two emotional responses:
3. A phobic horror of the infected, leading to stigmatization and poor treatment.
and, not often but not always:
4. A tendency to interpret the disease as a carrier of religious or moral meaning.
The important thing to remember is that none of these ideas has EVER been true. There has NEVER been a completely unstoppable and invincible disease. If there had been, we would not be here. There have been terrible, terrible diseases. But the "superbug" is a fantasy. Even before modern medicine was developed (and believe me, European Renaissance medicine could be spectacularly ineffective), there was NEVER been a disease where there was absolutely nothing you could do.
An epidemiologist once told me that most bubonic plague patients were not contagious. (The minority who were contagious were very contagious, but most sufferers were basically not contagious at all.) And most of those people would have recovered and lived if they just got basic nursing care, by which I mean basic 14th-century nursing care: someone to give them food and water and occasionally change their sheets. For most people that didn't happen because people decided, incorrectly, that fighting the disease was hopeless and that abandoning the sick is the way to safety.
Now, the abandon-the-sick idea perverts a common-sense idea (you need to take steps to avoid contagion) into something inhumane and destructive. "Try not to catch the disease" is reasonable. "Save yourself and let the sick die" is something else entirely. Leaving the sick to die alone, and running out of town to keep yourself safe, is unnecessary and unhelpful.
Plaguethink has two basic precepts:
1. A conviction that nothing can be done to stop the disease.
2. The idea that you can save yourself by abandoning the sick.
From those two ideas often come two emotional responses:
3. A phobic horror of the infected, leading to stigmatization and poor treatment.
and, not often but not always:
4. A tendency to interpret the disease as a carrier of religious or moral meaning.
The important thing to remember is that none of these ideas has EVER been true. There has NEVER been a completely unstoppable and invincible disease. If there had been, we would not be here. There have been terrible, terrible diseases. But the "superbug" is a fantasy. Even before modern medicine was developed (and believe me, European Renaissance medicine could be spectacularly ineffective), there was NEVER been a disease where there was absolutely nothing you could do.
An epidemiologist once told me that most bubonic plague patients were not contagious. (The minority who were contagious were very contagious, but most sufferers were basically not contagious at all.) And most of those people would have recovered and lived if they just got basic nursing care, by which I mean basic 14th-century nursing care: someone to give them food and water and occasionally change their sheets. For most people that didn't happen because people decided, incorrectly, that fighting the disease was hopeless and that abandoning the sick is the way to safety.
Now, the abandon-the-sick idea perverts a common-sense idea (you need to take steps to avoid contagion) into something inhumane and destructive. "Try not to catch the disease" is reasonable. "Save yourself and let the sick die" is something else entirely. Leaving the sick to die alone, and running out of town to keep yourself safe, is unnecessary and unhelpful.
It is not even a plan. "Just don't get it yourself" is not a plan, and it will not keep you safe. It leaves the disease unfought, which keeps the disease alive and dangerous. Letting the disease flourish but hoping it stays away from you will NOT work over time. You cannot keep out disease with a wall, or a moat, or a retreat to your country house, or with a border. The disease will get around all of that sooner or later. You cannot keep yourself safe by sacrificing other people to the illness. The outbreak itself has to be defeated, or no one is safe.
I mean, Elizabeth I, who theoretically owned everything she saw unless she went to the beach, actually came down with smallpox. (Her doctors nursed her through it, and she rode it out.) even the most powerful person in the country could not throw up real barriers against contagion.
During outbreaks of bubonic plague, people would leave the sick to die alone in their houses, and abandon that house, or that neighborhood, or simply flee town. What do you think that did? It left the disease alive and kicking, ready for the country-house crowd when they got back. And, well, that epidemic was spread by rats. Deserted neighborhoods full of dead bodies didn't make that problem any better. Plaguethink helped the disease spread. It always has.
During the 1980s, you could hear people talk about AIDS with the same terrified plaguethink. Put all of the infected on an island somewhere! That would be as pointless as it would have been inhumane, but the people who said those crazy things weren't thinking of fighting the disease. They were offering it a sacrifice to appease it.
Today, the voices of plaguethink are roaring on the media every day. Travel ban! Stop the flights! But those measures are counterproductive. They will not stop Ebola. They will let it plague us. You cannot keep out a sub-cellular organism with airport screenings. Of course you can't.
If we actually want to be safe from Ebola, we have to stop the outbreak in West Africa. Letting the outbreak flourish, because we've deluded ourselves that it's hopeless to fight it, will let it remain a danger forever. And the idea that we can't fight the outbreak, which people in the media take for granted, is an obvious lie. Nigeria has contained the outbreak in Nigeria. It can be done. And in this country it really is under control, no matter how it's being spun. The epidemiological forest fire in Liberia needs to be extinguished, and that will require outside help from the United States and Europe. But it can be done, and has to be. Letting Ebola run amok in Western Africa and trying to keep it out of this country is hopeless, especially when you define "keeping it out" as zero cases a year. You wouldn't build a fire-break to keepmamwildfire away from your house but not have anyone fight the wildfire. That. Would eventually fail. So the travel ban, and the stigmatization of health workers who fight Ebola in Africa and come home, is the worst possible thing. It is the 21st-century equivalent of letting rats feed on dead plague victims.
If we actually want to be safe from Ebola, we have to stop the outbreak in West Africa. Letting the outbreak flourish, because we've deluded ourselves that it's hopeless to fight it, will let it remain a danger forever. And the idea that we can't fight the outbreak, which people in the media take for granted, is an obvious lie. Nigeria has contained the outbreak in Nigeria. It can be done. And in this country it really is under control, no matter how it's being spun. The epidemiological forest fire in Liberia needs to be extinguished, and that will require outside help from the United States and Europe. But it can be done, and has to be. Letting Ebola run amok in Western Africa and trying to keep it out of this country is hopeless, especially when you define "keeping it out" as zero cases a year. You wouldn't build a fire-break to keepmamwildfire away from your house but not have anyone fight the wildfire. That. Would eventually fail. So the travel ban, and the stigmatization of health workers who fight Ebola in Africa and come home, is the worst possible thing. It is the 21st-century equivalent of letting rats feed on dead plague victims.
Worst of all is the stigmatization, the ritual humiliation, of health workers who have put themselves at risk. It is a disgraceful instance of brave and mature people being attacked by the childish and terrified. Those health workers are not a danger. They are our best hope. "Quarantining" them in medical tents without a toilet or shower makes no medical sense. (Ebola spreads through body fluids, jerks: anyone who might be carrying it shouldn't be kept away from a toilet.) Worse, it actively takes the disease's side against the health care workers. It is a declaration of unthinking allegiance to the plague.
Ebola needs to be fought. But it is not a terrible god. It is a pest. It cannot be appeased; it can only be fed and allowed to flourish. It is not a messenger of divine or immanent truth. Itis a sub-cellular parasite, a strand of DNA with an adjustment problem. It is not a great danger to the United States. But plaguethink could make it one.
Cross-posted from dagblog
Wednesday, October 15, 2014
Career Advice from Actors to Academics
It's that cruelest of seasons again for young scholars: job search season. In an annual fall ritual I've discussed in previous years, the list of jobs for new professors beginning next fall has recently been published, and people who want those jobs are now laboring over complicated job applications. As has been the case for many years, and especially since the Great Recession began, there are far fewer jobs than there are talented and qualified applicants. A job in the humanities typically gets more than a hundred or two hundred applications (sometimes more than three or four hundred), while there are only a few dozen job openings across the country in an individual's field. (If I were starting out looking for my first teaching job today, there would be only 22 jobs I could apply to in the US; by the end of November that number might swell to 30.) What this means is that no one, no matter how gifted and deserving, gets an assistant professorship without a whole lot of good luck. Talent isn't enough. Hard work isn't enough. Merit isn't enough. There are many, many more talented, hard-working and meritorious people than there are jobs. You have to be talented AND hard-working AND lucky. In other words, getting a job in the academy has become like getting a job in the theater, and it needs to be approached in the same way.
The best book of career advice I've ever read, hands down, is Robert Cohen's classic Acting Professionally, which I first read in my teens. I still have a copy, because some of its advice it turns out to be applicable to things outside acting. It's especially relevant to the strange little world of academia, which is like the strange world of the theater in that work is incredibly scarce, rejection is pervasive, and success or failure can feel like a judgment of you as a person. And much of the book is devoted to explaining how terribly hard it is, very much in the manner of today's "don't go to grad school unless you know the facts" talk in a faculty office. (In the edition I read in the 1980s, Cohen cautions aspiring actors that they might literally be one of a hundred people up for a single acting job. Oh, Bob. If only it were that easy.) Somewhere along the line, teaching college turned into an arts job, like acting or sculpting. That's not good, but for now it's the reality, and it has to be dealt with.
Children are (or should be) rewarded because they deserve rewards. Learn your algebra, get your A. But adults are hired because they are useful to their employers. The question is not what the job-seeker deserves. It is what the employer needs. Abstract merit is less important than how an applicant fits the needs of a particular job. For "children" we could read "students" and "adults" we could read "professionals." What you do as a student is about you. What other people hire you to do is ultimately about them.
If the two best actors who show up at an audition are both competing for the same role, only one of those actors is likely to get hired, because they can't necessarily fit other parts. Say three brilliant twenty-something actresses all try out for the romantic lead, and any one of them would be great. In fact, all three are better actors, in terms of overall talent and skill, than anyone who tries out for any of the other roles. You can't cast the runner-up for the female romantic lead as the seventy-year-old grandfather, even if she's a much "better" actor than all of the older men who've auditioned. The producers will cast the best grandfather-type as the grandfather, and the best ingenue as the ingenue. Likewise, if three brilliant old stage veterans turn up to read for the grandfather, and the best actress reading for the ingenue role is just okay, the just-okay actress will get hired and two of the silver-haired virtuosos won't. The actors who don't get hired deserve jobs. The show just can't use them.
In the same way, there are brilliant character actors who make a living as supporting players in big Hollywood movies. (And there are many other brilliant character actors who don't make a living at all.) You can often see those brilliant actors playing opposite leading actors who are less talented .... sometimes much less talented. The actor playing the villain or the sidekick may be a far better actor, as an actor, than the leading man. But the film would almost certainly be a flop if the character actor were put in the lead. ("Stanley Tucci is ... Batman.") Yes, there are always exception. But they're exceptions. And while I might pay good money to see Nathan Lane as The Mighty Thor, most people wouldn't. Some actors are more useful in supporting roles. Others are playing the lead role or not getting a part at all.
In the same way, academic jobs are about a variety of different needs, and something that helps your chances for one job might hurt your chances for another. This is not because those things are good or bad, but because they make you more or less useful for that specific job. Jobs require different balances of teaching and research. They require different kinds of teaching. Some jobs want to hire someone to cover an entire specialty by her- or himself, and prize breadth. Some are hiring someone to join an existing group of specialists, and may be looking for people who complement the existing faculty members, or for people who would be especially good collaborators with them. (Some departments want the new person to bring something new to the table. Some are trying to build up a critical mass of people doing overlapping work.) And here's the thing: all of these questions can work for or against you no matter what you do. Teaching lots of beginning classes might help you get a job where you'll teach those classes, but not to get a job where you'd only teach advanced courses. Doing research that overlaps a potential colleagues can sink your application ("Do we need another person doing Shakespeare and Renaissance science?") or move it to the top of the pile ("We want to become a center for studying Renaissance literature and science."). This is about their needs, not your merit.
Many small liberal-arts colleges favor applicants who went to small liberal-arts colleges themselves. The thinking is that alumni of small colleges have a feel for the kind of community experience that those schools work to provide, and that it sometimes takes people who were undergrads at big research universities a longer time to grasp what a place like Williams or Carleton is about. They don't think that people who went to small colleges are better or smarter than people who went to big universities. Arguing that Yale is harder to get into than Williams is beside the point. Small-college graduates aren't necessarily better than Ivy League graduates, but they bring something to the table that hiring committees see as useful.
So what to do with this lesson? Two things. The first is only psychological, but it's crucial: do NOT read the academic job market as a reflection of your professional worth. It is not that. It cannot be that. It does not judge your merit, but only your usefulness, and your usefulness to any particular employer is highly circumstantial.
When hiring committees talk about "fit" this is what they mean: your usefulness within the idiosyncratic terms of a given job. Some job seekers have taken a great dislike to the term "fit," which they see as not helpful. But what "fit" means is: it's not about you. Instead of being angry with that, take it as permission not to beat yourself up.
The second application of the rewarded-for-being-useful lesson is to the job market itself. As far as is within your power, you should craft your job materials to appeal to the demands of the particular job. And as far as is within your power, you should direct your professional energies toward the activities that qualify you for the kind of job you want.
There are limits to this. You should never say explicitly, "I think I meet your needs in X and Y way." They know their needs better than you do, and don't need to be told. And, as Flavia points out, the academic job letter is a fairly constrained genre whose limits you should definitely not break. But what you emphasize should generally be things that suit you for THAT job. If you are applying for a job teaching English literature at a place where you won't be expected to teach composition, that one 200-level literature section you once taught is at least as important as the fifteen sections of composition you've taught over the past four years. If you're applying for a job where half your teaching load would be comp, you should give your composition experience more play. If you were an actor going on auditions, you'd bring a prepared monologue that fit your skills, but also fit the part you were auditioning for. If you're auditioning for the funny best friend in a Wendy Wasserstein play, you don't give them your all-time-most-favorite monologue from Miss Julie. You don't give them a Neil Simon monologue if you're auditioning for Iago. Apply to the job they're offering.
In the longer term, if you want to get a certain kind of job, you should work to qualify yourself for those jobs in specific ways. This is easier said than done early in your career, when you don't necessarily get to choose teaching assignments and when you need to keep the wolf from the door. And qualifying for a job that already has a flood of qualified and over-qualified applicants doesn't guarantee you that job. It just allows you to get your application in past the first round of review, so that luck, fit, and other unpredictable forces can come into play. (If you can act but you can't sing or dance, no amount of luck will get you cast in a musical. If you're a great teacher with no publications, no amount of luck will get you a job at a research university.)
If you've taught a lot of intro-level courses, look for a chance to teach a more advanced class. That is a meaningful improvement to your CV. If you want a job in a department with a doctoral program, you should try to publish something in one of the top journals in your subfield; those departments will eventually evaluate you on your scholarly reputation as well as your productivity, so you need to show the hiring committee that you can publish in the influential, highly competitive venues. For those schools two or three things published in less selective journals do not add up to one article published in a flagship. If you'd be happier with a job where research is a smaller part of the mix, and where your scholarship will be counted more quantitatively, then two articles add up to more than one fancy article. The strategy there would be to focus on places where you can have your article accepted more quickly, and journals with higher acceptance rates. None of this guarantees you anything. (It goes both ways; if the stress of submitting to a journal with a tiny acceptance rate and inscrutable requests for revision makes you too crazy, then a research-intensive job will also bring miserable stress.) None of these things are easy to do. And none guarantee you anything. But you are not completely powerless. You have useful skills, and there are ways to increase your odds.
cross-posted from, and comments welcome at, Dagblog.
The best book of career advice I've ever read, hands down, is Robert Cohen's classic Acting Professionally, which I first read in my teens. I still have a copy, because some of its advice it turns out to be applicable to things outside acting. It's especially relevant to the strange little world of academia, which is like the strange world of the theater in that work is incredibly scarce, rejection is pervasive, and success or failure can feel like a judgment of you as a person. And much of the book is devoted to explaining how terribly hard it is, very much in the manner of today's "don't go to grad school unless you know the facts" talk in a faculty office. (In the edition I read in the 1980s, Cohen cautions aspiring actors that they might literally be one of a hundred people up for a single acting job. Oh, Bob. If only it were that easy.) Somewhere along the line, teaching college turned into an arts job, like acting or sculpting. That's not good, but for now it's the reality, and it has to be dealt with.
So, with the indulgence of my fellow Dagbloggers, I'd like to devote this post (and maybe two or three more) to sharing Cohen's lessons with younger academics.
The first thing to make clear is that advice is not enough. You can get the best advice possible, and follow it, and still not get a job. In this way, being an academic job-seeker is exactly like being an actor. Good advice is not always enough, because doing everything right is not always enough. Some career advice to academic job-seekers is offered, or taken, in the spirit of telling job-seekers that they will get a job if they do the right things; but there's no way to promise that. Advice isn't sufficient, but it's still necessary. It keeps you from taking yourself out of the running.
If you want to be in movies, you basically have to move to Los Angeles, where the casting happens. Moving to LA won't get you a job in movies; far from it. But if you don't move to LA, you won't have a Hollywood career. If you don't move to New York, you won't be on Broadway. If you want to be a working actor, you need a set of recent, professional headshots. The best set of headshots in the world won't get you work on its own. But not having those photos to give casting directors will ENSURE that you don't get work. So will getting amateur headshots that one of your friends took with a smartphone, or using old photos that show you with a hairline or waistline that you haven't had for five years. In the same way, the most immaculately prepared job materials won't get you a job, but careless or unprofessional job materials will make sure that you never get one. Having an article, or even two articles, accepted for publication in good journals won't guarantee you a job, because most applicants for most jobs will also have a publication or two. But if all the serious applicants for a job have those publications and you don't, you are not a serious applicant for that job.
The most important piece of advice Cohen gives, which has stayed with me for decades, is this:
Children are rewarded for being good. Adults are rewarded for being useful.
Children are (or should be) rewarded because they deserve rewards. Learn your algebra, get your A. But adults are hired because they are useful to their employers. The question is not what the job-seeker deserves. It is what the employer needs. Abstract merit is less important than how an applicant fits the needs of a particular job. For "children" we could read "students" and "adults" we could read "professionals." What you do as a student is about you. What other people hire you to do is ultimately about them.
If the two best actors who show up at an audition are both competing for the same role, only one of those actors is likely to get hired, because they can't necessarily fit other parts. Say three brilliant twenty-something actresses all try out for the romantic lead, and any one of them would be great. In fact, all three are better actors, in terms of overall talent and skill, than anyone who tries out for any of the other roles. You can't cast the runner-up for the female romantic lead as the seventy-year-old grandfather, even if she's a much "better" actor than all of the older men who've auditioned. The producers will cast the best grandfather-type as the grandfather, and the best ingenue as the ingenue. Likewise, if three brilliant old stage veterans turn up to read for the grandfather, and the best actress reading for the ingenue role is just okay, the just-okay actress will get hired and two of the silver-haired virtuosos won't. The actors who don't get hired deserve jobs. The show just can't use them.
In the same way, there are brilliant character actors who make a living as supporting players in big Hollywood movies. (And there are many other brilliant character actors who don't make a living at all.) You can often see those brilliant actors playing opposite leading actors who are less talented .... sometimes much less talented. The actor playing the villain or the sidekick may be a far better actor, as an actor, than the leading man. But the film would almost certainly be a flop if the character actor were put in the lead. ("Stanley Tucci is ... Batman.") Yes, there are always exception. But they're exceptions. And while I might pay good money to see Nathan Lane as The Mighty Thor, most people wouldn't. Some actors are more useful in supporting roles. Others are playing the lead role or not getting a part at all.
In the same way, academic jobs are about a variety of different needs, and something that helps your chances for one job might hurt your chances for another. This is not because those things are good or bad, but because they make you more or less useful for that specific job. Jobs require different balances of teaching and research. They require different kinds of teaching. Some jobs want to hire someone to cover an entire specialty by her- or himself, and prize breadth. Some are hiring someone to join an existing group of specialists, and may be looking for people who complement the existing faculty members, or for people who would be especially good collaborators with them. (Some departments want the new person to bring something new to the table. Some are trying to build up a critical mass of people doing overlapping work.) And here's the thing: all of these questions can work for or against you no matter what you do. Teaching lots of beginning classes might help you get a job where you'll teach those classes, but not to get a job where you'd only teach advanced courses. Doing research that overlaps a potential colleagues can sink your application ("Do we need another person doing Shakespeare and Renaissance science?") or move it to the top of the pile ("We want to become a center for studying Renaissance literature and science."). This is about their needs, not your merit.
Many small liberal-arts colleges favor applicants who went to small liberal-arts colleges themselves. The thinking is that alumni of small colleges have a feel for the kind of community experience that those schools work to provide, and that it sometimes takes people who were undergrads at big research universities a longer time to grasp what a place like Williams or Carleton is about. They don't think that people who went to small colleges are better or smarter than people who went to big universities. Arguing that Yale is harder to get into than Williams is beside the point. Small-college graduates aren't necessarily better than Ivy League graduates, but they bring something to the table that hiring committees see as useful.
So what to do with this lesson? Two things. The first is only psychological, but it's crucial: do NOT read the academic job market as a reflection of your professional worth. It is not that. It cannot be that. It does not judge your merit, but only your usefulness, and your usefulness to any particular employer is highly circumstantial.
When hiring committees talk about "fit" this is what they mean: your usefulness within the idiosyncratic terms of a given job. Some job seekers have taken a great dislike to the term "fit," which they see as not helpful. But what "fit" means is: it's not about you. Instead of being angry with that, take it as permission not to beat yourself up.
The second application of the rewarded-for-being-useful lesson is to the job market itself. As far as is within your power, you should craft your job materials to appeal to the demands of the particular job. And as far as is within your power, you should direct your professional energies toward the activities that qualify you for the kind of job you want.
There are limits to this. You should never say explicitly, "I think I meet your needs in X and Y way." They know their needs better than you do, and don't need to be told. And, as Flavia points out, the academic job letter is a fairly constrained genre whose limits you should definitely not break. But what you emphasize should generally be things that suit you for THAT job. If you are applying for a job teaching English literature at a place where you won't be expected to teach composition, that one 200-level literature section you once taught is at least as important as the fifteen sections of composition you've taught over the past four years. If you're applying for a job where half your teaching load would be comp, you should give your composition experience more play. If you were an actor going on auditions, you'd bring a prepared monologue that fit your skills, but also fit the part you were auditioning for. If you're auditioning for the funny best friend in a Wendy Wasserstein play, you don't give them your all-time-most-favorite monologue from Miss Julie. You don't give them a Neil Simon monologue if you're auditioning for Iago. Apply to the job they're offering.
In the longer term, if you want to get a certain kind of job, you should work to qualify yourself for those jobs in specific ways. This is easier said than done early in your career, when you don't necessarily get to choose teaching assignments and when you need to keep the wolf from the door. And qualifying for a job that already has a flood of qualified and over-qualified applicants doesn't guarantee you that job. It just allows you to get your application in past the first round of review, so that luck, fit, and other unpredictable forces can come into play. (If you can act but you can't sing or dance, no amount of luck will get you cast in a musical. If you're a great teacher with no publications, no amount of luck will get you a job at a research university.)
If you've taught a lot of intro-level courses, look for a chance to teach a more advanced class. That is a meaningful improvement to your CV. If you want a job in a department with a doctoral program, you should try to publish something in one of the top journals in your subfield; those departments will eventually evaluate you on your scholarly reputation as well as your productivity, so you need to show the hiring committee that you can publish in the influential, highly competitive venues. For those schools two or three things published in less selective journals do not add up to one article published in a flagship. If you'd be happier with a job where research is a smaller part of the mix, and where your scholarship will be counted more quantitatively, then two articles add up to more than one fancy article. The strategy there would be to focus on places where you can have your article accepted more quickly, and journals with higher acceptance rates. None of this guarantees you anything. (It goes both ways; if the stress of submitting to a journal with a tiny acceptance rate and inscrutable requests for revision makes you too crazy, then a research-intensive job will also bring miserable stress.) None of these things are easy to do. And none guarantee you anything. But you are not completely powerless. You have useful skills, and there are ways to increase your odds.
cross-posted from, and comments welcome at, Dagblog.
Monday, October 13, 2014
Stop Worrying About Ebola
Hi. I'm at Logan Airport in Boston. Unfortunately, CNN is on in the departure lounge. They are raving (indeed, nearly foaming at the mouth) about Ebola. And it seems, according to CNN, that the CDC has quarantined a plane from Liberia where some passengers have fallen ill. They have quarantined that plane here at, well, Boston's Logan Airport.
Should you be worried about Ebola? Let's put it this way: should I be worried about Ebola. No, and no.
There may be active patients elsewhere in the airport where I am sitting. I am in no danger. And unless you're actively nursing an infected victim, neither are you.
Now, STOP WORRYING ABOUT EBOLA. Seriously. Worry about texting and driving, because that is much more likely to kill you. Ebola is a genuine medical problem and it requires a response, but people sitting safely in the United States should not be freaking out about it. Really. Really, really.
This has been a message from the real world, Boston Logan Airport division. Thanks.
Cross-posted from Dagblog
Thursday, September 18, 2014
Who Lost Scotland?
Today Scotland votes on independence: a fifty-fifty referendum on leaving the United Kingdom. It's gone from a long shot to a statistical dead heat, and nobody can say for sure how the vote will go. But what's certain is that Scotland's old relationship with the rest of Britain is finished. The Scottish independence movement will not just go away if they come up a couple percent short; they're never going to give up now that they've gotten this close. And if a united United Kingdom squeaks by, Scotland will expect to be given much more autonomy than it's had so far. In fact, this week the leaders of all three major parties have had to promise them that autonomy. So no matter how the vote goes, it's fair to say that David Cameron and his Conservative Party have managed to lose Scotland. They should pay a price for that.
The format of the vote is Cameron's fault. Cameron insisted that the most popular middle-ground option, so-called "max devo" or maximum devolution, which would have kept Scotland inside the United Kingdom but given it more power over its own affairs, be kept OFF the ballot. He made sure that it was an all-or-nothing vote: accept the status quo or leave the nation entirely.
I'm sure Cameron viewed this as masterful strategy: getting what he wanted by allowing no other workable option. You can choose between having it David Cameron's way and having this delicious shit sandwich. But it's backfired. Given a choice between a radical break and Cameron's status quo, many Scots would clearly prefer a radical break. Some of the most persuasive arguments I've heard for a "Yes" vote on independence have been from people who said that what they really wanted was max devo, and that they were given no choice.
Pro tip to David Cameron: when people would rather eat a shit sandwich than spend time in your company, you're in no position to play the tough guy.
Now, of course, the danger of secession is so high that Cameron has had to troop up to Scotland with the Labour and Liberal party leaders and promise something close to max devo anyway. But many Yes voters hear that as an empty promise. For good reason, too: there are no specifics about what these "new powers for Scotland" would mean, and it's a promise to do something the voters want if the voters agree to give up all their leverage first. A promise like that isn't worth the paper it's not written on.
On the other hand, if No squeaks by, Cameron is in the position of having more or less promised to give Scotland the thing that he didn't want to give them and that he made sure was not on the ballot. So instead of exactly what he wants or an unpalatable alternative, he now faces a choice between exactly what he doesn't want and an unpalatable alternative. It's a kind of strategic masterpiece, carefully orchestrating his own defeat. It's a shit sandwich David Cameron prepared for himself, with his own two hands.
Now, most of the Scottish voters are far to Cameron's left, and he may think his Conservatives will gain politically if a whole region of Labour voters leave the country. But that's almost the definition of short-sightedness, and Conservatives who collude, even indirectly, in the breakup of the United Kingdom have failed at everything their party stands for. No one will admire a Conservative Party that allowed the dissolution of Great Britain. How could they? Churchill famously said that he hadn't become Prime Minister to preside over the dissolution of the British Empire. David Cameron now risks being the Tory MP who presided over the dissolution of Britain itself.
I'll admit that in my heart I'm hoping for a No, and a continued Great Britain. That's not because I'm a great Anglophile. (I'm from Boston, after all, where declaring independence from Britain is considered a heroic tradition.) But history, as I best understand it, suggests that Scotland will be dominated by its larger, wealthier southern neighbor no matter what, simply because that neighbor is larger and wealthier. Union, on balance, probably allows Scotland better terms in that relationship.
Remember how England took over Scotland: the King of Scotland inherited the English throne. After many decades of anxiety that the King of England would somehow get the Scottish throne and take over the country, the reverse happened. The King of Scotland took over England, and that ultimately put Scotland under England's power. For the last four hundred and eleven years, captive England has led conquering Scotland in chains, because the fundamental power difference is about things that no treaty can change. It's political gravity: the smaller country falls into the larger one's orbit. That underlying fact won't change with today's vote. But the strength of England's hold on Scotland will, win or lose.
cross-posted at Dagblog
The format of the vote is Cameron's fault. Cameron insisted that the most popular middle-ground option, so-called "max devo" or maximum devolution, which would have kept Scotland inside the United Kingdom but given it more power over its own affairs, be kept OFF the ballot. He made sure that it was an all-or-nothing vote: accept the status quo or leave the nation entirely.
I'm sure Cameron viewed this as masterful strategy: getting what he wanted by allowing no other workable option. You can choose between having it David Cameron's way and having this delicious shit sandwich. But it's backfired. Given a choice between a radical break and Cameron's status quo, many Scots would clearly prefer a radical break. Some of the most persuasive arguments I've heard for a "Yes" vote on independence have been from people who said that what they really wanted was max devo, and that they were given no choice.
Pro tip to David Cameron: when people would rather eat a shit sandwich than spend time in your company, you're in no position to play the tough guy.
Now, of course, the danger of secession is so high that Cameron has had to troop up to Scotland with the Labour and Liberal party leaders and promise something close to max devo anyway. But many Yes voters hear that as an empty promise. For good reason, too: there are no specifics about what these "new powers for Scotland" would mean, and it's a promise to do something the voters want if the voters agree to give up all their leverage first. A promise like that isn't worth the paper it's not written on.
On the other hand, if No squeaks by, Cameron is in the position of having more or less promised to give Scotland the thing that he didn't want to give them and that he made sure was not on the ballot. So instead of exactly what he wants or an unpalatable alternative, he now faces a choice between exactly what he doesn't want and an unpalatable alternative. It's a kind of strategic masterpiece, carefully orchestrating his own defeat. It's a shit sandwich David Cameron prepared for himself, with his own two hands.
Now, most of the Scottish voters are far to Cameron's left, and he may think his Conservatives will gain politically if a whole region of Labour voters leave the country. But that's almost the definition of short-sightedness, and Conservatives who collude, even indirectly, in the breakup of the United Kingdom have failed at everything their party stands for. No one will admire a Conservative Party that allowed the dissolution of Great Britain. How could they? Churchill famously said that he hadn't become Prime Minister to preside over the dissolution of the British Empire. David Cameron now risks being the Tory MP who presided over the dissolution of Britain itself.
I'll admit that in my heart I'm hoping for a No, and a continued Great Britain. That's not because I'm a great Anglophile. (I'm from Boston, after all, where declaring independence from Britain is considered a heroic tradition.) But history, as I best understand it, suggests that Scotland will be dominated by its larger, wealthier southern neighbor no matter what, simply because that neighbor is larger and wealthier. Union, on balance, probably allows Scotland better terms in that relationship.
Remember how England took over Scotland: the King of Scotland inherited the English throne. After many decades of anxiety that the King of England would somehow get the Scottish throne and take over the country, the reverse happened. The King of Scotland took over England, and that ultimately put Scotland under England's power. For the last four hundred and eleven years, captive England has led conquering Scotland in chains, because the fundamental power difference is about things that no treaty can change. It's political gravity: the smaller country falls into the larger one's orbit. That underlying fact won't change with today's vote. But the strength of England's hold on Scotland will, win or lose.
cross-posted at Dagblog
Thursday, September 11, 2014
Obama's Mission
Barack Obama was elected because the American people were tired of being bogged down in unwinnable foreign wars. He was elected because a majority of American voters had come to view the Iraq war as a mistake. This is a basic, bottom-line political fact. Therefore, it is not (and cannot be) Official Beltway Wisdom.
Obama also had a mandate to save the country after the economic crash. And he had some mandate to fix health care, which he had campaigned on doing, although this was not nearly as important as he thought. A lot of Obama's early political problems can be ascribed to the fact that he overestimated how much the country cared about health care and underestimated how much the country cared about financial reform and getting the troops home from Iraq. He would have been better served with bolder steps on the economy and a quicker timetable to get out of Iraq and Afghanistan. But even when he has misunderstood the voters' exact priorities at a particular moment, the voters' priorities have been real.
President Obama's address to the nation Wednesday night shows that he still remembers his mission. We're going into Syria to fight ISIS, but only with an air campaign and not with ground troops. Obama was immediately criticized by various talking heads and political opponents (in fact, was criticized even sooner than immediately, because the complaining started in advance of the speech) that Obama ought to commit ground troops, or not rule out committing ground troops, right away. They complained that Obama needs to be Serious, which means putting American soldiers and Marines in harm's way. But the American people made Barack Obama President specifically so he would not send troops to this kind of war. He is carrying out the mission we gave him.
There's been a lot of criticism in Washington about Obama's strategic maxim "Don't do stupid stuff." Hillary Clinton, who would be President if she had not voted to let George W. Bush do stupid stuff, has joined the criticism. But all this wise Washingtonians miss the basic fact. Obama was elected to keep the country from doing stupid stuff. And most of what passes for strategic wisdom in Washington these days is pretty stupid.
Committing ground troops into Syria is stupid. It is not even remotely a strategy. Sending our troops into a war zone with no plan for getting them out, or even a picture of what victory would look like, is not strategy but stupidity. And we've already lost too many American lives to stupidity like that.
People who want to invade Syria argue that supporting the moderate rebels is not enough, because the moderate Syrian rebels are not strong enough to win. Let me point out that if there is no existing force on the ground in Syria strong enough to beat ISIS even with our air support, then there is no force on the ground for us to hand Syria over to when our troops leave. It is the same problem as Iraq and Afghanistan. Going in with ground troops means going into a situation that will collapse again shortly after our ground troops leave. Staying in Syria until Syria is stabilized means occupying Syria forever.
If we don't have an ally that can win without our ground troops, then we don't have an exit strategy for our ground troops. Don't do stupid stuff.
More importantly, don't get American soldiers and Marines killed doing stupid stuff. That is our Commander in Chief's mission. Let him do it.
cross-posted at Dagblog
Obama also had a mandate to save the country after the economic crash. And he had some mandate to fix health care, which he had campaigned on doing, although this was not nearly as important as he thought. A lot of Obama's early political problems can be ascribed to the fact that he overestimated how much the country cared about health care and underestimated how much the country cared about financial reform and getting the troops home from Iraq. He would have been better served with bolder steps on the economy and a quicker timetable to get out of Iraq and Afghanistan. But even when he has misunderstood the voters' exact priorities at a particular moment, the voters' priorities have been real.
President Obama's address to the nation Wednesday night shows that he still remembers his mission. We're going into Syria to fight ISIS, but only with an air campaign and not with ground troops. Obama was immediately criticized by various talking heads and political opponents (in fact, was criticized even sooner than immediately, because the complaining started in advance of the speech) that Obama ought to commit ground troops, or not rule out committing ground troops, right away. They complained that Obama needs to be Serious, which means putting American soldiers and Marines in harm's way. But the American people made Barack Obama President specifically so he would not send troops to this kind of war. He is carrying out the mission we gave him.
There's been a lot of criticism in Washington about Obama's strategic maxim "Don't do stupid stuff." Hillary Clinton, who would be President if she had not voted to let George W. Bush do stupid stuff, has joined the criticism. But all this wise Washingtonians miss the basic fact. Obama was elected to keep the country from doing stupid stuff. And most of what passes for strategic wisdom in Washington these days is pretty stupid.
Committing ground troops into Syria is stupid. It is not even remotely a strategy. Sending our troops into a war zone with no plan for getting them out, or even a picture of what victory would look like, is not strategy but stupidity. And we've already lost too many American lives to stupidity like that.
People who want to invade Syria argue that supporting the moderate rebels is not enough, because the moderate Syrian rebels are not strong enough to win. Let me point out that if there is no existing force on the ground in Syria strong enough to beat ISIS even with our air support, then there is no force on the ground for us to hand Syria over to when our troops leave. It is the same problem as Iraq and Afghanistan. Going in with ground troops means going into a situation that will collapse again shortly after our ground troops leave. Staying in Syria until Syria is stabilized means occupying Syria forever.
If we don't have an ally that can win without our ground troops, then we don't have an exit strategy for our ground troops. Don't do stupid stuff.
More importantly, don't get American soldiers and Marines killed doing stupid stuff. That is our Commander in Chief's mission. Let him do it.
cross-posted at Dagblog
Monday, August 18, 2014
Let's Review the Michael Brown Case
Let's review some basics from the Michael Brown case:
If a police office kills an unarmed person for jaywalking, that is murder.
If a police officer kills an unarmed person for shoplifting five bucks' worth of cigars, that is murder.
If a police officer kills an unarmed person who had smoked marijuana sometime that week, that is murder.
If a police officer kills an unarmed person who turns out to have wanted to be a rapper, that is murder.
If a police officer kills an unarmed person who has given the police officer some lip, that is murder.
If a police officer kills an unarmed person who is running away from him, that is murder.
If a police officer kills an unarmed person who tried and failed to get the officer's gun before running away, that is murder.
I think you might detect a pattern here. The point is that killing someone who is not a clear (as in obvious) and present (meaning immediate) danger to someone else's life and safety is murder.
No one has suggested anything close to that kind of situation. The Ferguson Police Chief, who will clearly do everything and anything in his power to make excuses for his officer, has not been able to say that the shooter was in danger of his life. And there is no other excuse.
Can I imagine circumstances in which a police officer might use deadly force? You bet I can. But I don't even need to. I was raised by a police officer from a police family. I grew up around lots of police officers. And I do know a police officer who has killed someone in the line of duty (or rather, who was among the officers who killed someone in the line of duty; I don't think any of them want to know who fired the fatal bullet.) Why did they do it? Because a suspect was shooting at them and trying to kill them.
That is what what we're talking about. That is justification for using your weapon. None of this other stuff is even on the same planet as a real reason.
Almost every day we hear some fresh "revelation" about the young man killed by the police in Ferguson. Every day that revelation is offered up as if it changes the question of whether his murder was justified. And every day that revelation is utterly ridiculous. It says nothing about the real questions. It does say a lot about the moral compass of the person bringing it up.
If you're discussing an unarmed and completely defenseless man being shot to death and you bring up five dollars worth of stolen cigars, what you are saying is that you are too morally depraved, your moral judgment too impaired, to understand the value of human life.
If you bring up marijuana residue or rap music, same thing. You have announced your idiocy and depravity for all to hear. And you have insulted your listeners by presuming that they too were moral idiots.
(Remember the Eighties, when you kept hearing stories about how young black gang members were so morally bankrupt that they would shoot someone to death for a pair of sneakers? Shooting someone for a hundred-dollar pair of shoes would mean your moral compass was broken. But what would shooting someone over five dollars of merchandise mean?)
Mike Brown was endowed by his creator with certain inalienable rights, among them life, liberty, and the pursuit of happiness. All three were taken away from him on the street, with no process of any kind, by a paid officer of the law.
Michael Brown had a right to due process. He had a right to his life. There are no other questions. Whether or not you would have liked Mike Brown is not the issue. Whether or not you approved of Mike Brown is not the issue. Mike Brown's right to his life was not conditional on your approval, or mine, or any government authority's. He could only forfeit that right by endangering another life, and even then only while he posed an active danger. But Mike Brown was no danger to any living soul when he was killed. He had nothing in his hands but his own life. That was given to him by God. It was not for anyone else to take.
If you ask yourself whether or not Mike Brown deserved life, you are a lost soul. No one has set you to judge who should live and die. No one will and no one should. Mike Brown was a citizen like you, a human being like you. His rights are not subject to your little moods. If you will not defend his right to live, then you are no longer a citizen. I leave the question of your humanity to another judge.
Tuesday, August 12, 2014
Robin Williams and Making Live Comedy Live
Robin Williams was funny, lightning fast, and a gifted improviser, but what really set him apart as a comic was that he let his audiences share the experience of what doing standup comedy feels like. He didn't do that explicitly. It probably can't be done explicitly. But he did it, maybe better than anyone else ever has. It was the core of his gift, because a great comedian is not merely funny. A great comedian creates a relationship with the audience, and the relationship Williams created with his live audiences was something fundamental and profound.
Performing standup is a frightening and disorienting thing, even for pros. Standups talk about their art form as analogous to boxing, saying that if you don't stay in training you can't -- don't dare -- get into the ring. A live performance is always in danger of spinning out of control. You can lose the audience in a split second. Any comedian who's performed enough to learn even the basics of the craft has had the experience of bombing out in front of a live audience -- dying, as comics always put it -- dozens and dozens of times. An extremely original comedian has died even more often. That is a miserable experience. And even experienced pros, even stars, still sometimes have a performance come totally unglued. They have all learned to keep that from happening by maintaining firm control of the performance at all times.
Comics learn, gradually and painstakingly, to conceal their fear and anxiety from the audience. And it is right that they should. Watching a comedian fail on stage is depressing and embarrassing, without any hope of insight or catharsis. Comedians do their best to shield themselves from that public humiliation. They learn to project confidence to the crowd and to keep their failures of confidence hidden. The audience should never catch any scent of flop sweat, no whiff of the performers' insecurities or fears of humiliation. The art form, like every art form, works best when it is grounded in emotional truth, but creating comedy requires concealing the emotional truth of how creating comedy feels.
Williams was absolutely in control of the room. Audiences ate out of his hand. Watching him was nothing like watching an open-mike novice falling apart. But watching him live, when he first emerged on the standup scene in the late 1970s, was also a bewildering and disorienting experience. The speed at which he changed direction, leaping from one bit to another and then back, was then something totally new and unexpected. People were often under the impression that his entire act, every single word, was improvised. (Of course, it wasn't.) It can be hard to remember, thirty-five and nearly forty years after Williams emerged, how radically new he seemed. But he did. It was like he was free associating at lightning speed.
Robin Williams wasn't the first comic to improvise on stage. He wasn't the first to do strange or emotionally raw material on stage. To be honest, his success was never about the material per se; there were much better joke-writers in his generation. And he was definitely not the only 1970s comic disguising his act's formal structure; that had been going on for decades. But what Williams's performances did was turn the basic relationship of live stand-up inside out. His disorienting speed and rapid changes of direction created an exhilarating and slightly scary experience for the crowd. They became the ones who had to live with their fears and accept that the room was out of their control. But they also got to feel the energy of that, too, the nervous excitement that performers channel into their stage act. Watching Robin Williams in person was basically sharing his performance-night adrenaline high.
One thing this did was free Williams to admit his own anxieties, the worry driving the comedy, without relinquishing control. If you listen carefully to his classic Live at the Met album, the phrase you will hear him say most is "Oh, no!" He says it dozens of times in that set, as a segue, as a punctuation mark, as a space-holder to cover an audience laugh. But what he is saying, over and over, is still, "Oh, no!" (The second most common phrase is "Don't you see?") During Williams's first appearance on The Tonight Show (then an important rite of passage for any comedian), he openly talked himself through his anxieties between doing bits. ("Okay ... you're on television ... he [Johnny Carson] means you no harm.") The streak of anxiety in Williams's comedy was never a secret. He was sharing it with us all along.
But the more important thing was that Williams's approach allowed him to build a deep emotional bond with the audience. Live comedy is about a relationship between the comic and the crowd, because the crowd is a crucial element of the performance. A standup act is not the same if it is done for only one person. A tiny audience mutes the comedian's effectiveness. But as the crowd grows larger, so does its power, and the more audience members there are the more they can set each other on to laugh. Standup is a fundamentally social art form. When a comedian has successfully worked a crowd, it creates a powerful feedback loop, with the audience's laughter feeding the comedian energy and confidence, which she or he uses to make the audience laugh harder, until the laughter becomes irresistibly contagious. The comedian has a microphone, but the audience is the amplifier.
Great comedians bond with the audience on an emotional as well as an intellectual level. Williams created an exceptionally deep bond with his audiences, because he shared with them a core truth, the scary excitement of performing live comedy, that other comics had to deny. Williams could not talk about that directly either, but he communicated it to his audience by making them feel the same things he did. That was what made him electrifying on stage. His entire act was about the experience of performing. He was the livest of all live comedians.
And what Williams's act implicitly said was, This is a little frightening, but it's fun. And here we are doing it! He created an act that felt unpredictable and kept the audience off balance, but also created the sense that if they didn't know where any of this was going they were still all in it together. That is a powerful and intimate bond. Williams's live act, in his younger days, felt utterly chaotic, but audiences gave themselves permission to enjoy it, because Williams made the chaos feel safe. He was the benign lunatic. He could do anything on stage, because he had earned the audience's absolute trust.
In his early days, Williams used to close his shows with a quote from the great cult comedian Lord Buckley, who had used it to close his own act:
I wish I had the chance to tell Robin Williams the same thing tonight. Thank you, Mr. Williams, and rest in peace.
Performing standup is a frightening and disorienting thing, even for pros. Standups talk about their art form as analogous to boxing, saying that if you don't stay in training you can't -- don't dare -- get into the ring. A live performance is always in danger of spinning out of control. You can lose the audience in a split second. Any comedian who's performed enough to learn even the basics of the craft has had the experience of bombing out in front of a live audience -- dying, as comics always put it -- dozens and dozens of times. An extremely original comedian has died even more often. That is a miserable experience. And even experienced pros, even stars, still sometimes have a performance come totally unglued. They have all learned to keep that from happening by maintaining firm control of the performance at all times.
Comics learn, gradually and painstakingly, to conceal their fear and anxiety from the audience. And it is right that they should. Watching a comedian fail on stage is depressing and embarrassing, without any hope of insight or catharsis. Comedians do their best to shield themselves from that public humiliation. They learn to project confidence to the crowd and to keep their failures of confidence hidden. The audience should never catch any scent of flop sweat, no whiff of the performers' insecurities or fears of humiliation. The art form, like every art form, works best when it is grounded in emotional truth, but creating comedy requires concealing the emotional truth of how creating comedy feels.
Williams was absolutely in control of the room. Audiences ate out of his hand. Watching him was nothing like watching an open-mike novice falling apart. But watching him live, when he first emerged on the standup scene in the late 1970s, was also a bewildering and disorienting experience. The speed at which he changed direction, leaping from one bit to another and then back, was then something totally new and unexpected. People were often under the impression that his entire act, every single word, was improvised. (Of course, it wasn't.) It can be hard to remember, thirty-five and nearly forty years after Williams emerged, how radically new he seemed. But he did. It was like he was free associating at lightning speed.
Robin Williams wasn't the first comic to improvise on stage. He wasn't the first to do strange or emotionally raw material on stage. To be honest, his success was never about the material per se; there were much better joke-writers in his generation. And he was definitely not the only 1970s comic disguising his act's formal structure; that had been going on for decades. But what Williams's performances did was turn the basic relationship of live stand-up inside out. His disorienting speed and rapid changes of direction created an exhilarating and slightly scary experience for the crowd. They became the ones who had to live with their fears and accept that the room was out of their control. But they also got to feel the energy of that, too, the nervous excitement that performers channel into their stage act. Watching Robin Williams in person was basically sharing his performance-night adrenaline high.
One thing this did was free Williams to admit his own anxieties, the worry driving the comedy, without relinquishing control. If you listen carefully to his classic Live at the Met album, the phrase you will hear him say most is "Oh, no!" He says it dozens of times in that set, as a segue, as a punctuation mark, as a space-holder to cover an audience laugh. But what he is saying, over and over, is still, "Oh, no!" (The second most common phrase is "Don't you see?") During Williams's first appearance on The Tonight Show (then an important rite of passage for any comedian), he openly talked himself through his anxieties between doing bits. ("Okay ... you're on television ... he [Johnny Carson] means you no harm.") The streak of anxiety in Williams's comedy was never a secret. He was sharing it with us all along.
But the more important thing was that Williams's approach allowed him to build a deep emotional bond with the audience. Live comedy is about a relationship between the comic and the crowd, because the crowd is a crucial element of the performance. A standup act is not the same if it is done for only one person. A tiny audience mutes the comedian's effectiveness. But as the crowd grows larger, so does its power, and the more audience members there are the more they can set each other on to laugh. Standup is a fundamentally social art form. When a comedian has successfully worked a crowd, it creates a powerful feedback loop, with the audience's laughter feeding the comedian energy and confidence, which she or he uses to make the audience laugh harder, until the laughter becomes irresistibly contagious. The comedian has a microphone, but the audience is the amplifier.
Great comedians bond with the audience on an emotional as well as an intellectual level. Williams created an exceptionally deep bond with his audiences, because he shared with them a core truth, the scary excitement of performing live comedy, that other comics had to deny. Williams could not talk about that directly either, but he communicated it to his audience by making them feel the same things he did. That was what made him electrifying on stage. His entire act was about the experience of performing. He was the livest of all live comedians.
And what Williams's act implicitly said was, This is a little frightening, but it's fun. And here we are doing it! He created an act that felt unpredictable and kept the audience off balance, but also created the sense that if they didn't know where any of this was going they were still all in it together. That is a powerful and intimate bond. Williams's live act, in his younger days, felt utterly chaotic, but audiences gave themselves permission to enjoy it, because Williams made the chaos feel safe. He was the benign lunatic. He could do anything on stage, because he had earned the audience's absolute trust.
In his early days, Williams used to close his shows with a quote from the great cult comedian Lord Buckley, who had used it to close his own act:
People are the true flowers of life, and it has been a most magnificent pleasure to have temporarily walked in your garden.
I wish I had the chance to tell Robin Williams the same thing tonight. Thank you, Mr. Williams, and rest in peace.
Tuesday, August 05, 2014
The Other Two Sides in Israel and Palestine
It is not only hard to write about the bloodshed in Israel and Palestine without taking sides. It is impossible for most people to read about the violence in Israel and Palestine without taking sides. So the debate bogs down into questions of justification and self-defense and proportionality: that is, into the utterly useless question of whether Israel or Hamas is more in the wrong. It may well be that one side or the other is more justified, or more culpable. But since the answering that question will not prevent even a single death, the question is meaningless. Taking the Israeli side or the Palestinian side does not matter, the real merits of those causes notwithstanding, because the conflict that matters is not between the Israelis and the Palestinians. Neither side can actually win that conflict, and everything those two sides are doing right now puts resolution further out of reach. The two sides that actually matter are not the Israelis and the Palestinians but the peacemakers and the warmakers. That struggle can be won, but not by the side that's currently winning.
Instead of thinking of two ethnic peoples, we can think of the Israel/Palestine conflict as a contest between the negotiators and the escalators. There are negotiators and escalators in both camps. The negotiators want to end the violence and reach a peaceful long-term solution. Various individuals envision different versions of that settlement, and the Israeli and Palestinian negotiators each want their own ethnic group to get the maximally advantageous deal. But the goal is still a deal.
The negotiators have been on a long losing streak, and their position is incredibly weak at the moment. But even at their weakest, there is a single fundamental advantage that cannot be taken from them. They are the only side that can win. There is no military solution to the Israeli/Palestinian problem. There is no endgame through which either group can win through sheer force of arms.
Neither side can wipe out the other. That is not militarily feasible, politically viable, or morally acceptable. And no one is going anywhere. Israel is not going to be swept into the sea. If your goal is to do undo 1948 and make it as if Israel never existed, then your goal fundamentally cannot be achieved. Nor are the Palestinians going to be expelled. If you think that a nation founded in part by Holocaust survivors can solve its security problems through ethnic cleansing, you need to face basic reality. No One. Is Going. Anywhere.
In the long run, a negotiated settlement is the only endgame possible. But the escalators (who, like the negotiators, exist on both sides of the ethnic divide) are dedicated to prolonging the war as long as possible. Not to win it. Winning is objectively impossible. The real objective is the continuation of the war itself. If military victory were the actual goal, much of the behavior we see on the ground would be futile or even counter-productive. (Hamas's rocket attacks, for instance, don't make a lot of sense as an attempt to weaken Israel's military. But they are not an attempt to to weaken Israel's military.) If we understand the real goal to be provocation, the behavior becomes easily explicable. The violence is not an attempt to defeat the other military, but an attempt to provoke further military action by the opposition. A sudden big offensive is not an attempt to end the fight once and for all. It is an attempt to ensure that the fight does not end.
Some of the escalators are simply refusing to accept military reality, and delude themselves with dreams of victory. Some are driven by their personal ideology or personal hatreds. Some are not thinking straight at all. And some have a vested interest in keeping the hostilities going. Any conflict that goes on for as long as the Israel/Palestine conflict becomes institutionalized to some measure. Structured organizations, both official and unofficial, emerge specifically to wage that particular war. Careers are built around that war. There are wealthy and influential people who rely on the war for their wealth and influence, and power brokers who rely on the war for their power.
There are political figures, Israeli and Palestinian, whose careers are built on taking a harder-line position than their domestic political opponents, no matter how hard a line those opponents take. There are political leaders, Israeli and Palestinian, whose relationship with their constituents is founded on their constituents' fears. There are figures within the Palestinian leadership who have gotten seats at the table by making themselves indispensable to the war effort: the recruiters, the warlords, the money people. At least some of those people suspect that peace would make them dispensable. And on the Israeli side, in somewhat subtler ways (subtler, of course, because the Israeli state is more bureaucratically developed than the Palestinian movement), there are people who prosper in various ways from the militarization of the conflict.
I am not claiming that both sides are equally culpable, or morally equivalent, or any of that. I am not interested in arguing about right and wrong here. Arguments about right and wrong have led to piles of dead bodies. I am interested in arguing cause and effect.
That there are entrenched interests who benefit from the hostilities, on both sides, is not primarily a symptom of individual bad character. It is the inevitable result of a conflict that has gone on this long. A war that lasts two generations stops being just a war. It becomes a way of life. And people will fight to defend their way of life.
The escalators can always keep the war going by provoking the other ethnic group. When the opposite side retaliates, it is a pretext for further escalation, and pretty soon peace talks are out of the question again. Whenever things get too quiet, you convince yourself that the enemy is vulnerable and it's time to take advantage. Then, when the enemy strikes back, everyone on your side of the line has to rally to the fight. Things too quiet? Kidnap some hitchhikers. Build some settlements on the wrong side of the treaty line. Fire some rockets. Break a cease-fire. Sure, some of the people on your own official side of the conflict will tell you not to do these things, but once you've done them the other side will come on the attack and then the people who wanted to restrain you will have no choice but to back you.
The thing to realize here is that the Israeli escalators and the Palestinian escalators, while fighting each other on the battlefield, are also working together. They are both struggling to continue and escalate the war. You don't attack Israeli civilians and expect to get away with it. You don't kill Palestinian civilians in your reprisal attack and expect that this will calm the Palestinian side down. Bringing on the other side's reprisal is the goal. It is never stated that way. It could not be. But that is what is actually happening.
The problem is not just that IDF expeditions into Gaza will not stop the rocket attacks. It's that the point of the rocket attacks is to bring the IDF into Gaza. Why would Hamas, or elements of Hamas, want the IDF to invade Gaza? Several reasons, but one of them is that when the Israeli military is on the move, the people of Gaza have no choice but to stand by Hamas. There is no middle ground on a battlefield. And the escalators' main goal is to make negotiating impossible. Their war is against the middle ground.
Most of the struggle between the negotiators and the escalators is political; it is about whose faction is in ascendance, whose policy wins the debate, and whose orders get obeyed or ignored on the ground. But sometimes things actually flare up into intra-Israeli or intra-Palestinian violence. Fatah and Hamas have sometimes exchanged gunfire. An Israeli Prime Minister has been murdered by an Israeli fanatic because a final peace deal started to seem plausible. If the doves get too close to a deal, the hawks on their own side sometimes try to kill them.
But the hawks haven't needed to do anything so blatant lately, because the party of war has been on a roll. The Israeli and Palestinian hawks have worked together in a masterpiece of unspoken coordination, a long series of seamless no-look passes. In this, the escalators have a massive advantage over the negotiators. The Israeli and Palestinian doves need to communicate explicitly with one another, and they need to trust each other. They have to hold talks. In short, they have to actually negotiate. The Israeli and Palestinian hawks don't need to communicate with each other at all. They can simply act. They know what will happen if they provoke the other side. They can count on it. It's not about trust. It's about predictability.
Worse yet, the doves need unity and discipline on their own side in order to function. They need to deliver on their deals. But the hawks can disrupt things through insubordination or disobedience. They can, to various degrees, freelance. Settlers can disobey the Israeli government, but know that the state and the army will eventually have to back them. Palestinians can initiate attacks on Israelis without necessarily clearing it all the way to the top, and some people doing the attacks are not necessarily inside any real chain of command. ("Let's go kidnap a few teenaged Israeli hitchhikers" is not a plan hatched at the top level of leadership.) But the people who go ahead with those attacks know the leadership will not disavow them. An IDF commander can promise his superior that he will use restraint, and then use harsh and provocative tactics once an operation starts. A Palestinian who doesn't like a cease-fire can break it with just a few like-minded accomplices. Last Friday's cease-fire was broken almost immediately by a small group of armed Palestinians. That was not a real attempt to take military advantage, which would require a coordinated set of attacks by a large group. That was free-lancing, one small unit or cell just going out on its own. Your leaders agree to a cease-fire, you go out and shoot at Israelis, cease-fire over. That wasn't a side effect. That was the main point of the attack.
As long as this behavior goes on (and it goes on, to different degrees, on both sides), the war will never end. The hawks cannot defeat each other, and on some level aren't even trying. But they are committed to driving any hope of peace from the field. And they are willing to frag the doves when necessary. As long as those seeking to escalate the war can continue defying restraints imposed by their own side, the war will go on forever. And that is really the goal.
cross-posted from Dagblog
Instead of thinking of two ethnic peoples, we can think of the Israel/Palestine conflict as a contest between the negotiators and the escalators. There are negotiators and escalators in both camps. The negotiators want to end the violence and reach a peaceful long-term solution. Various individuals envision different versions of that settlement, and the Israeli and Palestinian negotiators each want their own ethnic group to get the maximally advantageous deal. But the goal is still a deal.
The negotiators have been on a long losing streak, and their position is incredibly weak at the moment. But even at their weakest, there is a single fundamental advantage that cannot be taken from them. They are the only side that can win. There is no military solution to the Israeli/Palestinian problem. There is no endgame through which either group can win through sheer force of arms.
Neither side can wipe out the other. That is not militarily feasible, politically viable, or morally acceptable. And no one is going anywhere. Israel is not going to be swept into the sea. If your goal is to do undo 1948 and make it as if Israel never existed, then your goal fundamentally cannot be achieved. Nor are the Palestinians going to be expelled. If you think that a nation founded in part by Holocaust survivors can solve its security problems through ethnic cleansing, you need to face basic reality. No One. Is Going. Anywhere.
In the long run, a negotiated settlement is the only endgame possible. But the escalators (who, like the negotiators, exist on both sides of the ethnic divide) are dedicated to prolonging the war as long as possible. Not to win it. Winning is objectively impossible. The real objective is the continuation of the war itself. If military victory were the actual goal, much of the behavior we see on the ground would be futile or even counter-productive. (Hamas's rocket attacks, for instance, don't make a lot of sense as an attempt to weaken Israel's military. But they are not an attempt to to weaken Israel's military.) If we understand the real goal to be provocation, the behavior becomes easily explicable. The violence is not an attempt to defeat the other military, but an attempt to provoke further military action by the opposition. A sudden big offensive is not an attempt to end the fight once and for all. It is an attempt to ensure that the fight does not end.
Some of the escalators are simply refusing to accept military reality, and delude themselves with dreams of victory. Some are driven by their personal ideology or personal hatreds. Some are not thinking straight at all. And some have a vested interest in keeping the hostilities going. Any conflict that goes on for as long as the Israel/Palestine conflict becomes institutionalized to some measure. Structured organizations, both official and unofficial, emerge specifically to wage that particular war. Careers are built around that war. There are wealthy and influential people who rely on the war for their wealth and influence, and power brokers who rely on the war for their power.
There are political figures, Israeli and Palestinian, whose careers are built on taking a harder-line position than their domestic political opponents, no matter how hard a line those opponents take. There are political leaders, Israeli and Palestinian, whose relationship with their constituents is founded on their constituents' fears. There are figures within the Palestinian leadership who have gotten seats at the table by making themselves indispensable to the war effort: the recruiters, the warlords, the money people. At least some of those people suspect that peace would make them dispensable. And on the Israeli side, in somewhat subtler ways (subtler, of course, because the Israeli state is more bureaucratically developed than the Palestinian movement), there are people who prosper in various ways from the militarization of the conflict.
I am not claiming that both sides are equally culpable, or morally equivalent, or any of that. I am not interested in arguing about right and wrong here. Arguments about right and wrong have led to piles of dead bodies. I am interested in arguing cause and effect.
That there are entrenched interests who benefit from the hostilities, on both sides, is not primarily a symptom of individual bad character. It is the inevitable result of a conflict that has gone on this long. A war that lasts two generations stops being just a war. It becomes a way of life. And people will fight to defend their way of life.
The escalators can always keep the war going by provoking the other ethnic group. When the opposite side retaliates, it is a pretext for further escalation, and pretty soon peace talks are out of the question again. Whenever things get too quiet, you convince yourself that the enemy is vulnerable and it's time to take advantage. Then, when the enemy strikes back, everyone on your side of the line has to rally to the fight. Things too quiet? Kidnap some hitchhikers. Build some settlements on the wrong side of the treaty line. Fire some rockets. Break a cease-fire. Sure, some of the people on your own official side of the conflict will tell you not to do these things, but once you've done them the other side will come on the attack and then the people who wanted to restrain you will have no choice but to back you.
The thing to realize here is that the Israeli escalators and the Palestinian escalators, while fighting each other on the battlefield, are also working together. They are both struggling to continue and escalate the war. You don't attack Israeli civilians and expect to get away with it. You don't kill Palestinian civilians in your reprisal attack and expect that this will calm the Palestinian side down. Bringing on the other side's reprisal is the goal. It is never stated that way. It could not be. But that is what is actually happening.
The problem is not just that IDF expeditions into Gaza will not stop the rocket attacks. It's that the point of the rocket attacks is to bring the IDF into Gaza. Why would Hamas, or elements of Hamas, want the IDF to invade Gaza? Several reasons, but one of them is that when the Israeli military is on the move, the people of Gaza have no choice but to stand by Hamas. There is no middle ground on a battlefield. And the escalators' main goal is to make negotiating impossible. Their war is against the middle ground.
Most of the struggle between the negotiators and the escalators is political; it is about whose faction is in ascendance, whose policy wins the debate, and whose orders get obeyed or ignored on the ground. But sometimes things actually flare up into intra-Israeli or intra-Palestinian violence. Fatah and Hamas have sometimes exchanged gunfire. An Israeli Prime Minister has been murdered by an Israeli fanatic because a final peace deal started to seem plausible. If the doves get too close to a deal, the hawks on their own side sometimes try to kill them.
But the hawks haven't needed to do anything so blatant lately, because the party of war has been on a roll. The Israeli and Palestinian hawks have worked together in a masterpiece of unspoken coordination, a long series of seamless no-look passes. In this, the escalators have a massive advantage over the negotiators. The Israeli and Palestinian doves need to communicate explicitly with one another, and they need to trust each other. They have to hold talks. In short, they have to actually negotiate. The Israeli and Palestinian hawks don't need to communicate with each other at all. They can simply act. They know what will happen if they provoke the other side. They can count on it. It's not about trust. It's about predictability.
Worse yet, the doves need unity and discipline on their own side in order to function. They need to deliver on their deals. But the hawks can disrupt things through insubordination or disobedience. They can, to various degrees, freelance. Settlers can disobey the Israeli government, but know that the state and the army will eventually have to back them. Palestinians can initiate attacks on Israelis without necessarily clearing it all the way to the top, and some people doing the attacks are not necessarily inside any real chain of command. ("Let's go kidnap a few teenaged Israeli hitchhikers" is not a plan hatched at the top level of leadership.) But the people who go ahead with those attacks know the leadership will not disavow them. An IDF commander can promise his superior that he will use restraint, and then use harsh and provocative tactics once an operation starts. A Palestinian who doesn't like a cease-fire can break it with just a few like-minded accomplices. Last Friday's cease-fire was broken almost immediately by a small group of armed Palestinians. That was not a real attempt to take military advantage, which would require a coordinated set of attacks by a large group. That was free-lancing, one small unit or cell just going out on its own. Your leaders agree to a cease-fire, you go out and shoot at Israelis, cease-fire over. That wasn't a side effect. That was the main point of the attack.
As long as this behavior goes on (and it goes on, to different degrees, on both sides), the war will never end. The hawks cannot defeat each other, and on some level aren't even trying. But they are committed to driving any hope of peace from the field. And they are willing to frag the doves when necessary. As long as those seeking to escalate the war can continue defying restraints imposed by their own side, the war will go on forever. And that is really the goal.
cross-posted from Dagblog
Tuesday, July 29, 2014
Snobs vs. the Ivy League (or, The Question of Bill Deresiewicz'sCharacter)
There is nothing a snobbish Ivy Leaguer likes better than putting down the Ivy League. It's an easy way to signal that you are above your own Ivy League school and the privilege it confers -- all a big humbug that your superior perspective sees right through -- while holding on to every last scrap of that privilege. It allows you to position yourself as not only 1. better than people who didn't get into Harvard, Princeton, or Yale, but 2. the benevolent champion of those little people who didn't get in and also 3. better than everyone else who did get into your school and who, unlike you, need to take the place seriously. It's a time-honored game for the insider's insiders, and William Deresiewicz plays it like an old hand in the latest New Republic, with an article titled "Don't Send Your Kid to the Ivy League."
Even the title of that article is disingenuous. William Deresiewicz has never studied or worked outside the Ivy League. He has three degrees from Columbia. He taught for ten years at Yale. Public colleges, and the students at public colleges, are merely rhetorically convenient symbols for him. He displays no understanding of, and no curiosity about, what those places and people are actually like.
Is going to an Ivy League school worth it? Unless you are already a person of enormous inherited privilege, the question is disingenuous. Of course it is. This question is like the popular media question, "Is going to college worth it?" No one asking that question honestly believes that they would have been better off not going to college; they would not be writing in whatever magazine is asking the question this week if they had not gone to college. And none of them would be willing for their own children not to go to college. Asking the question is an act of dishonesty. The writer is at the very least deceiving him- or herself.
Deresiewicz argues that one should turn down admission to an Ivy League school and go to a public university, where you will build superior character. So, if you get into Harvard you should go to the University of Massachusetts instead. Let me say, as a proud alumnus of both Harvard and U. Mass.: don't be ridiculous.
And yes, I learned to think at Harvard. Of course. Were some of my classmates careerists who resisted genuine introspection? Yes, surely a few. But no institution teaches students to love thinking. Only another person can teach you that. The Harvard I went to abounded with such people.
I should certainly not turned down Harvard when I was 17 and gone to U.Mass instead. That would have been crazy. And anyone telling a young person in my position to do that isn't striking a blow against elitism. They're just trying to keep less-privileged people out of the elite.
I said yes to Harvard for a simple reason: I could not afford not to. I grew up comfortably middle-class. But we certainly weren't the upper middle class. (One of my parents was a high school teacher, the other police lieutenant.) I could not turn down a break like getting into Harvard. I could not count on getting another break like that again. Anyone who tells a kid like me to turn down Harvard is doing that kid wrong.
Any 18-year-old who gets a chance to go to school with people smarter than she or he is should take that opportunity. Knowing that nearly all of your classmates know interesting things that you don't is a gift that only a fool would refuse. I am grateful that I was given that opportunity; there is no stronger expression of entitlement than ingratitude.
Deresiewicz wants to discuss character, and I don't want to impugn his. My spouse knows him from her own Yale days, and speaks well of him as a teacher. I am certainly willing to hope that his character is better than his essay makes it appear.
Even the title of that article is disingenuous. William Deresiewicz has never studied or worked outside the Ivy League. He has three degrees from Columbia. He taught for ten years at Yale. Public colleges, and the students at public colleges, are merely rhetorically convenient symbols for him. He displays no understanding of, and no curiosity about, what those places and people are actually like.
Is going to an Ivy League school worth it? Unless you are already a person of enormous inherited privilege, the question is disingenuous. Of course it is. This question is like the popular media question, "Is going to college worth it?" No one asking that question honestly believes that they would have been better off not going to college; they would not be writing in whatever magazine is asking the question this week if they had not gone to college. And none of them would be willing for their own children not to go to college. Asking the question is an act of dishonesty. The writer is at the very least deceiving him- or herself.
Deresiewicz argues that one should turn down admission to an Ivy League school and go to a public university, where you will build superior character. So, if you get into Harvard you should go to the University of Massachusetts instead. Let me say, as a proud alumnus of both Harvard and U. Mass.: don't be ridiculous.
And yes, I learned to think at Harvard. Of course. Were some of my classmates careerists who resisted genuine introspection? Yes, surely a few. But no institution teaches students to love thinking. Only another person can teach you that. The Harvard I went to abounded with such people.
I should certainly not turned down Harvard when I was 17 and gone to U.Mass instead. That would have been crazy. And anyone telling a young person in my position to do that isn't striking a blow against elitism. They're just trying to keep less-privileged people out of the elite.
I said yes to Harvard for a simple reason: I could not afford not to. I grew up comfortably middle-class. But we certainly weren't the upper middle class. (One of my parents was a high school teacher, the other police lieutenant.) I could not turn down a break like getting into Harvard. I could not count on getting another break like that again. Anyone who tells a kid like me to turn down Harvard is doing that kid wrong.
Any 18-year-old who gets a chance to go to school with people smarter than she or he is should take that opportunity. Knowing that nearly all of your classmates know interesting things that you don't is a gift that only a fool would refuse. I am grateful that I was given that opportunity; there is no stronger expression of entitlement than ingratitude.
I have three university degrees: two from world-famous universities and one from a state school. I have spent the last ten years teaching in a public university. I think it's fair to say that I have seen both sides of this question. And I am absolutely committed to public education. In fact, what makes me angriest about Deresiewicz is the way his phony, patronizing praise for public universities helps paper over the crisis that public schools are in. Public universities have been bleeding support for years, with our resources falling further and further behind those of the wealthy private colleges, and Deresiewicz knows it. The endless budget problems interfere, inevitably, with the education we can provide our students. Disguising that basic and terrible fact is a bad thing.
Let me confess here that this is personal. In an earlier article on this theme, Deresiewicz claimed that students were better off going to the university where I teach than they are going to Yale. He named us specifically and repeatedly. We have wonderful students and I am proud of them, but telling people to turn down Yale for us is insane. But still more insane was Deresiewicz's reason: you see, when Yale students struggle, they have enormous resources to help them: a small but well-trained army tutors and counselors. My students don't have that. We have some tutors and some counselors but when our students hit trouble (and my students as a group have far, far more troubles than Yale students), they are mostly on their own. Deresiewicz feels that this is a great thing. You see, it builds character. Isn't it better to be at a poor school, struggling?
I don't feel great about that. I long for the resources I used to be able to call in to help Stanford undergrads when they were in trouble. When my students get in trouble, I don't have those people to call, and that is a terrible, terrible feeling.
My students need that help and they don't get it. Deresiewicz applauds that. In fact, Deresiewicz, avowed anti-elitist, applauds struggling poor kids not getting help. That itself is outrageous. But Deresiewicz's cheap rhetorical ploy had real-world effects at my school, because it served as an excuse for not providing any of that help that our students need. When Bill Deresiewicz says it's great that my students don't get help, he does my students wrong. It will take me a long time to forgive him.
But what about character? What about elitism and snobbery? It is true that elite schools are full of students who are already from various elites. That is the nature of the beast. But, whether Deresiewicz realizes it or not, the class system is alive and well among the students at public universities. Rich students and poor students have very different experiences at those places, to the ultimate detriment of both. Skim the annual lists of famous party schools: it's not the kids who need to work for their tuition money who are throwing those parties. Drinking your way through school while studying the absolute minimum is one of the oldest ways for students to express their wealth and privilege, and it is now perversely easier to get away with that game at Flagship State than at one of the Ivies.
Let me confess here that this is personal. In an earlier article on this theme, Deresiewicz claimed that students were better off going to the university where I teach than they are going to Yale. He named us specifically and repeatedly. We have wonderful students and I am proud of them, but telling people to turn down Yale for us is insane. But still more insane was Deresiewicz's reason: you see, when Yale students struggle, they have enormous resources to help them: a small but well-trained army tutors and counselors. My students don't have that. We have some tutors and some counselors but when our students hit trouble (and my students as a group have far, far more troubles than Yale students), they are mostly on their own. Deresiewicz feels that this is a great thing. You see, it builds character. Isn't it better to be at a poor school, struggling?
I don't feel great about that. I long for the resources I used to be able to call in to help Stanford undergrads when they were in trouble. When my students get in trouble, I don't have those people to call, and that is a terrible, terrible feeling.
My students need that help and they don't get it. Deresiewicz applauds that. In fact, Deresiewicz, avowed anti-elitist, applauds struggling poor kids not getting help. That itself is outrageous. But Deresiewicz's cheap rhetorical ploy had real-world effects at my school, because it served as an excuse for not providing any of that help that our students need. When Bill Deresiewicz says it's great that my students don't get help, he does my students wrong. It will take me a long time to forgive him.
But what about character? What about elitism and snobbery? It is true that elite schools are full of students who are already from various elites. That is the nature of the beast. But, whether Deresiewicz realizes it or not, the class system is alive and well among the students at public universities. Rich students and poor students have very different experiences at those places, to the ultimate detriment of both. Skim the annual lists of famous party schools: it's not the kids who need to work for their tuition money who are throwing those parties. Drinking your way through school while studying the absolute minimum is one of the oldest ways for students to express their wealth and privilege, and it is now perversely easier to get away with that game at Flagship State than at one of the Ivies.
I cannot deny that elite universities have more than their healthy share of the arrogant, the entitled, and the egotistical. No one who has spent time at one of those places could deny that. There are a lot of big egos at Harvard and at Stanford both. But my experience of the world is that there are some arrogant and entitled people everywhere. Those people don't always base their sense of superiority on going to a fancy school, or on any educational achievement or talent. In fact, many people who feel superior to others don't base their conviction of superiority on anything that anyone else can detect. Arrogance and entitlement are their own reasons. And if you want to prevent a bright teenager from becoming an arrogant jerk, sending her to a school where there are always three smarter people in the room is not a bad idea. Most of what I know about humility I learned at Harvard.
Deresiewicz wants to discuss character, and I don't want to impugn his. My spouse knows him from her own Yale days, and speaks well of him as a teacher. I am certainly willing to hope that his character is better than his essay makes it appear.
So let's make it personal, Bill. You speak of character. Why not apply for a teaching job outside the Ivy League? If you believe in the mission public education, why not be part of that mission? Romanticizing my students' poverty is not good for them, and not healthy for you either. But you have the tools to help fight their disadvantage. It is not an easy job. It is much harder, in most ways, than teaching at Yale. And it will sorely test your spirit, because teaching a full range of college students means that at least a few of your students will not succeed, no matter what you do, because things outside school prevent them. Knowing that you cannot get them all through is a bitter thing. Knowing that another budget cut is coming, sooner or later, is hard on the spirit. But you wanted to build character, didn't you? You can use the privileges that you have been given to help those who have been shut out. Ranting about how awful Yale is helps no one, and it is a waste of your talents. Our country is full of less privileged schools, with less privileged students. Get a job at one. It is a chance to do something good, and something useful, in the real world.
Cross-posted from Dagblog