So, just in time to ruin my New Year's celebrations, Newsweek has seen fit to publish a credulous article trumpeting the old who-wrote-Shakespeare conspiracy theories. I won't give Newsweek a link, but you can click through Amanda Marcotte's smart takedown at Rawstory if you're curious. The original piece is full of breathless non-facts like "Nobody ever recognised Shakespeare as a writer during his lifetime" [except for at least three dozen separate individuals, writing both in print and manuscript, because Shakespeare was famous] "and when he died, in 1616, no one seemed to notice" [except the six different poets who wrote memorial verses for him]. Apparently you can always say, "there's no evidence" even when there is evidence.
Now, I'm on record about this question on this blog, and under my professional name, and I've been quoted about it in a major newspaper, so I don't want to belabor the key facts here. As the above example suggests, this isn't really a debate about facts anyway. But this phony debate often gets cast as insiders vs. outsiders, the stuffy Shakespeare establishment, with all the PhDs and whatever vs. the free-thinking, imaginative amateur scholars. So I'd like to clarify a few things about how academic and amateur Shakespeareans work.
1. Professional Shakespeareans constantly argue with each other and are rewarded for new ideas.
The standard position of the Francis Bacon/Earl of Oxford/etc./etc. fans is that "orthodox" Shakespeareans are all sticking together because we are afraid of new ideas. This ignores the fact that academic Shakespeare scholars argue with each other constantly about any question that can reasonably be disputed. Winning arguments with each other is how we get ahead in our careers. And winning an argument that brings in a big new idea, or overturns an important old idea, is the gold standard. The academic Shakespeare establishment isn't a conspiracy. It's a boxing ring.
This is one of the reasons that academic writing can be hard for general readers to enjoy: it focuses on highlighting the new idea that the writer is putting forward, rather than the ideas that the reader might find most interesting. Something that's interesting to you as a reader but that every scholar's agreed on for the last fifty years won't get much attention, while today's new idea, even if it's quite small, will get the most attention. And because every argument a scholar puts forward is liable to being torn apart by other scholars, scholarly writing tends to be carefully hedged and to carefully shore up even pretty small issues so that they don't give another critic an opening. That's another reason academese is hard to read.
I don't write my scholarship to highlight how much I agree with more established Shakespeareans. It's just the reverse. I once criticized something written by the then-head of
the Shakespeare Birthplace Trust (whom many Oxfordians especially dislike) so, ah, energetically that I was publicly accused, in print, of having been "unfair" to him. (Of course, I don't think that I was unfair, but hey, to offend and judge are distinct offices.) Scholarly writing demands pointing out where other scholars are wrong.
A member of the "Shakespeare establishment" who could make a strong case that Shakespeare's works had been written by someone else would stand to benefit enormously. Even if it weren't a completely bullet-proof case, the rewards for making a reasonably strong case, opening room for legitimate doubt, would be huge. You'd immediately become a major player in the field. If I thought I had the evidence to back up a case like that, you'd better believe that I would make it. And so would a lot of other people like me. Yes, that would mean publicly disagreeing with many important senior scholars; that would only make it sweeter.
(On the other hand, the reward for believing Shakespeare wrote Shakespeare is nothing, just like the reward for believing that the sky is blue and water is wet is nothing. No one beats someone else out for a job because they both believe the same thing that no one else doubts. One of the frustrations many literary scholars have teaching beginning undergraduates is those students' deep commitment to arguing things that are so obviously true that they're not worth bringing up; making arguments like that is not what professional academics value at all.)
The reason I don't make a case for someone else writing Shakespeare is that I can't. The reason that a large group of other people inside the academic world haven't done it is that they can't either. If there were evidence to make a good case, someone would certainly be ambitious enough to make it. But it never happens.
2. Amateur scholars are welcome in academic debates.
One of my generation's two greatest historians of Shakespeare's theater is an independent scholar named Dave Kathman, who doesn't have a university job or a PhD in literature. Dave works as a financial analyst in Chicago, and does the Shakespeare-theater-history thing as a hobby. But he's enormously productive and valuable as a scholar. There's only one PhD-holder in my generation who's more important to that specific field than Dave is. (That scholar is an Oxford professor, very much part of the establishment.) Dave has found original documents that we had not known about, because he looked in archives people had not thought about trying. So suddenly, thanks to Dave, we have apprenticeship records for Shakespeare's boy actors. We can prove when they joined the company, and we can closely estimate their ages. It used to be we knew very little about the boys who played female parts, but now we know more about them than we know about some of the adult actors.
Dave doesn't get turned away because he doesn't have a PhD in our field, or because he doesn't teach college. He's been welcomed and valued, because he makes important contributions. He has also made a strong argument that changed the way we think about an important primary document from theater history, a piece of old paper that's obscure to outsiders but which turns out to underwrite a lot of other theories about what was going on in the 1590s. Dave made strong case for that document being from a different year than we thought, and belonging to a different acting company. This, of course, led to a debate. Shakespeareans debate things. And Dave was opposed by some very high-profile senior scholars who were committed to the old way of looking at that document. But they didn't pull rank on him. No one said, "I teach at an Ivy and you don't have a PhD in English, so you're wrong." They had to meet him on the facts, and some eventually had to concede that he was right.
We don't turn amateurs away because they're amateurs. An amateur who makes a strong case can win the day.
3. Shakespeare "authorship disputes" are actually OLDER than professional Shakespeare scholarship.
In fact, the "authorship controversy" started in the days when every Shakespearean was an amateur. It didn't start until the 19th century, which is long enough after Shakespeare's death to raise difficult questions. (No one in the 16th, 17th, or 18th centuries expressed any doubts. But sometime after Shakespeare had been dead for 200 years a few people suddenly decided that it was impossible that he wrote his works.) But university courses on Shakespeare come even later still, as do doctoral degrees in English literature. Those don't get underway until the second half of the 19th century.
So this didn't start as an argument between professors and outsiders. There were no professors of Shakespeare. Everyone was an amateur (and that includes some of the greatest Shakespeare scholars who have ever lived).
But when literature departments got organized and people started writing research dissertations on Shakespeare, none of the maybe-someone-else-wrote-it stuff got used by the new group of pros. It wasn't because people conspired to exclude it. Someone who could prove that case in 1865 or 1915 would have been highly rewarded, the same way someone would for proving it in 2015. But the evidence for other candidates has never been there. And you can't get away telling your PhD adviser bullshit like "No one ever mentioned Shakespeare as a writer during his lifetime." Your adviser will know that's a lie.
The "Shakespeare authorship" arguments are like astrology: an old idea that professionals working in the field have outgrown but that stays popular with a slice of the general public. Like astrology, the Shakespeare-authorship game has trouble generating new hypotheses that can stand up to a rigorous test. And so authorship debates, like astrology, tend to recycle old claims over and over again, giving them a certain time-in-a-bottle quality. I'm having trouble finding anything in that Newsweek story that you couldn't find somewhere else by, say, 1940. In the academic world, a piece that just repeats things from decades ago is completely unpublishable. But the authorship hobbyists are more than happy to dish out the same old cabbage, no matter how many times it's been served before.
Journalists writing "news" stories about these conspiracy theories need to spin the Shakespeare-not-Shakespeare idea as somehow, well, new. But it's not new. It's a very old idea, nearly two hundred years old at this point, and it hasn't made any progress in a long time.
cross-posted from (and comments welcome at) Dagblog
Wednesday, December 31, 2014
Tuesday, December 23, 2014
Confidence, Rejection, and Criticism: Advice from Actors to Academics, Part Three
Christmas week is especially hard for young academics trying to get a job, especially in literary studies. The annual rhythm of the job search means that most first-round interviews (the interviews that take place at major disciplinary conferences over the winter) get scheduled during the first half of December. By this time of year, grad students (and recent PhDs) looking for a job are counting the meager number of schools where their applications are still active; they may have applied to dozens of jobs and gotten one or two first-round interviews to show for it. Worse yet, many people are getting nothing but the long and lengthening silence that tells them, day by painful day, that nothing is happening for them this winter.
And then many of those eager job-seekers, who've been tops of their classes all of their lives, have to fly home for Christmas and explain to their family that they're not getting a job this year, that they're not even in the running for a job this year. The darkest time of year is pretty dark for people in my business.
So, since it's that time of the year, it's time for a third installment of "Career Advice from Actors to Academics," inspired by Robert Cohen's classic book of advice Acting Professionally. As I wrote in part one, an academic career has become like a career in the arts because of the scarcity of work and the pervasive rejection. So academics, especially newer academics, can learn from our brothers and sisters in the theater (and in the other arts as well), who've been dealing with rejection and penury since back in the day.
And then many of those eager job-seekers, who've been tops of their classes all of their lives, have to fly home for Christmas and explain to their family that they're not getting a job this year, that they're not even in the running for a job this year. The darkest time of year is pretty dark for people in my business.
So, since it's that time of the year, it's time for a third installment of "Career Advice from Actors to Academics," inspired by Robert Cohen's classic book of advice Acting Professionally. As I wrote in part one, an academic career has become like a career in the arts because of the scarcity of work and the pervasive rejection. So academics, especially newer academics, can learn from our brothers and sisters in the theater (and in the other arts as well), who've been dealing with rejection and penury since back in the day.
But the hard truth is that artists are better prepared for rejection than scholars are. Few arts careers have the kind of heartbreaking schedule that the academic job search demands, where two-thirds of your job prospects for the year can evaporate within a window of a few weeks. Actors get rejected all year round. So do writers, dancers, sculptors, painters, and filmmakers. A stand-up comic who lives in the right city and works hard enough can get rejected every single night of the week. But none of those artists have to watch most of their chances for the year slip away just before Christmas. Artists get to space out the rejections better, and to inure themselves on a daily and weekly basis. (When I was an undergraduate theater bum, every show for the semester did make its casting decisions during the first week of the term, which is the only thing in my life that even remotely prepared me for the annual MLA conference.)
More importantly, almost every working artist, including the ones who've gotten an MFA, have started racking up rejections from their early twenties at least, while scholars don't begoin to experience the hard knocks until they've been systematically unprepared for it. Every actor gets turned down at auditions constantly, and many who enter a graduate acting program have been turned down many times before they get to school. The same goes with writers and other artists. After all, if people were already putting you on Broadway, you wouldn't go to acting school. But in fact, people are not putting you on Broadway. So actors who get a graduate degree have had to toughen up at least a little; they're prepared for the hard realities of the market because they've already tested them.
The scholar's career path is completely different. The rejections don't start until after you get your PhD. Graduate school is tough in all kinds of ways, but it promises to reward all of the deserving, as the job market will never do. If you earn a degree, you will get that degree.The working life of many doctoral students is a long series of A grades, scholarships, fellowships, and departmental awards until graduating with the doctorate. Then the working life suddenly turns into a long, bitter round of pummeling rejection. Most people aren't ready for that at all. How could they be?
The only thing that will get you through is confidence, which is not at all the same as ego. Let's go to Cohen's working definition:
Confidence is the power a person has over his or her own personality; it allows the person to accept criticism and at the same time rise above it.
Note the importance here of accepting criticism without feeling belittled by it. The false confidence of a heavily-defended ego cannot take criticism at all. But an ego like that can never survive an artistic, or an academic, career. You have to have a healthy perspective that can take criticism on board (although you don't necessarily allow any given critic to override your judgment completely) and find ways to use it.
It's important here to distinguish between criticism and rejection. If you have a fragile ego, those two things sound the same, or nearly the same. In fact, they are fundamentally different. Criticism, even if it happens to be mistaken, is almost always a gift. Someone has taken time and effort, neither of which are in great supply, in order to help you improve. Some criticism is not helpful, but all criticism is an attempt to be helpful.
Rejection does not come with criticism. It just says no, and moves along. The eerie silence that haunts some job-seekers the week before Christmas, or the brief formal rejections that some hiring departments send, don't offer any tips on how to improve.
You also need confidence to deal with the polite silence of rejection. Here is Cohen again:
It's important here to distinguish between criticism and rejection. If you have a fragile ego, those two things sound the same, or nearly the same. In fact, they are fundamentally different. Criticism, even if it happens to be mistaken, is almost always a gift. Someone has taken time and effort, neither of which are in great supply, in order to help you improve. Some criticism is not helpful, but all criticism is an attempt to be helpful.
Rejection does not come with criticism. It just says no, and moves along. The eerie silence that haunts some job-seekers the week before Christmas, or the brief formal rejections that some hiring departments send, don't offer any tips on how to improve.
You also need confidence to deal with the polite silence of rejection. Here is Cohen again:
[Confidence] allows an actor to believe in the reality of his or her performance even when no one else does. A person may have all kinds of doubts about his or her potential for career success, but may not doubt that "he is an actor," that "she can act."
Note that confidence is not a prediction about results. I will be a star someday is not confidence. I am an actor, even when you've been turned down for the fifteenth audition in a row, is confidence.
In the same way, your conviction that you will get an incredible and shiny job this time out is not confidence. (In most cases, it is simply magical thinking, which is unhealthy.) Confidence, in the sense Cohen is using it, is not about predicting career outcomes. These businesses are too precarious for predictions like that. Confidence is not your will to believe that the plum Ivy League job on this fall's listings will be yours. Confidence is the belief that your work is valid, that your scholarship is actually a contribution. You might note the distinction here is very close to a distinction between believing in "yourself" and believing in your work as something separate from you, which I think is largely a healthy distinction.
Confidence, in Cohen's sense, is about believing that the work you do is useful and worth reading: that what you do is scholarship, no matter what the external rewards are.
I would add one further thing about weathering rejection. In all of the arts, the psychologically healthy rule of thumb is that successful artists (read: academics) get rejected all the time, and that they are rewarded only sometimes. To put it another way: you will be rejected even if your work is good, indeed no matter how good your work is, but you will never be accepted at all unless your work is good.
That means that you should usually only attach meaning to the good results, and write the rejections off as the normal cost of doing business. That would be an irrational approach in most other endeavors, but in the arts, including the scholarly arts, that approach accurately describes the real facts on the ground. If the talented are only rewarded one time in ten or twenty, then the single success is more meaningful than the nine or nineteen rejections.
If you are a struggling new academic and you are finding that the rejection is truly universal -- if, for example, you applied for three dozen jobs this fall and none of them even asked you for an additional writing sample -- then you probably need to change something about what you're doing. Most likely, there is some qualification that you need and don't have, something that you need to add to your CV in order to enter the pool of viable candidates.
But if you are getting small flashes of encouragement inside dark, watery depths of rejection, then you are perfectly rational to take the encouragement, and not the rejection, as meaningful. No one can afford to reward all of the deserving job-seekers. There are too many good people to offer them all even a first-round interview. This is the literal truth. But at the same time, no one has any need or reason to waste their own time with someone who isn't, on some level, a viable candidate. You only get asked for extra materials if your CV is up to snuff. You only get a conference interview if the committee thinks they might actually hire you. And even if you don't progress to the next round of interviews, you should remember that the people who interviewed you saw you as a professional doing real work.
There are more good people than there are rewards for good people. So even the good are only rewarded rarely. But only the good are rewarded at all. Forget the failures, because everyone fails. Remember the successes, because there is only one explanation for success.
cross-posted from (and comments welcome at) Dagblog
Sunday, December 21, 2014
Police, Danger, and the Social Contract
I was blogging about the police tonight, and about the responses to protests of police brutality. Then I heard about the shooting of two police officers in New York City, so the rest of that post (and some of the others I have been working on) will have to wait.
The first thing I want to say is that absolutely nothing justifies this. Nothing justifies the murder. And if the murderer committed his crime in the service of any reasonable cause, he has set back that cause tonight. You don't get justice for Eric Garner, or for anyone else, by vigilante revenge. That only makes the problem worse.
We live in a culture where the police (who face real danger in their work) have been taught -- indeed, actively trained -- to be excessively fearful, to the point where some officers will put citizens' lives and safety at substantial risk rather than face some very small risk themselves. (For example, they might apply a chokehold to a suspect who's already being held on every side by several officers, and who can't free either of his arms. What danger that was meant to eliminate is hard to say.) Basically, the logic is that every possible trace of danger should be eliminated, which is impossible. So the effort to eliminate all danger generally means being extremely aggressive in situations that aren't actually that dangerous.
But that culture of fear is not helped by randomly killing police officers. It's fear that's driving the aggression, and the fear is fed by the potential randomness of the danger. Cops being killed without seeing the danger coming translates, in our current atmosphere, into cops being hyper-aggressive in situations of minimal danger because, "You never know."
Killing two cops in an ambush won't break up the mindset that killed Eric Garner, Killing two cops at random feeds the mindset that killed Eric Garner. When the police have been convinced that they could die at any moment, they take crazy and dangerous steps against people who actually pose no threat to them.
But anyone pointing at this murder as justifying that bad and crazy policing is a mistake. This doesn't justify anything. And the NYPD's aggression on the street will not, cannot, protect them from things like this. Saying they need to get tough with non-violent offenders in misdemeanor arrests because they could be killed in an ambush makes no sense. That strategy creates new problems without solving the old one.
The ugly truth is, police officers (like every other human being) are extremely vulnerable to a surprise ambush with a gun. Two police officers were killed this way in Las Vegas this year, gunned down by anti-government nuts while eating lunch. And obviously, the Tsarnaev brothers ambushed and killed an MIT campus cop last year. The method is the same every time: come up behind a police officer and shoot. The attacks in all three cases came out of the blue, and there was nothing that the officers could have done to protect themselves. It's not a question of police tactics. None of those cops had a chance.
But this is where the current approach to police work, the attempt to eliminate any and all potential danger, breaks down. You can never eliminate all danger. You can never even eliminate all mortal danger. Every police officer -- and every police officer's family -- has to live with that small, terrible chance. (I am a police lieutenant's son myself; I know exactly what this feels like.) There is always a little danger that you can't foresee or protect yourself from. But you definitely can't get rid of that uncontrollable danger by getting extra tough when there is no danger. Someone with a gun could always come up behind you. You can't protect yourself from that by choking an unarmed guy who's selling loosies on the street. That only creates more problems.
Least of all should the Mayor of New York, or other people who have legitimately criticized police tactics, be blamed for a crime against police officers. Police work is too important to be shielded from any criticism, and the difference between good and bad police work is much, much too important for bad cops to get a free pass. When police work becomes so recklessly bad that unarmed civilians are getting killed, when the police have actually become a cause of violence on the street, then the civil authorities have a duty to look into that. They would be derelict if they did NOT investigate.
In fact, the police are safest when they have strong civic oversight. In the end, the police's greatest protection is the social contract, which they are meant to enforce. The public support the police because the police protect them. If they endanger the public instead, the social contract breaks down. If ordinary citizens know there's a legitimate grievance process that works, and that they are safe from needless aggression by the police, the cops are safest and most respected. But when those legitimate outlets do not exist, or break down, then people are wrongly tempted to redress by illegitimate means. A breakdown of the social contract leads to unpredictable violence. And that puts everyone in more danger.
cross-posted from Dagblog
The first thing I want to say is that absolutely nothing justifies this. Nothing justifies the murder. And if the murderer committed his crime in the service of any reasonable cause, he has set back that cause tonight. You don't get justice for Eric Garner, or for anyone else, by vigilante revenge. That only makes the problem worse.
We live in a culture where the police (who face real danger in their work) have been taught -- indeed, actively trained -- to be excessively fearful, to the point where some officers will put citizens' lives and safety at substantial risk rather than face some very small risk themselves. (For example, they might apply a chokehold to a suspect who's already being held on every side by several officers, and who can't free either of his arms. What danger that was meant to eliminate is hard to say.) Basically, the logic is that every possible trace of danger should be eliminated, which is impossible. So the effort to eliminate all danger generally means being extremely aggressive in situations that aren't actually that dangerous.
But that culture of fear is not helped by randomly killing police officers. It's fear that's driving the aggression, and the fear is fed by the potential randomness of the danger. Cops being killed without seeing the danger coming translates, in our current atmosphere, into cops being hyper-aggressive in situations of minimal danger because, "You never know."
Killing two cops in an ambush won't break up the mindset that killed Eric Garner, Killing two cops at random feeds the mindset that killed Eric Garner. When the police have been convinced that they could die at any moment, they take crazy and dangerous steps against people who actually pose no threat to them.
But anyone pointing at this murder as justifying that bad and crazy policing is a mistake. This doesn't justify anything. And the NYPD's aggression on the street will not, cannot, protect them from things like this. Saying they need to get tough with non-violent offenders in misdemeanor arrests because they could be killed in an ambush makes no sense. That strategy creates new problems without solving the old one.
The ugly truth is, police officers (like every other human being) are extremely vulnerable to a surprise ambush with a gun. Two police officers were killed this way in Las Vegas this year, gunned down by anti-government nuts while eating lunch. And obviously, the Tsarnaev brothers ambushed and killed an MIT campus cop last year. The method is the same every time: come up behind a police officer and shoot. The attacks in all three cases came out of the blue, and there was nothing that the officers could have done to protect themselves. It's not a question of police tactics. None of those cops had a chance.
But this is where the current approach to police work, the attempt to eliminate any and all potential danger, breaks down. You can never eliminate all danger. You can never even eliminate all mortal danger. Every police officer -- and every police officer's family -- has to live with that small, terrible chance. (I am a police lieutenant's son myself; I know exactly what this feels like.) There is always a little danger that you can't foresee or protect yourself from. But you definitely can't get rid of that uncontrollable danger by getting extra tough when there is no danger. Someone with a gun could always come up behind you. You can't protect yourself from that by choking an unarmed guy who's selling loosies on the street. That only creates more problems.
Least of all should the Mayor of New York, or other people who have legitimately criticized police tactics, be blamed for a crime against police officers. Police work is too important to be shielded from any criticism, and the difference between good and bad police work is much, much too important for bad cops to get a free pass. When police work becomes so recklessly bad that unarmed civilians are getting killed, when the police have actually become a cause of violence on the street, then the civil authorities have a duty to look into that. They would be derelict if they did NOT investigate.
In fact, the police are safest when they have strong civic oversight. In the end, the police's greatest protection is the social contract, which they are meant to enforce. The public support the police because the police protect them. If they endanger the public instead, the social contract breaks down. If ordinary citizens know there's a legitimate grievance process that works, and that they are safe from needless aggression by the police, the cops are safest and most respected. But when those legitimate outlets do not exist, or break down, then people are wrongly tempted to redress by illegitimate means. A breakdown of the social contract leads to unpredictable violence. And that puts everyone in more danger.
cross-posted from Dagblog