Monday, October 28, 2013

Dead Man's Name Tag

I've been away at an academic conference for nearly a week, leaving blog posts unfinished, e-mail unanswered, and campus office untenanted. I had a wonderful time with a bunch of scholars and actors at the American Shakespeare Center's reproduction of Shakespeare's Blackfriars playhouse. (If you'd like to see some excellent theater, a trip to see the ASC's company in Staunton, Virginia, is a great idea.) But I also bumped up against a small problem that's began to follow me wherever I go professionally: the problem of my (real) name.

I am not the only Shakespeare scholar with my name. That's not surprising. My real name is quite common, not "John Smith" but a dirt-common first name with a vanilla-ethnic surname, so I bump into nominal doppelgangers all the time. I've gotten other people's phone calls and in one memorable case another person's subpoena. For years I went to a father-son barber shop, right across the street from my workplace, where both barbers shared my name and the younger barber, the one who cut my hair, even shared my middle name. So I wasn't surprised to discover that there was an older Shakespearean with my name, someone who began university teaching while I was still in grade school. And I knew what to do about it.

I have always been careful to use my middle initial in my academic byline, especially in my publications but also when registering for conferences, so that conference programs and name tags identify me as "Cosimo P. Cleveland." This is the simplest (and likely the only) way to differentiate myself from the earlier scholar who published as plain "Cosimo Cleveland" or, occasionally, "Cosimo T. Cleveland." Using the middle initial feels a little fussy and overly-formal, and it wouldn't be my preference in an ideal world. I certainly don't introduce myself with my middle initial when I'm shaking hands; in fact, I use a less formal nickname, equivalent to "Cozmo" or "Coz." But leaving the initial out of my byline would be sloppy and disrespectful. It would also verge on filial disrespect for my own father, a published spy novelist whom we can call "Cosimo C. Cleveland," but there's not much chance my work will be confused with Dad's; books by that Cosimo Cleveland tend to be about fearless Mossad agents and books by this one tend to be about Elizabethan actors marking up their scripts. In any case, the middle initial is on my book and on all my published articles so far, so there's no changing it. My byline is now my byline.

But when I go out in the professional world, that fussy little middle initial has been increasingly dropping away. The problem isn't that people confuse me with the other Cosimo Cleveland. The problem is that he's getting erased from history.

Cosimo T., who got his first job teaching college a quarter-century before I got mine, published much less than I have. This isn't a reflection on him, and certainly isn't a reflection on me, but an indication of how much our profession changed between his generation and mine. Professors hired in the seventies did not publish nearly as much as professors must today, because professors then were not expected to publish the way we are today. I published more articles before I got tenure than Cosimo T. published in his entire career, not because I am smarter or more industrious than Cosimo T. was, but because one of the requirements for me to get tenure in the 21st century was to publish more than Cosimo T., who taught at a somewhat better school than I, published over his three-decade career. If I hadn't out-published him by the end of year five, I would have been fired. This is true everywhere. All academics of my generation have to produce much more research than the older generation did. Those are just the facts of our business.

And, unlike Cosimo T., I have published a book.  That's partly about generational expectations as well. But it means, inevitably, that more people have heard of the younger Cosimo than of the elder. So they don't necessarily see the point of my fussy little middle initial, except as something pointlessly fussy. They don't see it as differentiating me from the other Cosimo Cleveland, because they don't know there ever was another Cosimo Cleveland.

So I unfailingly send in my conference registration paperwork as "Cosimo P.," but sometimes I show up and open my program to find "Cosimo Cleveland" on the schedule. Not always, of course, but twice in the last month.  And I get a name tag identifying me as simply "Cosimo," so that I have walked around a conference hotel for a long weekend wearing a retired man's name, and now, I have come to fear, a dead man's name. I googled the other Cosimo Cleveland this morning and saw an ambiguous reference to his death. But when I search "Cosimo Cleveland Shakespeare obituary" google just gives me a bunch of links that refer to myself. "You've totally eclipsed that guy," one of my conference friends told me six months ago when I explained this problem. "Everyone who talks about 'Cosimo Cleveland' means you." But being part of someone else's eclipse, even unintentionally, is not a good feeling.

And while I can usually insist on the middle initial in print, I can't make people remember it when they cite my work. It's very common to leave out middle initials by accident, or to misremember the initial, when writing footnotes. I've made both mistakes myself, meaning no disrespect to the scholars I was quoting; before I had any work of my own to footnote, I had not thought about why people might prefer a specific form of their names. (I apologize to those scholars here, and will happily do so again in person.) There's no great conspiracy at work, but the error is clearly related to the earlier Cosimo falling out of academic memory. If my name were Stephen X. Greenblatt, I would not be having this problem.

There's no way to insist on the middle initial or to make any kind of fuss when it gets dropped. That wouldn't revive Cosimo T.'s reputation, but would surely give Cosimo P. one. All I can do is to scrupulously use that initial myself, because the fact of that earlier career deserves to be acknowledged. History is part of my work. If I spend much of my time trying to retrieve lost details four centuries gone, I should not consent to forgetting the recent history of my own guild. And as someone who will die someday, I find it sad to see another person's life being forgotten. Devouring time may blunt the lion's jaws, but it is also devouring the memory of one Cosimo Cleveland, a dedicated teacher and scholar, and in due course it will come for me. There isn't even malice involved. People simply stop knowing. The name tag hanging around my neck in this or that hotel ballroom gives no testimony to that earlier Cosimo's work or life, but I have to read it as a reminder: memento mori.

cross-posted from Dagblog

Monday, October 14, 2013

A Plague on False Centrists

“A plague on both houses!” I've seen that line from Romeo and Juliet quoted repeatedly for the last two weeks,  as pundits and bloggers devoted to “balance” argue that the Democrats and Republicans share the blame for the current budget shutdown and the looming threat of default. The line itself is a cliche, but quoting Shakespeare makes you sound learned, and that is too often the major aim of both-sides-do-it journalism: making the journalist seem wise and above the inconvenient facts of the fray. Shakespeare was a poet, not a pundit, more interested in dramatic complexity than sound bites but if we’re going to mine his plays for lessons, we should remember what we’re quoting. Saying “a plague on both your houses” does not solve political conflict in Romeo and Juliet or in the real world. It accelerates a destructive feud, and it's meant to. Those who curse both houses are not trying to make peace. They are egging on a brawl.

    The line “a plague on both your houses” is of course spoken by Romeo’s friend Mercutio, mortally wounded in a street duel. Taken out of context, it sounds like an accusation that the Montague and Capulet families are both equally violent and equally blameworthy. But the audience can see for itself that this is not true. Romeo steadily refuses to fight. He and the other Montague on stage, his cousin Benvolio, work tirelessly to head off the duel, and when that fails Romeo physically throws himself between the fighters to stop them. Romeo and Juliet is a play about dangerous civil disorder, and the leaders of the Montague and Capulet houses do share the blame for disrupting the peace. The senior leaders who should rein in each house’s servants and young men, the clowns and hotheads, instead actually encourage aggression. But Mercutio pretends that there is no difference between the play’s most violent characters and its handful of peacemakers, no difference between starting a fight and trying to stop it. His pretense of neutrality is worse than an empty pose; it actively promotes violent conflict.

    Mercutio deliberately goads one of Juliet’s cousins to combat, and when Romeo refuses to be baited Mercutio jumps in to start the fray. He not only wants a fight; he insists. “A plague on both your houses” is the cry of a man killed with a sword in his hand. It is meant to spur Romeo to further bloodshed, accusing him of causing the death he tried to prevent, and it succeeds. Every death in Romeo and Juliet comes after the "plague on both houses" line. Blaming both sides equally undermines the peacemakers but empowers the hostile and unreasonable, freeing them from any public responsibility. If every time you kill a Montague (or a Capulet), people shake their heads and blame both the Montagues and Capulets, you might as will kill as many Montagues or Capulets as you can; your victims split the blame with you. And if you have to take half the blame every time you try to make peace and get attacked with a sword instead, you will never manage to make peace. Apportioning half the guilt for every crime to the criminal and half to the criminal’s sworn enemy is not an act of moderation. It promotes and rewards extremism.

    The American media has unerringly taken the side of confrontation and brinksmanship and discord, through years of mounting political disputes and manufactured crises. The press has done this while pretending neutrality with sad, wise shakes of the head, lamenting the unreasonableness of both sides. That head-shaking is not neutrality. It is active intervention on behalf of the unreasonable. Unprecedented acts of obstructionism are treated as routine tactics. Partisans abusing the legislative process to extract concessions are awarded the same stature and coverage given to national leaders seeking compromise. Worse, those who work for conciliation and those who work against it are portrayed as equally partisan, as if deal-making and deal-breaking were simply two sides of the same coin. Individual journalists may be liberal or conservative, but the political media itself is clearly biased toward confrontation: indifferent to policy results but hungry for drama, always looking for more juicy showdowns and shutdowns and crises. So the press has played Mercutio, standing in the street hoping to see a fight, scoring the “winners” and “losers” of every pointless showdown. The media pose as objective bystanders because they only forgive half of every crime and only slander half of every good deed. But absolving the cheats and brawlers is not objectivity. Cursing the peacemakers is not standing honest witness. The press has not been a bystander. It has acted, scene after pointless scene, to build more conflict. Our political journalists have helped to write the sorry drama our nation must now play out. They should take their bows.

cross-posted from Dagblog

Thursday, October 10, 2013

How Much Do You Have to Write to Stay Sane?

Flavia has a post about her writing process, with many thought-provoking comments from her readers, and Dame Eleanor Hull posts a great deal about the academic writing life. I find that I can't give a clear account of my writing process right now, if by "writing process" we mean my composition process. But I have learned, through difficult trial and error, that I need three things to keep my writing going well:

1. Something accepted but not yet in print.

2. Something submitted but not yet accepted.

3. Something new that I'm actively working on.

I know these sound like results, or productivity targets. But I don't think of them that way. The goal isn't necessarily to have x amount of work accepted in y amount of time. When I do manage to have all three of these things at once (and I certainly have not always done so), they operate as a security blanket. They allow me to write, because they keep me from worrying about my writing.

Having at least one article in press, one out for review, and one on the boil, when I can manage that trick, keeps me from obsessing about the response to any individual piece of writing. Otherwise the danger is that too much energy goes into worrying about one specific piece. That's not healthy because no one piece of writing defines you as a writer, and not healthy because the things you're worried about are beyond your control. If you're an academic writer, your articles can get swept around in the unpredictable weather of peer review, or becalmed for months and months at a journal that's going through organizational problems. You can't control that. Once it gets published, people will read it or not, like it or not, cite it or not. You can't control that either. And while you're working alone at your desk, you can lose your way worrying about whether or not what you're doing is any good at all, meaning whether anyone will like it. Anyone who's written a dissertation knows how easy it is to despair over a piece of writing that you've spent too much time working on by yourself.

But if something's out in the mail and something else is in press and something is getting worked on steadily at my desk, it's a lot easier not to worry about the one in the mail, or pin too many hopes on the piece you have coming out next summer. And it's easier to let the thing you're working on be itself, and let the worries about venues and reviews come in due time. Most of all, no single thing starts to feel like the barometer of your success. Yes, some things, especially the book you're working on, are more important. But having more than one piece of writing at various stages is, at least for me, a wonderful psychological buffer.

My three-things rule is especially suited for academic writing, where it takes months for a response to come back after submitting an article, and often a year and more between acceptance and publication. But fiction writing works on the same schedule, and literary fiction perhaps a slower one. Literary magazines can take more than six months to give you any response. The other reason to try to have pieces in various stages of acceptance, submission, and preparation is that you can not afford to stop working for the three to four months it takes to hear back about each piece. When an article or a story is out the door, you need to work on another article or story. You don't have that many months to waste. And if a fiction editor takes a pass on a first story but asks to see another, you had better have another story, better than the first, to show her.

You're not a writer because you have one story you're proud of, or one article you think is important, or even one book manuscript that you hope will win some prize. You can't afford to let your sense of yourself as a writer be tied to the fate of that one piece as it tries to find a home. Or maybe you can. I certainly can't. I need to feel that if I'm a writer, there's more where that came from. If I approach my work that way, I can afford setbacks to this or that particular piece, and you have to be able to afford the setbacks if you want to write because sooner or later they're coming.

Writing is a public act performed in a private place, something you do alone at your desk for the widest audience you can manage. If you give too much weight to what other people think, or give too much weight to your own private anxieties, you will have trouble writing at all. You give up hope after too many rejections in a row, or begin undermining your work in a misguided attempt to give people what you think they want. Or you will wrestle with yourself endlessly in private and pin yourself, never putting anything in the mail because it's never "finished" and never finishing anything because you refuse to show it. Either way, you get lost, and the work suffers. Staying sane enough to write means positioning yourself somewhere between your inner voices and the outside world, where you are able to listen to both clearly, because you need to listen to both, but where neither gets the last word.

cross-posted from Dagblog

Tuesday, October 08, 2013

The Tragedy of the Will

Twenty years ago, while I was talking politics with my friend Mike, he said that Reagan's great achievement was what he called "the Nietzschification of the Right." I didn't grasp what he meant at first, since I typically encountered Nietzsche quoted by leftist literary critics. Mike's point was that Reagan had transformed American conservatism from a stodgy, rationalist enterprise into an emotional, charismatic movement like the New Left of the 1960s. Main Street conservatism gave way to Movement Conservatism, founded upon passionate emotion and conviction. I've thought of that conversation a lot over the last two decades, through the rise and fall of Newt Gingrich, the second Bush Presidency, and the flood tide of the Tea Party. Mike's case has gotten stronger year by year. Mike himself has been furloughed in the government shutdown; he's now a government regulator.

Part of the right wing's Nietzschification has been its emphasis on the will as the decisive force in events. The current version of conservatism has become convinced, more and more thoroughly, that any reality can be reshaped by a sufficiently powerful imposition of one's will. Nothing is impossible if you just believe. But this turns out not to be true. In the actual world, reality takes belief's lunch money on a regular basis. Movement conservatism as practiced by Reagan was still largely the art of the possible; he was empowered by his movement's fervor, but mostly did what he could get through Congress and what his military forces could manage. When he did unrealistic things, like raising the deficit sky-high with tax cuts that were sold as likely to pay for themselves, the consequences either got shunted to the future (because Reagan's huge national debt would eventually be someone else's problem, i.e. ours) or borne by people without any political muscle to fight back, such as the mentally ill or the homeless. He ignored the consequences he could afford to ignore. But when he lost some Marines in Beirut, he pulled the Marines out. He didn't try to will the situation to his preferred result.

By the second Bush presidency, much of the Republican party had lost its ability to make that distinction. The Iraq War is nothing if not the disaster of policy makers who felt they could reshape the world simply by willing it. This is the period during which a White House source talked derisively about the "reality-based community" and ranted about how the Administration was "creating new realities." That's the force-of-will worldview right there. And you heard an enormous amount about will during the Bush II years. Military strategy was often cast as about demonstrating sufficient amounts of will, as if once our enemies realized we were serious, nothing else would matter. (This of course leaves out the possibility that our military enemies might themselves bend intense willpower toward achieving their goals. Since our primary enemies were hardened religious fanatics, that was more than a possibility.) This led Matt Yglesias to coin his phrase "the Green Lantern Theory of Foreign Policy," after a comic book superhero who could do anything with sufficient willpower. The last decade demonstrated just how poorly that theory worked.

Now the conservatives in the House are not merely trying to impose their will over policy realities, but over the reality of the political process itself, as if they could guarantee a victory over Obama simply by being more committed to the goal. They have made demands and not gotten what they demanded, and they have no plan but to stick to those demands. That's it. They ultimately believe Obama will cave because the power of their belief itself will make him cave. They don't have any other plan, and they have no endgame. Recently, some Republican senators from swing states angrily asked Ted Cruz what his strategy was, and he answered, apparently unconcerned, that he did not have a strategy. When this provoked his fellow Republicans to vocal rage, Cruz allegedly responded by calling them "defeatists." Think about the mindset that reveals. Someone with no game plan at all, someone who has no idea of how to try to win, takes the suspicion that he will therefore not win as a sign of a character flaw. Those who expect to lose simply because they cannot see any possible way to win are defeatists. Winners, evidently, do not need plans in Cruz's view of the world. They just need to believe in themselves.

That the Republicans, and especially the Tea Party wing of the Republicans, might actually suffer a political defeat seems to strike them as inconceivable. Their plan is to will themselves to victory. The fiscal and political health of our nation is in the hands of people too unrealistic even to calculate their own selfish chances. They are not unrealistic by chance, but by design. They are not simply poor gamblers, bad at estimating their odds. They are opposed to realism on principle. Realism is just defeatism. They are committed, more than anything, to the primacy of will over reality. That is the beating heart of their value system. To accept facts that they cannot change would be a betrayal of their most important principle. To do so would leave them lost and rudderless. Of course they can't make concessions to reality, let alone to Barack Obama. They cannot bring themselves to concede that "reality," as we know the term, even exists.

cross-posted from Dagblog