Tuesday, September 28, 2010

The Second Amendment Right to Kill You

cross-posted at Dagblog

Someone with an assault rifle shot up the University of Texas today. And then he killed himself. I'm pretty nauseated just writing that first sentence, considering UT's history. No one is dead but the guman, who shot himself.

Ten days ago someone else walked into Harvard Yard with a handgun, stood on the chapel steps, and killed himself in full view of a tour group. It was Yom Kippur. He left a nineteen-hundred-page suicide note, detailing his inability to get past sophomore philosophy questions.

In related sophomore-logic problems, one commenter on UT-Austin's emergency alert page used this to argue for more guns on campus.

Andrew Kelling said on September 28, 2010

This is exactly why we need to allow concealed carry on campus, so that if this shooter had decided to open fire, he would've been stopped long before he would have done much damage


That's one deep dedication to the counter-factual there, Andrew. Although the shooter harmed no one but himself under the current gun laws, it proves a need for even looser gun laws which "would've" hypothetically prevented him from doing any harm. Let's imagine things went even worse than they did, but then imagine that something else magically solved the problem. When your guarantee of public safety rests on not just one but a series of contingent events working out in your favor, I'd say you've just stepped down from the planning commitee. But Andrew's just working with ideas that are already out there, ideas which are so committed to everyone's freedom to have an AK-47 that no real world evidence can possibly interfere.

Apparently, we are committed to the Second Amendment, and committed to it everywhere, even in places dedicated to strengthening the mind and spirit. Thanks to the Founders' commitment to intellectual liberty and philosophical ideals, there is nowhere in this country where the life of the mind cannot be cut short with a small piece of lead.

Both Harvard and Texas are beautiful places. They've both been kind to me in what time I've spent there. They both have people I like and want to see safe.

Both of those stories are going to be spun as relatively happy endings, because neither of the unbalanced gunmen murdered anyone but himself. Pardon me if I don't see suicide as a happy ending. And I'm tired of talking about preventing murderers from getting guns. It is a terrible thing to sell a suicide the bullets.

I wish that 1,900 page suicide note had a sequel. I don't mean that as a joke. I think that every suicide note should have a sequel. I think the people who write them could eventually come through the other side of their suicidal depressions and rejoin the living world. I think that would be better for the potential suicides and better for the world.

It's easy to die. It always has been. But there is no point in making our fragile lives more fragile, making them easier to throw away in one dark moment. Death has always been stupid. But I'm sick of living in a country that makes dying so easy.

Thursday, September 23, 2010

Teaching Shakespeare and the New Normal

cross-posted at Dagblog

My profession has a lot of annual rituals, some obvious and some not, and one of them happened the week before last: the annual job list went live (and, in another annual ritual, came precariously close to crashing for the first afternoon).

The start of college classes in the fall also means the beginning of the hiring season for university faculty; most full-time jobs, and most of the good ones, are posted a full year in advance. If you want a job teaching Shakespeare to college students, you start looking at the job list in September. Ten days ago, a list of almost every college and university hiring English professors for fall of 2011 went live, and everybody trying to get one of those jobs took a look. The list gets updated throughout the next month or so, and there will be a second, shorter list in the spring. But the fall list basically gives you a sense of where the job market is this year.

I read the list every year, for a lot of reasons. When my department is hiring, I check to see who else is hiring in the same field that year. And I check the list for Renaissance literature out of professional curiosity. It tells me what kinds of jobs are out there, in what numbers, and what kinds of schools have the resources to hire people. It suggests details about some of my colleague's careers: who is being replaced at a school they've left, who has a good shot to get a permanent job at the place where they're a visiting assistant. It tells me which sub-specializations are currently fashionable with hiring committees, and which are out of favor. I read the list to see what's out there for younger friends who are looking for their first jobs. And I read the list because although it's been years since I've been in those friends' shoes, looking at the list with a churning mix of anxiety, hope, and basic need for a salary, it still feels like it was only weeks ago. Three months, tops.

For years, since long before I started reading it, the list has been much, much shorter than the list of people needing or deserving jobs. My first day teaching as an assistant professor, one of my students informed me in class that a hundred and fifty people had applied for my job. I don't know what reaction he wanted, but the truth is, this is what normal has become in my profession: everyone who has a job had to beat out at least a hundred and fifty people to get it. And there certainly weren't a hundred and fifty jobs for professors of Renaissance literature that year. Far from it. The mismatch between the number of jobs and the number of (highly qualified) job-seekers comes from twenty or thirty years of colleges shrinking their faculties, replacing full-time salaried jobs with underpaid "part-time" teachers. Many of the hundred and fifty people passed over for my job became "part-timers" somewhere, usually at three or four different schools every semester. One result is that even PhDs who do well at Ivy League universities can easily end up with only piece work, or with nothing at all. The other result is that I'm the only Renaissance lit scholar at a university which once had four or five of them. The second fact helps explain the first.

Since the economy cratered, my profession hasn't even been able to sustain that
threadbare and desperate version of normal. When the financial crisis hit in 2008, well after the hiring process had begun, universities behaved like any other employer. They panicked about taking on more salaries, and so jobs that were already in mid-search were canceled. Even coveted jobs at rich and famous universities had the plug pulled. What had been about five dozen jobs teaching Shakespeare or Milton became four dozen, or less, although there were still the same hundred and fifty or two hundred or two hundred and seventy-five people trying to get them.

We've been waiting for the rebound, like any other sector of the economy. It definitely didn't come last year, when there weren't even four dozen jobs advertised in the fall. There were still 200 smart young Shakespeareans, Miltonists and Tamburlaine experts out there looking for work. In fact, there were more, because the forty or so who'd gotten jobs the previous year had been replaced by two or three times that many new PhDs. But last year's national conference, where the face-to-face job interviews begin, looked like Hamelin after the Pied Piper left town: almost no young people in sight. In this economy, it's become almost impossible for new literature professors to start a career. An entire cohort of young people who might have been valuable and productive teachers and thinkers are being forced out of the profession.

At the same time, the other part of the academic job market, the hiring of senior faculty away from other institutions, stopped nearly dead. This is a much smaller market, with higher prices, for the relatively small fraction of scholars who can make salary demands or change jobs because universities compete for their services. These are the people who do high-profile research, bring in major grants or prestigious awards, and who can attract and train doctoral students. Schools who can afford it will pay a premium to make sure they have enough of those faculty. But since 2008, the faculty raids recruitment of senior scholars has been on hold. No one's had the money.

This year I opened the list and thought, "At last. The rebound is here." There are several very good jobs on this fall's list including a few outright trophy jobs, the kind a Hollywood movie would use as shorthand for success and a happy ending. And even more noticeably, there are some extremely prestigious senior jobs at major universities, which means some schools not only have the money to hire Shakespeareans again but have the budgets to bid competitively for top Shakespeareans. And since most of such senior-level hiring has always gone on without any public advertising, two or three ads for major jobs like this suggest that there's more faculty raiding courtship of leading figures happening behind the scenes. I though things looked good. Then I looked again.

Most of the jobs on the list are terrific, but that's the problem. It's the ordinary jobs that are missing. The Ivy League is back to hiring the way it did before the crash, but the rest of American higher education is still on a recession budget. In fact, the less elite colleges and universities seem to be hiring even more slowly than they have over the last two years. All told, the list has barely more than two dozen entry-level jobs for Renaissance lit professors, and another ten or so generalist or open-field jobs that a Renaissance specialist could apply for (against an even larger field of competitors). More jobs will trickle in over the next few weeks, but I'm afraid it will only be a trickle: last week's update had only one new job. Maybe this is the beginning of a slow recovery, but this year the news is that the recession is only over if you're rich. The five or ten best jobs are still there, the way they were before the crash. About two thirds of the other jobs are gone. The rich are still rich, but the middle class have become poor and the poor have nothing left.

So the profession of teaching Renaissance lit, anno domini 2010, looks like the American economy at large: the investor class at the top of the pyramid has preserved its wealth, but the recession has only grown deeper for the middle and working classes. It's not an accidental parallel: the elite schools have money to spend again because they are altruistic, non-profit members of the elite investor class, funded by the returns on their large endowments. What's good for Wall Street is good for the Ivy League. I don't begrudge those rich and famous universities their resources. They have important roles to play in American education, and I'm certainly pleased that they are still hiring professors. But those few schools are not and cannot be the whole of American education; higher education needs the rest of the colleges and universities to thrive, too. A recovery that only reaches the tip of the pyramid is not a recovery.

What our "new normal" looks like in higher education is what the "new normal" looks like for the American economy at large: an accelerated version of the regressive class stratification that's been taking place in this country for thirty years. The average schools have been falling behind the top 1 or 2% of universities for at least two decades, just as middle-class incomes have been falling behind the top 1 or 2% of our economic pyramid. The Harvard and Yale English departments certainly haven't shrunk to half the size they were in 1981; plenty of others have. Two years after the crash the "new normal" means normal for that lucky one percent, and new for all the rest of us. But of course, the "new normal" for American education is a movement toward a much older set of arrangements. It is a sudden leap in the otherwise slow backwards march to undo the postwar expansion of American education.

Every year for the last twenty or thirty, American colleges and universities have inched closer to the way things were during the Depression: a handful of powerful old schools, and then a steep dropoff. Anyone who is nostalgic for the colleges of the 1920s and 1930s doesn't know about them. American education was far more elitist in every real sense: it educated many fewer students, and educated them more poorly. Elitism did not advance anybody's education: the Ivies were more socially snobbish, but much less academically rigorous then they have since become, and the lesser schools had far fewer educational resources. There's no reason to go back to that model, ever, but we get closer to it every year.

At the same time, American society as a whole moves closer to the old, failed economic arrangements of the 1920s, with massive concentrations of wealth among a small group, low wages for most workers, and a relatively small and insecure middle class. There is no reason to go back to that either, but we get closer every day. The idea that making a small plutocracy even wealthier will benefit the rest of us is, as events have yet again made obvious, not true. That such an idea gets taken seriously is another triumph of nostalgia over history.

I'm a Renaissance scholar, after all. I spend every working day studying a society in which a tiny handful of people controlled a massive percentage of the resources while most people struggled for necessities. That is not a system for which any rational person should feel nostalgia. The art is great; the recurrent famines, not so much. I've seen what a country with an aristocracy looks like, and I don't want any part of it.

Monday, September 20, 2010

Bewitching Jesus

cross-posted at Dagblog

So, Saturday night the news was that Christine O'Donnell "dabbled into witchcraft" before becoming a hard-line evangelical Christian. And you know what? I wasn't surprised at all. Surely I wasn't surprised that a candidate like O'Donnell was attracted to the supernatural, since all of her politics are about magical thinking. I shrugged it off, and Sunday morning I went to church.

The readings at my Church, as they often are, were about the obligations of the rich to the poor. My denomination, for all its flaws, makes sure to read the entire the New Testament, and a big chunk of the rest of the Bible, on a steady three-years-of-Sundays rotation. Because the person giving the sermon doesn't get to cherry pick the Bible for texts to preach about, issues come up on Sunday about as often as Jesus brings them up in the Gospels. Most of the hot-button culture war issues, the ones now perceived as signature "Christian" issues, almost never get mentioned. Jesus seldom talks about any of them. On the other hand, justice for the poor comes up a lot. It is on Jesus's mind all the time. He will never go more than a few weeks without coming back to the topic.

So as so often happens to me on Sunday, I was reading along with the day's Biblical passages and getting a set of pretty clear instructions that seem very different from the instructions that many of my vocal fellow-Christians in this country claim to have received. I certainly am not going to speak about their Christianity. It isn't for me to judge anyone else's faith. Nor would many of them perceive me as a "real" Christian. Frankly, America has freedom of religion precisely because Christians can't agree on what the real Christianity is; the religious disagreements are older than the country. On the other hand, at least some of my duties as a Christian seem too clear to escape. If a mob's forming to drive the outsiders out of town, I had better not be in that mob. If the poor need food, I had better not be lobbying for them to be fed even less. I'm not the Christian I'd like to be, much less the one I ought to be, and I'm not the one to explain what God wants. But I know what I feel is expected of me, and it takes me a long way from what some of the Christian political movements in this country advocate.

But as I was leaving church and thinking about the many different Christianities in this country, I thought about Christine O'Donnell again, and her travels from occultism to her specific version of Christianity and how unsurprised I was. It seemed to me of a piece. Yes, evangelical Christians view witchcraft as their absolute opposite, the other end of the spectrum. But the two camps share a lot that I don't share with either of them.

First of all, both occultists and Christians like O'Donnell believe that magic is real and powerful. Witchcraft is a scary thing to evangelicals because they believe witchcraft to be one of the major problems facing our society today. They believe that the Devil can actually use it to make inroads into human souls, and when you come right down to it, they believe that magic can do things. In fact, Michelle Malkin is defending O'Donnell for exactly this reason, because O'Donnell "learned" the true dangers of witchcraft which helps her to understand dangerous practices such as Halloween. (No, I'm not making that up.) The two opposed camps share a mindset in which Halloween is full of actual occult power. I, to put it simply, do not.

And more to the point, the kind of Christianity O'Donnell espouses in public is essentially witchcraft by other means, a kind of magical practice that empowers and protects believers. I can't judge her actual practices and I know nothing about her private faith, but the Christianity she describes views the world in essentially magical terms. O'Donnell is on record as saying that she believes God "would provide a way" to avoid lying if Nazis were searching one's house for hidden Jews. And that's pretty much the magical view of the world: behavior is judged by how well it confirms to ritualized prescriptions and taboos, such as "never lie," rather than by the moral results of one's actions. One does not in fact make moral choices at all. The moral consequences of one's action are off-loaded onto divine Providence, which is responsible for making things work out well as long as you follow the Simple Rules.

(In fact, God did not provide any such providential assistance for the good people who protected Jews. They all had to lie. This is why Christians long ago provided the "necessary evil" or "lesser evil" principle, which allows you to fib rather than connive at genocide.)

And while, again, I am not fit to judge anyone else's Christianity, I am very fearful of the ways one can fall into what is essentially idolatry while persuading oneself that one is still a Christian. It's very easy. You just dress up your idol as Jesus. If you trade the ethical philosophy, which is complicated, for a simpler set of practices and taboos, and begin to address Jesus (or YHWH or Allah or the Tao or what have you) the way you would Mammon or Dagon, making propitiations in exchange for favors, you might as well just carve a new god for yourself out of a pumpkin. It's the same old business proposition: "I will do what you want, if you bring good things to me." Making that proposition to Jesus doesn't change its nature. I don't believe in Christianity because I believe Jesus can make good on that deal. I believe in Christianity because I believe Jesus does not make that deal.

There are no real wars between religions. There are only tribal wars that use some tribal idolatry to rally the troops, and sometimes the idolaters hijack the name of some better religion's god. The real wars are inside religions: struggles between the obligations of your religion as a set of ethical teachings, which forces you to face difficult realities, and the temptation of turning your religion into a set of magical practices that holds those unpleasant realities at bay. That struggle goes on every day, inside every major religion in the world, and always has. There is no struggle between Christianity and Islam. There is a struggle inside Christianity and a struggle inside Islam, and inside Judaism and Buddhism and Hinduism.

This struggle does not split along traditional denominational lines inside religions, either. As I was walking out of Church on Sunday, I passed a young woman who touching the base of a religious statue and energetically whispering an involved prayer, perhaps having a conversation and asking for a specific favor. I don't know what she was doing, but it was something I'd be uncomfortable with myself, and I'd sat in the same pews listening to the same readings and the same sermon. I'd be pretty surprised if the pastor would encourage parishioners to use a statue to focus an intercessory prayer, but every congregation is a little multitude. There are people worshiping God through sincere ethical commitment in every Christian congregation, even the oddest-seeming ones, and people sitting in the most rational and modern congregation busily propitiating their little folk-magic idol. To tell the truth, most of the idol-worshipers don't even know it.

Thursday, September 09, 2010

Ask Tamburlaine: Burning Korans Is a Bad Idea

cross-posted at Dagblog

Who on Earth is crazy enough to burn the Koran? Until two weeks ago, my answer has always been "raging lunatics in Elizabethan drama." You know, stage characters from the age of Shakespeare, the kind of people who are prone to cutting off their own hands, biting off their tongues and spitting them on the stage, or baking their enemies in pies and serving them for dinner. The people who make Hamlet seem well-adjusted. Certainly, I didn't think of it as the kind of thing real people did.

The Koran burner my students know is Tamburlaine the Great, the world conqueror in Christopher Marlowe's two-part extravaganza Tamburlaine the Great. This was one of the great hits of its time, likely bigger than most of Shakespeare's plays, and Shakespeare's own characters quote it from the stage. But since no one teaches it in high schools, and almost no one teaches it in college, here's the basic story:

A raving megalomaniac conquers most of the world and makes speeches about it. Nobody can stop him. The crazier he gets, the more he wins. Then he burns the Koran (just because) and BAM! He's dead.

And yes, this play was written in a Christian country, for Christian audiences who tended to think of Muslims (like the Pope) as agents of the Devil. But even they thought Koran-burning was a no-no.

Now, I have actually exposed unsuspecting college students to Tamburlaine. And they all say the same thing: the dude's crazy. For ten acts, he's running around putting women and children to the sword. He's putting heads of state in cages and using them as footstools. He's making enemy kings draw his chariot around the stage, like they're horses. He cuts himself with a sword, to show one of his good-for-nothing sons who the real tough guy is. He generally behaves like Kim Jong-Il off his meds. And no matter what crazy thing he's done, he comes back later to top it with something crazier. Then, in Part Two, Act Five, he sets a Koran on fire. And that's just too much for everybody.

And what does he have to say after he does that?

But, stay. I feel myself distempered suddenly.


That's right. It's the Don't Burn People's Holy Books Flu, and it kills him within a scene.

So take it from heavily fictionalized crazy people, kids: don't burn Korans, or any other religious works that hundreds of millions of people value. It only leads to trouble. Try to behave like the saner and more rational characters from English Renaissance drama, like Titus Andronicus, Richard III, or Lady Macbeth.

I wouldn't bring this up if Tamburlaine were only an imaginary character. But of course, fictional characters have their way of influencing real people, and some of Tamburlaine's admirers were so excited by him that they set out to be little Tamburlaines themselves. But they were going to make it happen right there in London! And if Tamburlaine the Great had been a big hero by killing so many many filthy foreigners like Arabs and Egyptians and Turks, the little Tamburlaines would kill some filthy outsiders themselves! Which filthy outsiders?

The Protestant refugees from Europe who'd taken refuge in London. Like bakers and shoemakers.

Some of them left a note on a church door in 1593, promising to murder all of the refugees and their children (who were foreigners, after all):

Since words nor threats nor any other thing
Can make you to avoid this certain ill,
We'll cut your throats, in your temples praying
Not Paris massacre so much blood did spill.


It goes on and on like that for dozens of lines.

It's signed "Tamburlaine."

And there's the lesson. When you train people to focus their rage and fear on some foreign scapegoat, to imagine Muslims or Turks or some other group of "strangers" as frightening and inhuman, there can come a moment when the people you've gotten worked up unpredictably switch their fear and hate and thirst for blood to another group of "outsiders" that you didn't expect, some group that's closer at hand and easy to get at.

And once the mob forms, it's too late to say, "No no, we didn't mean them." Once you start whipping up a mob to go after those stinking foreigners, you don't get to tell them exactly who counts as a stinking foreigners and who doesn't. They know who's not one of them. Mobs don't listen to lectures about details. It's the principle of the thing they care about.

Monday, September 06, 2010

What Happened to America's Economy

cross-posted at Dagblog

We're always told that economics is a complicated science, which is true, but also that bottom-line practical economics is very simple. But the simple rules we've all learned seem to have landed us into an incomprehensible mess. Let me try to recap what's happened.

Obviously, every business needs to make a profit to survive. This is done by keeping costs lower than revenues. If you're selling something, you need to make more money selling it than you spend in making or buying the product and in paying your workers.

What every business would like to do is to spend as little as possible without cutting revenues. This makes sense. And for the last thirty years, the most popular way of keeping costs down has been to keep labor costs down. The fewer people you pay, and the less you pay for them (counting wages, benefits, and payroll taxes), the more profit you can make. This also makes sense. Every business owner wants to keep pay down and profits up.

Now, here's the problem. If you pay your employees less than all of your competitors do, but charge equivalent (or slightly lower) prices, you will make more money than all of your competitors. If you could spend 10% or 20% less on your workers than everyone else in the market spent on equivalent workers, you'd have a winning strategy. However, at this point everyone who's gone to business school for even a semester, and many people who haven't, have all been taught that the key to succeeding in business is keeping costs down. So nearly everyone is trying to keep pay as low as possible.

This becomes a big problem because ultimately the employees (not just your employees but all of the economy's employees combined) are also the bulk of the consumers in the economy. If you pay Bob Cratchit only two-thirds of the going wage, but everyone else pays their Cratchits the full wage, you will be on the cover of Forbes. But if everyone pays their Cratchits two-thirds pay, suddenly there will be a lot less money out there for people to spend on buying your patented Scrooge-o-matic this Christmas. And then your business is in trouble. You could just fire a few Cratchits to keep your profit margin up, but if everyone's firing Cratchits then there will be even fewer people buying Scrooge-o-matics. It's a classic game-theory trap.

The problem is that reducing overall employee pay (either by cutting their wages or cutting taxes and benefits that employees will have to make up out of their wages) reduces overall consumer buying power. There are some classic ways to reduce (if not avoid) the problem of reducing consumer buying power.

The first is by increasing your national economy's exports, so that you get revenue from customers outside your own workforce. The United States has actually gone the opposite route, importing more and exporting less, so that makes the problem worse.

You can innovate in ways that allow you to produce higher-priced goods with lower overhead even when you hold employee pay more or less stable. We have done some of this, but it's a marginal benefit, and it's likely been overstated. Technological innovation has helped stave off the moment of crisis when the customers run out of money, but not much faster than the trade deficit has been hurrying that crisis toward us.

And finally, you can count on consumption by the very rich to make up for the loss of the employees' buying power. This is the "trickle-down" theory, which proposes that making the top spenders in the economy even richer, even at the expense of the lower, middle, and upper-middle spenders, will eventually help everyone. And it's true that the consumption of the very wealthy is consumption and helps as far as it goes. But the question is whether Mr. Scrooge's increased consumption when he has ten times the money in his pockets offsets the loss of consumption from a large number of Cratchits that he's fired. The result of our current real world experiment seems to be "Not even close."

So at some point, the United States economy will reach a point when the ongoing reduction in worker pay has damaged the consumers' overall buying power so badly that people can't buy anything and businesses can't make any profit. When will we hit that point? About twenty years ago. Maybe twenty-five years ago, maybe fifteen, it's hard to say. But certainly that crisis point, when it was time to turn the ship around and plot a new course, is already many years in our past.

When American businesses neared the point where the customers run out of money, they took very careful steps to postpone that crisis without fixing the underlying problems. By that point, the principle that pay should be kept low was already fixed in business's mind as an immutable and nearly moral law. (In fact, the compensation-cutting has only accelerated over the last decade or two.) So the question was how to shore up consumers' overall buying power even while cutting their overall pay.

There were two basic short-term fixes that disguised the problems and solved them in the short term while making them worse in the long run. The first was to cut prices on lower- and middle-end consumer goods by producing them overseas and retailing them in discount chains such as Wal-Mart. People who worked for a living were making less money, but many things were cheaper, so it felt like they were more or less keeping pace. This solution seemed to work for a while, but it cuts prices by cutting pay even faster, moving manufacturing jobs offshore and cutting wages for retail sales people (like Wal-Mart's employees) as far as they could be cut. So the first part of the strategy eventually undermines itself, and ultimately makes the economy's basic problem worse by cutting the consumers' buying power even deeper.

The other short-term fix was to loosen consumer credit by an unprecedented amount and make up for the lost real buying power by allowing consumers to rack up debt. This also papered over the problem for a while. Even if consumers had less money, they felt like they had money, and more importantly from business's point of view they spent that money. Meanwhile, business (in the aggregate) made even more profit off the consumers, by charging interest on that debt. Profits grew even higher, but the consumers' real long-term buying power was cut even further by the unprecedented amount of and unprecedented interest rates on that consumer debt. All of those credit cards kept the economy's wheels turning much longer than the actual fundamentals of the economy would allow, but left it in a much, much deeper hole. It's fashionable in some quarters to blame the economic downturn on the spendthrifts with the credit cards, but this misses the larger point: American business as a whole extended ridiculous amounts of credit for nearly twenty years, far in excess of borrowers' overall ability to repay, in an attempt to maintain unrealistically high corporate profits. American businesses essentially put the whole economy on Mastercard.

Now, of course, things are much much worse than they would have been twenty or so years ago. That was when we reached the point where depressing wages begins to depress consumers' buying power and hurt profits. Now, enabled by two decades of smoke and mirrors, we've gotten a long way below that point. Why would the American business community conspire against their own long-term economic future in this way?

The answer is supply-side economics, which basically teaches that the most important thing for economic growth is the availability of capital for investment. (It is absolutely true that an economy without enough investment capital will do poorly. There needs to be enough capital to start new businesses and expand existing ones.) Therefore, the key to building the economy is imagined as business profits which can be re-invested. Supply-side, at least in the crude popular form which actually gets put into practice, doesn't worry so much about the problem of demand. It proposes that if there is enough money to invest in a good business, the market will make that business a success, and people will buy the business's product or service. Where will the customers come from? Where will they get the money? The motto of supply-side is "If you build it, they will come." The problem is that they will come without money to buy anything.

The result of supply-side thinking is a focus on making the economy good for business profits, rather than for workers. (And of course, there is such a thing as an economy where pay's too high and profit's too low. But ideological supply-siders deny that profits could be too high or wages too low.) The focus has been on keeping costs down, and we've developed a whole set of beliefs and assumptions that go along with that idea. Everybody knows that taxes are bad for the economy, and everybody knows that unions are bad for the economy, because those things cut into employer profits and increase employee's earnings and buying power. But everybody's wrong; the economy was going like gangbusters in the heavily-unionized, relatively-high-taxed boom years of the Fifties and Sixties. And while there is such a thing as taxes being counter-productively high and union demands being unreasonable, unionized workers who can count on government benefits also make pretty good consumers, and the economy misses them when they're gone.

An implicit assumption of supply-side economics, at least as practiced by American policy-makers, is that there can never be too much capital available for investment. It imagines a smooth line on a graph, where the more capital there is the more the economy benefits, with no point of diminishing returns and certainly no point where the free capital becomes counterproductive. But this is demonstrably untrue. At a certain point, you get more capital than there are good investment opportunities for it, and then problems start. Ultimately, large piles of investment cash without enough real things to invest in lead to speculative bubbles. When average workers and consumers have too much cash for the amount of things that there are for them to buy, you get price inflation. When investors have too much cash for the amount of profitable enterprises that they could buy into, you get bubbles. Lots and lots of that accumulated capital gets put into investments that are unlikely to pay off, but the act of buying those ill-considered investments pushes their prices up, so they look like a huge profit opportunity, until eventually people are paying a million dollars for a tulip bulb and there's a crash. That's where the tech bubble and the resulting crash came from; that's where the housing bubble and the resulting crash came from. When the amount of profit going to the investor class becomes excessive, those big piles of cash basically start to set themselves on fire, and the fire spreads to the rest of the economy.

The problem we're in now is keeping a sufficient amount of investment capital available to build the economy back up, while restoring the buying power of the people who do most of the working and paying and living and dying around here. It's not an easy problem, but that's what the problem is.

Happy Labor Day, all!