Saturday, November 28, 2009


The definition of “democracy” means something not very complicated: the rule of the people. The implication is that the people exist in a society sufficiently socially coherent that people can respect one another despite their differences to reach majority-based decisions for the greater good of the greatest number. American democracy hedges on pure majoritarian rule with constitutional rights to protect individuals against the state.

But the history of this country has worked against this social coherence. Western European invaders who settled the lands later organized as the United States annihilated by disease, decimated in warfare, exiled, or segregated on reservations the continent’s indigenous peoples, who were its last residents to be recognized as citizens. These invaders soon brought with them slaves from Africa, who, though freed centuries after their first arrival, remained segregated from whites by law, housing patterns, and employment and educational opportunities. Later immigrants from Italy, Ireland, Germany, and the countries of Eastern Europe predominated in certain states or in enclaves in the larger metropolitan cities. An essentially white, Anglo-Saxon, Protestant over-class, concealing these social divisions with the Melting Pot myth, sustained these racial and ethnic structures, and all groups sustained their prejudices.

The good news is that WASPs are losing or have lost their grip. Many differences and divisions have become less pronounced in the past half century. Few organizations with a public face restrict membership on the basis of race, religion, gender, and national origin. Despite protests against political correctness, people of prejudice rarely publicly express their true beliefs and feelings about blacks, Jews, Catholics, or the many ethnic or nationality groups. Derogatory terms—the list is long and ugly—are big-time no-no’s. Prejudice no doubt lurks in some people’s thoughts and feelings about people different from them, but much of it is restrained and losing strength. Old divisions are dying.

The bad news is that the new divide is a political one, between Democrats, frequently “liberal,” and Republicans, increasingly “conservative.” The divide is so sharp that snide or insulting characterizations and associations, not the issues, dominate debate as never before. The most prominent is the Republican self-identification and self-celebration of themselves as “real Americans,” and Republican regard for Democrats as subversive or traitorous anti- or un-Americans, like atheists, elitists, bi-coastals, communists, socialists, fascists, coddlers of terrorists, flag-burners, baby-killers, sympathizers of child-molesters and other sexual perverts, and the like. The worst that I have heard Democrats say of Republicans is that they are dumb and dumber. No one accuses Independents (moderates?) of anything stronger than fence-straddling.

Which causes me to reflect how the Republicans, if they return to power in both the White House and Congress will rule. By making every effort to delegitimize Obama and doing everything possible to disrespect him, and by refusing to engage as a loyal opposition, not simply as the party of “no,” Republicans are setting precedents which will come back to haunt them and hurt all.

A contrast between party behavior in the first months of the Reagan and Obama administration is instructive and worrisome. During the 1980 Republican primaries and the campaign, Reagan pledged to introduce legislation based on supply-side, Laffer-curve, economics which claimed that tax cuts so spurred the economy as to increase government revenues. His rival in the primaries, George H. W. Bush referred to this theory as “voodoo economics.” When Reagan assumed office, the Democrats had long controlled both houses of Congress. Nevertheless, although they regarded Reagan’s tax proposal as absurd, they accepted that the elected president deserved a chance to effect one of his primary campaign planks. The rest is history: Democrats passed the legislation which, as they predicted, dramatically increased federal government debt.

Fast forward. During the 2008 presidential campaign, George W. Bush confronted a recessionary economy diving toward depression. Obama continued his nearly trillion-dollar TARP bailout of the financial industry when he became president. He provided additional support to the financial industry, new support to the collapsing automobile industry, and a stimulus package to encourage employment through public works and other programs. Although Democrats had made gains in the 2006 and 2008 elections and recovered control of Congress in 2008, when Obama won office by a significant majority—proof of widespread public support—Democratic efforts to improve a troubled economy inherited from Bush were vigorously opposed by Republicans. Since then, they have refused to work with Obama or Congressional Democrats, have resisted legislative proposals by threatening filibusters, and thereby set the new standard for passing legislation, not a majority vote of 51, but a supermajority vote of 60. Fine.

Actually, not fine. A Republican president inaugurated in 2013, even with a majority, but not a super-majority, of Republicans in both congressional chambers, cannot expect Democrats to agree to legislation swinging sharply in the other direction on major issues facing the country. If Democrats obstruct a Republican president and small Republican congressional majorities, what argument will they make to the American people? What will they say that Democrats cannot say now—we restored white power?

The political divisions which Republicans have made impossible to bridge by their example will bring democracy as we know it to an end. The details of its demise have yet to be worked out. If Bush’s abuses of Executive Branch agencies are any indication, a future GOP coup to secure political power is not inconceivable.

Friday, November 20, 2009


For 40 years, an abstract impressionist painting by Vance Kirkland, local Denver artist of some repute in the mid-twentieth-century, has hung on my living room wall. I have rotated it 90 degrees clockwise from the orientation which he intended; I think of my re-orientation as an act of participatory art. As I have looked at it over the years, I have tried to see some pattern emerging from its whirls and blotches. I have failed, but my failure pleases me. It teaches me to look without seeing when there is nothing to see.

Likewise, when I look at Sarah Palin live on television or still in photographs, I do not “see” anything, at least nothing certain. She is obviously attractive and vivacious, smart (in a small-minded, mean-spirited way), uninformed (in a large-scale way), and smooth in most situations and getting smoother. I have heard her speak, I have read and heard about her, and I know most of the criticisms about her. She gives evidence of narcissism and paranoia, both reflected in vindictive score-settling for the slightest slights. She has been platitudinous on domestic social and economic policies, ignorant of foreign affairs, and, most unusual of all, unwilling to learn more about the issues and to become more nuanced in her thinking. But I can make nothing of the woman. Who is she? What are her motives? Her principles? Her values? I still cannot answer my questions.

However, I can almost say, good for her. I believe that she is everything which her critics in both parties say that she is. But I also believe that their criticism does not matter: the more they say it, the more they strengthen her. The more they pan her electronic twittering, her media appearances, and her platform speeches, the more her fans admire whatever it is that she represents for them. I have a grudging admiration that she has buffaloed—or should I say “moosed”?—her critics on both left and right.

They had better shape up or she will ship them out. Palin is different from the usual spokesperson from the populist fringe which emerges when times are tough. She does not advocate economic reform, financial assistance to the unemployed or needy, or job creation by public works programs; she does not deplore the abuses which have caused misery to many; she does not even sympathize with those who have lost homes, jobs, health insurance, or education. Yet she seems to have greatest appeal to those who are suffering most. So what is it?

My guess is that her appeal is the populism of a new kind of identity politics. We have long had the identity politics of race, gender, and cultural issues. Palin builds on that politics, speaks its code words to play to it, but goes beyond it to exploit the identity politics of the educational and intellectual underclass. For them, knowledge and nuance are anathema, and Palin has the smarts of her instincts to know as much; she knows that they cannot get it. She knows that “you betcha,” a wink, and a sense of grievance work—and work better than policy wonkishness and program promises—because her fans can understand them without knowledge or nuance. So she is going to up-stage conventional politicians who think that criticism of her lack of knowledge or nuance matters to them.

I have two suggestions for those who want to minimize her appeal. One, let Palin be Palin. Her allure may fade after her allotted minutes of celebrity fame because she cannot change; what is static becomes stale, so her shtick will eventually fail to suit the craving for stimulation by the National Enquirer crowd. Two, meanwhile, play rope-a-dope (to a dope). When the opportunity presents itself, just agree with her, and let her have to fill in the silence which you leave her. If you ask her questions which require answers translating platitudes into specifics, you are inviting an aggressive but aggrieved retreat, which will revive the sympathy of her followers, who also feel put upon when challenged. Still, I have less confidence in the first than the second suggestion, but not much in either.

For the present, Palin will be the perfect storm of American anti-intellectualism and an anti-intellectual who understands that this educational and intellectual underclass can be culled and led to the polls. Right now, she gets only 30 percent of the vote—perhaps, if she does not fade from view, enough to set her up for a run in her mid 50s or her 60s, or to sustain her influence to pick a disciple. In the future, say 20 years from now, she or a candidate like her will have an excellent chance of winning. By then, we shall be dumb enough to feel a bond with someone equally dumb except for knowing how dumb we are.

Friday, November 13, 2009


I am amused and annoyed by the last quarter-century’s educational fondness for multiculturalism, a modern doctrine derived from a nineteenth-century, northwestern European idea for the bias-free study of other cultures—an idea alien to them. The irony is lost on its true believers in the socially therapeutic virtue of the doctrine implemented in education. In theory, the idea sounds good: studying and celebrating other cultures promote pride in one’s heritage and appreciation of others’. But because studying and celebrating are two different things, in practice, multiculturalism becomes superficial in serving political ends.

Sometimes multiculturalism may be harmless in its superficiality. Some educators think it is well served by textbooks with pictures of people of different colors and dress. Others include in their instruction selected artifacts of another culture without regard to their context. Years ago, a fellow teacher embarked upon a study of American Indian poems and stories. Learning that I had a record of some Navajo music, he wanted to play it for his class. I cautioned that, without a cultural context to make it meaningful, it would not prove educational; he thought otherwise. When his students became restless after about 15 seconds, he stopped playing it. I hope that his edified and enlightened students do not want Indians confined to reservations, but I worry that they think them cultural primitives having for music only grunts and yips accompanied by scratchy rattles. So much for the Yeibichei ceremony, with its dances and chants to alleviate a sufferer’s illness or pain.

Sometimes multiculturalism is not harmless in its superficiality. Not a few educators truncate or distort European and American culture because they regard it as too old, too white, too male. Thus, political correctness costs us a general appreciation—I do not mean total approbation—of the culture of our country.

When the educational purpose of multiculturalism study is to celebrate other cultures, it requires teachers to assume a positive judgment of them and to include only facts which reinforce that judgment and to exclude those which do not. The result is a distorted, sanitized view of other cultures which includes what we find aesthetically attractive like their decorative arts, dress, dance, music, and the like; and excludes what we find morally repugnant like clitorectomies, incest, and infanticide. In short, we appreciate or deprecate in other cultures what accords or not with the biases of our culture—not multiculturalism at all.

I have a fondness for Navaho art but am mindful that Navajos once trained infants not cry by briefly but, as necessarily, repeatedly smothering them when they did until they learned not to. Even when teachers explain the purpose of this practice—survival in concealment from hostile forces—many students cannot overcome their repugnance. Likewise, I admire Comanches as the best horsemen, hunters, and leatherworkers on the Great Plains after they acquired horses from the Spaniards, but remember that they were the most feared of all western tribes because of their expertise and enjoyment in torturing their captives. Teachers who praise Comanches for their adaptability to new circumstances omit their allegiance to old customs.

The antidote to cultural distance is something like immersion. My example is my teaching a long narrative poem written about a minor event in the higher echelons of Catholic society in early eighteenth-century London to a class of 17-year-old boys 250 years later. By snipping a lock of hair from a noblewoman whom he adored, a love-besotted baron prompted a quarrel between their families. In an effort to make peace, the poet wrote a mock epic poem to trivialize and laugh at the episode and thus humor them out of their peevishness. The poem failed to restore social peace but succeeded to establish itself as a minor masterpiece.

Without benefit of intensive study, I read, or perhaps dozed through, it as a high school senior and as a college sophomore. But I decided to teach it to find out why it had its high reputation. I said so to my students, and I think that my forthrightness mattered to them. Then we set to work. I skip the details except to say that we learned the relevant conventions of the mock epic and the features of the heroic couplet, and found examples of them. All the while, we were slowly reading our way through the poem and learning some social and cultural history as we went. By the time we got to the line, “When lovers, or when lapdogs, breathe their last,” they laughed in appreciation of its multi-faceted humor. Among other questions, the year-end class evaluation asked students which works they had enjoyed most and least. To my astonishment, over half my juniors reported that “The Rape of the Lock” was their favorite work.

It took me a while to understand what this meant for curriculum and instruction: multiculturalism requires, not exposure to, but mastery of, the material. If it requires this much work for Vietnam-era juniors to enter into, experience, and enjoy vicariously the cultural and social world of eighteenth-century London Catholic society, perhaps, for many reasons, we should be effectively unicultural, though more inclusive as we move into the modern period, before we embark on eclectic and superficial multiculturalism.