Category Archives: Politically correct

Reluctant, but Supporting Trump

Donald Trump can count at least one new supporter in this year’s election. “I had a close friend who’d been a business partner of Trump in the ’90s,” the critic and historian Fred Siegel tells me. “Trump ripped off a quarter of a million dollars from him. He told me this when we were discussing the election” four years ago. “Trump just said, ‘So, take me to court.’ I couldn’t vote for him.” Mr. Siegel couldn’t abide Hillary Clinton either, so he “slept through” the 2016 election. Next month he’ll be wide awake—though not woke—and will vote for Mr. Trump.

Joe Biden needn’t worry too much, perhaps. Mr. Siegel, 75, has only twice backed a winning presidential candidate since he reached voting age. But while he’s no bellwether, he does make an energetic case for the incumbent.

Mr. Siegel, a professor emeritus at New York’s Cooper Union and a senior fellow at the Manhattan Institute, says he overcame his distaste for Mr. Trump for three reasons. First, foreign policy: “Crushing ISIS, pulling us out of the Iran nuclear deal, moving our embassy to Jerusalem, and making fools of those people who insist that the Palestinian issue is at the heart of the Arab-Israeli conflict.” Second, by his “ability to withstand a prolonged coup attempt by the Democrats and the media,” which started with the Steele dossier: “If I’m saying what I find impressive about Trump, it’s that he’s survived. He has an extraordinary amount of arrogance, egotism, and self-confidence.”

Mr. Siegel’s third reason goes to the heart of his own political philosophy. He sees the president as a champion of “bourgeois values,” under threat from the “clerisy,” Mr. Siegel’s word for the dominant elites who “despise” those values. He regards Mr. Biden as a “captive” of this clerisy, and running mate Kamala Harris as the “embodiment of it.”

“I don’t want to see her as president,” Mr. Siegel says of Sen. Harris. “I don’t want a San Francisco Democrat who’s likely to impose elements of the Green New Deal, which she sponsored but lied about sponsoring on television. If Biden wins, she will be president in short order. I don’t know how long Biden will last.”

In Mr. Siegel’s view, “hard work, faith, family and autonomy” have enabled America to thrive, and Mr. Trump stands for these values, even if he doesn’t always exemplify them. “The elite is largely detached from the middle class,” Mr. Siegel says. “The two major sources of wealth in the last 20 years have been finance and Silicon Valley. Neither of them has much connection to middle-class America, or Middle America.” Mr. Trump is “in favor of manufacturing jobs, which are often middle-class.” The president also “recognizes the ways in which China is a threat to the survival of middle-class life in America, directly and indirectly.”

Mr. Siegel takes heart from Mr. Trump’s hostility to political correctness. “Wokeness is a force that undermines the middle class,” he says, “and you couldn’t have had wokeness without an elite contempt for the values of the middle class.” Middle Americans see political correctness “as a threat to the democratic republic they grew up in, where people could speak their mind.” I ask Mr. Siegel to define political correctness: “The inability to speak the truth about the obvious.”

As we sit on his porch in the Brooklyn neighborhood of Ditmas Park, his opinions—unfashionable in a borough where Mrs. Clinton outpolled Mr. Trump by more than 60 points—cause passersby to turn their heads. When he offers examples of political correctness that annoy him, a young man walking by the house looks startled. “Why can’t you say ‘Wuhan virus’?” Mr. Siegel exclaims. “Why can’t you say there are two genders?” The young man scuttles past as if singed, and Mr. Siegel says, with palpable sadness, that people don’t stop to talk to him on his porch as much as they used to. Word has got around that he is “a Trump supporter, so fewer people schmooze with me.”

Mr. Siegel is the author of several books, including “The Future Once Happened Here: New York, D.C., L.A. and the Fate of America’s Big Cities” (1997) and “The Revolt Against the Masses: How Liberalism Has Undermined the Middle Class” (2014). Just out is “The Crisis of Liberalism,” a selection of his recent political essays, published by the small, independent Telos Press.

He started as a man of the left, and still describes himself as a protégé of Irving Howe, the democratic socialist literary critic. “Howe died young,” Mr. Siegel notes—in 1993, at 72. He was a doctoral candidate at the University of Pittsburgh in 1968, studying the political economy of tobacco in Virginia, when he cast his first vote. But he sat out the 1972 election. “I voted for Humphrey. I did not vote for McGovern or Nixon. I worked for McGovern as a spokesperson in Western Pennsylvania, and I was stunned to discover that he thought Henry Wallace had been right about a lot of things. Lightbulbs went off.”

In 1976 he voted for “Gerald Ford, the man.” Ford was “moderately competent and unpretentious. Jimmy Carter was pretentious. I thought his religiosity was painted on.” His aversion to Mr. Carter persisted, and in 1980 he backed John Anderson, a liberal Republican running as an independent.

Mr. Siegel voted for Walter Mondale over Ronald Reagan in 1984. “If anyone was going to make the Great Society work—and it was a mess by this time, a farrago—it was Mondale.” Mr. Mondale had “intelligence and knowledge,” but his defeat, and Reagan’s notable successes, made Mr. Siegel “rethink a lot of things.” A man like Mondale, he says, “would not be possible in today’s Democratic Party. There’d be no room for him.”

By the late 1980s Mr. Siegel had become “a centrist Democrat—part of a group that no longer exists.” Michael Dukakis was too liberal for Mr. Siegel, so he skipped the 1988 election. He became a fellow at the Progressive Policy Institute, the think tank of the centrist Democratic Leadership Council. He voted for Bill Clinton in 1992 and 1996 and advised—“but didn’t invoice”—Mrs. Clinton on her successful 2000 bid for Senate.

He didn’t vote in 2000 or 2004 and thinks George W. Bush “was a horrible president”: “The conduct of the Iraq war was extraordinarily inept. I supported the war initially, but I watched how it was being conducted, and I changed my mind.” The first time he voted for “the Republican Party as a party” was in 2008, by which time he had started to define himself as a conservative.

By 2012, when he voted for Mitt Romney, Mr. Siegel had developed an exceedingly low opinion of President Obama, whom he describes as “a faux intellectual with preacher’s cadences and an academic veneer.” In his opinion, “the worst thing” about Mr. Obama was “his effect on race relations. We couldn’t have the cold civil war we have now without Obama, because he, in a very cunning way, exacerbated all of our racial tensions.”

Under Mr. Obama, Mr. Siegel says, “racial grievance” took on a “new legitimacy, and it came from a president talking in asides, and saying things between the lines. He didn’t push back against anything, not even against the idea that Michael Brown said ‘Hands up, don’t shoot’ in Ferguson [Mo.], which was just a fabrication.”

Yet Mr. Siegel traces the origins of the “present-day contempt” for the middle class back a century. He cites H.L. Mencken’s demeaning of the bourgeoisie, in the celebrated editor’s coinage of “booboisie.” Mr. Siegel has written extensively on Herbert Croly, the political philosopher and co-founder of the New Republic, as well as on the novelists H.G. Wells and Sinclair Lewis (who, in 1930, became the first American to win the Nobel Prize for Literature). These three men, Mr. Siegel says, laid the foundation for an elite revolt against the American middle class that endures to this day.

“Croly’s idea was that the college-educated, the elite, should become a new aristocracy,” Mr. Siegel says. “Croly believed that the middle-class and their allies—latter-day Jeffersonians who advocated individual freedom and acted in their own self-interest—were impeding the path of the experts, who were ‘disinterested.’ ”

Wells and Lewis bolstered the view that the professional class was above the fray, giving the argument an almost aesthetic hue. “They thought the middle class was vulgar,” Mr. Siegel says. Mr. Siegel cites a passage in Lewis’s novel “Main Street” (1920), which he regards as “a sardonic sally at the small-town American middle class and its commercial culture.” In the passage, Carol Kennicott, a young woman from the big city trapped by marriage in small-town America, describes Americans as “a savorless people, gulping tasteless food, and sitting afterward, coatless and thoughtless, in rocking chairs . . . and viewing themselves as the greatest race in the world.” In a word, deplorables.

Croly has been largely forgotten, Mr. Siegel says, because liberalism has been largely eclipsed. “Wokeism is not liberalism,” he says. “I don’t want to be unfair to liberals. I was very critical of liberals, but they were in favor of debate; they were in favor of empiricism, of open argument.” Wokeism, by contrast, is a “new secular revealed religion,” which involves no “investigation or empirical study.”

The eclipse of the old “Crolyite liberalism” began, Mr. Siegel says, in the 1980s and ’90s, with the eruption of postmodernism into American intellectual life. “There began to be an emphasis on ‘narratives’ and feeling, which undermined the Crolyite emphasis on empiricism and evidence.” Liberalism had already been weakened by Reagan’s victory in 1980. “There was questioning among liberals, and some self-doubt,” Mr. Siegel says. “But the questioning didn’t go far enough, and blame was placed squarely on Carter. He didn’t check all of Croly’s boxes, he wasn’t a natural, Ivy League aristocrat. He was a farmer”—in contrast with John F. Kennedy, an archetypal Crolyite president.

There was, Mr. Siegel says, an ideological “hiatus” under Mr. Clinton, in which a party that had been “demoralized by the defeat of the technocrat Dukakis in 1988” recovered some of its mojo. But “postmodernism turning to wokeness was churning” in the 1990s. The 2000 election was “a trauma” for the Democrats, and Howard Dean’s unsuccessful candidacy for the 2004 nomination previewed “some of the craziness and hysteria that would come full-bore, on a broader scale, a decade later.” Wokeism achieved its apotheosis in 2014, in the aftermath of Michael Brown’s shooting. “Ferguson allowed Ivy League grads to assert their ‘natural leadership,’ in opposition to lowlife cops and guys with pickup trucks—again, the deplorables.”

In Mr. Siegel’s understanding, wokeism holds that “the important truths are already known, and that the American aristocracy has to impose those truths on the country.” These are “given positions”—irrefutable and sacrosanct. Wokeism, he says, is a “perilous threat” to America and particularly to the First Amendment. “It says we don’t need debate. We don’t need free speech. We don’t need freedom of religion. We need to obey.” Mr. Siegel’s vote is his personal act of disobedience.

Mr. Varadarajan is a Journal contributor and a fellow at New York University Law School’s Classical Liberal Institute.


Truth is not part of the progressive agenda or msm agenda.

WSJ 10/14/2020 by Shelby Steele

August was the sixth anniversary of the death of Michael Brown, the black teenager who was shot dead by a white police officer in Ferguson, Mo. The incident, and the nationwide coverage it attracted, marked the beginning of a period of mass protests against police, which culminated (let’s hope) after the tragic death of George Floyd in Minneapolis this May.

The fashionable explanation for what happened to Brown, Floyd and others—such as Freddie Gray in 2015 and Philando Castile in 2016—is so-called systemic racism. The activist left and the mainstream media insist that law enforcement targeted these men because they were black—and that if they weren’t black, they would still be alive. The truth is more complicated and less politically correct, and it’s the subject of an engrossing new documentary that is scheduled to premiere Oct. 16.

Shelby Steele.


The film, titled “What Killed Michael Brown?,” is written and narrated by the noted race scholar Shelby Steele and directed by his son, Eli Steele. Readers of these pages probably know the elder Mr. Steele through his best-selling books and occasional Journal op-eds. But earlier in his career, Mr. Steele also won acclaim for his work in television. In 1990 he co-wrote and produced “Seven Days in Bensonhurst,” an Emmy-winning documentary about Yusef Hawkins, the black teenager from Brooklyn who was fatally shot in 1989 after he and some friends were attacked by a white mob.

In an interview this week, Mr. Steele, who is based at Stanford University’s Hoover Institution, explained the significance of Brown’s death and what it tells us about race relations today. “Michael Brown represented, even more so than Trayvon Martin, Freddie Gray and others, the distortion of truth, of reality,” he said. Mr. Steele added that when it comes to racial controversies, liberals have developed what he calls a “poetic truth,” which may be at complete odds with objective truth but nevertheless helps them advance a desirable narrative. In the case of Michael Brown, reality was turned on its head.

“It was almost absolute,” Mr. Steele said. “The language—he was ‘executed,’ he was ‘assassinated,’ ‘hands up, don’t shoot’—it was a stunning example of poetic truth, of the lies that a society can entertain in pursuit of power.” Despite ample forensic evidence, the grand-jury reports and the multiple Justice Department investigations clearing the police officer of any wrongdoing, “there are blacks today, right now in Ferguson, as I point out in the film, who still truly believe that Michael Brown was killed out of racial animus,” he said. “In a microcosm, that’s where race relations are today. The truth has no chance. It’s smothered by the politics of victimization.”

Yet Mr. Steele sees a better future, and the interviews highlighted in “What Killed Michael Brown?” help to explain his optimism. One of the film’s strong suits is showcasing the words and deeds of everyday community leaders in places like Ferguson, St. Louis and Chicago. These people are far more focused on black self-development than on badgering whites or blaming society for problems in poor black communities. They understand and accept objective truth but mostly toil in obscurity while liberal billionaires cut million-dollar checks to subsidize Black Lives Matter activism and antiracism gibberish from “woke” academics.

“It’s easy to say, ‘The white man, the white man,’ and point the finger,” says a pastor in the film whose church is located in one of Chicago’s most violent neighborhoods. “In reality, we have to take a very close look at ourselves.” His focus is on “the transformation of the person. And we’re telling them, hey, educationally, you gotta get it together. Economically, you gotta get it together. Family and spiritually, you gotta get it together. And you have to take responsibility.”

The president of the St. Louis NAACP chapter told Mr. Steele there was no evidence that the Ferguson protests had done anything to help the black people who live there. Property values have fallen, crime has increased, and schools continue to underperform. “Let’s be clear. The progressive agenda is not the black agenda,” he says. “The people in that community are no better off than they were prior to the death of that young black child. They’re no better off, and everybody knows it.”

Amazon, which was scheduled to stream the movie, is now having second thoughts and has placed it under “content review.” Eli Steele, the director, told me that he will resort to other streaming platforms if he has to and is referring people to the film’s website,, for more details on how to view it.

The progressive agenda may not be the black agenda, but it is the media’s agenda. Sadly, speaking plain truths about racial inequality in America today remains controversial.


Confederate Statues Controversy: A Historian’s View

Very well thought out and written article, which I commend to all. mrossol

7/17/2020 National Review.  By Bruce Westrate

I was born in 1952, during the presidency of Harry Truman. Nine years later, this country began its centennial commemoration of the Civil War. I was completely swept up in it, writing letters to chambers of commerce all around the battle-affected states to solicit information on nearby battlefields, both decisive and inconsequential. A year later, my parents surprised me with a trip to Gettysburg in the family station wagon. The evening of August 29, 1963, I spent the night in the house used by Robert E. Lee as headquarters during that titanic battle. Upon returning home, I wrote a letter to Pulitzer Prize–winning historian Bruce Catton (A Stillness at Appomattox), gushing over the vistas I’d seen and the knowledge I’d gained. He was kind enough to respond, encouraging me to continue my historical studies.

Not surprisingly, I became a historian and teacher. I have visited battlefields all my life, captivated by the dramatic confrontations that bloodied those sites, as well as by the even-handed presentation provided by the national military parks. This, I have long assumed, owed its nature to the premium that had been placed on reconciliation after the war which might allow a deeply wounded nation to heal, however imperfectly. I have taken my kids to Gettysburg, Antietam, and Petersburg, as well as the Confederate White House in Richmond.

Now, unhappily, I find myself consigned by the media to the ranks of would-be Nazis, mysteriously and involuntarily coupled to the most detestable ideology imaginable (fascism/Nazism), simply because I’ve always enjoyed such venues, along with the commemorative Civil War art that abounds among them. For the son of World War II vets, this is an uncomfortable fate to accept. So, over the last few days, I’ve been forced to ask myself the uncomfortable query: “What’s going on here?” Why are so many young people, abetted by the feckless opportunism of politicians, turning to the likes of the Taliban for their example in ravaging parks and civic squares across the South, attempting to discredit Civil War heritage and to efface historical memory?

And what I’ve come up with is this: It’s all about safe spaces. For the last few years, bemused citizens of my vintage (67) have been treated to the spectacle of ideological self-segregation on some of America’s most elite college campuses, based largely on the proposition that, contrary to the reassuring rhyme we’d learned as children, words can indeed hurt you. Moreover, in the name of preempting “hate speech,” sticks and stones may well come in handy.

From the outset, this logic seemed preposterous to me. Words have objective meaning, after all, or at least they once did. And since the English language has more words than any other, our options in their use are virtually infinite. We write wills with words, enforce contracts through them, use them to power political, theological, and philosophic debates. Our civilization would be impossible without them. Therefore, in and of themselves, words possess not only concrete meaning, but the potential for absolute functional precision.

Yet, notwithstanding this treasured medium of language, we are now asked to believe that words have no inherent meaning at all that is independent of a recipient’s translation of them, and that there is no intrinsic truth in language that lies in the words alone. Whether a person has been offended, circa 2020, seems no longer to be a product just of the words themselves, but is instead a measure of the culture, race, or sensitivity of the person on the receiving end.


Likewise, with statues. Historical art reflects, and provides guideposts to, the culture that constructed it. There is value in that, in and of itself, irrespective of the monument in question. It provides tangible signposts on the road of our social and political evolution as a nation and a culture, a civilization.

Yet recent controversies seem to have handed the adjudication of our public taste, along with the preservation of our historical past, over to the mob: latter-day sans-culottes. Almost overnight, it seems, works of historically commemorative art, which have presided over squares and parks for a century or more, have been declared utterly and immediately intolerable, notwithstanding their provenance or antiquity.

All this would seem to suggest something other than an extemporaneous response to something real. Then again, mobs tend to specialize in faux spontaneity. So, at this point, perhaps it would be well to remember that the French Directory would have demolished Notre Dame Cathedral in 1799, but for Napoleon’s improbable rise to power. Indeed, its composite stones had already been auctioned off! Now billions are being raised for its reconstruction from last year’s fire.

It is hardly necessary then, to catalogue the number of offensive monuments that may also have long-ago reached that point of intolerability across the world. Shall we deconstruct Aztec pyramids, where so many Tlaxcalan hearts were yanked out? Shall the San Diego State football team stop calling itself Aztecs? Shall the FDR monument be removed to appease the descendants of interned Nisei Japanese or the victims of the Tokyo fire raids of 1945? Shall the Japanese imperial palace be dismantled in deference to the 17 million Chinese who perished in World War II? Whither the Colosseum? The Pyramids? Monticello? Mount Vernon? The White House? While time used to provide some (excuse the expression) “monumental” immunity, ISIS has given all of us pause with its destruction of Palmyra in Syria, as with the Taliban’s earlier obliteration (with artillery) of the colossal Buddhas carved into the cliffs at Bamyan in Afghanistan. “Who controls the past, controls the future, who controls the present controls the past,” as Orwell reminds us.

And no case proves the rule more thoroughly than our current predicament. The breezy media conflation of the Confederacy with the Third Reich is so ahistorical and simple-minded that the connivance of our intelligentsia is laid bare by it. The Holocaust, while incontestably horrific, was hardly unprecedented in scale, given the grisly records of the Romans, the Mongols, the Spanish, the Soviets, the Communist Chinese, the Khmer Rouge, or the North Koreans. Yet Lenin’s statue stands unmolested in Seattle, and Karl Marx’s in Trier, Genghis Khan’s in Ulan Bator, Mao’s in Beijing, as do those of Francisco Pizzaro in Lima, and Hernan Cortes in Mexico City. And, unless I am misremembering history, did not Alexander the Great own slaves? Caesar? The Medici? No less an icon than Aristotle posited that slavery was part of the natural order of things. What are we to do with him? And while it has been fashionable over the past decade to deplore the study of history as “written by the winners,” now we find an exception in the Confederacy. After all, we are now asked, “Why should losers have monuments?”

Well, there is a very good answer to the question, offered 20 years ago by Shelby Foote, author of the three-volume Random House narrative history of the American Civil War. Put simply, he speaks of a great compromise that evolved after the Civil War that was critical to speeding reconciliation and obviating a massive northern military occupation of the South. Grudgingly perhaps, Southerners publicly renounced the institution of slavery for which they had fought. In return, the North acknowledged the heroism of Confederate soldiers and the distinction of such fabled commanders as Lee and Jackson. As imperfect, indeed iniquitous, as the South became following the Reconstruction era, at least the geographic sections healed in political union. And when one contemplates the ferocity and interminability of most civil wars in history, that was no mean feat.

What is going on before our eyes, I think, is a concerted attempt to infantilize history by confecting a narrative so facile, so riven with the blithe dispensation of “virtue” and “evil,” that all complexity is lost, and context is rendered subservient to a cleansing, anodyne, apologetic narrative. And while making war on the dead through the prism of the present may provide abundant opportunities for moral preening, it renders meaningful reflection impossible, and a truer understanding of history forever beyond our reach. Future generations will be condemned to “processed” history, and the betrayal that fraud always leaves in its wake. For the facts of history are not intrinsically “good” or “bad”: They just are. And our adjudication of the past should not rest on our present-day assessment of “good” and “evil,” but on truth, at least to the extent we can determine and analyze truth objectively.

The trans-Atlantic slave trade was a horrific blot on the history of humankind, less for the uniqueness of its cruelty than for its peculiar nature: the heartless commodification of millions of human beings over three centuries. It was an atrocity largely driven especially by the profitability of sugar cultivation, which is why most captives ended up in either Brazil or the Caribbean rather than the future United States. Make no mistake: This was a holocaust. Over two million captives perished in the hideous “Middle Passage” from West Africa across the Atlantic, along with perhaps 30 percent of the arrivals dying soon after that, depending on their destination. But it is also important to realize that slavery was a worldwide, institutionalized fact of life in the mid 19th century that, moreover, could not have occurred without the enthusiastic participation of many Africans themselves. Slavery had arrived with Islam in West Africa, and the depredations of slave catchers and slaving states in Africa (such as the Ashanti) to procure this human commodity, according to many scholars, may have carried away even more lives than the toll among those who subsequently died at sea or in western hemispheric captivity. No less a scholar than Henry Louis Gates, chair of Africana Studies at Harvard, put it this way in a New York Times op-ed: “90 percent of those shipped to the New World were enslaved by Africans and then sold to European traders . . . black people were just as complicit in the slave trade as whites.”

Gulled by the misrepresentation of the Roots TV series back in the 1970s, Americans have never really understood how slaves were acquired on the other side of the Atlantic. And while the trans-Atlantic trade had been largely suppressed by the Royal Navy before our Civil War, an even more vicious Arab trade in captives continued to scar most of East Africa for the rest of the century, killing over 80,000 a year, according to the renowned explorer David Livingstone. In all, perhaps 17 million Africans were enslaved by the Arabs in East Africa and the Nile Valley during the 19th century. And this tale of woe was replicated almost everywhere. Whether Russian serfs, Ottoman slaves, Chinese peasants, Indian untouchables; all were utterly at the mercy of their “superiors.” Slavery, it must be understood, was less a black, white, brown, or yellow crime than a hideously broad-based human atrocity stretching back to far antiquity. I make this point not to absolve any civilization of responsibility for the nightmare of slavery, but to include all mankind within the indictment.

Our Constitution institutionalized slavery in its fundamental law, e.g. the notorious 3/5 compromise. That was why the Emancipation Proclamation did not apply to slaves under Union control. Lincoln had chosen to employ a war measure under his brief as commander-in-chief. Passage of the 13th amendment was not completed until after the war’s conclusion. Slavery was America’s original sin. However, neither it nor Western Civilization created the institution of slavery.

Great Britain, however, should at least be credited with the eventual impulse toward abolition of the trade in the 19th century, after what was perhaps the first successful instance of popular activism the world had ever known, led by the likes of Wilberforce, Clarkson, Newton, and Equiano. Further, what particularly distinguishes the United States in this story, far beyond the besmirching reality of the Confederacy, is its prosecution of a war in which the abolition of the not-so-peculiar institution became an avowed, pronounced war aim. Nearly 350,000 Union soldiers gave their lives to that end, 40,000 of them African-American.

Robert E. Lee seems a particularly curious target for the vitriol of the mob. Lee was a lifelong soldier, hero of the Mexican War, superintendent of West Point. In 1861, it was to him that Abraham Lincoln initially offered command of the entire Union Army. Lee was arguably the most brilliant tactician this nation has ever produced. His victories at Chancellorsville and Second Manassas are still studied today. Fatefully, he chose his state over his country, not to protect slavery, which he had denounced as a moral evil, but to defend his home. If not a commendable impulse in hindsight, it is perhaps an understandable one.

Yet Lee’s greatest service to our country may well have come at the end of the War, at Appomattox. One word from “Marse Robert” likely would have been enough to have sent thousands of Confederate soldiers scurrying into the hills to carry on guerrilla warfare against occupying Union forces for decades to come. Most civil wars do not end as “civilly” as ours did. Instead, Robert E. Lee told his men to go home and become citizens again. And this is the man whose image we choose to deface? This is not just censorship; it is iconoclasm.

Drew Gilpin Faust has observed dramatically in This Republic of Suffering that the American South suffered more per capita war trauma than any other region on earth, except for the fabled “bloodlands” situated between Germany and Russia in the 1940s. “The American Civil War,” she observes, “produced a carnage that has often been thought reserved for the combination of technological proficiency and inhumanity characteristic of a later time.”

And while the erection of commemorative statues unfortunately coincided with the emergence of the “Jim Crow” South, there are more understandable motivations that, I would argue, took precedence. These were martial creations, after all, intended to commemorate battlefield feats. Historians have long observed that veterans typically (and understandably) avoid public remembrance and consecration of battlefield combat until decades after the event. The erection of these statues coincides with the dedication of most of the larger American battlefield parks and cemeteries. So, they were aimed less at betrayed freedmen then at kindling popular remembrance of the slain, along with the suffering wounded Confederate veterans had endured.

Was it so unnatural, then, for a society composed largely of traumatized ex-Confederates (which the southern population incontestably was after the Civil War) to commemorate those virtues which had not been completely obliterated by the misbegotten cause for which they fought: heroism, sacrifice, comradery, triumph (however ephemeral), as well as tragedy? These are universal impulses, which have been lauded by many other peoples, in many other places, in many other times. The sin of slavery is ineradicable in the public mind. So too must be our memorial ties to the past. That is how a civilization disavows its mistakes, reaffirms its achievements, and matures over time. It is that strand of cultural continuity connecting the past to the present and, ultimately, to posterity which, as Edmund Burke pointed out in responding to the horror of the French Revolution, allows a civilization to endure.

Saddest of all, perhaps, is the failure of so many people to reflect on the fact that many of these monuments were erected simply to honor the dead. Lots of dead. Among Confederate soldiers, only 6 percent owned any slaves. Most were dirt-poor farmers answering the call, however wrongheadedly we may construe it now, to defend their homes from outsiders.

As for all those boys in blue, without whom the institution of slavery would have lived on? Strange isn’t it, how few in the hysterical mob seem to recall them at all. A pity, that.


Oh Yes, Ban the Redskins

Yep, this is where the spineless are “dragging” us. mrossol

7/16/2020 WSJ by Daniel Henninger

The Washington Redskins logo on the team’s home field in Landover, Md., Aug. 28, 2009.


For now, the Washington Redskins are just the Washington Something or Others, a team with no name. After holding out for years against the inertial forces of political correctness, the Washington football team caved. Hmm, maybe “caved” is inappropriate language now. They gave up.

You knew the Redskins were done as soon as Dreyer’s Grand Ice Cream said it was dropping Eskimo Pie so the company could be “part of the solution on racial equality.” When I was growing up, Eskimo Pie always made me think Eskimos were great. But what did I know?

I’ve been fighting the team-name wars for years, most recently over baseball commissioner Rob Manfred’s goofy suppression of the Cleveland Indians’ Chief Wahoo.

You have to know when you’re licked. Sorry, wrong word. I mean beaten. Double-sorry; no one should be beaten. I mean defeated. I am defeated. Instead of complaining about the Redskins, it’s time to get ahead of the logo posse and eliminate a lot of really terrible sports-team names. Many of these teams probably think there’s no way their names would offend anyone. They are about to find out how wrong they are.


First we get rid of the low-hanging, already rotting fruit: The Chicago White Sox, the Boston Red Sox and the Cleveland Browns. White, red, brown and black are unspeakable and unthinkable colors now—for anything. The Chicago Green Sox would be ok. Many pro athletes are weirdly attracted to the color pink, so the Boston Pink Sox would work.

Clevelanders will object that even if most people under 20-years-old think the Cleveland Browns offends the race gods, the Browns are named after team founder Paul Brown, who, as luck would have it, was a white guy. The easiest solution would be to abolish the Browns once and for all. Who would notice?

Sing “hey hey, goodbye” to any team whose name suggests centuries of systemic privilege: the Kansas City Royals, Los Angeles Kings, Sacramento Kings, Vegas Golden Knights and Cleveland Cavaliers. And hasn’t the moment come for LeBron James to renounce “King James”?

Let’s admit it: Times have changed. The highest value in modern American life is feeling safe. Not “safe” in the sense of not being gunned down tomorrow night. I mean safe the way a college student or street protester feels “unsafe” if bad thoughts are brought to mind.

By this measure, the list of violative professional sports-team names is endless.


The Denver Broncos? Broncos are abused horses forced to buck and then submit by a Dallas Cowboy kicking them with San Antonio Spurs. They’ve all gotta go. Ford Motor just resurrected its Bronco SUV. What terrible timing. Dump it.

Too many teams are still dependent on fossil fuels: the Detroit Pistons, Edmonton Oilers and Pittsburgh Steelers. Let’s clean up the Steelers by renaming them the Pittsburgh Windmills.

The Philadelphia 76ers? Surely they’re already on their way to being rehabilitated as the Philadelphia 1619s.

The Miami Marlins shamelessly expropriated the name of a vulnerable species. They should be renamed the Miami Minnows.

Anyone who thinks names like this honor endangered species doesn’t understand why statues of George Washington have to go. The Minnesota Timberwolves should leave the wolves alone and call themselves the Minnesota Lutefisk.

Names associated with religious belief are also a problem. The New Jersey Devils imply God exists. Ditto the New Orleans Saints, and the Boston Celtics evoke Irish Catholics. Get rid of them.

The Portland Trail Blazers celebrate genocidal pioneers. The San Francisco 49ers are named after 19th-century California gold-diggers who raped the environment.

The Houston Rockets have an impossibly male-sounding name and should compensate by becoming the Houston Rockettes.

The Colorado Avalanche evokes death. The New York Rangers sound like the police. The Texas Rangers are the police. What were the San Diego Padres thinking?

The Chicago Bulls are another team named after an abused animal, not to mention the consumption of animal protein. A new name that comes to mind is the Chicago Jordans in honor of Michael, but that will remind some people of the Jordan River and the plight of the Palestinians.


Don’t get me started on teams who think they’re safe by hiding behind the names of birds or animals. The Toronto Blue Jays are named after a nasty bird. The Atlanta Hawks kill rabbits. Just the words “Miami Dolphins” make me want to cry.

The Miami Heat may be the future, invoking the problem of climate change, and we can’t be reminded of that too often. The about-to-die Cleveland Indians could become the Cleveland Cold.

The team name of the Utah Jazz never made sense to me, but it does suggest that rebranding teams as musical instruments might be safe. The New England Patriots are problematic in so many ways. Patriotism? Are you kidding me? I look forward to them coming back as the New England Trombones.

For now, Washington sits with a nameless football team. How about calling the team in the nation’s capital the Washington Nothings? That sounds like something we could all agree on.