During a recent conversation about groupthink in the arts, a playwright friend of mine told me about his experience last year of rehearsing his latest drama. A director had been enlisted to take charge, and from the outset was determined to impose his ideological values on to the production. Before long he was cutting lines that he considered ‘problematic’, and policing how these topics were discussed in rehearsal. The characters as portrayed in the script were morally ambiguous but, to the frustration of the writer, this director was adamant that the show must convey the ‘correct’ message. For him, theatre was simply another tool to spread the word of social justice.
Anyone who works in the arts will be aware of the deleterious impact that woke politics has had on creative freedom, although few will be bold enough to admit it publicly. That this director was enjoying his power was obvious, and I did suggest to my friend that perhaps a more open-minded practitioner might have improved the process. ‘That’s the trouble’, he said to me, ‘we couldn’t find one’. By the time the production’s run had started, my friend was struck by a terrible realisation. He had been treating this director as one might a dangerous dog: overly anxious not to cause him any displeasure, and always aware that he could lash out and bite at any moment.
If you have ever had dealings with identity-obsessed ideologues, this analogy will doubtless ring true. The rapid advance of the identitarian worldview in mainstream cultural, educational and political institutions, most notably via critical race theory and LGBTQIA+ activism, has been accelerated by one factor above all: intimidation. According to surveys by social scientists, the percentage of American citizens who are afraid to express their political views openly has tripled since the McCarthy era. As I have argued previously on spiked, it is self-censorship rather than state censorship that represents the most direct threat to the intellectual health of contemporary society. This is what John Stuart Mill meant when he wrote about the ‘despotism of custom’ and ‘the tyranny of the prevailing opinion and feeling’.
Ultimately, however, self-censorship is a choice, even at a time when speaking out can have ruinous personal consequences. If we allow ourselves to be intimidated into silence, there is little hope that the regressive effects of identitarian activism will be curbed. The Salem witch trials of 1692 only came to an end after the wife of the governor of the colony was implicated, and a sufficient number of villagers felt brave enough to express their doubts about the accusers. Today there are powerful ideologues who seek to foster division and undermine our liberal values in the name of social justice, and we can be sure they won’t back down while the majority remain silent. As Arthur Miller puts it in The Crucible (his dramatisation of the hysteria in Salem), ‘the little crazy children are jangling the keys of the kingdom’, safe in the knowledge that those who cross them will be the next to be condemned.
Our new culture of conformity is spreading, and its impact is being felt in all walks of life. Recently, an actor friend of mine was contacted by her agency because she had not posted anything on social media in support of the Black Lives Matter movement. She was told that she must do so immediately if she wanted casting directors to consider her for any future roles. I am hearing stories like this every day, but invariably they are communicated privately. There is a strong general feeling that to publicly object to the prevailing dogma is to jeopardise one’s livelihood and reputation.
This is the sad reality of most present-day working environments, where to utter an unfashionable opinion, to misspeak, or even to fail to show due fealty to received wisdom can be an impediment to future career prospects. As a former teacher, I am still in contact with ex-colleagues who are troubled by the sudden revisions made to curricula and pastoral policies. Under the misleading banner of ‘anti-racism’, the reactionary tenets of critical race theory are being foisted upon schools across the UK. Important struggles against genuine racism are being undermined by influential activists who insist that we live in an irredeemably white-supremacist culture. Highly contested notions such as ‘whiteness’, ‘white privilege’ and ‘white fragility’ are being treated as the gospel truth, not just at schools but throughout the public and private sectors. Many companies are forcing their employees to undergo ‘unconscious bias’ training, even though there is overwhelming evidence that such schemes are unreliable and ineffective. To raise a complaint is taken as proof of the kind of prejudice that the tests seek to expose. After all, only a witch would deny the existence of witchcraft.
It is a great irony that the strategy of smearing people as racist without cause is only successful because we live in a country that no longer considers racism in any way socially acceptable. Even the allegation is enough to make the accused unemployable. In other words, the claim that ours is an endemically racist society is fatally undermined by the very tactics that its proponents deploy. Bizarrely, the response to this reality from many leftist identitarians has been simply to deny the existence of what has become known as ‘cancel culture’. Yet we have all seen numerous examples of people who have been hounded out of their jobs for causing offence, either through misjudged jokes or ‘problematic’ views.
When a group of well-known figures – including Salman Rushdie, Noam Chomsky and Margaret Atwood – published an open letter in Harper’s Magazine decrying this new climate of intolerance, the backlash was intense. The relative privilege of the signatories was held up as evidence that cancel culture is a myth, and that these rich celebrities were simply worried about being held accountable for their views. This, of course, is to miss the point spectacularly; it was precisely their financial security that enabled them to sign the letter in the first place. As Tom Slater has pointed out, cancel culture is a phenomenon that most greatly impacts on ordinary working people. It only takes a few highly publicised instances of ‘cancellation’ to create the kind of atmosphere in which the public is intimidated into toeing the line.
As the social-justice ideology gains momentum, it will become increasingly apparent that to ignore it will only guarantee its dominance over our lives. Its advocates will continue to deny biological reality and threaten you if you will not acquiesce. They will tell you that the kind of colour-blindness advanced by Martin Luther King is a form of racism, rather than a beautiful ideal. They will bully people in the name of compassion, promote division and call it progressive, and rehabilitate a new form of racism under the guise of tolerance. They will insist that you accept their nebulous theories as fact, and couch their incoherent ideas in jargon and obfuscation. They will use inflammatory language to misrepresent your concerns, accuse you of ‘erasing’ people’s existence, or committing acts of ‘violence’ through speech. They will claim that there is no such thing as objective truth, but demand that you acknowledge the truth of their ‘lived experience’. They will carry on feeding the far right by elevating identity politics, and claim to be opposing fascism through their authoritarian methods. And if you dare to suggest that any of their demands should be subject to discussion or debate, they will not hesitate to brand you a bigot.
When this happens, it is our responsibility to take a stand. We should defend those who are the targets of bullying campaigns, whether they are being attacked for what they have said or what they refuse to say. We should not abandon the liberal values that still function as our best defence against racism and other forms of intolerance. We should not allow ourselves to be intimidated by threats, insults and false accusations. The desire for a quiet life is entirely understandable, but surely we have reached the point where the keys of the kingdom must be wrenched back from the hands of the crazy children.
Andrew Doyle is a stand-up comedian and spiked columnist.
“We are frequently informed that all areas of our lives in the West need to be “decolonised”; think about that for a moment. Have you noticed that it is only white majority societies that need to be “decolonised”? We rarely see demands for Islamic societies to decolonise, despite the long term occupation of non Islamic lands.
In other words it is only a charge levelled at white people which, in essence, means that white society has to abandon its culture and deny its culture and instead be both culturally colonised and in the long term literally colonised.”
Goldsmiths Library, which is part of the University of London, has proudly announced that it plans to ‘decolonise and diversify’ its collections. This will apparently allow it ‘to de-centre Whiteness, [and] to challenge non-inclusive structures in knowledge management and their impact on library collections, users and services’.
Of course, decolonising libraries is only one element of the broader project to decolonise the university, which also includes demands to decolonise curricula. But it is still a significant move.
At first glance, diversifying libraries sounds like a harmless idea. There is little to object to in the idea of sourcing more books from nations outside Europe. Students can benefit from being able to access books written by brilliant African authors just as they benefit from their existing access to books by European authors. After all, what matters is the quality of books, rather than their country of origin.
But even if these were well-intentioned plans, it is the unintended consequences we should be worried about. For a start, diversifying library collections is an expensive business. Many universities are already in a parlous financial position thanks to the pandemic, with student deferral rates up by 17 per cent, and the number of fee-paying international students set to plummet by nearly 50 per cent. In such challenging conditions, channeling dwindling finances into various decolonisation initiatives is only likely to result in other, arguably more important, university services being deprived of support and investment.
Moreover, libraries in universities and beyond often provide a vital service for many of the least advantaged groups in society. For those without wifi access or computers at home, they offer internet access. And for those living in crowded or noisy living conditions, they provide the space and quiet to read and concentrate. And, above all, they allow many simply to find and read books that they otherwise wouldn’t be able to. That, after all, is the whole point of a library: to allow people access to books. Yet too much focus on fashionable diversity and decolonisation projects could easily see all these vital services suffer.
And what of the effect on domestic book suppliers and sellers? If a decolonised and diversified book provision is presented as virtuous, does that mean those who fail to diversify and decolonise will be condemned or, worse, cancelled? Think not just of other university libraries or local libraries, but also of small independent bookstores unable to import books from far away. Are they going to be singled out and cancelled if they cannot afford to decolonise? Something similar has already happened to The Tattered Cover, a small bookstore in Denver, US, which failed to toe the woke line on the Black Lives Matter protests earlier this summer.
There is no doubting the social and economic challenges that too many in the UK face today. Such challenges are only likely to grow in the devastating aftermath of the pandemic and lockdown. But is directing resources towards the decolonisation and diversification of library book shelves likely to help people overcome these challenges? It seems unlikely.
In order to tackle real problems, rather than make woke activists feel good about themselves, we need strong and bright young people. People, that is, who would benefit from the vital service libraries already on offer; namely, access to the best that has been thought and said. The decolonisation of libraries is likely to prove a costly and damaging distraction.
“We are frequently informed that all areas of our lives in the West need to be “decolonised”; think about that for a moment. Have you noticed that it is only white majority societies that need to be “decolonised”? We rarely see demands for Islamic societies to decolonise, despite the long term occupation of non Islamic lands. In other words it is only a charge levelled at white people which, in essence, means that white society has to abandon its culture and deny its culture and instead be both culturally colonised and in the long term literally colonised.” ~Martel
Measured by his contributions to economics, political theory, and intellectual history, Thomas Sowell ranks among the towering intellects of our time. Yet, rare among such thinkers, Sowell manages never to provoke, in the reader, the feeling of being towered over. As Kevin Williamson observed, Sowell is “that rarest of things among serious academics: plainspoken.” From 1991 until 2016, his nationally syndicated column set the bar for clear writing, though the topics he covered were often complex. “Too many academics write as if plain English is beneath their dignity,” Sowell once said, “and some seem to regard logic as an unconstitutional infringement of their freedom of speech.” If academics birth needlessly complex prose, editors too often midwife it. An editor, Sowell once quipped, would probably have changed Shakespeare’s “To be or not to be, that is the question” to something awful, like “The issue is one of existence versus non-existence.”
Consider Sowell’s clear, brief explanation of the economic idea of “scarcity.” “What does ‘scarce’ mean?” he asks in his layman’s textbook, Basic Economics. “It means that what everybody wants adds up to more than there is.” Not only is pointless complexity absent from Sowell’s prose; so is the first-person perspective. The words “I” or “me” scarcely show up in his 30-odd books, but for his memoir, A Personal Odyssey.
To his critics, Sowell’s writing style is severe. But to his fan base—which includes figures as different as Steven Pinker and Kanye West—it’s a refreshing break from the self-absorbed drivel that frequently passes for cultural commentary nowadays. Pinker, a Harvard psychologist and leading public intellectual, named Sowell the most underrated writer in history. West, for his part, tweeted out a handful of Sowell quotes to millions of followers in 2018.
Sowell’s first piece of writing was published in 1950—a letter to the now-defunct Washington Star, urging the desegregation of the city’s public schools. The only hint during this period that he would someday be an economist was a budding interest in Karl Marx. For Sowell, Marx’s ideas “seemed to explain so much,” including his own “grim experience.” At the time, Sowell was a 20-year-old high school dropout, working as a clerk by day and taking classes by night—a situation that actually marked an improvement over his being unemployed and, for a time, homeless in his late teens.
Sowell’s experience had not always been so grim. Though his father died before he was born and his mother soon after, he nevertheless remembers his early childhood as a happy one. He was raised by his great-aunt in a house without electricity or hot water—typical for black North Carolinians in the 1930s. At the time, it never occurred to Sowell that they were poor; after all, they “had everything the people around [them] had.” Nor did he realize what it meant to be black in the era of Jim Crow. White people were “almost hypothetical” to him as a child. Indeed, it “came as a shock” to learn that most Americans were not black.
Sowell’s world expanded radically when his family moved to Harlem in 1939. It was the Harlem of James Baldwin (six years Sowell’s elder), and among its offerings were public libraries, which a nine-year-old Sowell gravitated to, and fistfights, which he had no choice but to engage in frequently. “At one point,” he recalls, “getting home for lunch safely became such an ordeal that a friend would lend me his jacket as a disguise, so that I could get away before anyone could spot me.”
Nor did his troubles end when he got home. With each passing year, his relationship with his great-aunt deteriorated, hitting a breaking point after he enrolled at Stuyvesant, New York City’s most prestigious public high school. An untimely illness, together with a heavy workload, conspired to make schoolwork unmanageable. Before long, Sowell was skipping class altogether, even as he and his adoptive mother engaged in internecine warfare: she threw his treasured art supplies away; he smashed her favorite vase; she called the police on trumped-up charges; he threatened to leave home.
The conflict escalated until it reached the brink of actual violence. In his memoir, Sowell recounts the painful climax:
“How long is this gonna go on, Thomas?” she asked me one day.
“Until someone cracks,” I said. “And it won’t be me.”
She tried being sanctimonious as I walked away, but I turned on her.
“You lying hypocrite!” I said, and launched into a tongue-lashing that left nothing to the imagination.
Wild with anger, she grabbed a hammer and drew it back to throw it. I was too far away to take it away from her, so I said: “Throw it—but you had better not miss.”
Trembling with anger, more so than fear, she put the hammer down. Afterwards, she seemed to understand at last the reality of our relationship, that we were simply enemies living under the same roof.
Sowell soon got himself emancipated and found a shelter for homeless youth. “It was now very clear to me that there was only one person in the world I could depend on,” he realized. “Myself.” With little more than the clothes on his back, he began a long journey that would lead him to the Marines, the Ivy League, and, briefly, the White House, at the Department of Labor.
In another cultural milieu, Sowell’s life could be the raw material for a compelling biopic or documentary. Instead, his story languishes in relative obscurity. This is partly because Sowell, after years of being a Marxist, ended up somewhere between libertarian and conservative—an orientation decidedly unwelcome in Hollywood. But he also does not wear his life story on his sleeve, and much in our culture today values “lived experience” over logical argument. In her best-selling book, White Fragility, Robin DiAngelo advises that, when talking to black people about race, white people should avoid being silent or emotionally withdrawn—but also avoid arguing. (She considers the phrases “I disagree” and “You misunderstood me” to be off-limits, for example.) For whites, the only option left, apparently, is to agree enthusiastically with whatever a black person says. By contrast, Sowell insists that his work “stands or falls on its own merits or applicability” and is not “enhanced or reduced by [his] personal life.”
His rejection of “lived experience” as a substitute for evidence, however, should not be confused with the view that experiences do not matter. In fact, Sowell’s work at times does reflect episodes from his life—often painful ones. The most striking example concerns his son, John Sowell. John was born healthy and seemingly normal, but as time passed, it became clear that something was wrong. Well past the age when most kids begin speaking in full sentences, John would scarcely utter a word. To outsiders, and even to Sowell’s then-wife, it seemed a clear case of mental disability. Yet Sowell wasn’t convinced. Speech problems aside, John was unusually bright: he could pick child locks before he could walk, for instance. And he had a prodigious memory: he once knocked over a chessboard mid-game and put all the pieces back in their former places. Given these underlying signs of intelligence, his failure to grasp even the simplest words was all the more mystifying. Yet hope came when, around age four, John slowly started to speak, and final vindication came when he grew up to become a well-adjusted young man.
Decades later, after his son had graduated from Stanford, Sowell set out to explain the puzzle. The result: the first academic study ever to explore the phenomenon of late-talking children who are unusually bright but not autistic. Drawing on this original research, as well as anecdotes, data, and history, Sowell wrote two books: Late-Talking Children, in 1997; and The Einstein Syndrome,in 2001. The second—named after history’s most famous late talker—won praise from Steven Pinker as “an invaluable contribution to human knowledge.” But apart from child-psychology specialists like Pinker, and parents of late talkers, these books received little public notice. Yet they represent a remarkable achievement: in an era of high academic specialization, it’s vanishingly rare for a scholar to break new ground in a field in which he has no formal training.
Sowell’s books on economics, the field in which he is trained—he received his Ph.D. from the University of Chicago in 1968—form the core of his achievement. Foremost among them is Knowledge and Decisions, first published in 1980. The book draws its inspiration from Friedrich Hayek’s classic 1945 essay “The Use of Knowledge in Society.” The knowledge that concerned Hayek was not timeless, scientific knowledge of the sort discovered by Einstein, or the bureaucratic knowledge that a government agency gathers, but practical knowledge—the kind required, say, to run a deli on a particular street corner in a specific neighborhood or grow crops on a particular plot of land in a variable climate. Knowledge of this kind is both fleeting (what was true last week may not be true this week) and local (what is true on one street corner may not be true on the next). No single person can ever possess much of it.
If it were possible for the sum total of such knowledge, distributed among millions of different minds, to be collected and conveyed to a single mind in real time, then a central planner could direct the economy like a maestro conducts an orchestra. Of course, it’s not possible, but Hayek’s insight was that the price mechanism achieves the same result, anyway. If tin suddenly becomes scarcer—either because reserves have been destroyed or a new use for it has been discovered—no central planner is needed to get consumers to use less of the metal. People do not even need to know why tin has become scarcer. Armed with no information other than the increased price of tin, millions will reduce their use of it, as if directed by an omniscient force. Put another way, what would require an impossible amount of knowledge and conscious coordination in the absence of prices requires neither in their presence.
Where Hayek’s essay ends, Sowell’s magnum opus begins. As the title suggests, the book is not only about knowledge (in Hayek’s sense) but also about the decisions we make—in economics, politics, war, and much else—based on such knowledge. In a world where each person’s knowledge amounts to a speck in an ocean of ignorance, Sowell’s thesis holds that “the most fundamental decision is not what decision to make but who is to make it.” While decision makers may speak in terms of goals—ending poverty, reducing racism, spreading democracy, and so on—all they can actually do is to begin processes. Thus, when faced with the question, “Who gets to decide?,” we ought to answer not by reference to the superior goals or moral fiber of some institution or another but to the incentives and constraints facing different decision makers.
The American Revolution, with its emphasis on checks and balances, provides the classic example of Sowell’s thesis put into practice. Drawing on “knowledge derived from experience,” Sowell writes, the Founders assumed that humans are basically selfish and created a system of incentives and constraints that would impede selfish leaders from doing horrible things. By contrast, the French Revolution, based on “abstract speculation about the nature of man,” assumed the opposite—that man was perfectible and that government was the instrument of perfection. The very different consequences of these two revolutions, according to Sowell, were no accident.
The more common choice between decision makers pits the government against the market. Yet for Sowell, “the market” is “a misleading figure of speech.” Many “refer to ‘the market’ as if it were an institution parallel with, and alternative to, the government as an institution.” In reality, “the market” is not an institution; it is “nothing more than an option for each individual to choose among existing institutions, or to fashion new arrangements suited to his own situation and taste.” The need for housing, for example, “can be met by ‘the market’ in a thousand different ways chosen by each person—anything from living in a commune to buying a house, renting rooms, moving in with relatives, living in quarters provided by an employer, etc.” Market arrangements may differ, but what unites them—and separates them from government plans—is that those who make decisions experience both their costs and benefits. Their feedback mechanisms are therefore instantaneous.
Though the connection is less obvious, Knowledge and Decisions reflects Sowell’s life as much as his books on late-talking children do. Like the American Founders, Sowell came to his view of government more through experience than through philosophy. In 1960, he worked as an economist with the Labor Department. His task was to study the sugar industry in Puerto Rico, where the department enforced a minimum-wage law. Upon discovering that unemployment was rising with each increase in the minimum wage, Sowell wondered whether the law was causing the rise—as standard economic theory would predict. His coworkers had a different take: unemployment was rising because a hurricane had destroyed crops. Eventually, Sowell came up with a way to decide between the competing theories: “What we need,” he told his coworkers excitedly, “are statistics on the amount of sugarcane standing in the field before the hurricanes came through Puerto Rico.” He was met with a “stunned silence,” and his idea was dismissed out of hand. After all, administering the minimum-wage law “employed a significant fraction of all the people who worked there.”
This was not an isolated experience. In 1959, Sowell was working as a clerk-typist for the U.S. Public Health Service in Washington. One day, a man had a heart attack just outside the building. He was taken inside and asked if he was a government employee. If he had been, he could have received treatment in the same building, immediately. But he was not—so he had to be sent to a hospital across town. It was rush hour, and by the time he got there, he was dead. Sowell captured the dark irony: “He died waiting for a doctor, in a building full of doctors.” As with the Labor Department, the problem was not the employees, who “were very nice,” he remembers; it was the “nature of a bureaucracy” itself, with its bad incentives and slow feedback mechanisms.
Dark irony (usually the result of some government program) is a frequent theme in Sowell’s work. One fact referenced in Basic Economics is typical: as part of an effort to support farmers during the Great Depression, the federal government bought 6 million hogs in 1933 and destroyed them—while millions of Americans were struggling to feed themselves. Modern bureaucracies, of course, can hardly escape ridicule. During the early months of the coronavirus pandemic, common sense led many people to wear masks in public, since it was well known that the virus spread mainly through coughing. Yet for months, the World Health Organization and the Centers for Disease Control advised people not to wear masks—only reversing their advice after the pandemic had nearly reached its peak. Unlike a business owner confronting a market test, no one in these organizations will necessarily pay a price.
“A frequent theme in Sowell’s writing is what philosophers would call reversing the explanandum.”
AConflict of Visions (1987) represents Sowell’s best effort to put his ideas in dialogue with their opposite. He begins the book by observing a strange fact: people predictably line up on opposite sides of political issues that seemingly have nothing in common. For instance, knowing someone’s position on climate change somehow allows you to predict their views on taxing the rich, gun control, and abortion. It’s tempting to dismiss this as mere political tribalism. But Sowell contends that more is at work: that there are two fundamental ways of thinking about the social world, two sets of basic assumptions about human nature, and two conflicting “visions,” from which most political disagreements follow. He names these the constrained vision and the unconstrained vision.
The constrained vision underlies Knowledge and Decisions. It maintains that humans are inherently more flawed than perfectible, more ignorant than knowledgeable, and more prone to selfishness than altruism. Good institutions take the tragic facts of human nature as given and create incentive structures that, without requiring men and women to be saints or geniuses, still lead to socially desirable outcomes. A good example is the price mechanism as described by Hayek. Centralized power is treated with suspicion, as the humans who wield it will be self-interested, or worse. What’s more, in the constrained vision, traditions and social mores are trusted because they represent the accrued wisdom of untold generations.
As for the unconstrained vision, if humans are flawed, selfish, and ignorant, it is not due to the unchangeable facts of our nature but to the way that our society happens to be arranged. By reforming our economic system, our education system, our laws, and other institutions, it is possible to change the social world in fundamental ways—including those aspects of it purportedly fixed by human nature. Through enlightened public policy, often implemented by a central authority, evils once assumed as inevitable are revealed to be social constructs or products of outdated ideas. Traditions should receive no special reverence, in this vision, but live or die according to their rationality (or lack thereof), as judged by modern observers.
Afrequent theme in Sowell’s writing is what philosophers would call reversing the explanandum—the phenomenon to be explained. Take poverty. Many observe the enormous chasm between rich and poor nations and, understandably, wonder why poverty exists. But the real question, in the constrained vision, is why wealth exists. “Standards of living far below what we would consider to be poverty have been the norm for untold thousands of years. It is not the origins of poverty which need to be explained,” Sowell writes in his recent Wealth, Poverty and Politics. “What requires explaining are the things that created and sustained higher standards of living.” In personal matters, too, he is quick to notice a mistaken explanandum. “Age 86 is well past the usual retirement age,” he noted in the final installment of his column, “so the question is not why I am quitting, but why I kept at it so long.” One major difference between the two visions is where they locate the explanandum when viewing the social world. “While believers in the unconstrained vision seek the special causes of war, poverty, and crime,” Sowell writes in Conflict, “believers in the constrained vision seek the special causes of peace, wealth, or a law-abiding society.”
Sowell’s great contribution to the study of racial inequality was to reverse the explanandum that has dominated mainstream thought for over a century. Intellectuals have generally assumed that in a fair society, composed of groups with equal inborn potential, we should see racially equal outcomes in wealth, occupational status, incarceration, and much else. That racial disparity is pervasive is seen either as proof that racial groups are not born with equal potential or that we don’t live in a fair society. The first position predominated among “progressive” intellectuals in the early twentieth century, who blamed racial disparity on genetic differences and prescribed eugenics as a cure. The second has dominated the academy since the 1960s and is now orthodoxy on the political Left. Democrats as moderate as Joe Biden have charged that America is “institutionally racist,” and when asked to prove it, the reply almost always points to statistical disparities between whites and blacks in wealth, incarceration, health, and in other areas. The suppressed premise—that statistical equality would be the norm, absent racism—is rarely stated openly or challenged.
In a dozen books, Sowell has challenged that premise more persuasively than anyone. One way he pressure-tests this assumption is by finding conditions in which we know, with near-certainty, that racial bias does not exist, and then seeing if outcomes are, in fact, equal. For example, between white Americans of French descent and white Americans of Russian descent, it’s safe to assume that neither group suffers more bias than the other—if for no other reason than that they’re hard to tell apart. Nevertheless, the French descendants earn only 70 cents for every dollar earned by the Russian-Americans. Why such a large gap? Sowell’s basic insight is that the question is posed backward. Why would we think that two ethnic groups with different histories, demographics, social patterns, and cultural values would nevertheless achieve identical results?
Sowell notes, too, the cases of a minority group with no political power nevertheless outperforming the dominant majority oppressing them. His favorite example was the successful Chinese minority in Southeast Asia. But he also has written about the Jews in Europe, the Igbos in Nigeria, the Germans in South America, the Lebanese in West Africa, and the Indians in East Africa. Perhaps the most striking American example is the Japanese. The Japanese peasant farmers who arrived on America’s western coast in the late nineteenth and early twentieth centuries faced laws barring them from landownership until 1952, in addition to suffering internment during World War II. Nevertheless, by 1960 they were outearning white Americans.
The phrase “the myth of the model minority” gets repeated so often that we mistake it for an explanation. It’s not a myth that some American minorities have higher incomes, better test scores, and lower incarceration rates than white Americans. And the most common explanation for this—that such groups come from the highly educated upper crusts of their original homelands—both explains too little and concedes too much. First, it doesn’t explain the rise of groups such as the Japanese; nor does it explain the eventual success of the Jewish migrants who left Europe around the turn of the century and settled on New York’s Lower East Side. Second, the argument implicitly concedes a part of what it seeks to refute: that the main determinants of economic success are education and skills—“human capital,” as economists call it.
One can object that the experience of black Americans is unique, and therefore incomparable with that of any other group. No other ethnic group in America was enslaved, disenfranchised, lynched, segregated, denied access to credit, mass-incarcerated, and so on. This is true enough—but only if our analysis is limited to America. What is so valuable about Sowell’s perspective is precisely its international scope. In three thick volumes published in the 1990s—Conquests and Cultures, Migrations and Cultures, and Race and Culture—he examined the role that cultural difference has played throughout world history. Sowell documents the fact that slavery, America’s “original sin,” has existed on every inhabited continent since the dawn of civilization. Without going back more than a few centuries, every race has been either slaves or enslavers—often both at once. Preferential policies provide another example. What we Americans euphemistically call “affirmative action” has existed longer in India than in America. Malaysia, Sri Lanka, China, and Nigeria have all had it, too.
How does all this apply to America? On William F. Buckley’s Firing Line, Sowell summed it up in a sentence: “I haven’t been able to find a single country in the world where the policies that are being advocated for blacks in the United States have lifted any people out of poverty.” Maybe American race relations are so unique that all historical and international comparisons are useless. But it’s far more likely that we have something important to learn from patterns that have held true around the world and throughout history.
Like others with similar views on race, Sowell has encountered countless smears, though the usual avenues of attack—accusations of racism, privilege, and all the rest—have not been available. Someone should have told Aidan Byrne, who reviewed one of Sowell’s books for the London School of Economics blog. Doubtless convinced that he was delivering a devastating blow, Byrne quipped: “easy for a rich white man to say.” It’s hard not to laugh at this hapless reviewer’s expense, but many mainstream commentators differ from Byrne only in that they usually remember to check Google Images before launching their ad hominems. The prevailing notion today is that your skin color, your chromosomes, your sexual orientation, and other markers of identity determine how you think. And it is generally those who see themselves as the most freethinking—“woke,” while the rest of us are asleep—who apply the strictest and most backward formulas.
To such people, the existence of a man like Thomas Sowell will always be a puzzle. He will always remain, in their minds, a phenomenon to be explained. But the question is not why a man who lived Sowell’s life came to hold the views that he did. The question is why one would expect a mind so brilliant to submit itself to received opinion of any kind.
Illustrative: A man holds a bottle of pills (outline205; iStock by Getty Images)
An existing medicine can “downgrade” the danger-level of coronavirus to that of a common cold, a Jerusalem researcher is claiming, after testing it on infected human tissue.
Prof. Yaakov Nahmias says that his research shows that the novel coronavirus is so vicious because it causes lipids to be deposited in the lungs, and that there is a solution to undo the damage: a widely-used anti cholesterol drug called fenofibrate.
“If our findings are borne out by clinical studies, this course of treatment could potentially downgrade COVID-19’s severity into nothing worse than a common cold,” Nahmias said.
Unlike remdesivir, which is being lauded for its effect on coronavirus patients, fenofibrate, sometimes sold under the brand name Tricor, is already accredited by America’s Food and Drug Administration and is in plentiful supply. Remdesivir is in short supply and is also still pending full approval by regulators like the FDA.Prof. Yaakov Nahmias of the Hebrew University of Jerusalem (right) (courtesy of the Hebrew University of Jerusalem)
Nahmias, director of Hebrew University’s Grass Center for Bioengineering, reached his conclusion in joint research with Dr. Benjamin tenOever at New York’s Mount Sinai Medical Center. Their paper has gone live on an online portal run by Cell Press, publishers of biomedical journals, for research that hasn’t yet been peer reviewed.
Nahmias and tenOever performed lab tests on human lung cells infected with SARS-CoV-2.
Nahmias said they arrived at the idea that a cholesterol drug could help after studying the way in which the novel coronavirus “hijacks” the human body.
He told The Times of Israel: “The question is why this new coronavirus is so different from its close relatives that just cause a common cold. What we see is that this virus really changes lipid metabolism in the human lungs. The new coronavirus causes tiny lipid droplets to accumulate in the lungs, something you don’t normally see in the lungs in any significant quantity.”The chemical composition of the fenofibrate cholesterol-lowering drug. Atoms are represented as spheres with conventional color coding: hydrogen (white), carbon (grey), oxygen (red), chlorine (green) (iStock)
Similar processes, hinging on the virus depositing fats, seem to take place in other parts of the body too, such as the liver, said Nahmias.
He believes that the virus does this in order to perpetuate itself in the host, and that if this process can be stopped, it will halt the onset of problems with organs — normally the lungs — that cause the virus to badly affect patients.
He said the virus interferes with the ability of the body to break down fat, and fenofibrate jump-starts this process. “The interesting thing about our study is that fenofibrate actually binds and activates the very site on the DNA that the virus shuts down — a part of our DNA that allows our cells to burn fat,” he stated.
“Virus infection causes the lung cells to start building up fat, and fenofibrate allows the cells to burn it.”
The restart of the process is swift, he said, comparing it to “when the plug is removed from the bath tub.”
Nahmias said that the high danger level from coronavirus isn’t caused by its infectiousness or the body’s general ability to rid itself of the virus, but rather by the unique symptoms it causes. “Your body can easily deal with the virus, all we need to do is deal with the symptoms,” he said.
“We need to give the body time to clear the virus without going into respiratory failure. And it’s by doing this that I think we can transform it into something far less serious, something like the common cold.”
Author of the article:Postmedia NewsPublishing date:Jul 31, 2020
Arwen~ Are you kidding me? “What remains unclear is whether Buddy succumbed to the complications of the virus, which he likely he contracted from his owner, Robert Mahoney of Staten Island, N.Y., or whether the dog died fromlymphoma.Mahoney tested positive for the novel coronavirus in the spring.
Two veterinarians, who were not involved in his treatment but who nonetheless reviewed Buddy’s medical records for National Geographic, told the publication the dog probably had cancer.” What an irresponsible and manipulative clickbait headline, using a beloved family dog’s death to promote fearmongering over COVID-19. This is indicative of the false stats being propagated by the media to keep this “pandemic” much larger and more dangerous than it really is, and for people to be begging for their rights and freedoms to be given up without a whimper. Unconscionable.
A canine named Buddy is the first dog to test positive for COVID-19 in the U.S. after the seven-year-old German shepherd fell ill for three months, according to National Geographic.
What remains unclear is whether Buddy succumbed to the complications of the virus, which he likely he contracted from his owner, Robert Mahoney of Staten Island, N.Y., or whether the dog died from lymphoma.
Mahoney tested positive for the novel coronavirus in the spring.
Two veterinarians, who were not involved in his treatment but who nonetheless reviewed Buddy’s medical records for National Geographic, told the publication the dog probably had cancer.
The dog more than lived up to his name by running through sprinklers, going on long car rides, swimming in the lake.
Buddy would cuddle with the Mahoney family.
At Halloween, the family would dress Buddy as a bunny.
He even acted as big brother by protecting Duke, a 10-month-old German shepherd the Mahoneys also owned.
Just before his seventh birthday, Buddy experienced breathing issues.
And then came his sudden passing as Buddy would emerge as the first dog in the United States to be confirmed positive for SARS-CoV-2, the coronavirus that causes COVID-19.
City employees must now undergo an anti-bias course about ‘Internalised Racial Superiority for White People’.
Martel -“As you will have all noted there is an article below exposing Seattle’s agenda of thought control and racist profiling ( for what else is it other than racist to label an entire race by dint of their race?) and your correspondent wonders, in this idle moment whether the First Amendment of the US constitution renders Seattle’s agenda as unconstitutional and thus, probably, illegal?”
In one of the clearest and most dangerous examples yet of institutional wokeness, the Seattle Office of Civil Rights has come up with a ‘race and social justice’ curriculum which the city government’s 10,000 employees must undertake.