Politics and Society


1. The Death of Moderate Republicans
2. What The Right Gets Right
3. America and Class
4. The New Upper Class: Geeks
5. Hard Wired Political Identity
6. Lobbying
7. Voting Lottery
8. Learning
9. War
10. Racist Hate Groups
11. Liberals and Conservatives
12. Anti-Science In US Politics
13. Cartoons and Sitcoms
14. The Bio Debate
15. Roman Republic and Campaign Spending
16. Overcoming Personal Bias
17. Social Work in the Tenderloin
18. Why Are We Deceived By Satire Sites?
19. Liberal vs Conservative
20. Nate Silver LT Interview
21. Maps
22. Euro Austerity
23. The "War" On Terror
24. Karl Marx and the Tea Party
25. Niall Ferguson on Social Inequality
26. Loss of Faith in Govts
27. Greens Are Finished
28. Changing Your Mind
29. Opposition To Fracking
30. American Bile
31. Tea Party
32. US Tax System
33. Climate Change Persuasion
34. The Tea Party Mind
35. Free Speech
36. The Tea Party and End Times
37. Long term Effects on Voting Preferences
38. Genghis Khan and Mongols
39. Foxes and Hedgehogs
40. The Vikings
41. The Georgians and Succession
42. Minimum Wage, Climate Change, Political Corruption
43. Chess and Politics
44. George Carlin
45. Closed Minds
46. Independent Scotland and Unintended Consequences
47. Revolution?
48. White Supremacists
49. FairTrade
50. Colour Blind Love
51. Quarantining The Islamic Cauldron
52. Paul Krugman on Denial Politics
53. US Politics and Money
54. Tight or Loose
55. Conservative Psychology
56. What Is The Fairest Voting System?
57. Exploiting The Tea Party
58. Inequality
59. Tribes (Seth Godin)
60. Why The Tea Party Wants Small Govt
61. The Cost of the Underclass
62. Why British Muslims Radicalise (and US ones don't)
63. AIPAC - The Israel Lobby in US Politics
64. Bill Gates Takes On The NRA
65. American Racialism
66. Rotherham and British Social Services
67. White Privilege
68. The Pyramids and Jewish Slaves
69. Homes For The Homeless 
70. Framing Persuasion
71. How To Win An Argument
72. UKip and the Tea Party
73. UK Immigration
74. Impact of Cheaper Oil
75. Redlining
76. Height discrimination in China
77. Solution Aversion
78. Self Interest and Leadership Coups
79. Govts and New Technology
80. Walmart Buys Govt
81. Act Local
82. Faux News
83. Weather Forecasting in US
84. American Politicians and The Religious Right
85. American Conservatives
86. GOP Captured By The God Squad
87. How To Sneer At The Greens
88. The Flawed Thinking Behind Political Correctness
89. Republicans and Health Care
90. Talking About Race
91. Hispanics In America
92. Jeremy Clarkson
93. Israel and Palestine
94. Singapore
95. Fighting "Religious Freedom" Movement
96. American Inequality
97. Rand Paul
98. Govt Welfare
99. US Politics Trainset For The Uber-Rich
100. You can’t be a smart candidate in a party that wants to be stupid.
101. Big Business and Social Conservatives
102. How to Combat Distrust of Science
103. Free Immigration
104. Being Black In America
105. What GOP Hasn't Worked Out
106. How The Tories Won
107. How The Tories Won Part 2
108. Make a Conservative Friend
109. The Iraq War: the Verdict
110. Making Voting Easier
111. Two Ways of Seeing The World
112. Socialism is Dead and Buried
113. Trickle Down Doesn't Work
114. The End of the Middle Class
115. Why Dylann Roof Acted on His Racism
116. The Tea Party and Southern Racism
117. The Migrant Question
118. The EU Is Doomed
119. How Debates Shift
120. The GOP entertainment auditions
121. Counter Extremism
122. Lottery Reward For Voting
123. Perot Didn't Cost Bush Re-election
124. Harnessing Anger
125. GOP and Fascism
126. The Politics of Anger
127. End Times and the Religious Right
128. Rules Are Not Fixed
129. Drone Warfare
130. Demographic Coping
131. The Coming Labour Schism?
132. Political Activists
133. Baby Boomers Have Pillaged The Economy
134. How Corrupt Is The US?
135. How We Respond To Tragedies
136. Middle East Quagmire
137. You Are More Than 7 Times As Likely To Be Killed By A Right-Wing Extremist Than By Muslim Terrorists
138. Stochastic Terrorism
139. Stifling Free Speech
140. Twitter and Facebook Arguments
141. GOP Extremism
142. The Vital Q's To Ask In Polling
143. The Welfare State Is Breeding Losers
144. African Strongmen
145. Marxism Always Fails
146. The Lesson of Chenobyl
147. The Politics of Pragmatic Compromise




The Death of Moderate Republicans

 

What the Right Gets Right

With the competitors for the Republican presidential nomination engaged in an intriguing and unexpected debate over the dangers of capitalism's "creative destruction," this is the appropriate moment to explore the question: What does the right get right?

What insights, principles, and analyses does this movement have to offer that liberals and Democrats might want to take into account? I recently posed a question to conservative think tanks: If given a free hand, how would conservatives deal with the unemployed, those dependent on government benefits (food stamps, Medicaid), and, more generally, those who are losers in the new economy - those hurt by corporate restructuring, globalization and declining manufacturing employment?

The Heritage Foundation, rather than answer the question, sent me links to the following papers: 'Extended Unemployment Insurance Payments Do Not Benefit the Economy,' 'A Free Enterprise Prescription: Unleashing Entrepreneurs to Create Jobs,' 'Confronting the Unsustainable Growth of Welfare Entitlements: Principles of Reform and the Next Steps,' and 'An Effective Washington Jobs Program: Do Less Harm.' A conservative policy intellectual from a different think tank sent me an email suggesting that I read Paul Ryan's budget proposal, 'The Path to Prosperity: Restoring America's Promise.'

All the answers evaded the question posed and, in my view, amounted to ideological pap.

I decided it might be better to ask liberals what they liked about conservatism. I submitted a new question to a small group of academics and activists on the left: what does the right get right?

The answers they gave describing the strengths the right has were illuminating and help to explain why the Republican Party has won seven of the last eleven presidential elections; controlled the Senate from 1981 to 1987 and from 1995 to 2007; and controlled the House from 1996 to 2006 and 2011 to 2013.

Andy Stern, former president of the Service Employees International Union (one of our era's few highly successful labor organizations) and now a senior fellow at Columbia University's Richman Center, made five points about conservatives in an email to me:

'They appreciate more instinctively the need for fiscal balance.'

'They understand people's more innate belief in hard work and individual responsibility and see government as too often lacking that understanding.'

'They are more suspicious from a philosophical point of view of big government as an answer to many issues and are suspicious of Wall Street institutionally and not just their high salaries, and bad practices.'

'They respect the need for private sector economic growth (although their prescription is lacking).'

'They are more pro-small business.'

Gary Jacobson, a political scientist at the University of California at San Diego, is the author of "A Divider, Not a Uniter," a harsh critique of the presidency of George W. Bush, whom Jacobson treats as a conservative apostate. Genuine conservatism, in Jacobson's view, has a number of strengths:

It recognizes "the importance of material incentives in shaping behavior, and the difficulty in keeping bureaucracies under control and responsive to citizens."

It is skeptical of the application of social science theories to real world problems and cognizant of human fallibility/corruptibility.

It places a high value on liberty/autonomy.

It places a similarly high value on good parenting.

It acknowledges the superiority of market systems for encouraging efficient use of resources.

Jonathan Haidt, a professor of psychology at the University of Virginia, is a liberal Democrat who has spent much of the past decade exploring the competitive strengths of conservatism. In his new book, "The Righteous Mind: Why Good People are Divided by Politics and Religion," which will be published in March, Haidt makes several points. Conservatives, he argues, "are closer to traditional ideas of liberty" like "the right to be left alone, and they often resent liberal programs that use government to infringe on their liberties in order to protect the groups that liberals care most about."

"Everyone gets angry when people take more than they deserve. But conservatives care more," Haidt writes. And social conservatives favor a vision of society "in which the basic social unit is the family, rather than the individual, and in which order, hierarchy, and tradition are highly valued."

What's more, conservatives detect threats to moral capital that liberals cannot perceive. They do not oppose change of all kinds (such as the Internet), but they fight back ferociously when they believe that change will damage the institutions and traditions that provide our moral exoskeletons (such as the family). Preserving those institutions and traditions is their most sacred value. Haidt is sharply critical of some aspects of liberalism. Liberals' determination to help victims often leads them to push for changes that weaken groups, traditions, institutions, and moral capital. For example, the urge to help the inner-city poor led to welfare programs in the 1960s that reduced the value of marriage, increased out-of-wedlock births, and weakened African American families, he suggests. It's as though liberals are trying to help a subset of bees (which really does need help) even if doing so damages the hive.

Haidt, Jacobson and Stern described the positive or 'flattering' view of conservatism; they were not asked about their opinions of conservatism's shortcomings.

Much of the 2012 general election campaign will be taken up by the struggle between Obama and Romney - and, more broadly, between Democrats and Republicans - to define conservatism and the Republican Party in either favorable or hostile terms.

Two scholars, Philip E. Tetlock, professor of management and psychology at the University of Pennsylvania's Wharton School, and Gregory Mitchell, a professor of law at the University of Virginia, have done provocative and useful work analyzing the pluses and minuses of liberalism and conservatism.

In 'Liberal and Conservative Approaches to Justice: Conflicting Psychological Portraits,' Tetlock and Mitchell argue that the liabilities of conservatism include the following:

"Conservatives are too prone to engage in zero-sum thinking (either I keep my money or the government takes it). They fail to appreciate the possibility of positive sum solutions to social conflicts."

Conservatives hold the laissez-faire 'minimal-state' view that, although we have a moral obligation to refrain from hurting others, we have no obligation to help others. Conservatives cling to the comforting moral illusion that there is a sharp distinction between allowing people to suffer and making people suffer.

Conservatives fail to recognize that even if each transaction in a free market meets their standards of fairness (exchanges between competent adults who have not been coerced or tricked into contracts), the cumulative results could be colossally unfair.

Conservatives do not understand how prevalent situational constraints on achievement are and thus commit the fundamental attribution error when they hold the poor responsible for poverty.

Conservatives overgeneralize: From a few cases of poor persons who exploit the system, they draw sweeping conclusions about all poor persons.

Chance happenings play a much greater role in success or failure than conservatives realize. People often do not control their own destinies.

The tensions between 'good' and 'bad' conservatism have already surfaced in the controversy over the corporate acquisition practices of Bain Capital when Mitt Romney was C.E.O. Both Romney and the firm are proponents of capitalism's 'gale of creative destruction.' The question is, has Bain produced enough creation to justify the destruction?

The ideological war has begun in earnest, even a little early. It pits the right, seeking to depict a conservatism that is essentially good and a liberalism that is essentially bad, against a left attempting just the opposite. Looked at another way, the two sides are fighting over what the role of government in redistributing resources from the affluent to the needy should and shouldn't be.

While neither Romney nor Obama fits comfortably into the role of doctrinaire standard bearer, they have both been shaped by political and economic pressures that have forced them into philosophical confrontation. Political campaigns, especially re-election campaigns, are highly ideological, and this one will be no exception as the nominees try to determine the direction the country will take over the next decade.

 

America and Class

G reat nations eventually cease to be great, inevitably. It’s not the end of the world. Britain goes on despite the loss of its one-time geopolitical pre-eminence. France goes on despite the loss of its one-time pre-eminence in the arts. The United States will go on under many alternative futures.

“There is a great deal of ruin in a nation,” Adam Smith wisely counselled a young correspondent who feared Britain was on its last legs in the 1700s. As a great power America still has a lot of ruin left in it. But how much ruin does the American project have left? It consists of the continuing effort, begun with the founding of the United States, to demonstrate that human beings can be left free as individuals and families to live their lives as they see fit, coming together voluntarily to solve their joint problems.

The polity based on that idea led to a civic culture that was seen as exceptional by all the world. That culture was so widely shared among Americans that it amounted to a civil religion. To be an American was to be different from other nationalities in ways that Americans treasured. That culture is coming apart at the seams — not seams of race or ethnicity, but of class.

Every society more complex than bands of hunter-gatherers has had an upper class and, within that upper class, an elite that has run the key institutions. The United States has been no exception. But things are different now from how they were half a century ago. America’s new upper class is new because its members have something in common beyond the simple fact of their success.

A narrow elite existed in 1960 as in 2010, but it was not a group that had broadly shared backgrounds, tastes, preferences or culture. They were powerful people, not a class. Americans still rise to power from humble origins. John Boehner, the Speaker of the House of Representatives, was one of 12 children born to the owner of a working-class bar.

But along with the continuing individual American success stories is a growing majority of the people who run the institutions of America who do share tastes, preferences and culture. They increasingly constitute a class.

Real income for the bottom quartile of American families fell after 1970. The poor didn’t actually get poorer — the growth of in-kind benefits and earned-income tax credits more than made up for the drop in pre-tax cash income — but they didn’t improve their position much either.

Real household income for families in the middle was flat. Just about all the benefits of economic growth from 1970 to 2010 went to people in the upper half of the income distribution

A growing majority of the people who run America who do share tastes and preferences Rolling back income inequality won’t make any difference in the isolation of the new upper class from the rest of America. The new-upper-class culture is driven by the distinctive tastes and preferences that emerge when large numbers of cognitively talented people are enabled to live together in their own communities.

Their isolation involves spatial, economic, educational, cultural and, to some degree, political isolation. This growing isolation has been accompanied by growing ignorance about the country over which they have so much power. America has neighbourhoods that have been famous for a century: places like the Upper East Side in New York, Beacon Hill in Boston and the North Shore of Chicago.

To illustrate the magnitude of the change in density of advanced education and the magnitude of income that occurred over the past half-century, I assembled data on median family income and percentage of adults with college degrees for 14 of the most famous “best parts of town” in 1960 and what had happened to those same indicators by 2000.

All 14 tell the same story. In 1960 college graduates were still a minority, usually a modest minority, in even the most elite places in the United States. Over the next 40 years these places were infused with new cultural resources in the form of college graduates and more money to pay for the tastes and preferences of an upper class.

These infusions were not a matter of a few percentage points or a few thousand dollars. The median income in these 14 elite towns and neighbourhoods went from $84,000 to $163,000 — almost doubling — in constant dollars. The median percentage of college graduates went from 26% to 67% — much more than doubling.

I also created a score combining education and income for each zip code in the country and found that the top 5% — the 882 SuperZips — contain a population with scores similar to that of the famous elite neighbourhoods.

By definition, most of the people who live in SuperZips are affluent and well educated. They are more likely to be married than elsewhere, less likely to have experienced divorce and less likely to have children living in households with single mothers. The men in SuperZips are more likely to be in the labour force than other American men and less likely to be unemployed. They also work longer hours than other Americans. Crime in urban SuperZips is low and crime in suburban SuperZips is rare.

Far from the life of the SuperZips is working-class America. For most of its history, working-class America was America.In 1960, 81% of American workers were employed in low-level white-collar or technical jobs, manual and service jobs or worked on farms. Within that mass of the working population there were racial and ethnic distinctions, but not many others. At that time the poor were not seen as a class, either by other Americans or in their own eyes. The poor were working-class people who didn’t make much money. They were expected to participate in the institutions of American life just as everybody else did. When white Americans thought about the lower class, a lot of them thought in terms of race — that’s one of the bad realities of 1960. In so far as they thought of a lower class among whites, they had in mind people at the fringes of American life — the broken-down denizens of Skid Row or the people known as white trash.

In the years after 1960 America developed something new: a white lower class that did not consist of a fringe but of a substantial part of what was formerly the working-class population. The new lower class grew under the radar for a long time. In the 1960s and 1970s two groups of Americans at opposite ends of the socioeconomic spectrum notoriously defied the traditional American expectations of respectable behaviour. White youths, mostly from middle-class and upper middle-class families, formed the counterculture that blossomed in the mid-1960s and died away during the 1970s. And a small minority of the black population became so socially disorganised that by the early 1980s it had acquired the label of underclass.

During those decades — quietly, gradually, without creating obvious social problems for America as a whole — the population of white Americans who defied traditional American expectations grew in size. By the 1990s and 2000s the new lower class was a shaping force in the life of working-class America.

Until recently healthy men in the prime of life who did not work were scorned as bums. Even when the man was jobless through no fault of his own, America’s deeply rooted stigma against idleness persisted — witness the sense of guilt that gripped many men who were unemployed during the Great Depression even though they knew it wasn’t their fault. That norm has softened. Consider the strange case of workers who have convinced the government they are unable to work. The percentage of workers who actually are physically or emotionally unable to work for reasons beyond their control has necessarily gone down since 1960.

Medical care now cures or alleviates many ailments that would have prevented a person from working in 1960. Technology has produced compensations for physical handicaps and intellectual limitations. Yet the percentage of people qualifying for federal disability benefits because they are unable to work rose from 0.7% of the labour force in 1960 to 5.3% in 2010.

More evidence for the weakening of the work ethic among males comes from the data on labour force participation — the economist’s term for being available for work if anyone offers you a job. When the average in 1960–4 is compared with the rate from 2004-8 (before the recession), white male labour force participation fell across the entire age range. Among white males of 30-49 — the prime ages when men are supposed to be working — 8% were out of the labour force in 2004-8. That is more than three times the percentage of prime- age men who were out of the labour force in 1960–4.

Throughout the 1960s American white males of all educational levels inhabited the same world. Participation in the labour force was close to universal among the 30–49 age group. Starting in the 1970s and continuing up to 2008, white males with only a high school education started leaving the labour force. As of March 2008, 12% of prime-age white males with no more than a high school diploma were not in the labour force compared with 3% of college graduates.

The “American project” refers to national life based on the idea that the “sum of good government”, as Thomas Jefferson put it in his first inaugural address, is a state that “shall restrain men from injuring one another [and] shall leave them otherwise free to regulate their own pursuits of industry and improvement”.

At this point in our history more and more people, including prominent academics, the leaders of the Democratic party and some large portion of the American electorate, believe history has overtaken that original conception.

Over the course of the 20th century, western Europe developed an alternative to the American model, the advanced welfare state that provides a great deal of personal freedom in all areas of life except the economic ones.

The restrictions the European model impose on the economic behaviour of employers and employees are substantial but, in return, the citizens of Europe’s welfare states have (so far) enjoyed economic security. I think it is a bad trade.

The European model assumes human needs can be disaggregated when it comes to choices about public policy. People need food and shelter so let us make sure that everyone has food and shelter. People may also need self-respect but that doesn’t have anything to do with whether the state provides them with food and shelter.

People may also need intimate relationships with others but that doesn’t have anything to do with policies regarding marriage and children. People may also need self-actualisation — or self- fulfilment — but that doesn’t have anything to do with policies that diminish the challenges of life.

Europe has proved that countries with enfeebled family and community can still be pleasant places to live The tacit assumption of the advanced welfare state is correct when human beings face starvation or death by exposure. Then, food and shelter are all that count. But in an advanced society the needs for food and shelter can be met in a variety of ways and at that point human needs can no longer be disaggregated. The ways in which food and shelter are obtained affects whether the other human needs are met.

People need self-respect but self-respect must be earned — it cannot be self-respect if it’s not earned — and the only way to earn anything is to achieve it in the face of the possibility of failing. People need intimate relationships with others, but intimate relationships that are rich and fulfilling need content and that content is supplied only when humans are engaged in interactions that have consequences.

People need self-actualisation, but this is not a straight road, visible in advance, running from point A to point B. Self-actualisation intrinsically requires an exploration of possibilities for life beyond the obvious and convenient. All of these good things in life require freedom in the only way that freedom is meaningful: freedom to act in all arenas of life coupled with responsibility for the consequences of those actions. The underlying meaning of that coupling — freedom and responsibility — is crucial. Responsibility for the consequences of actions is not the price of freedom but one of its rewards. Knowing we have responsibility for the consequences of our actions is a main part of what makes life worth living.

If we ask what are the domains through which humans achieve deep satisfactions in life — achieve happiness — the answer is there are just four: family, vocation, community and faith. In each of those domains, responsibility for the desired outcome is inseparable from the satisfaction. The deep satisfactions that go with raising children arise from having fulfilled your responsibility for just about the most important thing that humans do. If you’re a disengaged father who doesn’t contribute much to that effort, or a wealthy mother who has turned over most of the hard part to full-time daycare and then boarding schools, the satisfactions are diminished accordingly. The same is true if you’re a low-income parent who finds it easier to let the apparatus of an advanced welfare state take over.

In the workplace the deep satisfaction that can come from a job promotion are inextricably bound up with the sense of having done things that merited it. If you know you got the promotion just because you’re the boss’s nephew or because the civil service rules specify that you must get it, deep satisfactions are impossible.

When the government intervenes to help, whether in the European welfare state or in America’s more diluted version, it not only diminishes our responsibility for the desired outcome, it also enfeebles the institutions through which people live satisfying lives. There is no way for clever planners to avoid it.

Marriage is a strong and vital institution not because the day-to-day work of raising children and being a good spouse is so much fun, but because the family has responsibility for doing important things that won’t get done unless the family does them. Communities are strong and vital not because it’s so much fun to respond to our neighbours’ needs, but because the community has the responsibility for doing important things that won’t get done unless the community does them.

Once that imperative has been met then an elaborate web of expectations, rewards and punishments evolves over time. Together that web leads to norms of good behaviour that support families and communities in performing their functions.When the government says it will take some of the trouble out of doing the things that families and communities evolved to do, it inevitably takes some of the action away from families and communities. The web frays and eventually disintegrates.

The first two-thirds of the 20th century saw spectacular progress in reducing the problems of poverty. But when families become dysfunctional, or cease to form altogether, growing numbers of children suffer in ways that have little to do with lack of money. When communities are no longer bound by their members’ web of mutual obligations, the continuing human needs must be handed over to bureaucracies — the bluntest, clumsiest of all tools for giving people the kind of help they need. The neighbourhood becomes a sterile place to live at best and, at worst, becomes the Hobbesian all-against-all free-fire zone we have seen in some of our big cities.

These costs — enfeebling family, vocation, community and faith — are not exacted on the people of the SuperZips. The things the government does to take the trouble out of things seldom intersect with the life of a successful lawyer or executive. Rather, they intersect with life at the other end of the spectrum.

A man who is holding down a menial job and thereby supporting a wife and children is doing something authentically important with his life. He should take deep satisfaction from that and be praised by his community for doing so. If that same man lives under a system that says the children of the woman he sleeps with will be taken care of whether or not he contributes, then that status goes away.

I am not describing a theoretical outcome, but American neighbourhoods where, once, working at a menial job to provide for his family made a man proud and gave him status in his community and where now it doesn’t. Taking the trouble out of life strips people of ways in which human beings look back on their lives and say: “I made a difference.”

Europe has proved that countries with enfeebled family, vocation, community and faith can still be pleasant places to live. I am delighted when I get a chance to go to Stockholm or Paris. When I get there the people don’t seem to be groaning under the yoke of an oppressive system. On the contrary, there’s a lot to like about day-to-day life in the advanced welfare states of western Europe. They are great places to visit. But the view of life that has taken root in those same countries is problematic.

It seems to go something like this: the purpose of life is to while away the time between birth and death as pleasantly as possible and the purpose of government is to make it as easy as possible to while away the time as pleasantly as possible — the Europe Syndrome.

Europe’s short working weeks and frequent vacations are one symptom of the syndrome. The idea of work as a means of self-actualisation has faded. The view of work as a necessary evil, interfering with the higher good of leisure, dominates. To have to go out to look for a job or to have to risk being fired from a job are seen as terrible impositions.

The precipitous decline of marriage, far greater in Europe than in the United States, is another symptom. What is the point of a lifetime commitment when the state will act as surrogate spouse when it comes to paying the bills? The decline of fertility to far below replacement is another symptom. Children are seen as a burden that the state must help shoulder and even then they’re a lot of trouble that distract from things that are more fun.

The secularisation of Europe is yet another symptom. Europeans have broadly come to believe that humans are a collection of activated chemicals that, after a period of time, deactivate. If that’s the case, saying that the purpose of life is to pass the time as pleasantly as possible is a reasonable position. Indeed, taking any other position is ultimately irrational.

The alternative to the Europe Syndrome is to say your life can have transcendent meaning if it is spent doing important things — raising a family, supporting yourself, being a good friend and a good neighbour, learning what you can do well and then doing it as well as you possibly can.

Providing the best possible framework for doing those things is what the American project is all about. When I say the American project is in danger, that’s the nature of the loss I have in mind: the loss of the framework through which people can best pursue happiness.

The reasons we face the prospect of losing that heritage are many but none is more important than the twin realities that I have tried to describe. On one side of the spectrum a significant and growing portion of the American population is losing the virtues required to be functioning members of a free society. On the other side of the spectrum the people who run the country are doing just fine. Their framework for pursuing happiness is relatively unaffected by the forces that are enfeebling family, community, vocation, and faith elsewhere in the society. In fact, they have become so isolated they are often oblivious to the nature of the problems that exist elsewhere.

In the absence of some outside intervention the new lower class will continue to grow. Advocacy for that outside intervention can come from many levels of society — that much is still true in America — but eventually it must gain the support of the new upper class if it is to be ratified. Too much power is held by the new upper class to expect otherwise. What are the prospects of that happening?

Extracted from Coming Apart: The State of White America 1960-2010 by Charles Murray

 

The new upper class: genius geeks

The new upper class: genius geeks, no common sense necessary

In the early 1990s Bill Gates was asked what competitor worried him the most. "Software is an IQ business. Microsoft must win the IQ war or we won't have a future," he said. "Our competitors for IQ are investment banks such as Goldman Sachs and Morgan Stanley."

Gates's comment reflected a reality that has driven the formation of the new upper class: over the past century brains became much more valuable in the marketplace. The reasons why it happened are not mysterious.

First, the higher-tech the economy, the more it relies on people who can improve and exploit the technology, which creates many openings for people whose main asset is their exceptional cognitive ability. What was someone with exceptional mathematical ability worth on the job market 100 years ago if he had no interpersonal skills or common sense? Not much. The private sector had only a few jobs that might make him worth hiring. His options were not much wider in 1960.

What is a person with the same skill set worth today? If he is a wizard programmer, as people with exceptional mathematical ability tend to be, he is worth six figures to Microsoft or Google. If he is a fine pure mathematician, some quant funds can realistically offer him the prospect of great wealth.

Second, the more complex business decisions become, the more businesses rely on people who can navigate labyrinths that may or may not call upon common sense but certainly require advanced cognitive ability. If a lawyer can work out the multidimensional issues that enable the merger of two large corporations, he may be worth a commission of millions of dollars. The same happened in the financial industry as technology has made possible new and complex — but also fabulously profitable — financial instruments.

Third, the bigger the stakes the greater the value of marginal increments in skills. In 1960 the corporation ranked 100th on the Fortune 500 had sales of $3.2 billion. In 2010 the 100th ranked corporation had sales of $24.5 billion (in constant dollars). The dollar value of a manager who could increase his division's profitability by 10% instead of 5% escalated accordingly.

Given that backdrop it is no surprise that the people working in managerial occupations and the professions made a lot more money in 2010 than they had made in 1960 and their growing wealth enabled the most successful of them, the members of the new upper class, to isolate themselves from the rest of America in ways they formerly couldn't afford to do.

 

Hardwired Political Identity

A few weeks before the 2008 election, Democratic strategists were running out of ideas for how to help Al Franken. His race against incumbent Minnesota senator Norm Coleman was a stubborn one: Even after some of the country’s highest ever per capita spending, the contest remained close, with a small number of undecided, seemingly unbudgeable voters.

The job of pollsters in these situations is to figure out who the undecided actually are and what could make them move. Often, they focus on demographics (playing to older suburban women) or issues (talk of school reform). But one pollster working for the Democratic Senatorial Campaign Committee, Mark Mellman, felt it might pay to look for more primal distinctions.

Mellman added to his Minnesota polls a battery of questions inspired by research in psychology and neuroscience, borrowed from personality tests and designed to separate those with more rational processing systems from those who relied on emotion in their decision-making. Here polls did discern a latent split: Franken led Coleman by one point among those they identified as “feelers” but lagged by seven points among “thinkers.” The committee changed its ad strategy in response. Highly stylized television spots, like a movie spoof that showed Coleman as a fugitive fleeing George W. Bush, were replaced by messages that were “a little more flat, a little more factual, a little more sourced,” Mellman said. One defended Franken against Coleman’s charges with a calm narrator reading off a checklist of straightforward rebuttals under the words “The Truth.” Franken won, after a long recount, and in 2010 Mellman used the same battery of questions to shape media strategy for Harry Reid and Barbara Boxer.

That kind of science may seem alien to the war room, but Mellman’s hunch, that the differences in how people process politics may be more innate than we’ve thought, is becoming the default assumption in research labs worldwide. There, over the last decade or so, scientists have been extending to politics the imperious insights of neuroscience and evolutionary psychology that have so shaken other social sciences.

At the vanguard of this movement is Jonathan Haidt, a moral psychologist whose best-selling new book, The Righteous Mind, collects his own experiments—testing biases, prejudices, and ­preferences—and the work of like-minded colleagues to unmask much of our political “thinking” as moral instinct papered over, post facto, with ideological rationalization. We may tell ourselves that we believe welfare is just or that abortion violates the sanctity of life, but we’re really using borrowed language to express much more visceral attitudes, oriented around one of six moral dials—harm, fairness, loyalty, authority, liberty, and sanctity. Much of what passes for the daily scrum of electoral politics, he says, is merely an effort to find language that can help citizens justify these instincts. “Once people join a political team, they get ensnared in its moral matrix,” Haidt writes. “They see confirmation of their grand narrative everywhere.”

But the new science of primal politics goes quite a bit deeper than psychology. Over the past few years, researchers haven’t just tied basic character traits to liberalism and conservatism, they’ve begun to finger specific genes they say hard-wire those ideologies. If that work is to be believed, it would mean that an individual’s path to a political identity starts not with a series of choices but with long-ago genetic mutations, and that our collective experience of politics may be less a battle of ideas than a Darwinian contest in which we are all unwitting participants. After a team of geneticists claimed in a 2005 American Political Science Review article that they had evidence of DNA’s influence on politics, Duke political scientist Evan Charney rebutted that their findings “would require nothing less than a revision of our understanding of all of human history, much—if not most—of political science, sociology, anthropology, and psychology, as well as, perhaps, our understanding of what it means to be human.”

The thing most in need of revising may be our reflex for self-flattery. We revel in the idea that personal politics are perfectly deliberative, never more than in a year when Barack Obama and Mitt Romney—two dispassionate rationalists with great confidence in their skills of persuasion—will cross the country to win over their fellow citizens’ hearts and minds. But the comforting metaphor of a grand national debate to determine where the swing voters will end up has never seemed so out of sync with trending science. After all, what is the point of everything that happens between now and November 6 if our wiring dictates how we vote?

In 2006, as political scientists fixated on the country’s red-blue divide, NYU psychologist John Jost delivered a paper titled “The End of the End of Ideology.” For years, scholars had believed that the differences between liberals and conservatives were both trivial and superficial—that the two major American political parties weren’t all that far apart and that voters’ loyalties to them were simply arbitrary. Some had become convinced that people attached themselves to an ideology merely to assert their sophistication, picking a side to show they were capable of articulating a coherent worldview—that, as Haidt suggests, partisanship was merely an intellectual superstructure.

But Jost believed there was something deeper to political identity, something that might explain why there has been so much continuity, and so little shape-shifting, in the politics of the modern West. Ever since French parliamentarians decided their seating arrangements in the eighteenth century, the two-way split between left (concerned with inequality and eager for social change) and right (guardians of tradition and satisfied with an uneven spread of resources) has remained remarkably constant across time and place. As new issues have emerged, opinions on them have lined up very neatly with the old patterns—patterns so consistent they cannot be arbitrary and so peculiar it’s hard to believe they are fully rational. Why in the American system were the people who opposed the death penalty almost always the ones who believed that rich people should pay more in taxes? Why are those concerned with Net neutrality the same people obsessed with local produce? Why do support for strong regulation of abortion and weak regulation of the financial sector seem to go together?

Jost wanted to find out and, with a group of colleagues, set off to map the psychological infrastructure of politics. They didn’t bother asking people about cap-and-trade or gun control, but focused on jazz, masturbation, and gardening. What they discovered was a series of contrasts: Conservatives approved of documentaries and going to bars; liberals looked favorably upon motorcycles and singing songs. In earlier studies, liberals had been shown to be unpredictable and uncontrolled, conservatives conscientious and trustworthy. Jost found that liberals embraced those considered outside the social mainstream, like lesbians, “street people,” and atheists; conservatives approved of fraternities and sororities, politicians, and Caucasians. Conservatives were fonder of children, liberals of professors. Among women, conservatives were more into sex; among men, Jost and his colleagues found the opposite.

They also visited the rooms of 76 UC-Berkeley students, along with a series of five nearby offices, and coded nearly every item in the rooms after quizzing the spaces’ inhabitants about their attitudes. Conservatives’ bedrooms had more laundry baskets, postage stamps, and sports memorabilia. Liberals had movie tickets and larger collections of CDs and books. Conservatives had calendars, flags, and ironing boards. Liberals had international maps, art supplies, and stationery. Conservative offices were less “stylish” and “comfortable”; liberal workplaces were more colorful. When Jost and his colleagues videotaped three-minute interviews with the students, then reviewed the tapes, the liberals were chattier, the conservatives withdrawn and cautious.

Jost’s goal wasn’t to confirm the cheap ideological caricatures of columnists—although the paper does a magnificent job of it—but to see what people’s not obviously political characteristics might explain about how their minds work. “As a general rule,” the authors wrote, “liberals are more open-minded in their pursuit of creativity, novelty, and diversity, whereas conservatives seek lives that are more orderly, conventional, and better organized.” Rare midlife conversions aside, our parties are groups of two different kinds of people, they said, divided not by class or geography or education but by temperament.

Until recently, merely looking to sort people into political types was taboo, with its hint of Nazi eugenics. In the thirties, German psychologist Ernst Jaensch had isolated traits—“definite, unambiguous … tough, masculine, firm”—that would help to identify good candidates for National Socialism. After the war, American researchers backed away from any suggestion that personal politics came with birth, leaving political scientists to mine social explanations instead. In so doing, political science insulated itself from the hard sciences, leaving the discipline inadvertently perched on a bizarre assumption: Politics was the only sphere of human existence immune to hereditary influence.

Other disciplines were less intimidated by taboo. In 1989, UC-Berkeley developmental psychologists Jack and Jeanne Block tracked down just over 100 23-year-olds they had closely observed two decades earlier. In the late sixties, the Blocks had identified a set of students at Bay Area nursery schools and assigned personality characteristics to each child based on his or her behavior, alone as well as in groups, as part of a study on creativity and self-esteem. Twenty years later, the Blocks went back and started talking to them about politics, comparing answers to the personality traits they had observed in their subjects as toddlers. They found that even in nursery schools, liberals had been self-reliant and resilient, able to develop close relationships and willing to easily cast off routine. The conservatives had been distrustful of others and anxious when facing uncertainty, quick to take offense and experience guilt. The Blocks felt they had found that the origins of adult partisanship manifest at an age often defined by its innocence to the world of politics.

If the Blocks’ sandbox profiling is right, it would mean that ideologies are not free-floating philosophies to which free agents can attach themselves but manifestations of deeply held personality traits. Conservatism might not be that thing defined by William F. Buckley or Edmund Burke but a primal condition by which people hedge against disorder or change they can’t otherwise control. Perhaps the clearest marker that Rudy Giuliani is a conservative is the fact that he can’t truck messiness. Others on the right have made more high-minded appeals to the gross-out standard: Leon Kass, who chaired George W. Bush’s Commission on Bioethics, has promoted “the wisdom of repugnance” as a key value in making policy around issues like cloning. “We have basic emotions set up to deal with these challenges, whether it’s fear or anger or disgust,” says Brown University political scientist Rose McDermott. “That predisposition affects whether they’re conservative or liberal because it helps them organize the world in a way that reduces their fear.”

In 2008, Cornell psychologist David Pizarro tried to explain just how we interact with that fear. He and his colleagues had surveyed subjects about their political beliefs and then asked each how much he or she would be disgusted by the smell of urine, the sight of crawling maggots, or the knowledge that a bowl of soup had been stirred by a used flyswatter. People develop their gag reflexes long before they pick a political movement, Pizarro speculated, and many of the political debates that appear to be moral tests for adults eventually reveal themselves as little more than a measurement of childlike revulsion. Does prisoner torture or the sight of two men kissing or the nihilistic gore of Grand Theft Auto make you gag?

Those on the right were the most easily grossed out, Pizarro found, confirming our intuitive picture of live-and-let-live liberals and law-and-order conservatives. But research also showed that conservatives were not only turned off by flies, turds, and images of people fighting but that they were positively turned on by their own feelings of repugnance, especially in a related experiment conducted at the University of Nebraska–Lincoln. Researchers there outfitted subjects with an eye-tracker, which measured how and where participants focused their attention, before projecting collages mixing images that are known to trigger adverse reactions (spider, more maggots) and those that stimulate goodwill (cute rabbit, happy child). Unlike liberals’ eyes, conservatives’ eyes dwelled unusually long on images they found most repellent. Similarly, when researchers used electrodes to measure the amount of moisture released by subjects—a typical method of ascertaining emotional response—they found that conservatives were more aroused by images of politicians they disliked (the Clintons) than those they liked (Ronald Reagan, George W. Bush). Liberals were excited by the sight of those they liked.

In Man Is by Nature a Political Animal, published last year, the anthology’s editors argue that any open-minded pursuit of these questions will show that evolutionary impulses shape our political inner lives as much as our physical form. McDermott, one of the volume’s editors, predicts that within ten years saliva swabs will identify a genetic link explaining why some individuals welcome immigration while others respond violently to it. Citizens with ­“really strong immune systems are going to be all right with immigration,” ­McDermott ventures, because they’ll be less concerned with the pathogen threat that outsiders pose.

“It’s hard to find something we haven’t been able to say is significantly affected by the heritability of genes,” says James ­Fowler, a UC–San Diego social scientist. If genes can make someone more prone to depression or bad temper, why couldn’t they also explain his political views? And if genes were shaped over time by evolutionary pressures that drove people to protect their turf or successfully reproduce, why shouldn’t we see politics at least partly in the same terms?

“I know there’s a knee-jerk reaction that this can’t be right: ‘There’s no way there’s a gene that’s responsible for my politics,’ ” says Matthew C. Keller, a behavioral geneticist at the University of Colorado. “For me, this is a genetic IQ test. If they say that type of thing, it means they don’t understand genetics that well.”

To those immersed in the science, moral concerns have seemed to exhibit the strongest hereditary influence and to manifest themselves earliest in life. They are the most stable over a lifetime and the least susceptible to persuasion. That may explain why the most angry, permanent divisions in modern American politics have surrounded “God, guns, and gays” and why an intra-Republican truce on such cultural issues strikes nearly everyone as particularly fanciful. What if positions on these issues evoke the most primal responses because, in animal terms, they are most primal?

Such thinking would threaten the pieties of both left and right. Conservatives might have to adjust to a world in which few human failings could be fully blamed on cultural decline. At the same time, the liberal mind would be forced to rethink its posture toward ­cultural backwardness, and decide whether it ought to treat illiberal attributes like intolerance and racism as part of human nature. Would those who oppose discrimination against gays on the basis that sexuality is no choice still feel empowered to hate the right wing if they knew homophobes, too, were just born that way?

The question leads straight back, through behavior and heritability, to our DNA. In the mid-2000s, Fowler tried to isolate the effects of specific genes on civic engagement: Comparing the frequency of voting by identical twins (who share their DNA) against fraternal ones (who share half of it), Fowler and his team concluded that differentials in turnout patterns can be explained just as well by genetics as by learned behavior.

Fowler and his colleagues then went further to see if any of the 25,000 known human genes, which have been connected to conditions like dyslexia and depression, could be linked to political ­temperament. They concentrated on one of the first genetic variations to be connected with a particular personality type: DRD4-7R, a variant of the gene that encodes the brain’s dopamine receptors that had already been linked to novelty-seeking behavior. “It’s a reward system,” says Fowler. “It lights up when we eat chocolate or have sex or do cocaine. It’s the system that goes haywire when people have gambling addictions.” Fowler thought back to the personality studies by Jost and others that presented “openness” and “novelty-seeking” as liberal traits. If dopamine levels made someone more likely to go bungee-jumping, why wouldn’t they also lead him or her to a political party less guided by tradition?

To test the hypothesis, Fowler and his colleagues had to identify a group of liberals and see whether they were more likely to carry two copies of DRD4-7R. In the data from a previous study, he found 2,574 people identified as DRD4 carriers who had been asked about their politics and placed on a political continuum. There was not a direct connection between the gene and ideology, Fowler and his colleagues found, but when the researchers also looked at the number of friends the subjects had had in high school (one of the survey questions asked), they did find a strong tie. People with the genetic variant who had lively adolescent social lives were more likely to consider themselves liberal as young adults.

Even so, Fowler laughs at the idea that he had isolated a single gene responsible for liberalism—an idea circulated in much of the chatter about the study. “There are hundreds if not thousands of genes that are all interacting to affect complex social behaviors,” Fowler says, and scientists have only a rough sense of that process. “There’s a really long, complex causal chain at work here,” says UC-Berkeley political scientist Laura Stoker, “and we won’t get any real understanding without hundreds and hundreds of years’ more research.”

But a century is a long time to wait. The media coverage of Fowler’s “liberal gene” (“Don’t hold liberals responsible for their opinion—they can’t help themselves,” reported Foxnews.com) demonstrated just how much of an appetite there might be for teasing out the election-year implications of the new biological determinism. Should Republican strategists be activating conservative attitudes by reminding the base of the things that repulse them (maggots, Bill Clinton)? Would Democrats be smart to run their voter-registration drives near slot machines and bungee jumps? Is there something in the biological makeup of politicians like Obama and Romney that seems to make them more malleable than the people whose votes they want to win? (Now that we have the birth certificate, we’ll need a DNA swab.) Why does Romney seem quick to shift positions on the moral issues that should be most hardwired but faithful on the economic ones with weaker genetic inheritance? Does the fact that he’s a flip-flopper also mean he’s a robot? And what’s the deal with Obama’s bipartisan fetish, anyway—is that some kind of freak genetic mutation? Or further evidence that he’s hiding, as Romney would have it, a more sinister atavistic agenda? Instead of exalting independents, should we treat their lack of discernible ideology as evidence of their underdevelopment? And do Americans not have a third party because the laws of evolution won’t let one survive?

An election season’s arrival quickly sweeps away any such sense of political fatalism. Every four years, we treat our presidential campaign as an exercise in Tocquevillean political free will, 200 million Americans questioning their individual beliefs and national priorities unencumbered by lineage or patrimony. We rearrange our civic life around the cult of the ideologically unmoored voter—once called ticket-splitters, then swing voters, now just independents. But daily shifts in Biden’s language or Romney’s policy positions or the imagery of super-PAC ads are only worth the attention we lavish on them if they’re being judged by a perfectly open-­minded electorate.

“You can be the best campaign, but if someone is genetically predisposed against being affected by it, you’re not going to make much of a difference,” says Fordham political scientist Costas Panagopoulos, former editor-in-chief of the trade magazine Campaigns & Elections. Even if the genetic studies don’t suggest that votes are truly automatic, efforts to get conservative Catholic union members to vote for Obama or liberal stockbrokers for Mitt Romney may be more doomed than we want to believe. Of course, we have long appreciated the role played by one biological predictor: the gender gap, which has become the most popular way to explain Obama’s lead over Romney as we head into the fall campaign.

Indeed, whatever we make of the academic breakthroughs in understanding the role of evolutionary psychology in politics, the old heuristics may have to suffice for now. Parties and candidates have few practical tools to sort voters into new biological categories and little sophisticated understanding of how to leverage any new insight. “It’s hard to put that on a survey,” notes Will Feltus, a Republican consultant who uses statistical modeling to advise campaigns on how to target their television buys. “ ‘I just have a few questions about your genetics. Which of the following genetic sequences is closest to your own?’ ”

It is easy to imagine that more data on how brains and bodies process political messages might just gussy up the logic of the red-blue divide with a scientistic certitude. Campaigns would be even more convinced that the people who are not for them today will never be for them and redouble resources on rousing the voters they know to be on their side and give up on trying to change minds. “For what I do, finding out whether someone is the way they are from childhood doesn’t help me a whole lot. I am being paid to tell politicians where people are, how changeable they can be within a twelve-to-eighteen-month period,” says Whit Ayres, a Republican pollster who worked on Jon Huntsman’s presidential campaign. “It’s just numbers. How many people are in this group, how many people are in that group, and how many people do you have to add to get to 50 percent?”

 

Lobbyists

 

Voting Lottery

 

Learning

 

War

 

Racist Hate Groups

 

Liberals and Conservatives

Blue state, red state. Big government, big business. Gay rights, fetal rights. The United States is riven by the politics of extremes. To paraphrase humor columnist Dave Barry, Republicans think of Democrats as godless, unpatriotic, Volvo-driving, France-loving, elitist latte guzzlers, whereas Democrats dismiss Republicans as ignorant, NASCAR-obsessed, gun-fondling religious fanatics. An exaggeration, for sure, but the reality is still pretty stark. Congress is in a perpetual stalemate because of the two parties' inability to find middle ground on practically anything.

According to the experts who study political leanings, liberals and conservatives do not just see things differently. They are different - in their personalities and even their unconscious reactions to the world around them. For example, in a study published in January, a team led by psychologist Michael Dodd and political scientist John Hibbing of the University of Nebraska–Lincoln found that when viewing a collage of photographs, conservatives' eyes unconsciously lingered 15 percent longer on repellent images, such as car wrecks and excrement - suggesting that conservatives are more attuned than liberals to assessing potential threats.

Meanwhile examining the contents of 76 college students' bedrooms, as one group did in a 2008 study, revealed that conservatives possessed more cleaning and organizational items, such as ironing boards and calendars, confirmation that they are orderly and self-disciplined. Liberals owned more books and travel-related memorabilia, which conforms with previous research suggesting that they are open and novelty-seeking.

“These are not superficial differences. They are psychologically deep,” says psychologist John Jost of New York University, a co-author of the bedroom study. “My hunch is that the capacity to organize the political world into left or right may be a part of human nature.”

Although conservatives and liberals are fundamentally different, hints are emerging about how to bring them together—or at least help them coexist. In his recent book The Righteous Mind, psychologist Jonathan Haidt of the N.Y.U. Stern School of Business argues that liberals and conservatives need not revile one another as immoral on issues such as birth control, gay marriage or health care reform. Even if these two worldviews clash, they are equally grounded in ethics, he writes. Meanwhile studies by Jost and others suggest that political views reside on a continuum that is mediated in part by universal human emotions such as fear. Under certain circumstances, everyone can shift closer to the middle—or drift further apart.

The Fear Factor

Psychologists have found that conservatives are fundamentally more anxious than liberals, which may be why they typically desire stability, structure and clear answers even to complicated questions. “Conservatism, apparently, helps to protect people against some of the natural difficulties of living,” says social psychologist Paul Nail of the University of Central Arkansas. “The fact is we don't live in a completely safe world. Things can and do go wrong. But if I can impose this order on it by my worldview, I can keep my anxiety to a manageable level.”

Anxiety is an emotion that waxes and wanes in all of us, and as it swings up or down our political views can shift in its wake. When people feel safe and secure, they become more liberal; when they feel threatened, they become more conservative. Research conducted by Nail and his colleague in the weeks after September 11, 2001, showed that people of all political persuasions became more conservative in the wake of the terrorist attacks. Meanwhile, in an upcoming study, a team led by Yale University psychologist Jaime Napier found that asking Republicans to imagine that they possessed superpowers and were impermeable to injury made them more liberal. “There is some range within which people can be moved,” Jost says.

More practically, instead of trying to change people's emotional state (an effect that is temporary), astute policy makers might be able to phrase their ideas in a way that appeals to different worldviews. In a 2010 paper Irina Feygina, a social psychology doctoral student at N.Y.U. who works with Jost, found a way to bring conservatives and liberals together on global warming. She and her colleagues wondered whether the impulse to defend the status quo might be driving the conservative pooh-poohing of environmental issues.

In an ingenious experiment, the psychologists reframed climate change not as a challenge to government and industry but as “a threat to the American way of life.” After reading a passage that couched environmental action as patriotic, study participants who displayed traits typical of conservatives were much more likely to sign petitions about preventing oil spills and protecting the Arctic National Wildlife Refuge.

Environmentalism may be an ideal place to find common political ground. “Conservatives who are religious have this mind-set about being good stewards of the earth, to protect God's creation, and that is very compatible with green energy and conservation and other ideas that are usually classified as liberal,” Nail says.

Moral Scorecards

On topics where liberals and conservatives will never see eye to eye, opposing sides can try to cultivate mutual respect. In The Righteous Mind, Haidt identifies several areas of morality. Liberals, he says, tend to value two of them: caring for people who are vulnerable and fairness, which for liberals tends to mean sharing resources equally. Conservatives care about those things, too, but for them fairness means proportionality—that people should get what they deserve based on the amount of effort they have put in. Conservatives also emphasize loyalty and authority>, values helpful for maintaining a stable society.

In a 2009 study Haidt and two of his colleagues presented more than 8,000 people with a series of hypothetical actions. Among them: kick a dog in the head; discard a box of ballots to help your candidate win; publicly bet against a favorite sports team; curse your parents to their faces; and receive a blood transfusion from a child molester. Participants had to say whether they would do these deeds for money and, if so, for how much - $10? $1,000? $100,000? More? Liberals were reluctant to harm a living thing or act unfairly, even for $1 million, but they were willing to betray group loyalty, disrespect authority or do something disgusting, such as eating their own dog after it dies, for cash. Conservatives said they were less willing to compromise on any of the moral categories.

Haidt has a message for both sides. He wants the left to acknowledge that the right's emphasis on laws, institutions, customs and religion is valuable. Conservatives recognize that democracy is a huge achievement and that maintaining the social order requires imposing constraints on people. Liberal values, on the other hand, also serve important roles: ensuring that the rights of weaker members of society are respected; limiting the harmful effects, such as pollution, that corporations sometimes pass on to others; and fostering innovation by supporting diverse ideas and ways of life.

Haidt is not out to change people's deepest moral beliefs. Yet he thinks that if people could see that those they disagree with are not immoral but simply emphasizing different moral principles, some of the antagonism would subside. Intriguingly, Haidt himself has morphed from liberal to centrist over the course of his research. He now finds value in conservative tenets that he used to reject reflexively: “It's yin and yang. Both sides see different threats; both sides are wise to different virtues.”

 

Anti-Science In US Politics

It is hard to know exactly when it became acceptable for U.S. politicians to be antiscience. For some two centuries science was a preeminent force in American politics, and scientific innovation has been the leading driver of U.S. economic growth since World War II. Kids in the 1960s gathered in school cafeterias to watch moon launches and landings on televisions wheeled in on carts. Breakthroughs in the 1970s and 1980s sparked the computer revolution and a new information economy. Advances in biology, based on evolutionary theory, created the biotech industry. New research in genetics is poised to transform the understanding of disease and the practice of medicine, agriculture and other fields.

The Founding Fathers were science enthusiasts. Thomas Jefferson, a lawyer and scientist, built the primary justification for the nation's independence on the thinking of Isaac Newton, Francis Bacon and John Locke—the creators of physics, inductive reasoning and empiricism. He called them his “trinity of three greatest men.” If anyone can discover the truth by using reason and science, Jefferson reasoned, then no one is naturally closer to the truth than anyone else. Consequently, those in positions of authority do not have the right to impose their beliefs on other people. The people themselves retain this inalienable right. Based on this foundation of science—of knowledge gained by systematic study and testing instead of by the assertions of ideology—the argument for a new, democratic form of government was self-evident.

Yet despite its history and today's unprecedented riches from science, the U.S. has begun to slip off of its science foundation. Indeed, in this election cycle, some 236 years after Jefferson penned the Declaration of Independence, several major party contenders for political office took positions that can only be described as “antiscience”: against evolution, human-induced climate change, vaccines, stem cell research, and more. A former Republican governor even warned that his own political party was in danger of becoming “the antiscience party.”

Such positions could typically be dismissed as nothing more than election-year posturing except that they reflect an anti-intellectual conformity that is gaining strength in the U.S. at precisely the moment that most of the important opportunities for economic growth, and serious threats to the well-being of the nation, require a better grasp of scientific issues. By turning public opinion away from the antiauthoritarian principles of the nation's founders, the new science denialism is creating an existential crisis like few the country has faced before.

In late 2007 growing concern over this trend led six of us to try to do something about it. Physicist Lawrence M. Krauss, science writer and film director Matthew Chapman (who is Charles Darwin's great–great-grandson), science philosopher Austin Dacey, science writer Chris Mooney, marine biologist Sheril Kirshenbaum and I decided to push for a presidential science debate. We put up a Web site and began reaching out to scientists and engineers. Within weeks 38,000 had signed on, including the heads of several large corporations, a few members of Congress from both parties, dozens of Nobel laureates, many of the nation's leading universities and almost every major science organization. Although presidential hopefuls Barack Obama and John McCain both declined a debate on scientific issues, they provided written answers to the 14 questions we asked, which were read by millions of voters.

In 2012 we developed a similar list, called “The Top American Science Questions,” that candidates for public office should be answering [see “Science in an Election Year” for a report card by Scientific American's editors measuring how President Obama and Governor Mitt Romney did]. The presidential candidates' complete answers, as well as the responses provided by key congressional leaders to a subset of those questions, can be found at www.ScientificAmerican.com/nov2012/science-debate and at www.sciencedebate.org/debate12.

These efforts try to address the problem, but a larger question remains: What has turned so many Americans against science—the very tool that has transformed the quality and quantity of their lives?

A Call to Reason

Today's denial of inconvenient science comes from partisans on both ends of the political spectrum. Science denialism among Democrats tends to be motivated by unsupported suspicions of hidden dangers to health and the environment. Common examples include the belief that cell phones cause brain cancer (high school physics shows why this is impossible) or that vaccines cause autism (science has shown no link whatsoever). Republican science denialism tends to be motivated by antiregulatory fervor and fundamentalist concerns over control of the reproductive cycle. Examples are the conviction that global warming is a hoax (billions of measurements show it is a fact) or that we should “teach the controversy” to schoolchildren over whether life on the planet was shaped by evolution over millions of years or an intelligent designer over thousands of years (scientists agree evolution is real). Of these two forms of science denialism, the Republican version is more dangerous because the party has taken to attacking the validity of science itself as a basis for public policy when science disagrees with its ideology.

It gives me no pleasure to say this. My family founded the Minnesota Republican Party. But much of the Republican Party has adopted an authoritarian approach that demands ideological conformity, even when contradicted by scientific evidence, and ostracizes those who do not conform. It may work well for uniform messaging, but in the end it drives diverse thinkers away—and thinkers are what we need to solve today's complex problems.

This process has left a large, silent body of voters who are fiscally conservative, who believe in science and evidence-based policies, and who are socially tolerant but who have left the party. In addition, Republican attacks on settled scientific issues - such as anthropogenic climate change and evolution - have too often been met with silence or, worse, appeasement by Democrats.

Governor Romney's path to endorsement exemplifies the problem. “I don't speak for the scientific community, of course, but I believe the world is getting warmer,” Romney told voters in June 2011 at a town hall meeting after announcing his candidacy. “I can't prove that, but I believe based on what I read that the world is getting warmer, and number two, I believe that humans contribute to that.” Four days later radio commentator Rush Limbaugh blasted Romney on his show, saying, “Bye-bye nomination. Bye-bye nomination, another one down. We're in the midst here of discovering that this is all a hoax. The last year has established that the whole premise of man-made global warming is a hoax! And we still have presidential candidates who want to buy into it.

By October 2011 Romney had done an about-face. “My view is that we don't know what's causing climate change on this planet, and the idea of spending trillions and trillions of dollars to try and reduce CO2 emissions is not the right course for us,” he told an audience in Pittsburgh, then advocated for aggressive oil drilling. And on the day after the Republican National Convention, he tacked back toward his June 2011 position when he submitted his answers to ScienceDebate.org.

Romney is not alone in appreciating the political necessity of embracing antiscience views. House Speaker John A. Boehner, who controls the flow of much legislation through Congress, once argued for teaching creationism in science classes and asserted on national television that climate scientists are suggesting that carbon dioxide is a carcinogen. They are not. Representative Michele Bachmann of Minnesota warned in 2011 during a Florida presidential primary debate that “innocent little 12-year-old girls” were being “forced to have a government injection” to prevent infection with human papillomavirus (HPV) and later said the vaccine caused “mental retardation.” HPV vaccine prevents the main cause of cervical cancer. Religious conservatives believe this encourages promiscuity. There is no evidence of a link to mental retardation.

In a separate debate, Republican candidate Jon Huntsman was asked about comments he had made that the Republican Party is becoming the antiscience party. “All I'm saying,” he replied, “is that for the Republican Party to win, we can't run from science.” Republican primary voters apparently disagreed. Huntsman, the lone candidate to actively embrace science, finished last in the polls.

In fact, candidates who began to lag in the GOP presidential primaries would often make antiscience statements and would subsequently rise in the polls. Herman Cain, who is well respected in business circles, told voters that “global warming is poppycock.” Newt Gingrich, who supported doubling the budget of the National Institutes of Health and who is also a supporter of ScienceDebate.org, began describing stem cell research as “killing children in order to get research material.” Candidates Rick Perry and Ron Paul both called climate change “a hoax.” In February, Rick Santorum railed that the left brands Republicans as the antiscience party. “No. No, we're not,” he announced. “We're the truth party.”

Antiscience reproductive politics surfaced again in August, this time in one of the most contested U.S. Senate races. Todd Akin, who is running in Missouri against Claire McCaskill, said that from what he understood from doctors, pregnancy from rape is extremely rare because “if it's a legitimate rape, the female body has ways to try to shut that whole thing down.” Akin sits on the House Committee on Science, Space, and Technology, which is responsible for much of the U.S. federal science enterprise, so he should be aware of what science actually says about key policy issues. In fact, studies suggest that women are perhaps twice as likely to become pregnant from rape, and, in any event, there is no biological mechanism to stop pregnancy in the case of rape. Akin's views are by no means unusual among abortion foes, who often seek to minimize what science says to politically justify a no-exception antiabortion stance, which has since become part of the 2012 national GOP platform.

A look at down-ticket races suggests that things may get worse. The large crop of antiscience state legislators elected in 2010 are likely to bring their views into mainstream politics as they eventually run for Congress. In North Carolina this year the state legislature considered House Bill No. 819, which prohibited using estimates of future sea-level rise made by most scientists when planning to protect low-lying areas. (Increasing sea level is a predicted consequence of global warming.) The proposed law would have permitted planning only for a politically correct rise of eight inches instead of the three to four feet that scientists predict for the area by 2100.

Virginia Republicans took similar action in June, banning the use of the term “sea-level rise” from a government-commissioned study and instead requiring use of the phrase “recurrent flooding” because “sea-level rise” is considered “a left-wing term,” according to one of the legislators.

The Evolution of American Science Denialism

The American Antiscience Movement did not travel from the fringe to the center of society overnight. Its roots can be traced back a century to three-time Democratic candidate for president William Jennings Bryan, who ran fundamentalist campaigns against the theory of evolution, which he argued was causing moral decay in the nation's youth by undermining the authority of the Bible.

Bryan lost to proscience Republicans William McKinley and William Howard Taft, but he continued to campaign throughout the South, working to banish the scientific theory from American classrooms. Eventually Tennessee passed a law prohibiting the teaching of “any theory that denies the Story of the Divine Creation of man as taught in the Bible, and to teach instead that man has descended from a lower order of animals.” The coverage of the resulting Scopes “monkey trial” in 1925 turned the American public against religious fundamentalism for a generation, and the persistent campaigns against evolution drove most scientists into the Republican Party.

When World War II broke out, science gained new luster. President Franklin D. Roosevelt turned to science as an intellectual weapon to help win the war. FDR asked Vannevar Bush, who led what is now known as the Carnegie Institution for Science, to marshal the U.S. science enterprise. Bush's efforts succeeded, leading to the development of radar, artificial rubber, the mass production of penicillin and the atomic bomb. After the war, he convinced President Harry S. Truman that continued federal investment in science could make the U.S. into a world leader.

The investment paid off, but the steady flow of federal funding had an unanticipated side effect. Scientists no longer needed to reach out to the public or participate in the civic conversation to raise money for research. They consequently began to withdraw from the national public dialogue to focus more intently on their work and private lives. University tenure systems grew up that provided strong disincentives to public outreach, and scientists came to view civics and political involvement as a professional liability.

As the voice of science fell silent, the voice of religious fundamentalism was resurging. Moral disquietude over the atomic bomb caused many to predict the world would soon end, and a new wave of fundamentalist evangelists emerged. “All across Europe, people know that time is running out,” a charismatic young preacher named Billy Graham said in 1949. “Now that Russia has the atomic bomb, the world is in an armament race driving us to destruction.”

Increasing control over the reproductive process widened the split in the following years. Religious conservatives felt that humans should not interfere in God's plan, denouncing the growing popularity of the birth-control pill in the 1960s and debating in the 1970s whether “test-tube babies,” produced by in vitro fertilization, would have souls. They redefined pregnancy to begin at fertilization, rather than implantation in the uterine wall, and argued that abortion was murder.

Science's black eye grew with the broader public as well. In the 1950s children played in the fog of DDT as trucks sprayed neighborhoods, but with the 1962 publication of Rachel Carson's Silent Spring, we learned it was toxic. This pattern repeated over and over again as unforeseen health and environmental consequences of quickly commercialized science came to light. Similar scandals erupted over the effects of scores of industrial applications, ranging from sulfur dioxide and acid rain, to certain aerosols and the hole in the ozone layer, to leaded gas and cognitive impairment, to the granddaddy of them all, fossil fuels and global climate change.

Industrial mishaps led to new health and environmental regulatory science. The growing restrictions drove the older industries in the chemical, petroleum and pharmaceutical fields to protect their business interests by opposing new regulations. Proponents of this view found themselves in a natural alliance with the burgeoning religious fundamentalists who opposed the teaching of evolution. Industrial money and religious foot soldiers soon formed a new basis for the Republican Party: “In this present crisis, government is not the solution to our problem,” President Ronald Reagan argued in his 1981 inaugural address. “Government is the problem.” This antiregulatory-antiscience alliance largely defines the political parties today and helps to explain why, according to a 2009 survey, nine out of 10 scientists who identified with a major political party said they were Democrats.

This marriage of industrial money with fundamentalist values gave fundamentalism renewed power in the public debate, and efforts to oppose the teaching of evolution in public schools have returned in several states. Tennessee, South Dakota and Louisiana have all recently passed legislation that encourages unwarranted criticisms of evolution to be taught in the states' public schools. Evangelical state legislators and school board members mounted similar efforts this year in Oklahoma, Missouri, Kansas, Texas and Alabama, and the Texas Republican Party platform opposes “the teaching of … critical thinking skills and similar programs that … have the purpose of challenging the student's fixed beliefs and undermining parental authority.”

If both Democrats and Republicans have worn the antiscience mantle, why not just wait until the pendulum swings again and denialism loses its political potency? The case for action rests on the realization that for the first time since the beginning of the Enlightenment era in the mid-17th century, the very idea of science as a way to establish a common book of knowledge about the world is being broadly called into question by heavily financed public relations campaigns.

Ironically, the intellectual tools currently being used by the political right to such harmful effect originated on the academic left. In the 1960s and 1970s a philosophical movement called postmodernism developed among humanities professors displeased at being deposed by science, which they regarded as right-leaning. Postmodernism adopted ideas from cultural anthropology and relativity theory to argue that truth is relative and subject to the assumptions and prejudices of the observer. Science is just one of many ways of knowing, they argued, neither more nor less valid than others, like those of Aborigines, Native Americans or women. Furthermore, they defined science as the way of knowing among Western white men and a tool of cultural oppression. This argument resonated with many feminists and civil-rights activists and became widely adopted, leading to the “political correctness” justifiably hated by Rush Limbaugh and the “mental masturbation” lampooned by Woody Allen.

Acceptance of this relativistic worldview undermines democracy and leads not to tolerance but to authoritarianism. John Locke, one of Jefferson's “trinity of three greatest men,” showed why almost three centuries ago. Locke watched the arguing factions of Protestantism, each claiming to be the one true religion, and asked: How do we know something to be true? What is the basis of knowledge? In 1689 he defined what knowledge is and how it is grounded in observations of the physical world in An Essay Concerning Human Understanding. Any claim that fails this test is “but faith, or opinion, but not knowledge.” It was this idea—that the world is knowable and that objective, empirical knowledge is the most equitable basis for public policy—that stood as Jefferson's foundational argument for democracy.

By falsely equating knowledge with opinion, postmodernists and antiscience conservatives alike collapse our thinking back to a pre-Enlightenment era, leaving no common basis for public policy. Public discourse is reduced to endless warring opinions, none seen as more valid than another. Policy is determined by the loudest voices, reducing us to a world in which might makes right—the classic definition of authoritarianism.

Postmodernism infiltrated a generation of American education programs, as Allan Bloom first pointed out in The Closing of the American Mind. It also infected journalism, where the phrase “there is no such thing as objectivity” is often repeated like a mantra.

Reporters who agree with this statement will not dig to get to the truth and will tend to simply present “both sides” of contentious issues, especially if they cannot judge the validity of scientific evidence. This kind of false balance becomes a problem when one side is based on knowledge and the other is merely an opinion, as often occurs when policy problems intersect with science. If the press corps does not strive to report objective reality, for which scientific evidence is our only reliable guide, the ship of democracy is set adrift from its moorings in the well-informed voter and becomes vulnerable once again to the tyranny that Jefferson feared.

An Existential Crisis

“Facts,” John Adams argued, “are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.” When facts become opinions, the collective policymaking process of democracy begins to break down. Gone is the common denominator—knowledge—that can bring opposing sides together. Government becomes reactive, expensive and late at solving problems, and the national dialogue becomes mired in warring opinions.

In an age when science influences every aspect of life—from the most private intimacies of sex and reproduction to the most public collective challenges of climate change and the economy—and in a time when democracy has become the dominant form of government on the planet, it is important that the voters push elected officials and candidates of all parties to explicitly state their views on the major science questions facing the nation. By elevating these issues in the public dialogue, U.S. citizens gain a fighting chance of learning whether those who would lead them have the education, wisdom and courage necessary to govern in a science-driven century and to preserve democracy for the next generation.

 

Cartoons and Sitcoms

Homer Simpson is electoral gold dust. White, male and blue-collar, the donut-obsessed father of three is the kind of vote that often swings American elections. Although he has often been taken for a Republican, in 2008 he joined millions of other working-class whites in pulling the lever for Barack Obama - helping the Democrat capture states such as Ohio and Indiana. But this year, Homer has confounded expectations by voting for Mitt Romney.

In a promo for the latest series of The Simpsons, Homer enters a polling booth and mulls over the options. He doesn't like the Obamas because of First Lady Michelle's anti-obesity campaign ("I already got one wife telling me to eat healthy"), so he goes for Mitt instead. The voting machine then reveals Romney's tax records, which show that the multi-millionaire Republican got a tax deduction for a "personality implant", and that the government actually pays him taxes. Homer threatens to tell the press and the machine opens up and sucks him inside. We later see that he has been "outsourced" to a flag-making factory in China.



The Simpsons skit on Obama v Romney is part of a grand tradition of US sitcoms commenting on politics. While we Brits have political satires such as The Thick of It or Yes, Minister, we don't expect to see characters in standard, family-oriented shows making gags about David Cameron or Ed Miliband. Yet in America, sitcom writers will happily plunder politics for material. With the ratings of many news shows plummeting, and those of sitcoms on the up, comedies offer an alternative barometer of public opinion. They reveal a country that's neither as straightforwardly liberal nor conservative as the presidential election suggests.

US sitcoms are written in a similar way to our soaps: week-by-week, by a team of writers over the course of a broadcast season that can last up to six months. The flexibility this lends can encourage writers to use headlines for inspiration.

The political content of US sitcoms has varied over time. In the Seventies, shows such as All in the Family or Maude tackled abortion, gay rights and racial discrimination, and always with a liberal slant. Conservatism made a comeback in the Eighties, with writers sometimes using their shows to promote the moral causes of the day. One episode of Diff'rent Strokes was graced by Nancy Reagan, who took time out of her schedule as First Lady to warn viewers of the evils of drugs.

As American politics got hotter and more fractured, TV producers tried to avoid storylines that would alienate potential viewers. The Simpsons, which started in 1989, was drawn into electoral politics by accident. Although it's now regarded as an all-American family favourite, the show was initially attacked by conservatives who thought it encouraged laziness, cynicism and disrespect for elders among the young.

In 1992, the Republican President George HW Bush said in a speech to the National Religious Broadcasters, "We need a nation closer to The Waltons than The Simpsons" - referring to the popular TV drama about a simple, wholesome, Depression-era farm family. Three days later, The Simpsons rolled out an episode in which Bart responds, "We're just like the Waltons! We're praying for an end to the Depression, too!"

Since the Nineties, animated sitcoms have been the most willing to address politics head on. That's partly because cartoon characters can do and say things that live-action actors can't. But animated sitcoms also tend to start out as cult hits that build a following among demographics with distinct political identities.

Family Guy, which is about a blue-collar family living in Rhode Island, pokes fun at conservatives on behalf of its young, liberal audience. In one episode, teenager Chris Griffin is invited by a girl he has a crush on to attend a Young Republicans social. He asks what they do there and she says, "We help people who already have the means to help themselves." By contrast, South Park has generated a more conservative philosophy, called South Park Libertarianism. Although its generous helpings of swearing, sex and violence make it unattractive to religious conservatives, its bigger target is politically correct liberalism. One episode attacks the anti-smoking lobby (staffed by vampires), and another claims that environmentalists are destroying the ozone layer by releasing toxic levels of "smug" into the atmosphere.

For live action sitcoms, politics comes in the subtler form of cultural comment. This season's shows feature more gay people than at any other time in history. The trend started in 1997, when the comedian Ellen DeGeneres came out on her sitcom, Ellen. At the time, half of Americans still believed that gay sex should be a crime. But since then, attitudes have mellowed in parallel to sitcoms becoming more boldly pro-gay. When Vice President Joe Biden endorsed gay marriage rights in 2012, he cited the sitcom Will and Grace - which features a leading gay character - as one of the things that had changed the attitudes of voters towards homosexuality.

Whether sitcoms accentuate or simply imitate change is up for debate. But, just like UK soaps, their popularity depends on remaining contemporary. At present, the biggest sitcom hit in the US is Modern Family. It sounds as if it was scripted by a politically correct social worker. It features three families: a husband and wife with three kids, an interracial, cross-generational couple and a gay couple raising a child. However, the show has a demographic appeal that defies political stereotypes. Both presidential candidates have said it's their favourite show and Republicans are slightly more likely to watch it than Democrats.

The secret of this liberal Trojan horse is that it sells socially progressive ideas in a conservative packaging. The families go to church (as do nearly half of all Americans) and regularly learn Eighties-style moral lessons about hard work and fidelity. But they also reflect how the American family is evolving from a heterosexual, all-white norm to something more diverse and tolerant.

Today, one in four gay couples is raising a baby and one in seven new marriages is inter-racial. So are sitcom writers the vanguard of a liberal army of social reformers? Yes and no. Programme-makers know that there are some subjects the American people won't watch: abortion, teenage pregnancy and atheism rarely make it to prime time. And political balance is always respected.

The New Normal is about two gay dads raising a baby through a surrogate, and the question of whether they should or could get married inevitably leads to some reflection on the 2012 presidential election. But while the show clearly sympathises with the pro-gay marriage Democrats, the terror of losing Republican viewers ensures that it includes plenty of satire of liberal hypocrisy. When, in a recent episode, a conservative character challenges the gay couple to admit that they don't actually have any black friends, they're desperate to prove her wrong. Unable to find one, they hire a black waiter for the day to play the part. All of which might explain why Homer Simpson voted for Mitt Romney only four years after voting for Barack Obama. Of all the animated sitcoms, The Simpsons is the most aware of its family-friendly, national treasure status, so it steers clear of partisan bias. But that doesn't mean it can't be sharp.

In one episode, we get a glimpse of the party conventions. At the Republican event, the signs read: "We want what's worst for everyone!" and "We're just evil!" The banners at the Democratic convention read: "We hate life and ourselves!" and "We can't govern!" That's the choice that Americans face - and it ain't always funny.

 

The Bio Debate

Stem cells, embryo research and synthetic biology are just a few of the issues that will force strange new political alliances.

NOBODY is immune from the feeling that change is accelerating with each passing year. This sense of "future shock" is perhaps most closely associated with information technology. We've all experienced the anxiety, frustration and resentment that accompanies the introduction of a new version of software on which we depend, or the realisation that people younger than ourselves have adopted a new technology that makes their lifestyle seem very different from our own.

Worries about rapid change also bubble up in response to scientific progress, especially when it raises moral questions. We've seen this time and again with controversies over evolution, reproductive rights, the origin of the universe and nearly all issues in science that relate to human values.

Biology is an especially volatile source of sensitivities. The old biology was mainly observational, but the new biology, or biotechnology - including stem cells, embryo research, synthetic biology and reproductive technology - has unprecedented power to change basic life processes.

Such sensitivities are understandable. People rightly feel that high stakes are involved when science challenges our customary and largely workable moral framework.

And there is, of course, hyperbole associated with biotech. But even if only some of the predictions bear fruit, the new biology will challenge everything in its path, including our understanding of ourselves, our relationship with the world, our social arrangements and values and our political systems. The new biology is thus becoming part of political life. Candidates for national political office need to have staked out positions on these issues.

Biology and politics already intersect, of course. A good example is the abortion controversy, a recurrent theme in the US since the 1970s, with both sides trying to influence the decision over whether to continue a pregnancy or not.

But this issue is relatively uncomplicated compared with what is to come. The straightforwardness of the available positions (anti-abortion or pro-choice) is vastly outstripped by the scenarios that will be forced on us by the new biology.

One recent example is the controversy over the "three-parent embryo". This is a technology for avoiding mitochondrial disease whereby nuclear DNA from an egg with defective mitochondria is injected into an egg from another woman with healthy mitochondria, and the resulting egg can then be fertilised. To some this is perfectly acceptable. To others it smacks of eugenics. This is just one example of how, in the early 21st century, we are crossing the threshold to a new biopolitical world.

Already there are more protagonists than in the past. Science, the state, industry and religious organisations are just some of the parties vying for control.

What is more, familiar ideological labels are poor predictors of policy positions. US anti-genetic-engineering activist Jeremy Rifkin was perhaps the first to notice that anxieties about biology cut across the political spectrum. He noted more than a decade ago in an article for UK newspaper The Guardian: "The current debate over... biotech issues, is beginning to reshape the whole political landscape in ways no one could have imagined just a few years ago."

Rifkin was right: biopolitical issues increasingly make for strange political bedfellows and alliances of convenience as people with differing sympathies make common cause.

In one camp are bioprogressives who are supportive of the new biology from opposite sides of the traditional political divide. Those on the left emphasise regulation, equality and the common good, while those on the right emphasise free enterprise as the most reliable source of innovation.

There are also several flavours of bioconservative. Some are religious traditionalists, others are secular neoconservatives who regard science as a threat to human dignity, moral equality and human nature itself. Bioconservatives are increasingly joined by "green" progressives who harbour deep doubts about the implications of science for social justice.

The bête noire of all types of bioconservatives is the small but growing movement called transhumanism, which enthusiastically embraces technological change. Transhumanists see the prospects for drastic enhancements in what bioconservatives regard as an essential human nature that is too precious and fragile to withstand manipulation.

In a foretaste of the strange new biopolitical alliances to come, consider the shortage of organs for transplant. The established view among liberals and conservatives is that virtually any incentive for donation is morally unacceptable. But some libertarians and some on the left decry the loss of thousands of lives each year while suitable organs, especially kidneys, are not made available. Although they are poles apart on most issues, they agree that policy options for incentives should be explored.

It's hard to say how great the scale of political changes wrought by the new biology will be, but there can be little doubt that we are heading into uncharted territory. We might hold out hope that all sides can be convinced that science, within carefully negotiated limits, can enhance and enrich our quality of life. But what counts as enhancement and enrichment will be a matter of negotiation. That is the subject matter of the new biopolitics.

If politics is, as I believe it is, the only alternative to violence, these matters are worthy of the best politics we can muster.

 

Political Campaign Spending Brought Down the Roman Republic

Two years after the Supreme Court’s decision in Citizens United, which allowed unlimited corporate and union money into American politics, there is one line that continues to echo: “The appearance of influence or access … will not cause the electorate to lose faith in our democracy.”

That line lasts because it’s a testable prediction. It’s not a question of precedent or constitutional interpretation, but of public opinion—and as such, we all feel competent to judge it. Loss of faith, the Supreme Court allowed, is itself an argument against our increasingly unregulated campaign spending regime. Of course, democratic faith is a slippery concept. But it is always on display in an election's aftermath. In the best case, the election's winners and losers have a shared, if grudging, agreement about the fairness of the process and its outcome. In the worst case, the winner's legitimacy is just one more "fact" to disagree about.

Does massive campaign spending move us closer to the worst case? One view of the 2012 election holds that super PACs proved far less effective than feared. "But ultimately," argues Nicholas Confessore in the New York Times, "Mr. Obama did not beat the super PACs; he joined them." His re-election, therefore, doesn’t settle the question raised by the Supreme Court; it simply postpones it.

Rather than letting the Citizens United experiment in confidence play out over the next several elections, we can find evidence now, by looking to political history. How has the “appearance of influence” affected faith in other elected governments? History tells us that such faith is far easier to tear down than to rebuild. And one of the best examples of this faith under strain comes from one of the earliest experiments in elected government: the Roman Republic.

Our political culture is saturated with historical appeals to the founders, but when the founders themselves wanted to make such appeals, they turned overwhelmingly to Rome. As inspiration and as practical example, that republic’s history is written into our own. For Alexander Hamilton, the republic represented "the utmost height of human greatness." The authors of the Federalist Papers cited the republic as an influence on the American Constitution 14 separate times. In early America, Rome before Caesar served as the quintessential republic of virtue; its collapse was the ultimate cautionary tale of political corruption.

A crucial part of that story was the corrosive influence of money in politics. To be sure, Rome was never a true democracy; its elections were always designed to heavily favor the wealthy and well-born. Further, the kind of money that consumed Roman politics—personal spending by wealthy candidates - isn’t the prime source of controversy in our time. Nevertheless, the last generation of the Republic’s politics was dominated by two trends: universal complaints about money’s corrupting effect on politics and near universal unwillingness to do anything about it.

Ancient politicians were just as skilled as modern ones at identifying and exploiting loopholes in election law. In Rome, the key loophole lay in the fuzzy distinction between ambitus (electoral bribery) and mere benignitas (generosity). Roman elections were often won on the strength of free food, drinks, entertainment, and sometimes hard cash offered directly to voters and financed by private fortunes. In fact, Roman campaign slogans were sometimes inscribed on the bottom of commemorative wine cups—you could drain the cup and find out whom to vote for. Most of the Roman elite relied on the gentleman’s agreement that the line between bribery and generosity would not be strictly patrolled. At worst, rank vote-buying was something your opponents engaged in; you, on the other hand, were simply being a good neighbor.

That explains the curious fact that continually rising penalties for corruption had almost no deterrent effect. Toward the republic’s end, the penalty for ambitus had risen to 10 years’ exile. The general Pompey, who presented himself as a clean-government advocate when he wasn’t buying elections for his allies, even proposed raising the statute of limitations for corruption charges to two decades, meaning that virtually no Roman politician would be safe.

Yet the money continued to flow: Politicians able to afford the massive bribes were usually able to afford protection after the fact. Worse, with no enforceable limits on spending and a heavy premium on one-upsmanship, the price of elections skyrocketed. Five years before the republic collapsed, Cicero made an astonishing claim: The wealthy had injected so much cash into election season that the interest rate in Rome temporarily doubled. Nor could the power of money be confined to election season—its influence spread throughout the republic’s government. Rome had long sent politicians to govern a province after their year in office; ultimately, they felt entitled to fleece those provinces in order to recoup their election losses, a practice that spread deep resentment of the capital. The biographer Plutarch records bribery of civil servants, who were paid off to erase debts owed to the public purse. Jury verdicts, too, were regularly bought and paid for.

Julius Caesar, who brought the republic to an end, walked a path to power paved by charisma and military accomplishments—and his mastery of Roman campaign-finance practices. He won his first election to Rome’s highest office with the backing of a single wealthy donor (who, in exchange, planned to serve beside Caesar). And if there was a moment when civil war between Caesar and the conservative Roman Senate became inevitable, it was probably the day Caesar paid off the debts of Gaius Scribonius Curio. Curio, an up-and-coming conservative, had won election as a tribune of the people, and with it the tribune’s power to veto any law. But he had heavily indebted himself along the way. Caesar satisfied his creditors, but only on the condition that Curio switch sides. From that point, Caesar, who already had an army, owned a veto in the Roman government. Political deadlock was assured.

Caesar’s fiercest personal enemy was also Rome’s most consistent enemy of electoral corruption. Marcus Porcius Cato, a Stoic and Senate conservative, made his name denouncing the influence of money on private and public life. Yet Cato failed, just as other would-be reformers did. He obsessively cast corruption as a failing of personal morality rather than a systemic crisis, which dramatically understated both the scope of the problem and the means needed to control it. As a result, Cato’s proposed remedies were usually ad hoc, and they predictably fell short.

In one instance, he managed to persuade a group of candidates to appoint him as an informal election judge, with the power to investigate bribery and publicly expose any candidate he found guilty. Each pledged to forfeit money to Cato if he was caught breaking the deal. Days before the vote, an enthusiastic Cicero wrote, “If the election proves free, as it is thought it will, Cato alone can do more than all the laws and all the judges.” On the day of the election, Cato stood before the Roman people and duly announced that one of the candidates had cheated. The rivals huddled and came to their own decision: The guilty man should be let off with no further penalty, and he should keep his money. That’s how deep corruption ran in the culture of the Roman elite. Even as ambitus weakened the republic, each member of the governing class preferred a strategy of maximizing his own gains in a broken system. Vote-buying made sense for individual politicians at the same time as it undermined the elite as a whole. Cicero, for instance, passed a strong anti-corruption law and even named it after himself - and then, he secured the acquittal of the very first man charged under the law because he was a political ally.

Several years later, an ex-governor was tried for extorting money from his province to finance a campaign for higher office. Six different lawyers, drawn from the cream of the Senate - the equivalent of a Roman legal Dream Team—rose to defend him. The corrupt former official was acquitted with ease.

By the end, chronic election-buying had helped grind down all faith in republican government. Why was Caesar able to dissolve the republic and set Rome on a course to one-man rule? Because, in large part, enough people believed that the republic was too rotten to be worth saving. And while most classical sources dwell on the aristocracy, there’s also strong evidence that ordinary Romans grew increasingly alienated from politics during the final years. Radical leaders like Catiline and Clodius drew massive followings with their attacks on a corrupt elite; their riots, in turn, helped convince much of that elite that Rome was in grave need of a strongman.

Unlimited money in politics certainly doesn’t guarantee riots and civil war. Nor does “the appearance of influence” always undermine republican government. But legitimacy matters, and it rests on a delicate understanding: the belief that those who govern have a right to govern. It’s devilishly hard to measure or quantify, but (to paraphrase the Supreme Court again) we know it when we see it.

How much democratic faith do Americans have today? How many liberals think George W. Bush won in 2004 because of electronic voting shenanigans in Ohio? How many conservatives think Barack Obama won in 2008 thanks to ACORN, or in 2012 because of handouts to the 47 percent? Unlimited money in politics adds one more cause for doubt, perhaps the most powerful of all, to a list that has grown in recent years. How long until we have a presidential election in which a dangerous percentage of Americans view the final result as illegitimate on account of money?

On some level, Citizens United was right. It’s not bought influence that’s deadliest to our politics - it’s the appearance of influence. We can debate dueling First Amendment readings and the real power of super PACs all we’d like; but here is a case where public opinion, on its own, should be decisive. If a loss of faith becomes wide and deep enough, the question of whether or not we are right to lose faith becomes academic. The loss is destructive either way. Just ask Rome.

 

Data vs Dogma

Data vs Dogma

 

Social Welfare in the Tenderloin

The Tenderloin is widely acknowledged as the most hellish neighborhood in San Francisco. Out of the city's ten most violent crime plots, the Tenderloin is home to seven. Recent stats estimate the neighborhood has an average of three major crimes per hour, including one-third of the city’s drug offenses, with a yearly mean of two crimes per resident. The population is made up of more than 6,000 homeless people and contains one-fourth of the city’s HIV-positive drug users. Filthy sidewalks and vacant buildings peppered with single-occupancy hotel rooms provide a home to all levels of drugs and prostitution.

My friend Lorian has been employed as a social worker in the Tenderloin for several years now. Her tweets about it (things like: “today: 4 dead clients, 1 murdered provider, 1 client defecated in the lobby, 1 dead dog, & 1 facebook friend posted pictures of nachos.”) got me curious as to what her job is like. She was kind enough to answer some of my questions. 

I imagine it varies greatly, but can you describe your average workday?

Lorian: The first thing is getting through the door at 9 AM. We usually have to step over clients or random strangers passed out on the benches from drinking and/or using since God knows when. The smell is the first thing that hits you—a stench of urine, feces, poor hygiene—it's really at its strongest in the morning, but you get used to it throughout the day. Then we check our voicemail. Twenty messages from the same two or three clients who either scream their financial requests over and over, simply sit there and breathe, or tell you that witches are under their beds waiting for the next blood sacrifice. Paranoid clients like to fixate on witches, Satan, etc. Anyway, we get ready to open and hand out checks to the clients who are either on daily budgets, or who make random check requests. The budgeted clients are the most low-functioning, as they can be restricted to as little as $7 per day in order to curb their harm reduction. They'll go and spend that $7 on whatever piece of crack they can find, and then two hours later they're back, begging for more money. Clients will find some really brilliant ways to beg. When we're not dealing with clients out in the lobby, which can involve anything from handing out checks to cleaning up blood to clearing the floor for folks having seizures, we're usually dealing with the government agency assholes over at Social Security. I personally work with around 200 clients, so the paperwork and filing can be extraordinary. My “average day” starts at 9 AM and lasts until 7 or 8 PM. You're in the Tenderloin, right? What's the deal with that area?

Yeah, the Tenderloin is where the majority of our clients live in residential hotels (SROs). It's one of the two predominately black neighborhoods left in SF (the other is the Western Addition), and it's the center of the crack, heroin, and oxy drug culture, and it hosts the transgendered sex-worker scene. It's an incredible neighborhood. There's a preservation society that works really hard to keep the original buildings in place, so the 'Loin has an impressive architectural history, not to mention random shit like vintage fetish-magazine stores, pot dispensaries, and transgender strip clubs. It's literally located at the bottom of a giant hill (Nob Hill), where the old money sits and looks down on the poor black folk, so the geography of SF's class structure is more blatant than in other cities, I think. It's a fucked place: human shit smeared on the sidewalks, tweakers sitting on the corner dismantling doorknobs for hours, heroin users nodding out in the middle of the streets, drug dealers paying cornerstore owners $20 to sell in their stores, dudes pissing on your doorstep as you leave for work, etc. It's a weird, fascinating, and very hard place to live.

Why do you think so many of your clients are paranoid and/or disturbed?

Why are my clients so fucked up? Traumatic backgrounds, PTSD, and severe mental illness (schizophrenia, bipolar disorder, and dementia are the most frequent cases we see). And whenever you combine a drug habit with compromised mental health, you usually get a mess of a brain. Abuse, rape, murder, suicide, war—you name it, they've experienced it. Most of our clients live with on-site case management and nursing staff, so their medication is monitored, but when they stop taking the meds, that's when psychotic breaks and fixations happen. I've been the “subject” of quite a few of these fixations. And even though our agency pushes the belief that “housing is healthcare,” the shit that goes down inside these residential hotels can be hard to stomach. A lot of our clients feel safer living on the streets. 

How does being in the midst of so much mental illness affect you emotionally? 

Man, social work is so fucking weird. People think you're a saint. “It takes a certain person to do that kind of work,” is what I hear a lot. Fuck that. When you're young, you can afford to have ideals and believe in stuff, and think that what you're doing matters, but after watching grown men shit themselves and sometimes try to eat their own shit, not to mention the countless number of times I've had to pick people off the floor and put them back in their wheelchairs because they've been drinking since 6 AM and can't even sit up straight, your measly 32K salary starts to matter a helluva lot more than social justice.

I think I got into social work because I had this idea of it somehow “killing” my ego. It seems silly, but it felt very real at the time. There's a sadness to watching your idealism and convictions go to shit. Not to mention that working in such a thankless and fucked system will kill a sacred part of you. I feel tired. For the most part, people do not want help. They want money or they want drugs or they want death.

What you do seems important, though. There must be some goodness in it, too, right? I feel like you tweet sometimes about people bringing you weird things they see as gifts or saying nice, if totally bizarre things. Are there moments that help balance the heavy?

I don't really think of what I do as “important,” because days are days and everyone is dying and who am I to think anything of anything. But yes, there are moments, there is goodness. Today a client brought me a huge drawing he made of a tree in Golden Gate Park. It must've taken him hours. He said he drew every leaf. I told him the line work was amazing, and he said, “An amazing tree for an amazing woman.” And then he asked me, “When is the Fourth of July?” Sometimes moments like that are enough.

(a forum comment):

For the most part, the homeless can be broken down into four categories:

-The financial down & outs. People who lost their job or hit with a major medical problem & bills. They're actively trying to get back on their feet, find jobs, housing and re-enter society.

-The addicted. Drugs or alcohol ruined their life, so they fully embrace their addictions. Frequently addicted women as prostitutes.

-The mentally-ill. These are the people who aren't able to take care of themselves or hold jobs. Often, they're deranged or schizophrenic. A small subset are former military. Addiction may also be an added problem.

-Tramps, vagabonds, and teen runaways. Mentally stable people who choose the lifestyle and don't want to fit into society. Teens may turn to additions, and likely would have let because of abuse.

I fully believe that if we made a real effort to reduce homelessness, we'd properly fund separate programs that address the separate above groups (with exception to the tamps and vagabonds, they're free to live the way they want). Each should be regarded as a separate population, with very specific needs and issues. Frankly, the program aimed to help people find housing and jobs shouldn't be the same one who tries to counsel of someone who eats their own excrement and thinks they hear Satan talking in their head, or have to work with people who are tripping out on meth or acid.

 

Why Are We Deceived By Satire Sites?

t had been weeks, weeks, since a news organization was suckered by an obvious piece of satire. We were due for another face plant, and we got it: a hot piece of Breitbart.com clickbait titled “Krugman Files for Bankruptcy.” “Paul Krugman, the economic darling of the left, has filed for Chapter 13 bankruptcy protection,” wrote Larry O’Connor. “Apparently this Keynsian thing doesn’t really work on the micro level.”

It’s actually “Keynesian,” and it doesn’t, but that’s not the point. The point was that O’Connor, like so many before him, had accidentally run with fake news from the Daily Currant. In nine short months, the Web publication has fooled people into thinking Rick Santorum was on Grindr, that Michele Bachmann was going to ban falafel in public schools, and that Sarah Palin had joined Al-Jazeera. The dupe on that last story was Suzi Parker, a contributor to the Washington Post’s She the People blog. “If Parker had a shred of self-awareness, integrity, and dignity,” wrote media watcher John Nolte, “she would have changed the headline to ‘Too Good To Check,’ and under it posted an essay about how shallow, smug, bitterly angry partisanship can blind you to common sense.”

Nolte is an editor at Breitbart.com.

I asked O’Connor and Parker to comment on the most embarrassing mistakes they made all year and help me conduct some media autopsies. To my surprise, they both declined comment. Luckily, the Washington Post’s Erik Wemple tracked down the origin of the Krugman story - a sponsored item on the Boston Globe’s website published without any editor’s knowledge or consent. “Prudent Investor,” branching out from his normal work as a Pilgrim’s Progress character, cited “Austria’s Format online mag.” The bogus story, credited to the Daily Currant, was titled “Paul Krugman Is Broke.” English-speakers shrugged and hit “share.”

Why is the Internet so easily fooled by this Satire-Magazin? Daniel Barkeley, the 28-year old who founded the site last summer, has his theories. “We write articles that seem more real than articles you might see in the the Onion,” he says. “If you look at Ricky Gervais’ shows, like The Office, or Armando Iannucci’s shows, like The Thick of It, those are fly-on-the-wall documentaries. That’s the kind of comedy I like—it’s made to look real. It’s funnier that way, and we think it’s more intelligent that way. So I guess a byproduct of that is that you end up with parodies that people think are true.”

When Barkeley says “we,” he means himself and one colleague. It’s just two people who keep accidentally hoaxing the media. Barkeley went to the University of Oregon, then moved to Los Angeles, dreaming of a career as a comedy writer. “It was very spur of the moment,” he says. “I didn’t know anybody.” He switched to investment banking, went to school in France, and for his final assignment he designed a business—a satire website. One month later, the site was live. Less than a year later, Barkeley has built a “tight but livable” existence from online ads—a business that gets new attention, every month or so, when someone thinks a piece of satire is real.

New attention, but not necessarily new traffic. “I’m looking at the analytics,” says Barkeley, “and traffic isn’t much higher than it was a few days ago.” The Daily Currant’s faux scoops get shared at basically the same rate whether they spill into the mainstream or not. They’re written in the driest possible prose. A fake source in the Onion eventually starts cursing or otherwise giving up the game. A fake source in the Daily Currant sounds completely earnest. This was about as wacky as the Krugman story got:

“The majority of his debts are related to mortgage financing on a $8.7 million apartment in lower Manhattan, but the list also includes $621,537 in credit card debt and $33,642 in store financing at famed jeweler Tiffanys and Co. “The filing says that Krugman got into credit card trouble in 2004 after racking up $84,000 in a single month on his American Express black card in pursuit of rare Portuguese wines and 19th century English cloth.”

It’s funny if you realize that Krugman would never buy anything like this. But if you’re inclined to hate the guy, you’ll read the numbers and nod your head. In the harsh words of Gawker’s Max Read, the Daily Currant’s parodies are ‘semi-believable political wish-fulfillment articles distinguished by a commitment to a complete absence of what most people would recognize as ‘jokes.’”

And that’s why people share them. Sometimes an article surges on social media because it’s got a scoop that changes minds. The rest of the time, the article rockets around Facebook because it confirms what the reader already thinks and what his friends believe. For one 2011 study, two academics at the University of Pennsylvania analyzed 7,000 New York Times articles that had made it to the site’s “most shared” list. “Participants were less likely to share the story if they were in the high sadness as opposed to the low sadness condition,” they wrote. “Second, the results were similar for arousal; the high sadness condition evoked less arousal than the low sadness condition. Third, as hypothesized, this decrease in arousal mediated the effect of condition on sharing.”

That’s one theory, but it could explain most of the social Web. You share the picture of Abraham Lincoln next to the inspiring quote because it makes you happy. You share a paean to free speech by a Russian composer—“The Russian state is acting like a dominant male in a group of monkeys!” - because you felt inspired. You share the story about Sarah Palin being stupid because you think Sarah Palin is really stupid and refuse to let the 2008 election end already.

“After the whole Todd Akin thing happened,” says Barkeley, “I put up a story about how he thought breast milk could cure gay people. The Guardian contacted me and wanted to know where the video was. I said ‘No, no, it was a fake.’ But at least they checked, right?”

 

Liberal vs Conservative

Many people are wondering what this means for the future of same-sex marriage in the United States. Why exactly is this such a contentious issue, and why do Americans’ opinions seem to differ so greatly? When it comes to marriage equality, why can’t we all just get along?

Where Does a Same-Sex Marriage Attitude Come From?

The reason why only nine states in the USA have legalized same-sex marriage most likely has something to do with the large number of senators (and, presumably, American citizens) who are against it. But other than the obvious factors (like religion and age), what else might make someone especially likely to reject the idea of same-sex marriage?

We might find some answers by looking at empirical research on the psychological roots of political ideology. Conservative social attitudes (which typically include an opposition to same-sex marriage) are strongly related to preferences for stability, order, and certainty. In fact, research suggests that these attitudes may be part of a compensatory mental process motivated by anxiety; people who feel particularly threatened by uncertainty cope with it by placing great importance on norms, rules, and rigidity. As a result, people who are particularly intolerant of ambiguity, live in unstable circumstances, or simply have an innate need for order, structure, and closure are more likely to hold attitudes that promote rigidity and conventional social norms – meaning that they are most likely to be against same-sex marriage.

What does it mean to be intolerant of ambiguity? Well, would you rather see the world around you as clear and straightforward, or would you rather see everything as complicated and multidimensional? People who fall into the first category are much more likely to want everything in life (including gender roles, interpersonal relationships, and conceptualizations of marriage) to be dichotomous, rigid, and clear-cut. “Ambiguity-intolerant” people are also, understandably, more likely to construe ambiguous situations as particularly threatening. After all, if you are inherently not comfortable with the idea of a complicated, shades-of-gray world, any situation that presents you with this type of uncertainty will be seen as potentially dangerous. This is likely what’s happening when a conservative sees an ambiguous situation (e.g. a same-sex couple’s potential marriage) as a source of threat (e.g. to the sanctity of marriage).

Why Is Attitude Change So Hard?

After reading the section above, it should be fairly clear that there’s a problem with how both pro- and anti-same-sex-marriage proponents are viewing the other side’s point of view. The issue is not really that there’s one way to see the issue, and the other side simply isn’t seeing it that way; the issue is that both sides are focusing on entirely different things.

Overall, liberal ideology paints society as inherently improvable, and liberals are therefore motivated by a desire for eventual societal equality; conservative ideology paints society as inherently hierarchical, and conservatives are therefore motivated by a desire to make the world as stable and safe as possible. So while the liberals are banging their heads against the wall wondering why conservatives are against human rights, the conservatives are sitting on the other side wondering why on earth the liberals would want to create chaos, disorder, and dangerous instability. It boils down to a focus on equality versus a focus on order. Without understanding that, no one’s ever going to understand what the other side wants to know and hear, and all sides’ arguments will fall on deaf ears.*

But there’s another mental process at play. When someone has a strong attitude about something (liberal or conservative), the mind works very hard to protect it. When faced with information about a given topic, people pay attention to (and remember) the arguments that strengthen their attitudes, and they ignore, forget, or misremember any arguments that go against them. Even if faced with evidence that proves how a given attitude is undeniably wrong, people will almost always react by simply becoming more polarized; they will leave the interaction even more sure that their attitude is correct than they were before. So even if each side understood how to frame their arguments – even if liberals pointed out the ways in which same-sex marriage rights would help stabilize the economy, or conservatives argued to liberals that they could provide equal rights through civil unions rather than through marriage – it’s still very unlikely that either side would successfully change anyone’s attitude about anything.

If Attitudes Are So Stubborn, How Have They Changed In The Past?

So how did it happen? As one specific example, how did New York end up legalizing same-sex marriage in June 2011 with a 33-29 vote?

I’d wager a guess that part of it had to do with the five other states that had legalized same-sex marriage by that point and seen their heterosexual marriages remain just as sacred as they ever were before. As same-sex marriage becomes more commonplace (and heterosexual marriages remain unaffected), it will also become less threatening; as it becomes less threatening, it will evoke less of a threat response from people who don’t deal well with ambiguity.

But I can offer another serious contender: Amendment S5857-2011.

This amendment, which states that religious institutions opposed to same-sex marriage do not have to perform them, was passed shortly before the same-sex marriage legalization bill. There’s a very powerful social norm at work in our interactions, and it shapes how we respond to people’s attempts at persuasion: When we feel like someone has conceded something to us, we feel pressured to concede something back. This is called a reciprocal concession.

Let’s say a Girl Scout comes to your door and asks if she can sell you ten boxes of cookies. You feel bad saying no, but your waistline doesn’t need the cookies and your wallet doesn’t need the expense. After you refuse the sale, she responds by asking if you’d like to purchase five boxes instead. You then change your mind and agree to buy five boxes; after all, if the girl scout was willing to concede those five boxes of cookies, you feel pressured to concede something in return – like some of your money. That’s the power of reciprocal concessions.

This, in my opinion, is a good contender to explain what happened in the New York State Senate back in 2011. The vote was dead even: 31 for, 31 against. When the Senate passed the Amendment, this was a concession from the pro-same-sex-marriage side, which, according to the logic of reciprocal concessions, should have encouraged no-voting senators to reciprocate by conceding their votes. For two of them, it worked.

So now, we’ve seen that personality, ideology, and attitudes can all play a role in our attitudes towards same-sex marriage, and that votes might even swing because of techniques that we could have learned from our local Girl Scouts. This means is that it’s absolutely essential for everyone involved in the debate this week to understand that we won’t all respond to the same types of arguments, reasoning, or pleas. Rather, it is imperative that we consider how a focus on equality or stability might shape what information we pay attention to, and what values we deem most important.

(1) I recognize that these are generalizations, and these descriptions do not accurately represent every liberal person and every conservative person. I also recognize that individual political attitudes are more complex than this distinction may make them out to be, and that religious ideology plays a very strong role as well. However, the focus on equality vs. stability is, at its core, the fundamental difference between liberal and conservative political ideology.

 

Nate Silver (LT Interview)

 

Nate Silver is good at statistics. Last year he correctly predicted election results for all 50 US states. Now he’s America’s superstar number-cruncher. But what’s he got to say about Britain? From an eyrie high above midtown Manhattan, Nate Silver trains his beady, statistically accurate eye over the Hudson River. Clouds scuttle through the sky over New Jersey, a meteorological event predicted by local weather forecasters, and the traffic far below behaves as it should, dependably disappearing into the Lincoln Tunnel.

Silver has been up all night making longer-range forecasts, attempting to predict how Americans will feel about gay marriage in 2016 and what they will think in 2020 based on historic trends. At 6am he went to bed having concluded that voters in precisely 32 states would, on balance, probably be willing to support same-sex unions by 2016, barring a great religious awakening among the youngest generation of voters.

His own awakening from sleep came a mere four hours later. Now here he is in a small room on the eleventh floor of the New York Times building, ready to talk about his extraordinary success in predicting the outcome of the last American election in all 50 states – an achievement that made him one of the few superstars among the dull galaxy of backroom political analysts. The predictable outcome was that his book on the art of forecasting politics, economics, earthquakes, terrorism and the weather, The Signal and the Noise, rocketed to the top of bestseller lists.

Before we talk of that, I ask him to peer into the future with his tired, narrow eyes and predict things of the greatest import to the British nation. Could we expect a barbecue summer? “I don’t even know what that is,” he says. It’s like an ordinary British summer except slightly less miserable, I say. “Oh. I don’t know.” Weather forecasting, he says, is one of the few successes among mankind’s many futile attempts to predict what will happen next. Yet even the most brilliant men, women and machines at the National Weather Service in Maryland cannot see more than eight days into the climatic future. After that time frame we have to rely on past climate trends: a fact that may not be all that heartening regarding the British summer.

We move on to another important issue: will Andy Murray win Wimbledon? “If you had to pick for or against it, you would probably say no,” Silver says. “The problem is: can you count on Andy Murray to play five good sets as reliably as Federer or Djokovic? You wouldn’t want to bet on it.” He pauses and looks a little rueful. “Well, I had some money on Murray in Australia. So I’ve been burnt by that experience. Heh, heh.” He has a chirrupy laugh that is very disarming and an easy, shambling manner. His hair, for instance, does not adhere to any discernible pattern.

What about Britain’s coalition government, I say. Will it survive? He thinks for a moment. “So, I’ve never quite understood,” he says. “What are the Liberal Democrats getting out of the coalition exactly?” He really does ask the most acute questions. As I struggle to supply an answer, he says: “And does Labour have a charismatic leader?” His prediction, based only on the fact that “you have had a rough several years economically”, is that David Cameron will not be re-elected.

“He’s awfully unpopular right now,” he says. “Look, this is dangerous, because I can extrapolate from the political science literature in the US. In the US context, the economy is not everything, but it’s an awfully large factor and the personalities of the candidates don’t matter very much in nine out of ten cases. I’m not sure if that’s as true in the parliamentary system.

I think the economy bit has generally been shown to be true across different parts of the world.” Cameron might as well call in the removal men.

Silver gets asked to forecast all manner of things these days. Word of his extraordinary predictive powers has spread around the world; journalists hasten to his side like Ancient Greeks to the Delphic Oracle, only without the animal sacrifice. In January he went to Australia to compete in a poker tournament and was bombarded with questions about whether the Prime Minister, Julia Gillard, would win the coming election, (though none, perhaps, over whether they could expect a barbecue summer as they always get one, the lucky devils).

People treat him like a wizard. “This is the perception I’m trying to push against,” he says. “It’s often a better story if you hail this person as a wizard or the genius instead of just saying: ‘Here’s this person with a statistical model who worked pretty hard at it and got a little fortunate in the end.’”

Silver’s famous statistical juggernaut rolled out during the 2008 elections, processing reams of polling data from all over America and weighting the numbers according to factors such as the past accuracy of each poll, sample size and how recently it was carried out. That year he correctly predicted the winner of the presidential election in all but one state and the outcome of all 35 senate races.

Late last year, his blog, FiveThirtyEight, was brought within the sprawling online empire of The New York Times, where it generated an astounding 20 per cent of daily web traffic. There, calmly and insistently, he predicted a second victory for Barack Obama.

From two desks at the side of the newsroom, Silver and his colleague Micah Cohen aggregated polling data, weighting poll results according to the pollster’s track record for reliability and making adjustments to account for the demographics of a state. After other refinements, they simulated the election 10,000 times; from this they drew their percentage assessment.

“A lot of the conclusions I make in FiveThirtyEight are obvious,” he says. “Obama is ahead in the polls so he’s probably going to win. That’s kind of all we’re saying at the end, right? And the fact that people had such trouble grasping it was kind of remarkable. It shows how delusional people have become when it comes to politics.” Again, he breaks off into a rascally laugh.

While political pundits insisted that the race hung on a knife edge and could go either way, Silver offered ever more confident forecasts. Conservatives painted him as a liberal who was skewing his own model in favour of Obama. Joe Scarborough, the host of a popular morning show on MSNBC, declared: “Anybody that thinks that this race is anything but a toss-up is such an ideologue they should be kept away from typewriters, computers, laptops and microphones for the next ten days because they’re jokes.”

Silver suggested a $2,000 bet with Scarborough and was promptly reprimanded by the public editor of The New York Times, apparently for abusing his position. “Nobody knows anything,” wrote Peggy Noonan, an influential conservative commentator, a day before the election. “Everybody’s guessing.” But she thought that Mitt Romney would triumph, that he was “stealing in like a thief with good tools”.

Silver, on the other hand, gave Obama a 91 per cent chance of victory. One radio commentator wondered whether it was necessary to go through with all the voting given that Americans already had Nate Silver to tell them what they were about to do. In the event, FiveThirtyEight correctly forecast the results in all 50 states. By the cold, hard light of statistical probability, the emperors of American punditry appeared naked. There was an obvious temptation to take an extended victory lap after this, although Silver still insists he was lucky. “We would only have given ourselves a 20 per cent chance of getting all 50 states right,” he says. “So that was a stroke of luck. I would have had to have changed jobs, probably, if Romney had won.”

Some thought he had a better election than the Commander-in-Chief himself. Critics talked about Revenge of the Nerds. There is some of this in Silver’s book and in the arc of his own life so far, which is replete with examples of a dorky fellow gaming the system. By 8 he had concluded that he would never be a professional baseball player. “My strategy was to take a lot of walks,” he says. “I couldn’t hit for shit, but the pitchers couldn’t throw for shit, so I just wouldn’t swing at anything.”

Home was a yellow two-storey house in East Lansing, Michigan. His father was a “secular Jewish” professor of political science, his mother was “kind of a Wasp, but also not really practising at all” and a ready volunteer for local political dust-ups over school closures and the need for more traffic stop signs. “I was definitely one of those nerdy kids who was pretty advanced,” he says. He attended a fairly traditional high school: “It wasn’t this postmodern thing where all the jocks and nerds are happily interacting. It wasn’t Glee.”

He was on the school debating team; it was both a passion and “an excuse to get out of class”. They practised the arts of rhetoric for up to 50 hours a week. He believes they were the best team in Michigan, though they lost the state title on what he still regards as “a controversial decision”. This argumentative background must help now that he is in such demand at conferences, although he feels a little uncomfortable about the skill of leading an audience in either direction as required. “I’ve kind of come not to like that paradigm much,” he says. “You should make the best case you can for the truth, as complicated as it might be sometimes.”

He knew he was gay “quite early on”, he says. “If I were three years younger it might have been a lot different. But I was just on the trailing edge of that generation where the association that most people would seek to make with gay people was the Aids epidemic.” He came out to his parents at the age of 20, after a year abroad studying at the London School of Economics. That year gave him “kind of a buffer zone… a whole continent away. You kind of establish yourself there and kind of test it out. In retrospect it seems very strategic, the whole thing.”

He was introduced to his longtime boyfriend, Robert Gauldin, a graphic designer, by friends on the Chicago gay scene. “He’s kind of very right-brained and creative, but also like a little bit hot-tempered,” says Silver. “And I’m kind of aloof and cool. I think that’s what makes it work.” He thinks that the idea that opposites attract may be more important in gay relationships. It doesn’t work, he says, “if you have the same personality, especially two men, right? You can talk about equality all you want, but you definitely have different dynamics when two men are interacting.” Growing up gay in the Midwest, without religion, and later living as a professional gambler – all these things may have aided his later development as a gimlet-eyed prognosticator. “When your lifestyle choice is frowned upon by the vast majority of society you start to say, ‘Well, I’m not sure that I can trust society’s judgment,’ ” he says. “You are immediately tending to question things, right, which a lot of people do in different ways.

“But you do hear of people who grow up so comfortably that they can never empathise with any situation outside themselves. I think this was Mitt Romney’s problem – obviously a very smart guy, but he was kind of brought up in a bubble and never had to challenge himself, never really endured a lot of hardships. The fact that he couldn’t even fake empathy was interesting, right? It’s very unusual for a political candidate.”

The other lesson Silver draws from his life so far is that it pays to seek out fields of endeavour where the competition is either weak or non-existent. In his mid-twenties, during a brief, miserable career as a tax consultant, he designed a system to predict the performance of baseball players, applying algorithms and statistical analysis to a sport where they were not yet widely used. Then he applied his fluency with probability to predict card hands in the new world of online poker. “People don’t recognise how much success can be measured relative to the competition,” he says. “I know this from poker, certainly.”

Over a period of two and a half years, playing for about 20 hours a week, he made $400,000. He now wishes he had played more. “I just assumed I could play poker on the internet and these opponents [were] so bad I could make money any time I wanted,” he says. In fact, “Where my skills were basically getting a little better or staying the same, everyone else went from being really kind of shitty to quite good.”

Still, they were great days. He recalls a lady at the Bellagio in Las Vegas saying, “We’re colouring you up to a flag,” and handing him a $10,000 chip that bore the Stars and Stripes. There were lavish dinners where everyone offered to pay. “Poker players are not very materialistic people, ironically,” he says. “You won’t get very far if you’re making a $1,000 bet and you think, ‘What could I buy with this?’”

He moved into political analysis in 2006, as Congress moved against online poker. He was starting to lose money at cards anyway – the competition had tightened – whereas in politics it seemed to Silver that one “could look like a genius simply by doing some fairly basic research into what really has predictive power in a political campaign”. He analysed the predictive efforts of a leading group of American pundits, concluding that they “may as well have been flipping coins”. In his book Silver applies the same hard-eyed appraisal to all sorts of forecasters, moving among them like Simon Cowell at a karaoke bar. Economists are frequently judged useless, incapable not only of predicting recessions but of telling when one has already begun.

The field of earthquake science is also strewn with notable failures. Mathematical patterns are discernible in the occurrence of terrorist atrocities – a fact that may not be all that reassuring given where the pattern leads. He speaks to a Harvard professor who predicted, in 2004, that a nuclear attack on America was “more likely than not” within ten years and suggests that he would be “at least a little nervous” working in the New York Times building so close to the blazing target that is Times Square.

Silver credits poker with giving him the confidence to stand by his own forecasts. “If you get your chips all in and you have a 60 to 40 per cent edge, you are quite happy about that,” he says. “And so the fact that by election day in 2012, assuming that we had designed the model right, that we were a 90-10 favourite – that’s almost a better situation than you would ever get in poker.” You wonder how he will maintain his status as impartial observer now he is a fixture in the political firmament.

You wonder if people will begin to follow his star. Early last year, when the Republicans were still midway through their prolonged and often bizarre search for a leader, Silver wrote an article suggesting that if the party nominated the moderate Jon Huntsman, Barack Obama had almost no chance of being re-elected. Surely they will listen next time? “I’m not trying to give advice to either party,” he says. “The stuff we say is pretty obvious a lot of the time.”

But forecasters, like the witches of Macbeth, can sometimes influence events. “I get invitations to talk to Democratic groups or conservative groups, to talk about electoral strategy, and I really try to avoid those, beyond getting a beer with someone,” he says. “Because I don’t want to be a player in that scene, right? I feel that when you get too close to either party you can become influenced in ways that are at best a waste of time.” He doesn’t even vote. “It’s just like, well, are you analysing this event or are you participating in it?” he says. “It’s a way to separate that out.” Although Republicans portray him as a liberal, Silver defines himself as “libertarian on social issues and centrist on economic issues. In the UK that might make me a Tory. Here I’m probably more in the Democrat sphere as Republicans have become so conservative.” So what happens next? Will we live to see the age of Hillary? “Hillary would be a formidable candidate if she runs,” he says. “People should not neglect the fact that she has one of the most persuasive people in the world as her husband. I’m sure Bill would like a third Clinton term.”

On the Republican side he predicts a “spectacularly interesting primary”, filled with Senator Marco Rubio, Congressman Paul Ryan and Governor Chris Christie. “You could also see them nominating a nutcase.”

He remains pessimistic about American politics, despite a recent compromise over the debt ceiling and signs of an emerging consensus on immigration. Does studying the world – quantifying it, predicting it – make him feel any better? “Learning more about an issue can be a coping mechanism, I guess,” he says. “I still read strange Wikipedia articles late at night.” The night before our interview, while thinking about gay marriage, he read about the Cuban missile crisis. “I don’t watch a lot of TV,” he says, as if this explains it. “It’s kind of my mode of relaxation really.” Then he adds, quickly, “Well, one of them.”

Then, after photographs, and after gamely predicting that the Duchess of Cambridge would sire a son – “Statistically, boys are 50.1 per cent likely” – he shuffles out to a dinner. Not a wizard or a genius, just this person with a statistical model who worked pretty hard at it and got a little fortunate in the end.

    

 

 

Maps

 

Iran is launching an ‘Islamic Google Earth’. But in the digital age everyone can be their own cartographer.

Iran has announced an “Islamic Google Earth”, a three-dimensional mapping service that will show the world as Tehran would like it to be seen.

On the surface this seems (like most statements emanating from Iran) almost entirely mad; but at another level it reflects (like most statements emanating from Iran) the paranoia, manipulation and authoritarianism that are the hallmarks of the Ahmadinejad regime.

For in the past ten years maps have changed in ways that strike terror into your average dictator: they have become democratic and they have become, for the first time in history, authentic depictions of messy reality rather than political tools.

Mohammad Hassan Nami, Iran’s Communications Minister, has said that the Islamic version of Google Earth, to be named Basir (meaning spectator), “would take people of the world towards reality ... Our values in Iran are the values of God and this would be the difference between Basir and the Google Earth, which belongs to the ominous triangle of the US, England and the Zionists.”

What this will probably mean in practice is that satellite images of the Earth will be overlaid with data reflecting an Islamic world view and Iranian propaganda: indicating the direction of Mecca, identifying Iran’s enemies and perhaps airbrushing out evidence of its nuclear weapons production. Mr Nami, the former deputy defence minister, has a doctorate in state management from Kim Il Sung University in Pyongyang, North Korea — which may give some indication of where he is coming from, cartographically speaking.

For about three millennia the map, while claiming authenticity and objectivity, was really a political statement, a symbolic tool and sometimes a weapon. It depicted what is mine and yours. It revealed not necessarily what was true, but what was important or sacred to the map-maker. Every map came with an agenda and a proprietorial impulse; maps showed where civilisation existed (here) and savagery began (over there).

Traditional maps came with a specific political and religious centre of gravity. The first known map, drawn by an Iraqi scribe some 500 years BC, puts Babylon at the centre of the world. Medieval Christian maps revolved around Jerusalem. The 13th-century Hereford Mappa Mundi has the Garden of Eden at the edge of the world and beyond it a wilderness inhabited by cannibals and “the sons of Cain”.

Maps can shape political perceptions. In 1974, the German Arno Peters published a map emphasising the land mass of the Third World, correcting the familiar “Mercator” projection that distorted the size of countries in favour of the wealthy North.

In the age of exploration British survey teams headed into uncharted regions to impose scientific order on the world through mapping, while advancing the mercantile and strategic interests of empire. On Victorian maps imperial pink spreads outwards from Britain, at the very centre of the globe, to cover much of the planet.

In 1891 came the first attempt to create a standardised, internationally agreed world map. The German cartographer Albrecht Penck proposed an “International Map of the World”, gathering information from all national mapping agencies. The project was an abject failure. It could not even agree on the location of mountains and lakes, let alone borders. It finally died in 1940 when a Luftwaffe bomb landed on the project headquarters in Britain during Hitler’s attempt to wipe London off the map. Meanwhile, Nazi cartographers were drawing up imaginary maps of the thousand-year Reich, complete with Aryan towns, Hitler Youth Schools and Strength Though Joy theme parks. Map-making changed utterly and permanently in 2000 when Bill Clinton descrambled the Global Positioning System, enabling a universally accessible, bird’s eye view of the world. Google has become the most powerful map-maker the world has ever seen. More than a billion people have downloaded Google Earth. But that power brings dangers. In 2010 Nicaraguan forces occupied an island belonging to neighbouring Costa Rica, justifying the incursion by reference to a Google map that had accidentally placed the border in the wrong place.

The nature of map-making in a digital age is philosophically quite different from the cartography of the past. Google deliberately eschews top-down geography, in which every item on the map must have a single official name, in favour of an agnostic approach encouraging multiple identifications and other data that let users make their own decisions about what a map means. This is a non-prescriptive map, open to different interpretations and containing “as much discoverable information as possible”.

Where there is a dispute, modern digital maps usually reflect it: Google Mapslabel the islands off Argentina “Falklands (Malvinas)”, a succinct description of the political situation. Such ambiguity is anathema to autocratic regimes. Iran insists, for example, that the body of water separating Iran from the Arabian Peninsula is the “Persian Gulf”, not the “Arabian Gulf” nor even the neutral “Gulf”. When Google left the name off altogether last year, it prompted fury in Tehran and a mass online protest.

Even more threatening to the likes of Mahmoud Ahmadinejad is the idea that maps can be “democratised”, reflecting not the ideas of the State, but those of the people using them. So-called “neogeography” does not distil geographical information into one definitive shape, but rather offers a profusion of data generated by users, with map space becoming “an open platform to create content that accurately reflects their views”.

A map was once an official representation of the world to be found in an atlas or on a schoolroom wall. Today it lives in your phone, constantly evolving and able to reflect not some objective reality, but a subjective world tailored by individuals to what interests them. A map was once the same for everyone who used it; now there are infinite maps, distinct and particular to each of us. Traditional maps focused on a geopolitical centre of gravity — Jerusalem for the crusader, London for Victorian Britons, Tehran and Islam if you are Iran’s Communications Minister. The map-making revolution allows each of us to navigate the world with our own Mappa Mundi.

This is what truly frightens the despots — the new centre of gravity on the digital map is you.

    

 

 

European Austerity

 

The big news of the past week had nothing to do with the I.R.S. or Benghazi. It was the confirmation that, while the American economy continues to recover from the disastrous financial bust of 2008 and 2009, Europe remains mired in a seemingly endless slump.

On this side of the pond, the Congressional Budget Office announced that, with the economy expanding, tax revenues rising, and federal spending being restrained, the budget deficit is set to fall to about four per cent of Gross Domestic Product this year, and to 3.4 per cent next year. The latter figure is pretty close to the average for the past thirty years. At least for now, the great U.S. fiscal scare is over—not that you’d guess that from listening to the public debate in Washington. In Europe, things are going from bad to worse. New figures show that in the seventeen-member euro zone, G.D.P. has been contracting for six quarters in a row. The unemployment rate across the zone is 12.1 per cent, and an economic disaster that was once confined to the periphery of the continent is now striking at its core. France and Italy are both mired in recession, and even the mighty German economy is faltering badly.

Why the sharp divergence between the United States and Europe? When the Great Recession struck, U.S. policymakers did what mainstream textbooks recommend: they introduced monetary and fiscal-stimulus programs, which helped offset the retrenchments and job losses in the private sector. In Europe, austerity has been the order of the day, and it still is. Nearly five years after the financial crisis, governments are still trimming spending and cutting benefits in a vain attempt to bring down their budget deficits.

The big mystery isn’t why austerity has failed to work as advertised: anybody familiar with the concept of “aggregate demand” could explain that one. It is why an area with a population of more than three hundred million has stuck with a policy prescription that was discredited in the nineteen-twenties and thirties. The stock answer, which is that austerity is necessary to preserve the euro, doesn’t hold up. At this stage, austerity is the biggest threat to the euro. If the recession lasts for very much longer, political unrest is sure to mount, and the currency zone could well break up.

So why is this woebegone approach proving so sticky? Some of the answers can be found in a timely and suitably irreverent new book by Mark Blyth, a professor of political economy at Brown: “Austerity: The History of a Dangerous Idea.” Adopting a tone that is by turns bemused and outraged, Blyth traces the intellectual and political roots of austerity back to the Enlightenment, and the works of John Locke, David Hume, and Adam Smith. But he also provides a sharp analysis of Europe’s current predicament, explaining how an unholy alliance of financiers, central bankers, and German politicians foisted a draconian and unworkable policy on an unsuspecting populace.

The central fact about Europe’s “debt crisis” is that it largely originated in the private sector rather than the public sector. In 2007, Blyth reminds us, the ratio of net public debt to G.D.P. was just twelve per cent in Ireland and twenty-six per cent in Spain. In some places, such as Greece and Italy, the ratios were considerably higher. Over all, though, the euro zone was modestly indebted. Then came the financial crisis and the fateful decision to rescue many of the continent’s creaking banks, which had lent heavily into property bubbles and other speculative schemes. In Ireland, Spain, and other countries, bad bank debts were shifted onto the public sector’s balance sheet, which suddenly looked a lot less robust. But rather than recognizing the looming sovereign-debt crisis for what it was—an artifact of the speculative boom and bust in the financial sector—policymakers and commentators put the blame on public-sector profligacy.

“The result of all this opportunistic rebranding was the greatest bait-and-switch operation in modern history,” Blyth writes. “What were essentially private-sector debt problems were rechristened as ‘the Debt’ generated by ‘out-of-control’ public spending.”

The obvious alternative to rescuing the bad banks in the periphery countries was to let them go bust, but that was a risky option. As we saw in the United States after Lehman Brothers was allowed to fail, once one domino goes down the others get very shaky. Preventing a wholesale U.S. banking collapse took the Fed launching all sorts of emergency lending programs and Congress approving a seven-hundred-billion-dollar bailout. In Europe, such policies weren’t available. The E.C.B.’s charter didn’t provide for it acting as a lender of the last resort. And the European universal banks were simply too big to rescue. In 2008, Blyth recalls, the combined assets of the six largest U.S. banks came to sixty-one per cent of U.S. G.D.P. Compare that with Germany, where the biggest financial institutions (Deutsche Bank and Commerzbank) had assets equal to a hundred and fourteen per cent of German G.D.P., or France, where the three biggest banks (BNP Paribas, Société Générale, and Crédit Agricole) had assets equal to three hundred and sixteen per cent of France’s G.D.P.

With defaults and a wholesale bailout off the table, Europe was condemned to muddling through as best it could. Coming out of the first stage of the crisis, which lasted until the first half of 2011, it was saddled with a periphery - Greece, Ireland, Portugal - that had been bailed out but that was still sinking under enormous debts, and a financial system that was highly leveraged and loaded up with suspect government bonds. What the continent desperately needed was a return to growth-oriented policies of the sort adopted in the United States. Higher growth would have raised tax revenues, boosted job growth, and shored up the banks’ balance sheets. But largely due to the euro, Europe was stuck in an austerity vice.

Membership of the common currency prevented individual countries from printing money and devaluing their currencies, which is what the United States had done. Blyth notes:

If states cannot inflate their way out of trouble (no printing press) or devalue to do the same (non-sovereign currency), they can only default (which will blow up the banking system, so it’s not an option), which leaves internal deflation through prices and wages—austerity.

Theoretically, there is another option: fiscal stimulus in the form of tax cuts and more government spending. But that, too, is effectively ruled out. Under the terms of the euro zone’s comically misnamed Stability and Growth Pact, countries like France and Italy, which have budget deficits larger than three per cent of G.D.P., are legally obliged to cut spending, even though doing so is sure to depress the economy further, leading to lower tax revenues and bigger deficits. Meanwhile, member countries that have a budget surplus, particularly Germany, refuse to help their neighbors by introducing a stimulus.

It’s all quite mad, but that doesn’t mean it will end anytime soon. Indeed, about the only things that seem likely to change the situation are another blow up in the bond markets or a political revolution in a member state. So far, Mario Draghi, the Italian financier who took over as the chairman of the E.C.B. in 2011, has managed to prevent the first of these things from happening. And despite mass protests from Athens to Madrid, the pro-euro political establishment has held onto power.

Blyth rightly describes this whole sad story as an attempt to recreate a European version of the gold standard, the “barbarous relic” (Keynes) that helped bring about the Great Depression. But rather than confine himself to explaining and bemoaning the enduring appeal of austerity policies, Blyth explores their roots in the laissez-faire writings of Locke, Smith, and David Ricardo; the Treasury view of the nineteen-twenties; the Austrian business cycle theory of Friedrich Hayek and Joseph Schumpeter; the monetarism of Milton Friedman; the Washington Consensus of the I.M.F. and the World Bank; and the “expansionary austerity” school that emerged from Bocconi University, in Milan. With so much hinging on Germany, the discussion of postwar German ordoliberalism, which underpins Berlin’s hostility to expansionary policies, is particularly valuable.

As Blyth points out, German politicians influenced by ordoliberalism, such as Chancellor Angela Merkel and Wolfgang Schäuble, the finance minister, aren’t hostile to government activism in the same way conservatives in the United States and Britain are. To the contrary, they believe in a social market economy, where the state sets the rules, including the generous provision of entitlement benefits, and vigorously enforces them. But encouraged by Germany’s success in creating an export-led industrial juggernaut, they believe that everybody else, even much less efficient economies, such as Greece and Portugal, should copy them rather than rely on the crutch of easy money and deficit-financed stimulus programs.

That’s all very well if you are an official at the Bundesbank, or one of the parsimonious Swabian housewives beloved of Merkel, but it ignores a couple of things. First, it’s the very presence of weaker economies in the euro zone that keeps the value of the currency at competitive levels, greatly helping German industry. If Greece and Portugal and other periphery countries dropped out, the euro would spike up, making Volkswagens and BMWs a lot more expensive. Second, it isn’t arithmetically possible for every country to turn into Germany and run a big trade surplus. On this, Blyth quotes Martin Wolff, of the Financial Times: “Is everybody supposed to run a current account surplus? And if so, with whom—Martians? And if everybody does indeed try to run a savings surplus, what else can be the outcome but a permanent global depression?”

For many parts of Europe, the depression is already here, and its cost is mounting. In Spain and Greece, the unemployment rate among some younger demographics is close to fifty per cent. Meanwhile, the calls on the Spanish and Greek governments to downsize their spending programs continue. Blyth, who grew up poor in the Scottish town of Dundee, discusses what this means in personal terms.

Probabilistically speaking, I am an as extreme example of intergenerational social mobility as you can find anywhere. What made it possible for me to become the man I am today is the very thing now blamed for creating the crisis itself: the state, more specifically, the so-called runaway, bloated, paternalist, out-of-control welfare stat ...

I was never hungry. My grandmother’s pension plus free school meals took care of that. I never lacked shelter because of social housing. The schools I attended were free and actually acted as ladders of mobility for those randomly given the skills in the genetic lottery of life to climb them.

So what bothers me on a deep personal level is that if austerity is seen as the only way forward, then not only is it unfair to the current generation of “workers bailing bankers,” but the next “me” may not happen.

Is that laying it on a bit thick? Perhaps. Blyth’s larger point, though, is valid. Ultimately, economics cannot be divorced from morality and ethics. Austerity’s failure isn’t just a matter of disappointing G.D.P. figures and missed deficit targets. It’s a human calamity, and one that could have been avoided.

    

 

 

The "War" On Terror

 

Back in January, 2002, when George W. Bush’s war on terror was getting into full swing, Terry Jones, a British comedian who was part of the Monty Python troupe, asked an awkward question: “How do you wage war on an abstract noun? It’s rather like bombing murder.” Eleven years later, nobody has come up with a convincing answer, perhaps because there isn’t one. But in the past couple of days, we’ve seen some laudable efforts to reframe the question in a manner that’s more amenable to rational discourse.

I’ll get to President Obama’s speech about resetting U.S. policy in a moment, but he wasn’t the only politician who spoke yesterday about combating terrorism. In London, David Cameron, the British Prime Minister, delivered a commendably measured response to the brutal murder of an off-duty British soldier outside an Army barracks in Woolwich, south London. After paying tribute to the victim, Lee Rigby, a twenty-six-year-old private who had served in Afghanistan, and issuing the standard declaration that Britain “will never give in to terror,” Cameron noted that the attack, carried out with kitchen knives and meat cleavers, “was also a betrayal of Islam—and of the Muslim communities who are give so much to our country. There is nothing in Islam that justifies this truly dreadful act.” Cameron went on:

We will defeat violent extremism by standing together, backing our police and security services, and, above all, by challenging the poisonous narrative of extremism on which this violence feeds. He ended the speech thus: “The police have responded with heightened security and activity—and that is right. But one of the best ways of defeating terrorism is to go about our normal lives. And that is what we shall all do.”

Quite possibly, Cameron’s tone had something to do with the British yen for adopting a pose of sang-froid toward anything short of a nuclear attack. But, like Obama, he was also trying to come to terms with reality. In Woolwich, as in Boston last month, the attacks, heinous as they were, appear to be have been petty plots cooked up by disaffected local youths who had turned to radical Islam but who had little or no contact with organized terrorist groups, such as Al Qaeda. While such attacks can succeed in spreading terror, they pose no significant threat to the state. In what sense, then, can they justify putting (or keeping) the country on a permanent state of war footing?

This was one of the questions that Obama addressed in his speech at the National Defense University on Thursday, which has rightly received positive reviews, including one from my colleague Jane Mayer. Noting that America has so far expended more than a trillion dollars and seven thousand lives in the open-ended conflict that Bush began, he called on Congress to amend its 2001 authorization for the use of military force, which gave the President broad latitude to engage in counterterrorism operations anywhere in the world.

Commentators and human-rights groups have rightly noted that a speech is only a speech. Will the new guidelines for drones attacks make much of a difference in how they are carried out? Will Guantánamo be closed? Will the Justice Department continue to subpeona reporters’ phone records in search of national-security leaks? It will take some time for the answers to emerge. But Obama’s speech shouldn’t be judged solely on how its individual recommendations are carried out—its contribution was broader. It didn’t just question the utility of individual measures, such as holding prisoners without trial, it queried the intellectual underpinnings of the whole war-on-terror enterprise, the entire mindset that has gripped the country for the past eleven and a half years.

Such a questioning was long overdue. Civil liberties aren’t the only liberties that the war on terror has abridged. Equally pernicious has been its encroachment on intellectual liberty. Ever since 9/11, the mere mention of Al Qaeda, or the general threat of radical Islam, has often been sufficient to suspend sensible debate about all sorts of questions. In an era of fiscal retrenchment, does it make sense to spend taxpayer’s money on ever-more elaborate airport scanners, or on Alzheimer’s research? What’s the bigger threat to the United States, a splintered Al Qaeda or a revival of tensions between China and Japan? Should the rebuilt World Trade Center be converted into a semi-militarized security zone? Such questions are rarely even discussed, outside of a few rarified think tanks and editorial boards.

Part of the problem goes back to the conceptual issue that Terry Jones identified back in 2002: If you go to war with “terrorism” or “terror,” how will you know when you’ve won?

With most wars, you can say you’ve won when the other side is either all dead or surrenders. But how is terrorism going to surrender? It’s hard for abstract nouns to surrender. In fact it’s very hard for abstract nouns to do anything at all of their own volition—even trained philologists can’t negotiate with them…

The bitter semantic truth is that you can’t win against these sort of words - unless, I suppose, you get them thrown out of the Oxford English Dictionary.

Obama didn’t endorse that particular idea. Instead, he suggested restricting the definition of terrorism to something narrower and more manageable: “Beyond Afghanistan, we must define our effort not as a boundless ‘global war on terror’ but rather as a series of persistent, targeted efforts to dismantle specific networks of violent extremists that threaten America.” That sounds sensible, as does his admission that “neither I, nor any President, can promise the total defeat of terror. We will never erase the evil that lies in the hearts of some human beings, nor stamp out every danger to our open society.”

The attacks in Boston and Woolwich illustrated that, of course, and, given the resentments (some justified, some fantastical) that exist in the Muslim world, there are likely to be more of them in the future. At least for now, though, our leaders are making some of the right noises, which is important. As Obama said, “We must define the nature and scope of this struggle, or else it will define us.”

    

 

 

Karl Marx and the Tea Party

 

Marxist theory can be summarized in two distinct ways.

The first view (held mostly by its detractors) is that Marxism is little more than the politics of resentment — a philosophical justification for the hatred of success by those who failed to achieve it. The politics of resentment offers three different methods for bringing its program of economic jealousy to fruition: Under socialism, the unsuccessful use the power of government to forcibly extract wealth and possessions from the successful, bit by bit until there is nothing left; under the more extreme communism, the very notion of wealth or success is eliminated entirely, and anyone who seeks individual achievement is punished or eliminated; and finally under anarchy, freelance predators would be allowed to steal or destroy any existing wealth or possessions with no interference from the state. Marx himself saw pure communism as the ultimate goal, with socialism as a necessary precursor, and perhaps just an occasional dash of anarchy to ignite the revolutionary fires.

But there is another, more intriguing and less noxious, view of Marxist thought that gets less attention these days because its anachronistic roots in the Industrial Revolution seemingly render it somewhat irrelevant to modern economics. Marx posited that factory workers should own the factory themselves and profit from its output, since they’e the ones actually doing the work — and the wealthy fat cat “capitalists” should be booted out of the director’s office since they don’t really do anything except profit from other people’s labor. Marx generalized this notion to “The workers should control the means of production,” and then extended it further to a national scale by declaring that the overall government itself should be “a dictatorship of the proletariat,” with “proletariat” defined in this context as “someone who actually works for a living.” The problem with this theory in the 21st century is that very few people actually work in factories anymore due to exponential improvements in automation and efficiency, and fewer still produce handicrafts, and the vast majority of American “workers” these days don’t actually create anything tangible. Even so, there is an attractive populist rationality to this aspect of Marxism that appeals to everyone’s sense of fairness — even to those who staunchly reject the rest of communist theory. Those who do the work should reap the benefits and control the system; hard to argue with that.

Although the “factory” is no longer the basic building block of the American economy, Marx’s notion that “The workers should control the means of production” can be rescued and made freshly relevant if it is re-interpreted in a contemporary American context.

Visualize the entire United States as one vast “company,” with citizens as employees and politicians and bureaucrats as managers. Everybody, in theory, works together to make the company successful. But there are two realities which shatter this idealized theory: first, only about half the employees actually ever do any work, while the rest seem to be on permanent vacation or sick leave; and second, our bureaucratic “managers” — just like the wealthy fat cats in Marx’s vision — simply benefit from the labor of others without ever producing anything of value themselves.

Now, this “company” known as the USA doesn’t operate in the way traditional companies operate. In our system, we create only a single product every year, a gigantic pile of money we call the “Federal Budget.” Each “employee” is free to engage in any profitable activity or profession of his choice, just so long as at the end of the year he (or she, obviously) adds his earnings to the collective pile, setting aside a certain amount for living expenses. The “managers” then decide how this gigantic pile of money is spent, presumably to keep the company healthy and strong.

The formula to determine how much each employee gets to keep for living expenses is called “the tax code,” and those who contribute to the national product are called “taxpayers.” The managers deciding how the pile is spent are “politicians,” who are chosen every two years in a shareholders’ meeting called an “election.”

This system worked pretty well for quite a long time — until recently. It is only within the last few years that something remarkable happened: The number of contributing “taxpayers” in the country for the first time has fallen to approximately 50% of the population. Meanwhile, the number of unemployed, retired, disabled or indigent citizens grew, as did the number of citizens who earned so little in part-time or low-paying jobs that they paid no taxes, as did the number of people laboring in the untaxed underground economy, as did the number of bureaucrats.

The end result of this epochal demographic and economic shift is that for the first time in American history, the people who actually work for a living and contribute to the common good — the “proletariat” in Marx’s version, and the “taxpayers” in ours — no longer control the company. Vote-wise, the scales have tipped in favor on the non-contributors and the bureaucrats, and suddenly they are the ones making the decisions about what to do with our collective gigantic pile of money — while those who actually created the pile through their work and tax contributions have become powerless.

It is outrage over this very power shift that spawned the Tea Party, which is essentially a movement of taxpayers angry that they no longer get to determine how their taxes are spent. Historically speaking, the Tea Party movement can be accurately defined as a workers’ revolution.

Karl Marx, were he alive today, would approve.

At least he would if he was able to follow his own theories to their logical conclusion. Unfortunately, the arc of history has exposed an untenable logical paradox at the heart of Marxist theory: What if the “workers” — the actually productive people in society whom Marx assumed were motivated by resentment — instead were motivated by a desire for self-determination? What if the “parasitical class” was not merely (as Marx posited) the do-nothings at the top but the do-nothings at the top and the bottom?

Marxist ideologues will likely be affronted by my analysis, saying I have no right to twist Marx’s ideas to meet my modern notions. But in truth, re-interpreting Marx is not only commonplace but necessary, even to his followers, since the mid-19th century framework of his arguments was already outdated by the start of the 20th century, leading to any number of post-Marxist theorists and revolutionaries who have put their own spin and interpretation on his ideas. Without updating and re-interpretation, Marx would be irrelevant by now.

No one has a monopoly on Marxist theory. Not even Marxists.

The Tea Party is a workers’ revolution. Modern “progressivism” is a reactionary totalitarian movement. The sooner that honest Marxists grasp this, the sooner “the people” can achieve liberation.

    

 

 

Niall Ferguson on Social Inequality

 

“The United States is where great things are possible.” Those are the words of Elon Musk, whose astonishing career illustrates that the American dream can still come true.

Musk was born in South Africa but emigrated to the United States via Canada in the 1990s. After completing degrees in economics and physics at the University of Pennsylvania, he moved to Silicon Valley, intent on addressing three of the most “important problems that would most affect the future of humanity”: the Internet, clean energy, and space. Having founded PayPal, Tesla Motors, and SpaceX, he has pulled off an astonishing trifecta. At the age of 42, he is worth an estimated $2.4 billion. Way to go!

But for every Musk, how many talented young people are out there who never get those crucial lucky breaks? Everyone knows that the United States has become more unequal in recent decades. Indeed, the last presidential election campaign was dominated by what turned out to be an unequal contest between “the 1 percent” and the “47 percent” whose votes Mitt Romney notoriously wrote off.

But the real problem may be more insidious than the figures about income and wealth distribution imply. Even more disturbing is the growing evidence that social mobility is also declining in America.

The distinction is an important one. For many years, surveys have revealed a fundamental difference between Americans and Europeans. Americans have a much higher toleration for inequality. But that toleration is implicitly conditional on there being more social mobility in the United States than in Europe.

But what if that tradeoff no longer exists? What if the United States now offers the worst of both worlds: high inequality with low social mobility? And what if this is one of the hidden structural obstacles to economic recovery? Indeed, what if current monetary policy is making the problem of social immobility even worse?

It’s harder than ever for Americans at the bottom to rise to higher income levels.

This ought to be grist for the mill for American conservatives. But Republicans have flunked the challenge. By failing to distinguish between inequality and mobility, they have allowed Democrats, in effect, to equate the two, leaving the GOP looking like the party of the 1 percent—hardly an election-winning strategy.

To their cost, American conservatives have forgotten Winston Churchill’s famous distinction between left and right—that the left favors the line, the right the ladder. Democrats do indeed support policies that encourage voters to line up for entitlements—policies that often have the unintended consequence of trapping recipients in dependency on the state. Republicans need to start reminding people that conservatism is about more than just cutting benefits. It’s supposed to be about getting people to climb the ladder of opportunity.

Inequality and social immobility are, of course, related. But they’re not the same, as liberals often claim.

The American Dream has become a nightmare of social stasis.

Let’s start with inequality. It’s now well known that in the mid-2000s the share of income going to the top 1 percent of the population returned to where it was in the days of F. Scott Fitzgerald’s Great Gatsby. The average income of the 1 percent was roughly 30 times higher than the average income of everyone else. The financial crisis reduced the gap, but only slightly—and temporarily. That is because the primary (and avowed) aim of the Federal Reserve’s monetary policy since 2008 has been to push up the price of assets. Guess what? The rich own most of these. To be precise, the top 1 percent owns around 35 percent of the total net worth of the United States—and 42 percent of the financial wealth. (Note that in only one other developed economy does the 1 percent own such a large share of wealth: Switzerland.)

By restoring the stock market to where it was back before the crisis, the Fed has not achieved much of an economic recovery. But it has brilliantly succeeded in making the rich richer. And their kids.

The “cognitive elite” marry one another and cluster together in fewer than a thousand exclusive neighborhoods. According to Credit Suisse, around a third of the world’s thousand or so billionaires in 2012 were American. But of these, just under 30 percent were not self-made—a significantly higher proportion than for Australia and the United Kingdom. In other words, today an American billionaire is more likely to have inherited his or her wealth than a British one is.

This is just one of many indications of falling social mobility in the U.S. According to research published by the German Institute for the Study of Labor, 42 percent of American men born and raised in the bottom fifth of the income distribution end up staying there as adults, compared with just 30 percent in Britain and 28 percent in Finland. An American’s chance of getting from the bottom fifth to the top fifth is 1 in 13. For a British or Finnish boy, the odds are better: more like 1 in 8.

True, the relatively flat income distribution of Scandinavian countries makes it easier to get from the bottom to the top—there’s less financial distance to travel. But the same cannot really be said of Britain. Indeed, the amazing thing about the most recent research on social mobility is that the United Kingdom—which used to have the most rigid class structure in the developed world—now risks losing that title to the United States. No wonder Downton Abbey is so popular here.

The American Dream has become a nightmare of social stasis. According to research by Pew, just under 60 percent of Americans raised in the top fifth of incomes end up staying in the top two fifths; a fractionally higher proportion of those born in the bottom fifth—60.4 percent—end up staying in the bottom two fifths.

Perhaps not surprisingly, the child poverty rate is more than double the poverty rate for seniors.

This is the America so vividly described by Charles Murray in his bestselling book Coming Apart. At one end of the social scale, living in places with names like “Belmont,” is Murray’s “cognitive elite” of around 1.5 million people. They and their children dominate admissions to the country’s top colleges. They marry one another and cluster together in fewer than a thousand exclusive neighborhoods—the enclaves of wealth that Murray calls the SuperZips.

At the other end, there are places like “Fishtown,” where nobody has more than a high school diploma; a rising share of children live with a single parent, often a young and poorly educated “never-married mother.” Not only has illegitimacy risen in such towns, so has the share of men saying they are unable to work because of illness or disability or who are unemployed or who work fewer than 40 hours a week. Crime is rampant; so is the rate of incarceration. In other words, problems that used to be disproportionately associated with African-American communities are now endemic in the trailer parks and subprime slums inhabited by poor whites. You get born there, you stay there—unless you get sent to jail.

What has gone wrong? American liberals argue that widening inequality inevitably causes falling social mobility. This was what Alan Krueger, chairman of the Council of Economic Advisers, had in mind back in January, when he came up with the “Great Gatsby Curve,” showing that more unequal countries have less social mobility. (Hang on, wasn’t Gatsby a self-made bootlegger?) But to European eyes, this is also a familiar story of poverty traps created by well-intentioned welfare programs. Consider the case highlighted by Gary Alexander, Pennsylvania’s former secretary of public welfare. A single mom with two young kids is better off doing a part-time job for just $29,000—on top of which she receives $28,327 in various benefits—than if she accepts a job that pays $69,000, on which she would pay $11,955 in taxes.

How can social mobility increase in a society that cares twice as much for Grandma as for Junior?

Another good example is the growth in the number of Americans claiming Social Security disability benefits. Back in the mid 1980s, little more than 1.5 percent of the population received such benefits; today it’s nearly 3.5 percent. Nor (as used to be the case) are the recipients mainly elderly. Around 6 percent of the population aged between 45 and 54—my age group—are SSDI beneficiaries. Payments to disabled workers average $1,130 a month, which works out as $13,560 a year—just $2,000 less than a full-time wage at the federal minimum of $7.25 an hour.

Maybe we really are unhealthier than we were 30 years ago, though the data on life expectancy tell a different story. Maybe work really has got more physically demanding, though the shift from manufacturing to services also suggests otherwise. The more credible possibility is that it has become easier for the mildly unwell or unfit to get classified as disabled and to opt for idle poverty over working poverty, which pays only slightly better and means working with that niggling backache or mild depression.

Significantly, after two years on disability benefit, you qualify for Medicare, swelling the ever-growing number of beneficiaries of the federal government’s most expensive welfare program. Right now, federal spending on health care, according to the Congressional Budget Office, is around 5 percent of GDP, but it is forecast to double by the 2040s. Needless to say, this reflects the great demographic shift that is inexorably driving up the share of seniors in the population. But consider how the combination of an aging population and welfare programs is working to reduce the resources available to young people.

According to the Urban Institute, the current share of federal spending on the young is around 10 percent, compared with the 41 percent that goes on the non-child portions of Social Security, Medicare, and Medicaid. Per capita government spending—including state and local budgets—is roughly double for the elderly what it is for children. Perhaps not surprisingly, the child poverty rate is more than double the poverty rate for seniors. Ask yourself: how can social mobility possibly increase in a society that cares twice as much for Grandma as for Junior?

The only mystery that remains is why this generational conflict has not yet become a serious issue in American politics. Bafflingly, young voters still tend to line up with the very organizations that seem most intent on ratcheting up the future liabilities of government (not to mention the teenage unemployment rate)—notably the public-sector unions.

Writing in 1960, the economist Friedrich Hayek made a remarkable prediction about the ultimate consequences of the welfare state. “Most of those who will retire at the end of the century,” he wrote, “will be dependent on the charity of the younger generation. And ultimately not morals but the fact that the young supply the police and the army will decide the issue: concentration camps for the aged unable to maintain themselves are likely to be the fate of an old generation whose income is entirely dependent on coercing the young.”

For every Musk, how many talented young people will never get a lucky break?

Hayek was right that by 2000 the baby boomers would expect the young to bear the rising costs of their protracted and generously funded retirements. Almost alone among postwar economists, he saw the generational conflict implied by the welfare state. But he was wrong about how the younger generation would react. Far from rounding up the old and putting them in camps, it is the young who are the docile victims.

One possible explanation for this docility lies in the other main reason for declining social mobility: the disastrous failure of American high schools in the places like Murray’s imaginary Fishtown.

Despite a tripling of per-pupil expenditure in real terms, American secondary education is failing. According to the Council on Foreign Relations, three quarters of U.S. citizens between the ages of 17 and 24 are not qualified to join the military because they are physically unfit, have criminal records, or have inadequate levels of education. A third of high school graduates fail the mandatory Armed Services Vocational Aptitude Battery. Two fifths of students at four-year colleges need to take remedial courses to relearn what they failed to master in high school.

The cognitive elite discreetly rig the game in favor of their offspring with well-timed benefactions.

In international comparison, the United States is now somewhere in the middle of the league table for mathematical aptitude at age 15. The Organization for Economic Cooperation and Development’s most recent Program for International Student Assessment (PISA) study was damning: in math, the gap between the teenagers in the Shanghai district of China and the United States is as large as the gap between American teenagers and Albanians.

But the real shocker is the differential between rich and poor kids. At the ages of 4 to 5, children from the poorest fifth of homes are already 21.6 months behind children from the richest homes in the U.S., compared with 10.6 months in Canada. The proportion of 15-year-olds who are functionally illiterate (below level 2 in PISA tests) is 10.3 percent in Canada. In the U.S. it is 17.6 percent. And students from the highest social-class groups are twice as likely to go to college than those from the lowest classes.

Meanwhile, there are disturbing signs that America’s elite educational institutions are reverting to their old role as finishing schools for the children of a hereditary elite—the role they played back when F. Scott Fitzgerald was partying at Princeton.

At the ages of 4 to 5, children from the poorest fifth of homes in the U.S. are already 21.6 months mathematically behind children from the richest homes.

In a disturbing critique of Ivy League admissions policies, the editor of the American Conservative, Ron Unz, recently pointed out a number of puzzling anomalies. For example, since the mid-1990s Asians have consistently accounted for around 16 percent of Harvard enrollments. At Columbia, according to Unz, the Asian share has actually fallen from 23 percent in 1993 to below 16 percent in 2011. Yet, according to the U.S. census, the number of Asians aged between 18 and 21 has more than doubled in that period. Moreover, Asians now account for 28 percent of National Merit Scholarship semifinalists and 39 percent of students at CalTech, where admissions are based purely on academic merit.

Perhaps those in charge of Ivy League admissions have good reasons for their decisions. Perhaps it is right that they should do more than simply pick the most academically talented and industrious students who apply. But the possibility cannot be rejected out of hand that, whatever their intentions, the net effect of their pursuit of “diversity” is in fact to reduce yet further this country’s once unique social mobility. Nor can we dismiss the hypothesis that the “legacy” system may be the key here, as the cognitive elite discreetly rig the game in favor of their offspring with well-timed benefactions.

As a professor at Harvard, I am disquieted by such thoughts. Unlike Elon Musk, I did not come to the United States intent on making a fortune. Wealth was not my American dream. But I did come here because I believed in American meritocracy, and I was pretty sure that I would be teaching fewer beneficiaries of inherited privilege than I had encountered at Oxford.

Now I am not so sure.

    

 

 

Loss of Faith in Govt

 

Large majorities of voters think Europe’s governments tax unfairly and spend inefficiently.

During the 20th century, Europe’s progressive parties based their appeal on two propositions: that they cared for ordinary folk and knew how to harness state power to convert care into action.

No longer. A major YouGov study in Britain, France, Germany and Sweden shows that few still believe either proposition. Millions in all four countries no longer think left-of- centre parties care about them; and most reject the idea that governments are good at solving social problems.

Our figures show how nostalgia is strong but hope is weak. In three of the four countries, half of voters agree that the main centre-left party “used to care about people like me” (Germany 52 per cent, Britain 51, Sweden 49). But in all four the proportions saying their main centre-left party still “cares now about people like me” are much lower. For Britain’s Labour Party the figure is down 19 points, at 32 per cent.

To some extent, left-of-centre parties are suffering a decline in faith in politics as a whole. The figures for right-of-centre parties are also down. But in each country they are down by far less. Perhaps because of Angela Merkel’s dominance of German politics, her Christian Democrat party is now more widely thought to care than the opposition Social Democrats. Even where centre-left parties are still regarded as more caring, their lead is too narrow for comfort. Historically, Europe’s Left has relied on its commitment to fairness to trump any advantage for the Right on economic competence. Today, whenever the Left trails on competence it looks unlikely to overcome that deficit by proclaiming its compassion.

The problem goes deeper than party reputation. The cause of progressive politics in recent decades has relied on state action — to give every family access to schools and healthcare, to ward off poverty, insure against misfortune and support the elderly. In the early decades after the Second World War the popularity of this state expansion came because much of the cost was borne by those on above-average incomes. Most working-class people, who then comprised about two thirds of the electorate, paid relatively modest taxes.

Today this absorbs far more of a nation’s income than it used to. In Britain, government spending on health, welfare and education has climbed from 12 per cent of GDP 60 years ago to 28 per cent today. To meet this cost, taxes have risen. Working- class families have to pay far more for their welfare rights and public services.

Our survey shows how this is now threatening the postwar consensus. We asked people to consider the taxes they pay, and all the services and benefits they receive from the State. The responses are stark. In all four countries, at least six in ten think their tax system is unfair and at least two thirds say that state spending on services and benefits is inefficient. By margins from three-to-one (Sweden) to nine-to-one (France) people think their families pay in more in taxes than they get back in pensions, welfare benefits, education, healthcare, policing and other public services. In Britain 56 per cent think they pay in more than they get out. Only 15 per cent think they are net beneficiaries.

It is not just that the traditional working class has shrunk; even among the diminished number of manual workers the dominant view is that their tax bill is greater than the benefits and services they receive. The financial crisis has exacerbated this, but not caused it: the forces at work are far greater and go back far longer.

This does not mean that the desire for well-run public services and decent welfare has gone. In all four countries there is majority support for the principle of an active State — providing good-quality public services, reducing inequality and regulating big banks and businesses. However, millions of people across Europe despair of the State’s ability in practice to do any of these things effectively. The result is a widespread demand for smaller, smarter government.

We asked people about their ideal society. Would the government do less and tax less, do more and tax more, or keep the current balance? In the three countries with centre-right governments — Britain, Germany and Sweden — big majorities want the status quo or smaller government. In Britain and Germany, support for more tax-and-spend is remarkably low: 13 per cent and 11 respectively. No wonder Britain’s Labour Party and Germany’s Social Democrats now emphasise financial prudence more than social ambition. But it’s not just their policy they have to get right: it’s their broader reputation. As long as voters think that their heart is still set on higher taxes and greater spending, they will struggle to return to power.

Parties seldom plan more than one election ahead. Our findings suggest that centre-left parties in all four countries — and, by extension, across the developed world — need to rethink their long-term vision as well as their short-term tactics.

    

 

 

Greens Are Finished

 

The cash wasted on failed global warming policies would be better spent on tackling the problems faced by the poor Seven years ago, pulled along by huskies, David Cameron visited a Norwegian glacier. Vote blue, he implored, and go green. One year later Kevin Rudd became Prime Minister of Australia after identifying climate change as the “greatest moral challenge of our time”. Climate change campaigners interpreted his victory as one of seismic importance and governments across Europe rushed to pour money into the renewable energy sector. Then in 2008 along came Barack Obama. The wicked George W. Bush was replaced with a president who promised to stop global warming. Hurrah!

And, for a period, Mr Obama seemed determined to deliver. Here, after all, was the president, some would have us believe, who could walk on water. One year into his blessed reign he was awarded the Nobel Peace Prize without having secured peace in any part of the world. He was top of the pops in global opinion surveys. Just about every world leader wanted to be photographed alongside him.

Super-Obama’s great opportunity to save the planet came in 2009 at the Copenhagen climate change summit. He was at the height of his political powers. His Democratic party controlled all of Washington: the presidency, the House of Representatives and the Senate. And yet Copenhagen ended in the same way as almost every other climate change summit of recent times: in failure. Having failed to persuade members of his own party to cut greenhouse gas emissions, Mr Obama also failed to persuade the governments of New Delhi, Beijing and Brasília.

The writing may have been on the wall in 2009, but the green movement has soldiered on. Theirs, they believed, was a moral mission of such importance that nothing would or should get in their way. Whatever the economic, social or political price they were determined to succeed. The doubts of sceptics like me could always be ignored, but when the politicians who once championed green politics are in retreat it is perhaps time for even ecological diehards to get real.

And in the past ten days one of the greenest of green politicians has to all intents, constructions and purposes given up. Last week Australia’s green movement suffered a defeat at least as big as those of the country’s cricket and rugby teams. Mr Rudd announced that he would ditch the carbon tax that had threatened to consign his Labor Party to one of the worst defeats in its history.

All over the world green politicians are presiding over similar climbdowns. From Washington to London, shale gas rather than any renewable technology is seen as the future. Even nations such as Germany and Spain, which led the march to green energy, are slashing unaffordable subsidies to the renewables industry. Lord Lawson of Blaby has claimed that the average share price of companies in the renewable sector has fallen by 80 per cent over five years. “One renewable company after another is going bankrupt,” he declared.

The heavy cost of green energy policies might have been justifiable if they had delivered results, but they haven’t. Since the Kyoto treaty on climate change, global emissions have continued to rise. Since 1990 they have increased by about 50 per cent. China’s increase in emissions has been 25 times greater than the reduction by the EU’s core nations. In so far as Europe has actually met its environmental obligations it has only done so by exporting industrial capacity (and jobs). Once the environmental impact of imported goods has been added to its carbon footprint Europe has clearly failed to keep its environmental promises.

One commentator, Bjørn Lomborg, spelt out the futility of Europe’s unilateral environmentalism. Germany’s efforts to combat climate change might, he calculated, just possibly delay a rise in global temperatures by 37 hours, but that delay will have cost German taxpayers and consumers more than $100 billion in the form of renewable subsidies and higher electricity costs. That’s about $3 billion an hour.

Green enthusiasts are kidding themselves if they blame the global economic slump for the failure of climate change policies. Their policies were always an attempt to defy economic gravity. No half-decent politician in any part of the developing world was ever going to delay economic progress by embracing expensive energy sources. Any policies that prevent a clinic in India from being able to refrigerate medicines or a student in China from being able to read at night were always destined to fail. I am not one of those people who deny that the climate might be changing. I don’t feel qualified to question the majority of scientists who insist that warming is both real and man-made. My objection to global warming policies is more practical. They aren’t succeeding in cutting emissions and they aren’t going to succeed until so-called clean energy is similar in cost to conventional energy. Until then — and we should be investing in green technologies in the meantime — the demands of millions of wealthy green campaigners will continue to be overwhelmed by the demand from billions of poor people for economic growth and the social justice that it affords them.

Two decades of green policies haven’t just failed to stop global warming. Old age pensioners in Britain and in other developed countries have been forced to bear electricity bills inflated by renewable subsidies. Blue-collar workers have lost their jobs as energy-intensive manufacturing companies have relocated overseas. Beautiful landscapes have been ruined by bird-chopping wind turbines.

There have also been huge opportunity costs. What could world leaders have achieved if they hadn’t spent the past 25 years investing so much money and summitry on global warming? In a brilliant book — How to Spend $75 Billion to Make the World a Better Place — Mr Lomborg has documented how politicians could have been tackling more pressing problems facing the world’s poorest people. Action on HIV/Aids, for example, the provision of micro nutrients to hungry children, the control of malaria, guarantees of clean water and the liberalisation of trade would all have been better uses of politicians’ time and taxpayers’ money.

Many of Britain’s politicians — notably the Chancellor, George Osborne — know all of this. But outside of last week’s welcome but overdue encouragement of fracking, Britain’s statute book is still creaking under the weight of yesteryear’s laws and their commitments to invest in expensive green energies. Until those laws are repealed British businesses and consumers will be paying a very high price for no earthly benefit.

    

 

 

Changing Your Mind

 

In all arguments, from faith to politics, sexuality to morality, a little messed-up thinking takes us a long way.

‘Who am I to judge?”

Pope Francis was commenting this week on the moral status of Roman Catholic homosexuals. My first response was scornful. If the Holy Father feels unqualified to judge a straightforward question of sexual morality then what’s the point of popes? Do not priests in their confessionals daily dispense implicit judgment on adultery, abortion, contraception?

It was the inconsistency that irked me. I care about consistency. I recoil when someone tosses his own argument lightly aside, even to support a position I favour. It’s immaterial that I don’t myself believe in the Apostolic Succession; I’m still offended when believers ignore the internal logic of Catholicism. Rome accepts certain core premises about an event 2000 years ago, and from these premises flow certain conclusions. A pope who questions his own status in matters of human morality kicks at the props that support his whole authority.

I could go on like this but I won’t, because it’s wrong. My rationalism, though very dear to me, misses a wellspring of history. My rationalism has not understood the human heart and how it changes and how we guess when it is changing. My rationalism hasn’t even understood the human mind, and how it changes and what are the telltale signs that it is changing. All hung up on logic and consistency, my rationalism has failed to comprehend how little it sometimes matters what, on the face of it, people say they believe.

Hypocrisy is a healthy part of what we are, how we work and how politics works. Three cheers to the jury that acquits a defendant they know is guilty as charged, because they think the law’s unfair. It is very, very important that, while outwardly professing the centrality of logic, we do not shackle ourselves to arguments. Reasoning does matter in politics, but instinct and observation, not logic, are often what best signal to us that an argument is off course.

We need to leave ourselves plenty of slack in politics and morality, plenty of space, to follow the promptings of intuition in what we do; and when it doesn’t feel right, nobody should padlock himself to the maxim: “Practise what you preach.” A healthy dollop of hypocrisy can liberate.

Whatever Pope Francis seemed to be saying, whatever he may himself think he meant, he was really saying: “We may have got this wrong.” Maybe his views have been affected by his own observation of the extent of undeclared homosexuality among the priesthood. The logician in me says: “Hypocrites!” A wiser voice says: “Thank heavens so many have not practised what they preached; taking note of the practice, this Pope will perhaps now revisit the preaching.”

When in the 1970s it began to be fashionable to say that, while homosexuality might be a pity, tolerance demanded a kinder approach, the logician in me said: “No. With child molesters we do not ‘tolerate’. Make the positive moral case for same-sex love, don’t burble about tolerance.” Then a wiser voice said: “Thank heavens opinion is moving and people are disapproving less. Equality can come later.”

The new Archbishop of Canterbury, with his ambivalent and shifting remarks on women bishops and on homosexuality, appears to be on a similar journey. Nothing is easier to demonstrate than that Dr Justin Welby, Dr Rowan Williams and the Anglican Synod have got themselves into a total logical mess on the status of women clergy, or that the Church of England’s position on gay priests is philosophically farcical. But it doesn’t matter. These two confusions are a thoroughly good thing. If you want to get from A to B in public or religious affairs it may be necessary to spend some time in a fog patch en route.

We’re in such a fog patch on drugs, too, at present; and in a similar mess on prostitution. In both cases the law, or the prosecution guidelines, or the way the police use their discretion in practice (and often an unholy mixture of all three) effectively mean that individuals can use a little cannabis with relative impunity, but buying or selling the drug is an actively prosecuted crime. As anyone can see and as is often remarked, there’s no logic here; no more logic than the law’s permission to buy or sell sex, coupled with the law’s criminalisation of brothels, of soliciting and of pimps.

In both cases it’s not even clear that society is moving from a less to a more permissive regime; we may be stuck in the middle. Although the current situation may be hypocritical and confusing, it’s better than a rigorous and consistent criminalisation of what society still claims to believe are unmitigated evils. Never ask politicians to make up their minds until you’re confident they’ll make them up in your direction.

The direction of movement is clearer on a subject of growing importance that (embracing abortion within the title) we may loosely term embryology. At the other end of life — on assisted dying — society is moving too. We and our laws are in a philosophical mess on both, essentially because we’re unclear whether we really do believe, as we claim, that “all human life is sacred” and because we’re struggling to define what constitutes life.

The law is unclear in myriad ways and the whole field awash with hypocrisy and dishonesty, from the doctors who deny that they kill already-dying patients — which they plainly and mercifully do — to the prosecuting authorities who decline to prosecute, to the judges and juries who decline to convict, to the arcane debates on the precise point at which a foetus becomes “viable”.

Again, the whole vast philosophical mess is thoroughly welcome, and the hypocrisy greatly preferable to moral rigour. I believe I recognise two journeys that Western ethics are on here, though I shall not live to see this proved. The first is towards the realisation that the word “life” is a primitive concept without (in the light of science) any useful modern meaning. What we call life is just a bundle of qualities something may exhibit, no single one of them the “core” of the word’s meaning, any more than as we remove the skins of an onion we shall approach and reveal the essential onion.

The second journey is towards our realisation that the word “sacred” is, likewise, literally meaningless. There is no sacredness without God.

As the concept of “life” dissolves, and as we cease to believe anything “sacred”, public policy towards our treatment of the beginning and the ending of a human being will move towards resolving such questions on increasingly utilitarian grounds. That — steeped in hypocrisy, denial and logical confusion — is where we are going. It’s where we must go.

Meanwhile, confusion and hypocrisy slosh around. Bring on the confusion. Bring on the hypocrisy. They are signals of the unshackling of minds and hearts from dogma. We need the space to live by different standards to those we must profess and to act in ways that depart from our declared principles, if we are to reach for new standards and new principles. As the great 18th-century Scottish philosopher David Hume remarked: “Be a philosopher; but, amidst all your philosophy, be still a man.”

    

 

 

Opposition To Fracking

 

The opponents of shale gas are a strange alliance of eco-freaks and old-school nimbyists. Both need to get real.

Sometimes, when we think we’re thinking we aren’t. You and the spouse have a running disagreement over whether ketchup should be kept in the fridge, say. One of you says it’s best served cold. The other says, rubbish, it’s eaten with hot food, at room temperature at least, might as well chill the pepper grinder. You both think you’re making an intellectual case and you’re both wrong. It’s just what your parents did. Think about it. See?

Politics is the same. We oppose policies not because of what they are, but because it’s those bastards who did that other thing we hated who are proposing them. We support intervention in Syria because we supported intervention in Iraq, or don’t, because we didn’t. Or we look at fracking, which big oil companies like, and people with dreadlocks don’t, and we think we know which side we’re on.

I certainly did. Through age, belief and disposition, I’m the sort of guy who sees a wind turbine and feels a tingle in his spine. It’s something to do with the glorious meeting of natural beauty and human ingenuity, audibly subdued and at the scale of a cathedral. There’s a set I pass often, heading out of Edinburgh on the A68, dotted along the horizon of the Lammermuirs. They make me want to park the car, and start singing hymns. And probably because of that I get quite upset when people start arguing that they don’t work terribly well.

I can just about cope with an economic criticism; that we have the incentives wrong and wind farms are actually subsidy farms, rewarding technological box-ticking rather than the actual generation of power. But start advancing the notion that the whole concept is just not a very good idea — that, like hybrid cars or most solar panels, wind turbines are all for show — and I start feeling edgy. Never mind, for now, whether such claims are true. The important thing is how badly I want them not to be.

Fracking is much the same. It’s a new technology with all the political trappings of old technologies. Greenpeace is against it and Sarah Palin is for it. It would be easy to stop thinking there. It takes stuff out of the ground and burns it and many of us have spent the past couple of decades diligently learning that this is bad. If you’ve any sort of environmental conscience; if you fill your recycling bin and fret about flying, if you’d love an electric car and even toy with conscientious vegetarianism, fracking is something you are poised to be anti.

Start reading up, though, in a spirit of even half-hearted open inquiry and the wheels of dissent soon come off. Most of the scare stories, such as earthquakes, poisoned water tables and flammable methane bursting from your taps simply aren’t true and those that are true are plainly the product of cowboy industries springing up in the American wilderness, which Britain simply wouldn’t stand for.

Plus, while burning natural gas does produce CO2, it produces quite a lot less than most real-world alternatives, to the extent that US emissions (not that Americans much give a damn about emissions) are plummeting. So all that really leaves is a rather mimsy aesthetic objection to great big robotish nodding donkey monsters scarring the lovely British countryside. Much like wind turbines do. Hmmm.

Environmentalists and their opponents both need to drag themselves out of their old, comfortable trenches. The debate has grown stale on both sides and fracking shows how poorly it maps on to the new, real world.

Through fear and hyperbole, the environmental argument has too often conflated a “we’re killing the planet” argument with a “so let’s kill ourselves instead” solution. It’s dumb, it’s anti the science it professes to cherish, and it’s simply impractical. Sure, it’s perfectly coherent to argue that our ideal future involves eating grass, wearing jumpers knitted out of weevils and travelling around in Flintstones-style cars. But it’s hardly a credible plan. Not if you’ve actually met another human being and thought about how they are, and are not, prepared to live.

Environmentalists need to grow up. Everybody knows that wind and hydro alone simply won’t save the planet. The physics might work, but the “human co-operation” angle simply isn’t achievable. In a world where nobody fancies being a caveman again combustion power is here to stay. We ought to find within our politics the ability to cheer at it getting better. This might even mean supporting new coal, if it is to be burnt in a better way than old coal. It certainly means supporting nuclear, and I’m coming around to the view that it means supporting fracking. Fracking does not mark a death knell for environmental politics. We need to support this not because we’ve given up being environmentalists, but because we’ve matured into environmentalists who want to get stuff done.

Plenty on the other side need to grow up, too. In Britain, at least, opposition to wind farms has made hypocrites and conspiracy theorists of half a generation of middle-class, middle-aged men who ought to have known better. It continues to astonish me that the most vocal disbelievers of mainstream climatology never pause to consider how terribly convenient it is that their scientific views co-operate so neatly with all their other views. Doubting the science makes longhair activists wrong and big government evil, and disguises the nimbyist desire not to have a windfarm out back as something else entirely. Fracking strips all that away. For the armchair climate sceptic with a Constable view out of the window, nimbyism is the only argument left. Michael Fallon, the Energy Minister, joked this week that the great joy of fracking was that quite a lot of it could take place beneath the pretty, rural South-eastern homes of the cossetted commentariat. “We are going to see how thick their rectory walls are, whether they like the flaring at the end of the drive,” he said. And, while he may have been overestimating the resources we cossetted commentariat have at our disposal these days, he was also absolutely right. Blunt honesty is about to become a requirement. The only honest argument we can have about fracking is about whether or not it ought to happen next door.

These great conceptual shifts are possible; witness how nuclear power, once anathema to anybody remotely green, no longer is. It will be interesting to see if some nimbyists make common cause with those hated longhairs, spiralling into whole new realms of expedient and fantastical paranoia. The practical environmentalist ought to leave them to it. Get real. Gas is better than coal. Let’s hope the next thing will be better still.

    

 

 

American Bile

 

Not long ago I was walking toward an airport departure gate when a man approached me.

“Are you Robert Reich?” he asked.

“Yes,” I said.

“You’re a Commie dirtbag.” (He actually used a variant of that noun, one that can’t be printed here.)

“I’m sorry?” I thought I had misunderstood him.

“You’re a Commie dirtbag.”

My mind raced through several possibilities. Was I in danger? That seemed doubtful. He was well-dressed and had a briefcase in one hand. He couldn’t have gotten through the checkpoint with a knife or gun. Should I just walk away? Probably. But what if he followed me? Regardless, why should I let him get away with insulting me?

I decided to respond, as civilly as I could: “You’re wrong. Where did you get your information?”

“Fox News. Bill O’Reilly says you’re a Communist.”

A year or so ago Bill O’Reilly did say on his Fox News show that I was a Communist. I couldn’t imagine what I’d done to provoke his ire except to appear on several TV shows arguing for higher taxes on the wealthy, which hardly qualified me as a Communist. Nor am I exactly a revolutionary. I served in Bill Clinton’s cabinet. My first full-time job in Washington was in the Ford administration, working for Robert H. Bork at the Justice Department.

“Don’t believe everything you hear on Fox News,” I said. The man walked away, still irritated.

It’s rare that I’m accosted and insulted by strangers, but I do receive vitriolic e-mails and angry Facebook posts. On the Internet and on TV shows, name-calling substitutes for argument, and ad hominem attack for reason.

Scholars who track these things say the partisan divide is sharper today than it has been in almost a century. The typical Republican agrees with the typical Democrat on almost no major issue. If you haven’t noticed, Congress is in complete gridlock.

At the same time, polls show Americans to be more contemptuous and less trusting of major institutions: government, big business, unions, Wall Street, the media.

I’m 67 and have lived through some angry times: Joseph R. McCarthy’s witch hunts of the 1950s, the struggle for civil rights and the Vietnam protests in the 1960s, Watergate and its aftermath in the 1970s. But I don’t recall the degree of generalized bile that seems to have gripped the nation in recent years.

The puzzle is that many of the big issues that used to divide us, from desegregation to foreign policy, are less incendiary today. True, we disagree about guns, abortion and gay marriage, but for the most part have let the states handle these issues. So what, exactly, explains the national distemper?

For one, we increasingly live in hermetically sealed ideological zones that are almost immune to compromise or nuance. Internet algorithms and the proliferation of media have let us surround ourselves with opinions that confirm our biases. We’re also segregating geographically into red or blue territories: chances are that our neighbors share our views, and magnify them. So when we come across someone outside these zones, whose views have been summarily dismissed or vilified, our minds are closed.

Add in the fact that most Americans no longer remember the era, from the Great Depression through World War II, when we were all in it together — when hardship touched almost every family, and we were palpably dependent on one another. There were sharp disagreements, but we shared challenges that forced us to work together toward common ends. Small wonder that by the end of the war, Americans’ confidence in major institutions of our society was at its highest.

These changes help explain why Americans are so divided, but not why they’re so angry. To understand that, we need to look at the economy.

Put simply, most people are on a downward escalator. Although jobs are slowly returning, pay is not. Most jobs created since the start of the recovery, in 2009, pay less than the jobs that were lost during the Great Recession. This means many people are working harder than ever, but still getting nowhere. They’re increasingly pessimistic about their chances of ever doing better.

As their wages and benefits shrink, though, they see corporate executives and Wall Street bankers doing far better than ever before. And they are keenly aware of bailouts and special subsidies for agribusinesses, pharma, oil and gas, military contractors, finance and every other well-connected industry.

Political scientists have noted a high correlation between inequality and polarization. But economic class isn’t the only dividing line in America. Many working-class voters are heartland Republicans, while many of America’s superrich are coastal Democrats. The real division is between those who believe the game is rigged against them and those who believe they have a decent shot.

Losers of rigged games can become very angry, as history has revealed repeatedly. In America, the populist wings of both parties have become more vocal in recent years — the difference being that the populist right blames government more than it does big corporations while the populist left blames big corporations more than government.

Widening inequality thereby ignites what the historian Richard Hofstadter called the “paranoid style in American politics.” It animated the Know-Nothing and Anti-Masonic movements before the Civil War, the populist agitators of the Progressive Era and the John Birch Society — whose founder accused President Dwight D. Eisenhower of being a “dedicated, conscious agent of the Communist conspiracy” — in the 1950s.

Inequality is far wider now than it was then, and threatens social cohesion and trust. I don’t think Bill O’Reilly really believes I’m a Communist. He’s just channeling the nation’s bile.

    

 

 

The Tea Party

 

To judge from the commentary inspired by the shutdown, most progressives and centrists, and even many non-Tea Party conservatives, do not understand the radical force that has captured the Republican Party and paralyzed the federal government. Having grown up in what is rapidly becoming a Tea Party heartland–Texas–I think I do understand it. Allow me to clear away a few misconceptions about what really should be called, not the Tea Party Right, but the Newest Right.

The first misconception that is widespread in the commentariat is that the Newest Right can be thought of as being simply a group of “extremists” who happen to be further on the same political spectrum on which leftists, liberals, centrists and moderate conservatives find their places. But reducing politics to points on a single line is more confusing than enlightening. Most political movements result from the intersection of several axes—ideology, class, occupation, religion, ethnicity and region—of which abstract ideology is seldom the most important.

The second misconception is that the Newest Right or Tea Party Right is populist. The data, however, show that Tea Party activists and leaders on average are more affluent than the average American. The white working class often votes for the Newest Right, but then the white working class has voted for Republicans ever since Nixon. For all its Jacksonian populist rhetoric, the Newest Right is no more a rebellion of the white working class than was the original faux-populist Jacksonian movement, led by rich slaveowners like Andrew Jackson and agents of New York banks like Martin Van Buren.

The third misconception is that the Newest Right is irrational. The American center-left, whose white social base is among highly-educated, credentialed individuals like professors and professionals, repeatedly has committed political suicide by assuming that anyone who disagrees with its views is an ignorant “Neanderthal.” Progressive snobs to the contrary, the leaders of the Newest Right, including Harvard-educated Ted Cruz, like the leaders of any successful political movement, tend to be highly educated and well-off. The self-described members of the Tea Party tend to be more affluent and educated than the general public.

The Newest Right, then, cannot be explained in terms of abstract ideological extremism, working-class populism or ignorance and stupidity. What, then, is the Newest Right?

The Newest Right is the simply the old Jeffersonian-Jacksonian right, adopting new strategies in response to changed circumstances. While it has followers nationwide, its territorial bases are the South and the West, particularly the South, whose population dwarfs that of the Mountain and Prairie West. According to one study by scholars at Sam Houston State University in Huntsville, Texas:

While less than one in five (19.4%) minority non-Southerners and about 36% of Anglo non-Southerners report supporting the movement, almost half of white Southerners (47.1%) express support.

In fact, the role that antigovernment sentiment in the South plays in Tea Party movement support is the strongest in our analysis.

The Tea Party right is not only disproportionately Southern but also disproportionately upscale. Its social base consists of what, in other countries, are called the “local notables”—provincial elites whose power and privileges are threatened from above by a stronger central government they do not control and from below by the local poor and the local working class.

Even though, like the Jacksonians and Confederates of the nineteenth century, they have allies in places like Wisconsin and Massachusetts, the dominant members of the Newest Right are white Southern local notables—the Big Mules, as the Southern populist Big Jim Folsom once described the lords of the local car dealership, country club and chamber of commerce. These are not the super-rich of Silicon Valley or Wall Street (although they have Wall Street allies). The Koch dynasty rooted in Texas notwithstanding, those who make up the backbone of the Newest Right are more likely to be millionaires than billionaires, more likely to run low-wage construction or auto supply businesses than multinational corporations. They are second-tier people on a national level but first-tier people in their states and counties and cities.

For nearly a century, from the end of Reconstruction, when white Southern terrorism drove federal troops out of the conquered South, until the Civil Rights Revolution, the South’s local notables maintained their control over a region of the U.S. larger than Western Europe by means of segregation, disenfranchisement, and bloc voting and the filibuster at the federal level. Segregation created a powerless black workforce and helped the South’s notables pit poor whites against poor blacks. The local notables also used literacy tests and other tricks to disenfranchise lower-income whites as well as blacks in the South, creating a distinctly upscale electorate. Finally, by voting as a unit in Congress and presidential elections, the “Solid South” sought to thwart any federal reforms that could undermine the power of Southern notables at the state, county and city level. When the Solid South failed, Southern senators made a specialty of the filibuster, the last defense of the embattled former Confederacy.

When the post-Civil War system broke down during the Civil Rights Revolution of the 1950s and 1960s, the South’s local notable class and its Northern and Western allies unexpectedly won a temporary three-decade reprieve, thanks to the “Reagan Democrats.” From the 1970s to the 2000s, white working-class voters alienated from the Democratic Party by civil rights and cultural liberalism made possible Republican presidential dominance from Reagan to George W. Bush and Republican dominance of Congress from 1994 to 2008. Because their politicians dominated the federal government much of the time, the conservative notables were less threatened by federal power, and some of them, like the second Bush, could even imagine a “governing conservatism” which, I have argued, sought to “Southernize” the entire U.S.

But then, by the 2000s, demography destroyed the temporary Nixon-to-Bush conservative majority (although conceivably it could enjoy an illusory Indian summer if Republicans pick up the Senate and retain the House in 2016). Absent ever-growing shares of the white vote, in the long run the Republican Party cannot win without attracting more black and Latino support.

That may well happen, in the long run. But right now most conservative white local notables in the South and elsewhere in the country don’t want black and Latino support. They would rather disenfranchise blacks and Latinos than compete for their votes. And they would rather dismantle the federal government than surrender their local power and privilege.

The political strategy of the Newest Right, then, is simply a new strategy for the very old, chiefly-Southern Jefferson-Jackson right. It is a perfectly rational strategy, given its goal: maximizing the political power and wealth of white local notables who find themselves living in states, and eventually a nation, with present or potential nonwhite majorities.

Although racial segregation can no longer be employed, the tool kit of the older Southern white right is pretty much the same as that of the Newest Right:

The Solid South. By means of partisan and racial gerrymandering—packing white liberal voters into conservative majority districts and ghettoizing black and Latino voters–Republicans in Texas and other Southern and Western states control the U.S. Congress, even though in the last election more Americans voted for Democrats than Republicans. The same undemocratic technique makes the South far more Republican in its political representation than it really is in terms of voters.

The Filibuster. By using a semi-filibuster to help shut down the government rather than implement Obamacare, Senator Ted Cruz of Texas is acting rationally on behalf of his constituency—the surburban and exurban white local notables of Texas and other states, whom the demagogic Senator seems to confuse with “the American people.” Newt Gingrich, another Southern conservative demagogue, pioneered the modern use of government shutdowns and debt-ceiling negotiations as supplements to the classic filibuster used by embattled white provincial elites who prefer to paralyze a federal government they cannot control.

Disenfranchisement. In state after state controlled by Republican governors and legislators, a fictitious epidemic of voter fraud is being used as an excuse for onerous voter registration requirements which have the effect, and the manifest purpose, of disenfranchising disproportionately poor blacks and Latinos. The upscale leaders of the Newest Right also tend to have be more supportive of mass immigration than their downscale populist supporters—on the condition, however, that “guest workers” and amnestied illegal immigrants not be allowed to vote or become citizens any time soon. In the twenty-first century, as in the twentieth and nineteenth, the Southern ideal is a society in which local white elites lord it over a largely-nonwhite population of poor workers who can’t vote.

Localization and privatization of federal programs. It is perfectly rational for the white local notables of the South and their allies in other regions to oppose universal, federal social programs, if they expect to lose control of the federal government to a new, largely-nonwhite national electoral majority.

Turning over federal programs to the states allows Southern states controlled by local conservative elites to make those programs less generous—thereby attracting investment to their states by national and global corporations seeking low wages.

Privatizing other federal programs allows affluent whites in the South and elsewhere to turn the welfare state into a private country club for those who can afford to pay the fees, with underfunded public clinics and emergency rooms for the lower orders. In the words of Mitt Romney: “We pick them up in an ambulance, and take them to the hospital, and give them care. And different states have different ways of providing for that care.”

When the election of Lincoln seemed to foreshadow a future national political majority based outside of the South, the local notables of the South tried to create a smaller system they could dominate by seceding from the U.S. That effort failed, after having killed more Americans than have been killed in all our foreign wars combined. However, during Reconstruction the Southern elite snatched victory from the jaws of defeat and succeeded in turning the South into a nation-within-a-nation within U.S. borders until the 1950s and 1960s.

Today the white notables of the South increasingly live in states like Texas, which already have nonwhite majorities. They fear that Obama’s election, like Lincoln’s, foreshadows the emergence of a new national majority coalition that excludes them and will act against their interest. Having been reduced to the status of members of a minority race, they fear they will next lose their status as members of the dominant local class.

While each of the Newest Right’s proposals and policies might be defended by libertarians or conservatives on other grounds, the package as a whole—from privatizing Social Security and Medicare to disenfranchising likely Democratic voters to opposing voting rights and citizenship for illegal immigrants to chopping federal programs into 50 state programs that can be controlled by right-wing state legislatures—represents a coherent and rational strategy for maximizing the relative power of provincial white elites at a time when their numbers are in decline and history has turned against them. They are not ignoramuses, any more than Jacksonian, Confederate and Dixiecrat elites were idiots. They know what they want and they have a plan to get it—which may be more than can be said for their opponents.

    

 

 

US Tax System Captured By The Rich

 

Joshua Holland: When we got into World War II, individuals and families paid 38 percent of federal income taxes and corporations picked up the other 62 percent. Last year, individuals and families paid 82 percent of federal income taxes and corporations paid just 18 percent. How did this happen?

David Cay Johnston: All modern societies require a large public sector to provide the goods and services on which the private sector depends. So you need commonwealth services – education, basic research, statistical gathering and civil law enforcement – a whole host of activity than can only be provided through the public sector.

Now, corporations have a concentrated interest in the taxes they pay and the capacity to lobby for changes and make campaign donations to rent, or in some cases buy politicians’ votes. Over a long period of time, they saw to it that we change these tax laws and shifted this burden.

We also dramatically increased the size of the federal government during the period from before World War II until now and some of the additional money you’re seeing is the result of increased Social Security taxes and Medicare taxes — some programs that benefit people generally, and that I think we should look at as efficient buying at the wholesale level, rather than retail.

Holland: Unpack that for me, David.

Johnston: Sure. Corporations in the days before World War II were essentially domestic operations, with a few exceptions. And we now have a global economy. And in the global economy, corporations all around the world are going to push to get the lowest tax — or no tax at all in some places — and then use that to pressure the US government to ease their tax burdens.

They’ve also, by the way, put in place innumerable little rules changes involving accounting and depreciation — that is writing down the value of equipment as it’s being used up — and other things, to reduce their bills.

Holland: And does this help explain why we have a very low tax burden overall, relative to other wealthy countries, but a lot of Americans feel that they’re being taxed to death?

Johnston: Well, one of the reasons some Americans feel they’re being taxed to death is that if you add up our taxes, which are low compared to other modern countries, and then you add in private expenditures for things the tax system pays for in other countries — a lot of our health care costs, higher education costs, admissions and fees and tickets and licenses for a lot of things — lo and behold, we end up being a relatively high-tax country. But it depends on how you analyze the data.

And let me give you one killer figure: We spend so much money on our health care in this country — or as I prefer to think of it, sick care in this country — that for every dollar that the other 33 modern economies spend for universal coverage, we spend $2.64. And this is done using something called “purchasing parity dollars,” so they’re truly comparable. So we spend $2.64 per person and still have almost 50 million people with no coverage and 30 million with limited coverage, and these other countries spend far less with universal coverage.

Here’s how much that costs: In the year 2010, if we had had the French health care system, which is one of the most expensive in the world, it would have provided universal coverage and it also would have saved us so much money that we could have eliminated the individual income tax that year and all else would have been equal. Our excess health care costs above those of the French were a little over 6 percent of the economy and the income tax in 2010 brought in about 6 percent of the economy.

Holland: And Dean Baker at the Center for Economic Policy and Research points out that if we paid the same for health care per person as all of the countries with longer life expectancies, we would be at a balanced budget today and looking at surpluses in the future.

Johnston: We can continue to have this enormous military operation — one that I have been very critical of — we can continue to have that if we just fix the health care problem. So imagine what happens if we get our health care costs in line by doing what every one of our economic competitors has figured out is the cheapest thing to do: universal health care with little or no out-of-pocket expense. And if we then cut back on this enormous military, where we spend 42 percent of all the military spending in the world, we would be able to lower taxes, run surpluses, fund higher education and research that will make us wealthier in the future. It’s just two things we need to address — just two.

Holland: Let’s go back to taxes. In your book, Perfectly Legal, which everybody should read, you showed that it’s not just the top one percent that are taking in so much more income than they did a generation ago, and paying less taxes on that income, but you really have to look at the top tenth of a percent or even the top one hundredth of one percent. Tell us about that.

Johnston: Well, the plutocrat class — that’s the top 16,000 households in this country — are where all the gains have been going since the end of the recession. Thirty-seven cents out of every dollar of increased income between 2009 and 2012 went to these 16,000 households — in a country of 314 million people.

So here’s what the newest data show based on tax returns: The average income of the bottom 90 percent of us has fallen 20 percent below where it was in the year 2000 — it fell from about $36,000 to $30,000. It has fallen back to the level of 1966, when Mustangs were new, Lyndon Johnson was president and we were prosecuting a war in Vietnam. 1966.

And what happened to the 1 percent of the 1 percent? Well, their income was about $5 million dollars a year back then on average and now it’s $23 million dollars a year on average.

Now it’s important to add a point: This is how it’s measured by the tax system. Very, very wealthy people — Warren Buffett, hedge fund managers, Mitt Romney when he ran a private equity fund — are not required to report most of their economic gains and legally they can literally live tax-free or nearly tax-free by borrowing against their assets. You can borrow these days, if you’re very wealthy, against your assets for less than 2 percent interest and the lowest tax rate you could pay is 15 percent. So no wealthy person with any sense of good economics will pay taxes if they can borrow against their assets. Now you and I can’t do that because our assets aren’t worth that much, but if you’re a billionaire and you borrow, let’s say, $10 million dollars a year to live on, you pay $200,000 interest, but your fortune through investing grows by $50 million. At the end of the year you pay no taxes, your wealth is up almost $40 million dollars and your cost was just the interest of $200,000.

Holland: Amazing. How much have changes to the tax code had to do with the sky-high level of inequality we see today?

Johnston: Oh, I believe that the Reagan-inspired changes in the tax code are absolutely fundamental to this enormous growth of inequality. When I went to The New York Times in 1995 and I started writing about inequality, there were a lot of people who thought I was some far-out radical. They got lots of calls and letters and complaints, and I just kept telling the editors, “Just watch the data — it’s going to show this will get worse.” And the reason is if you listened to what Reagan and his supporters said in the 1980 election, it was clear that it would lead to greater inequality. Here’s the way to think about it: The bottom 90 percent of us are actually paying slightly higher federal taxes than we did, as a share of our income, back in 1961. But the top 400 taxpayers in the country are paying 60 percent less and their incomes have grown so much that they’re making 35 times as much after-tax income because of higher incomes and lower taxes.

Now imagine you’re able to save money, and let’s say you make enough money to save $1,000 a year. If we cut your tax rate, which was $1,000 dollars, in half to $500, now you can save $1,500 dollars a year. Pretty soon, you’ve got a snowball that’s getting bigger and bigger.

Well, people at the very top have had their taxes cut 60 percent and they can’t spend all the money they’re making anyway. Nobody can consume a billion dollars a year unless they gamble it away. You can’t even consume that much in drugs — it will kill you for sure. You can’t consume that much entertaining mistresses — I don’t care if you’re a 21-year-old athlete in great shape — the body won’t allow it. The only way you can consume that much money is by gambling.

And so even — if you’re a billionaire — even with a jet and mansions and artwork, you can’t consume that kind of money. So what happens is this snowball grows at avalanche rates and that’s why we’re seeing this enormous build-up at the top.

And unfortunately, we’ve created a society now where we measure people not the way Martin Luther King said we should in his 1963 speech — “I have a dream that one day my four children will be judged not by the color of their skin, but the content of their character” — we judge people now by the presumed content of their wallets. We have replaced character with commas — it takes two to be a millionaire, three to be a billionaire, and that’s the measure we’re applying to people.

And people who have three commas, well, that’s not enough, there’s never enough. Money is like — as Richard Pryor once said about cocaine — too much is never enough. And so you have this sense of entitlement at the very top that you’re entitled to all of this money even though it’s not doing anything productive — it isn’t improving the quality of your life and it is actively damaging the lives of your fellow Americans.

Holland: My grandmother is obsessed with the question of why people who have billions want more. She can’t understand it.

Johnston: It’s very easy to understand — it’s a status thing. So you’re rich enough to own a one-eighth interest in a little Honda jet, but the guy down the road, he has his own Honda jet, and the guy down the road from him, he has a Gulfstream, and the guy down the road from him, he owns a jumbo jet, and the guy down the road from him, Sheldon Adelson, the guy who kept Newt Gingrich’s campaign alive, owns two personal 747s, one of which is equipped for skateboarding in the sky by his youngest heirs. Oh, I’m sorry, you think you’re well off because you own a jet? The guy down the street’s got two 747s, and by the way, I understand that there’s a private order in place for an Airbus 380. Gee, my yacht is only 350 feet, then you announce that you’re building the world’s biggest yacht and somebody then says, “No, I’m building a 410-foot yacht.” This is meaningless consumption in terms of making the world economy any better, but for egos, oh it’s, “Mine’s bigger than yours.”

Holland: Speaking of Mitt Romney, he famously said that there were 47 percent of American families that pay no income taxes, the federal income tax is fairly progressive, but it raises the same amount of revenues as the much more regressive payroll taxes, more or less, about 40 percent of the government’s income. Similarly, at the state and local level, the poor pay a much higher share of their income in taxes than the rich. The poorest 20 percent of the population pay over 11 percent of their incomes in state and local taxes, while the top 1 percent pays half that rate. Why do people focus on this one tax, the federal income tax, which generates about 40 percent of the federal government’s income?

Johnston: Well, because the anti-tax crowd for a hundred years has been trying to get rid of progressive income taxes and they have distorted and lied and we have what is politely called a low-information voter named Mitt Romney who made it demonstratively clear during the campaign that he had no idea what’s actually going on in the country.

By the way, do you know why 47 percent of Americans in one year — it’s no longer true, it’s going to drop back to the high 30s — but do you know why so many Americans do not pay federal income taxes?

Holland: Tell me.

Johnston: The lead reason for the increase is the Republican policy put in place in the ’90s of the child tax credit. So a married couple with two children does not pay any federal income taxes until they make at least $44,000 a year. And with a little bit of tax planning and a 401(k) plan and some other things, you could make $70,000 and pay no federal income tax. So the Republicans create this situation where middle-income families with children pay no income tax and then they complain about it. And nobody but me has called them on this.

Holland: I have!

Johnston: Okay, well, let me rephrase that. Only a handful of us have called them on this. You certainly are not hearing it on the network news and the front pages of the major newspapers.

Holland: No, you’re not.

David, final question. According to the Tax Justice Network, “$32 trillion has been hidden in small island banking hubs which host a bevy of trust funds, shell corporations and other tax havens.” That’s not just American dollars — it’s a global figure.

How much tax revenue are we losing here and what would our budget picture look like if this weren’t the case?

Johnston: The bottom line: we are not serious about high-end tax cheating in America. That’s one of the major problems. If you’re an ordinary worker, we take your taxes out of your paycheck before you get the money, which means that Congress doesn’t trust you. But if you’re a business owner, an investor, a landlord, then Congress trusts you to report your income, subject to audit, which is highly unlikely. And if you’re smart, your books are so complicated and the audit budgets are so small that unless you were blatant and stupid, you won’t get caught.

We had the case a few years ago of a fellow who public records at the Securities and Exchange Commission showed had made $2-plus billion and filed no income tax return, and the IRS was unaware of him.

We could solve our budget problems with the two things that I mentioned: getting a health care system that’s modern and efficient and reduces cost by about six percentage points of the economy while covering everybody; and two, just scaling back a little bit on the military that we operate as if we were going to go to war with Russia, the old Soviet Union, which we’re not.

And if we then made it a priority to make sure that the tax laws apply equally to everybody — we could probably cut everybody’s tax rates if we did that. But there’s no stomach on Capitol Hill for going after rich tax cheats who are sophisticated and smart, and so they get away with it.

    

 

 

Climate Change Persuasion

 

WHEN scholars of the future write the history of climate change, they may look to early 2008 as a pivotal moment. Al Gore's film An Inconvenient Truth was bringing the science to the masses. The economist Nicholas SternMovie Camera had made the financial case for tackling the problem sooner rather than later. And the Intergovernmental Panel on Climate Change (IPCC) had just issued its most unequivocal report yet on the link between human activity and climatic change.

The scientific and economic cases were made. Surely with all those facts on the table, soaring public interest and ambitious political action were inevitable?

The exact opposite happened. Fast-forward to today, the eve of the IPCC's latest report on the state of climate science, and it is clear that public concern and political enthusiasm have not kept up with the science. Apathy, lack of interest and even outright denial are more widespread than they were in 2008.

How did the rational arguments of science and economics fail to win the day? There are many reasons, but an important one concerns human nature.

Through a growing body of psychological research, we know that scaring or shaming people into sustainable behaviour is likely to backfire. We know that it is difficult to overcome the psychological distance between the concept of climate change – not here, not now – and people's everyday lives. We know that beliefs about the climate are influenced by extreme and even daily weather.

One of the most striking findings is that concern about climate change is not only, or even mostly, a product of how much people know about science. Increased knowledge tends to harden existing opinions (Nature Climate Change, vol 2, p 732).

These findings, and many more, are increasingly available to campaigners and science communicators, but it is not clear that lessons are being learned. In particular, there is a great deal of resistance towards the idea that communicating climate change requires more than explaining the science.

The IPCC report, due out on 27 September, will provide communicators with plenty of factual ammunition. It will inevitably be attacked by climate deniers. In response, rebuttals, debunkings and counter-arguments will pour forth, as fighting denial has become a cottage industry in itself.

None of it will make any real difference. This is for the simple reason that the argument is not really about the science; it is about politics and values.

Consider, for example, the finding that people with politically conservative beliefs are more likely to doubt the reality or seriousness of climate change. Accurate information about climate change is no less readily available to these people than anybody else. But climate policies such as the regulation of industrial emissions often seem to clash with conservative political views. And people work backwards from their values, filtering the facts according to their pre-existing beliefs.

Research has shown that people who endorse free-market economic principles become less hostile when they are presented with policy responses which do not seem to be as threatening to their world view, such as geoengineering. Climate change communicators must understand that debates about the science are often simply a proxy for these more fundamental disagreements.

Some will argue that climate change discourse has become so polluted by politics that we can't see the scientific woods for the political trees. Why should science communicators get their hands dirty with politics? But the solution is not to scream ever louder at people that the woods are there if only they would look properly. A much better, and more empirically supported, answer is to start with those trees. The way to engage the public on climate change is to find ways of making it resonate more effectively with the values that people hold.

My colleagues and I argued in a recent report for the Climate Outreach and Information Network that there is no inherent contradiction between conservative values and engaging with climate change science. But hostility has grown because climate change has become associated with left-wing ideas and language.

If communicators were to start with ideas that resonated more powerfully with the right – the beauty of the local environment, or the need to enhance energy security – the conversation about climate change would likely flow much more easily.

Similarly, a recent report from the Understanding Risk group at Cardif University in the UK showed there are some core values that underpin views about the country's energy system. Whether wind farms or nuclear power, the public judges energy technologies by a set of underlying values – including fairness, avoiding wastefulness and affordability. If a technology is seen as embodying these, it is likely to be approved of. Again, it is human values, more than science and technology, which shape public perceptions.

Accepting this is a challenge for those seeking to communicate climate science. Too often, they assume that the facts will speak for themselves – ignoring the research that reveals how real people respond. That is a pretty unscientific way of going about science communication.

The challenge when the IPCC report appears, then, is not to simply crank up the volume on the facts. Instead, we must use the report as the beginning of a series of conversations about climate change – conversations that start from people's values and work back from there to the science.

    

 

 

The Tea Party Mind

 

If you want to understand how American politics changed for the worse, according to moral psychologist and bestselling author Jonathan Haidt, you need only compare two quotations from prominent Republicans, nearly fifty years apart.

The first is from the actor John Wayne, who on the election of John F. Kennedy in 1960 said, "I didn't vote for him, but he's my president and I hope he does a good job."

The second is from talk radio host Rush Limbaugh, who on the inauguration of Barack Obama in 2009 said, "I hope he fails."

The latter quotation, Haidt explains in the latest episode of Inquiring Minds (click above to stream audio), perfectly captures just how powerful animosity between the two parties has become—often overwhelming any capacity for stepping back and considering the national interest (as the shutdown and debt ceiling crisis so unforgettably showed). As a consequence, American politics has become increasingly tribal and even, at times, hateful.

And to understand how this occurred, you simply have to look to Haidt's field of psychology. Political polarization is, after all, an emotional phenomenon, at least to a large degree.

Jonathan Haidt thinks our political views are a by-product of emotional responses instilled by evolution.

"For the first time in our history," says Haidt, a professor at NYU's Stern School of Business, "the parties are not agglomerations of financial or material interest groups, they're agglomerations of personality styles and lifestyles. And this is really dangerous. Because if it's just that you have different interests, that doesn't mean I'm going to hate you. It just means that we've got to negotiate, I want to win, but we can negotiate. If it's now that 'You people on the other side, you're really different from me, you live in a different way, you pray in a different way, you eat different foods than I do,' it's much easier to hate those people. And that's where we are."

Haidt is best known for his "moral foundations" theory, an evolutionary account of the deep-seated emotions that that guide how we feel (not think) about what is right and wrong, in life and also in politics. Haidt likens these moral foundations to "taste buds," and that's where the problem begins: While we all have the same foundations, they are experienced to different degrees on the left and the right. And because the foundations refer to visceral feelings that precede and guide our subsequent thoughts, this has a huge consequence for polarization and political dysfunction. "It's just hard for you to understand the moral motives of your enemy," Haidt says. "And it's so much easier to listen to your favorite talk radio station, which gives you all the moral ammunition you need to damn them to hell."

To unpack a bit more what this means, consider "harm." This moral foundation, which involves having compassion and feeling empathy for the suffering of others, is measured by asking people how much considerations of "whether someone cared for someone weak and vulnerable" and "whether or not someone suffered emotionally" factor into their decisions about what is right and wrong. As you can see, liberals score considerably higher on such questions. But now consider another foundation, "purity," which is measured by asking people how much their moral judgments involve "whether or not someone did something disgusting" and "whether or not someone violated standards of purity or decency." Conservatives score dramatically higher on this foundation.

How does this play into politics? Very directly: Research by one of Haidt's colleagues has shown, for instance, that Republicans whose districts were "particularly low on the Care/Harm foundation" were most likely to support shutting down the government over Obamacare. Why?

Simply put, if you feel a great deal of compassion for those who lack health care, passing and enacting a law that provides it to them will be an overriding moral concern to you. But if you don't feel this so strongly, different moral concerns can easily become paramount. "On the right, it's not that they don't have compassion," says Haidt, "but their morality is not based on compassion. Their morality is based much more on a sense of who's cheating, who's slacking.”

For Haidt, the political moment that perfectly captured this conservative (and Tea Party) morality—while simultaneously showing how absolutely incomprehensible it is to those on the left—was Wolf Blitzer's famous gotcha question to Ron Paul during a 2011 Republican presidential debate. Blitzer asked Paul a hypothetical question about a healthy, 30-year-old man who doesn't get health care because he doesn't think he needs it, but then winds up in a serious medical situation. When Blitzer asked Paul whether society should just "let him die," there were audible cheers and cries of "yeah" from the audience—behavior that was appalling to care-focused liberals, but that is eminently understandable, under Haidt's paradigm, as an emotional outburst based on a very different morality.

"My analysis is that the Tea Party really wants [the] Indian law of Karma, which says that if you do something bad, something bad will happen to you, if you do something good, something good will happen to you," says Haidt. "And if the government interferes and breaks that link, it is evil. That I think is much of the passion of the Tea Party.”

In other words, while you may think your political opponents are immoral - and while they probably think the same of you - Haidt's analysis shows that the problem instead is that they are too moral, albeit in a visceral rather than an intellectual sense.

As a self-described centrist, Haidt sometimes draws ire from the left for comments about how liberals don't understand their opponents, and about how conservatives have a broader range of moral emotions. But he certainly doesn't claim that when it comes to political animosity and the polarization that we now live under, both sides are equally to blame. "The rage on the Republican side is stronger, the Republicans have gotten much more extreme than the Democrats have," Haidt says.

The data on polarization are as clear as they are disturbing. Overall, feelings of warmth towards members of the opposite party are at terrifying lows, and Congress is perhaps more polarized than it has been in the entire period following the Civil War. But this situation isn't the result of parallel changes on both sides of the aisle. "The Democrats, the number of centrists has shrunk a bit, the number of conservative Democrats has shrunk a bit, but it's not that dramatic, and the Democratic party, certainly in Congress, is a mix of centrists, moderately liberal and very liberal people," says Haidt. "Whereas the Republicans went from being overwhelmingly centrist in the '50s and '60s, to having almost no centrists," Haidt says.

And of course, the extremes are the most morally driven, the most intense.

From the centrist perspective, Haidt recently tweeted that "I hope the Republican party breaks up and a new party forms based on growth, not austerity or the past."

"This populist movement on the right," he says, is "sick and tired of the allegiance with business." And more and more, business feels likewise, especially after the debt ceiling and shutdown disaster.

"I think this gigantic failure might be the kind of kick that some reformers need to change how the Republicans are doing things," says Haidt. "That's my hope, at least."

    

 

 

Free Speech

 

Restrictions on free speech nearly always spread, becoming tools of the intolerant and the illiberal.

It’s more than 30 years since the death of Robert McKenzie and you may have forgotten who he is, if you ever knew. Yet when, as a teenager, I met the presenter of the BBC election programmes he was the most famous person I’d ever encountered. He was destined to play an important role in my life.

When I first came to the LSE as an undergraduate I was surprised to find that, in such a political place, there wasn’t a debating society. I decided to set one up. I’d invite some big figures. It would be fun. I had, for instance, the idea of asking Roy Jenkins, Clive Jenkins and Patrick Jenkin to debate “Which Jenkin(s) for Britain?”.

The trouble was, I didn’t know any politicians or public figures. However, I did pass the office of Bob McKenzie, an LSE sociology professor, on my way to the cafeteria every day and I figured, well, he knew everyone. Perhaps he could help. So I went to see him. And this is when the professor changed my life. He said no. No, he couldn’t help. Or at least he could, but he wouldn’t.

Actually, it wasn’t as straightforward as that sounds. He was encouraging, but also tough. He said that the LSE student union had adopted a policy of banning speakers from campus if they were “racist or sexist”. As a result, the leading Conservative Sir Keith Joseph had been hounded out of the school. The professor wasn’t going to invite any of his friends to the LSE until he could be certain that wouldn’t happen. If I wanted his assistance, I would have to persuade the student union to drop the policy.

So began my commitment to what has turned out to be a lifelong cause. I saw the professor’s point immediately and began a campaign for free speech. In time, this took me out of the Left altogether and I never went back. Before the campaign made its final breakthrough, LSE students had banned the Home Office minister Timothy Raison (immigration control was racist), the Hot Gossip dance troupe (racist and sexist) and the Israeli Ambassador (Zionism is racism). Very sadly, when victory was already in sight, Bob McKenzie died.

Yet in the end, we won. LSE students voted down the policy. And in 1986 the Government included a section in the Education Bill that guaranteed free speech in universities, outlawing such bans in future.

Unfortunately, the issue has never quite gone away. From time to time someone in a university somewhere takes it into their head to start banning things again. And such was the case a little more than a week ago, in, wouldn’t you just know it, the LSE.

At the Fresher’s Fair, the students running the Atheist Society stall wore T-shirts displaying a well known satirical cartoon strip called Jesus and Mo. The identity of the characters is not explicit, but the strip pokes fun at religion. The humour isn’t crass but it is pretty pointed. Someone — perhaps more than one person — complained. The Student Union called in the school authorities, the school then called in security. The T-shirts had to be covered up or removed.

Small incident though this may appear to be, I think it teaches some big lessons. It is important in itself in terms of free expression in universities, and it is important beyond universities.

To start with, it teaches something that ought to be obvious, but somehow isn’t. The people who end up in control when freedom is restricted are those people who do not like freedom.

When I was at college, the Left always argued that their bans protected freedom. They were trying to keep out fascists. This sounded well-meaning (although I don’t think it was) but whatever the intention, it was never going to have a benign effect. Bans become tools for intolerant people to prevent tolerant people from expressing themselves.

There was a sort of grim inevitability about the fact that when bans returned to the LSE, it would be fundamentalists oppressing liberals.

The second big lesson is more important still. The LSE authorities were aware of their responsibility to protect free speech under the 1986 Act. However Section 26 of the Equality Act 2010 seemed to put the school under another obligation, one that clashed with the protection of free speech.

They felt constrained under the Equality Act to prevent harassment. The legislation defines harassment as an act that might violate someone’s dignity or create a hostile environment “related to a protected characteristic”. One such protected characteristic is “religion”. The school judged that in the context of an event welcoming all students, the T-shirts breached the Act. They had to go.

I think this judgment was horribly mistaken, but it wasn’t malicious. And that is why it teaches a lesson. It shows how law works.

The balance between the 1986 and 2010 Acts wasn’t weighed by lawyers using case law and thinking about Parliament’s intentions. It was probably made by a professor acting in a hurry in response to a student complaint. He may have mixed in his political ideas, a desire to calm things down and a splash of legal awareness and reached a decision.

I don’t suppose, if they had debated the question, Parliament would ever have reached the absurd, even sinister, conclusion that the law should forbid students to wear the T-shirts. But they hadn’t debated the question, they’d just left broad gauge laws for laymen to interpret. As was bound to happen, the interpretation ended up being excessively cautious, and insufficiently sensitive to the freedom to be a nuisance.

What happened at the LSE matters because of its implications for academic freedom and free expression in the public square. It would be a disaster for free speech if other authorities, not just universities but theatres and councils and others, followed this lead. Perhaps some already do. If so then Parliament will have to revisit the 2010 Act to make sure they do not.

And it matters, too, because of the debate on press freedom. Those who want a “dab of statute”, just a tiny bit you know, nothing to worry about, think they are the knights in shining armour, the defenders of the weak. In the end, however, restrictions on freedom of speech always spread, becoming the tool of the intolerant and the enemy of liberal engagement. The idea that because the originators feel themselves well-meaning the result will be benign is awe-inspiringly naive. Nor will a press law remain strictly within its original boundaries. Laws don’t work like that. They are interpreted by laymen who want a quiet life. They are used by agitators who seize their chance. They are expanded by judges trying to solve legal puzzles and extend their own discretion. They drift and with it freedom of expression drifts away.

This sort of drift can’t be prevented entirely. Yet you can guard against it. The right thing to do is to side always for free speech against encroachments upon it even from the well-meaning. The lesson I learnt from Bob McKenzie all those years ago.

    

 

 

The Tea Party and End Times

 

Want to know why the Tea Party so eager to grievously wound the Republican Party? The answer is as simple as it is counterintuitive: its leaders view themselves as modern prophets of the apocalypse.

In the aftermath of the great government shutdown of 2013, the Tea Party continues to cause heartburn for establishment Republicans. Consider the results of last week's elections, which offer clues to the internecine GOP battles that lie ahead. Although it's much too early to draw hard conclusions, Chris Christie proved that a moderate, common-sense Republican could win in deep blue New Jersey, while in purple Virginia the wild-eyed social reactionary Ken Cuccinelli failed to gain traction outside his uber-conservative Christian-right base.

Yet the Tea Party is willing to defy overwhelming negative public opinion, wreck the government, risk plunging the world economy into chaos and invite political defeat. The driving force behind this destructive strategy is that Tea Party zealots answer to a 'higher calling.'

They believe America teeters on the brink of destruction, and hold as an article of faith that liberals, gays, Democrats, atheists and the United Nations are to blame. This 'end-times' world-view is a foundational precept of the evangelical movement, from which many of the so-called Tea Party favorites spring. Scholars call it apocalypticism.

Of course, the Tea Party is not just composed of members of the Christian right. Many are genuine libertarians. Some nurse an unreconstructed Confederate grudge, while others harbor a thinly disguised racism. However, the real energy, the animating force for the movement comes from evangelicals, of whom Ted Cruz, Michelle Bachmann and Sarah Palin are the most strident. These are the modern-day 'apocalyptic prophets.'

Although the issues are secular, the prophets' anti-Obamacare rhetoric rings with religious, end-times cadences. So to understand why they invoke chaos, we need to know where their ideas about an 'apocalypse' came from.

Most theologians, including the revered Albert Schweitzer, believe John the Baptist and Jesus of Nazareth were Jewish apocalypticists. Simply put, these first-century prophets believed they were living in the 'end times' before God would send his representative, the 'Son of Man' (taken from a rather obscure passage in the Book of Daniel), to overthrow the forces of evil and establish God's justice on earth. Apocalypse literally translates as 'the revealing' of God’s will. For these early prophets the Kingdom of God was not to be a church, but a military and political kingdom on earth.

Lest this sound far-fetched to modern ears, listen to our modern Tea Party prophets in their own words:

"You know we can't keep going down this road much longer. We're nearing the edge of the cliff . . . We have only a couple of years to turn this country around or we go off the cliff to oblivion!" - Ted Cruz at the Values Voters Summit, Oct. 11

". . . I'm a believer in Jesus Christ, as I look at the End Times scripture, this says to me that the leaf is on the fig tree and we are to understand the signs of the times, which is your ministry, we are to understand where we are in God’s End Times history. Rather than seeing this as a negative . . . we need to rejoice, Maranatha Come Lord Jesus, His day is at hand. And so what we see up is down and right is called wrong, when this is happening, we were told this, that these days would be as the days of Noah. We are seeing that in our time." - Michelle Bachmann, Oct. 5, 2013

"And this administration will been [sic] complicit in helping people who wants [sic] to destroy our country." – Louie Gohmert on the floor of the U.S. House

"The biggest war being waged right now is against our religious liberties and traditional values." - Rep. Tim Huelskamp, Values Voters Summit

"The fight for religious freedom starts here at home because we are one nation under God." - House Majority Leader Eric Cantor, Values Voters Summit

For these apocalyptic prophets, the issues aren't even political anymore; they're existential, with Obamacare serving as the avatar for all evil. In this construct, any compromise whatsoever leads to damnation, and therefore the righteous ends justify any means.

Much of the prophets' message is couched in populist language. It sounds familiar to us because we've heard it all before. Historically whenever our country has experienced economic stress an angry, reactionary vein of populism surfaces. Sometimes called 'Jacksonian,' this common thread actually reaches back to the American Revolution, then to Shay's Rebellion, through Jackson's 'Augean Stables' to William Jennings Bryan's rants against science in the Scopes 'Monkey Trial.' It includes 'Know-Nothings,' Anti-Masons and Huey Long's 'Every Man a King.' George Wallace stood in the schoolhouse door and Ross Perot sabotaged George Bush the Elder's re-election. Except for Andrew Jackson, each burst of populist fervor ended badly.

Our modern prophets are fundamentally different. Their dogma springs from Pat Robertson and Jerry Falwell, through James Dobson's Family Research Council, to the eerily omnipresent Fellowship and its C Street house.

Ted Cruz's father, Raphael was seen in recently uncovered videos calling for America to be ruled by 'kings' who will take money from anyone who is not an evangelical Christian and deliver it into the hands of fundamentalist preachers and their acolytes. This is a movement is called 'Christian Dominionism,' and it has many adherents the evangelical right. It is also obviously and dangerously anti-democratic. These new apocalyptic prophets, and the demagogues who profit (pun intended) from them, see themselves locked in mortal combat against the Anti-Christ in a fight for America's soul - and wealth.

Now if you are battling the forces of evil for the very survival of the nation, there can be no retreat, no compromise, and no deals. Like the Jewish zealots at Masada, it's better to commit glorious suicide than make peace with the devil. There can be no truce with the Tea Party because its apocalyptic zealots can never take 'yes' for an answer.

Since the apocalyptists cannot compromise, they must be beaten. President Obama and congressional Democrats seem to have finally grasped this fact, and are learning how to deal with them. By refusing to knuckle under to extortion in the government shutdown drama, Obama exposed their reckless radicalism and won resoundingly.

But Democrats can't solve this problem alone. To bring any semblance of order back to the American political system and restore a functioning two-party system, the GOP has to find its own equilibrium. Thankfully, this process has already begun.

Establishment Republicans, corporate CEOs and Wall Street moguls stand appalled at the Tea Party monster they helped to create. Formerly cowed into silence, they are beginning to see the handwriting on the wall and speak out against the self-destructive zealots.

In conservative Virginia, Ken Cuccinelli was largely abandoned by the GOP establishment. Many Republican leaders even went so far as to endorse the Democrat, Terry McAuliffe. Unable to raise significant money from the Republican establishment, Cuccinelli was outspent more than ten-to-one. While Virginians rejected a conservative who believes government's role is to regulate morality, New Jersey voters chose a conservative Republican who believes government has a constructive, practical role to play.

The contrast is striking - and instructive. Until Republicans slug it out among themselves and decide which kind of party they want to be, we will continue to lurch from crisis to crisis.

This family fight will not be easy or bloodless. The Tea Party represents roughly one-half of the Republican base. They love Cruz, Palin and the chorus of other voices crying in the wilderness. They are unified in their hatred of Obama, and they are organized down to the precinct level. More importantly, they despise the moderate voices in their own party.

Gerrymandered congressional districts guarantee many safe Tea Party seats. Powerful think tanks and advocacy groups like The Heritage Foundation, the Chamber of Commerce, American Enterprise Institute and others, which in years past underpinned the Republican establishment, are now heavily invested in the right-wing agenda and will not be easily co-opted. Deep-pocketed militants like the Koch brothers will keep the cash flowing, and right-wing talk radio-heads will whip up the aggrieved faithful.

It's almost impossible to predict how this family fight will end, but there are at least two possible outcomes: First, the pragmatists win. The Grand Old Party could be led out of the wilderness by a charismatic figure a la Chris Christie, who is viewed as a straight-talking, practical problem-solver. Any such leader will have to arise outside Washington. The pragmatists' backers would include big business, Wall Street, the military-industrial complex, GOP lobbyists and a plethora of wealthy patrons who can't afford any more Tea Party shenanigans.

They have a strong case. Moderates have won some dramatic conservative victories over the years, delivering massive tax cuts, reforming welfare, de-regulating Wall Street, diluting Roe v. Wade, reviving federalism with block grants and reshaping today's conservative Supreme Court.

Second, the hard-liners revolt. The party splinters, and out of the wreckage a new center-right 'Whig Party' emerges. This is not so far-fetched as it may seem. A recent bi-partisan polling by NBC and Esquire Magazine reveals a wide plurality: 51% of Americans view themselves as centrists, not deeply invested in either party.

Not surprisingly, these moderates have both liberal and conservative views. 64% support gay marriage, 63% support abortion in the first trimester, 52% support legalizing marijuana, and they support a strong social safety net by wide margins. But 81% support offshore drilling, 90% support the death penalty and 57% are against affirmative action. So a new moderate coalition might well attract significant support from the moderate middle, establishment Republicans, Independents and centrist Democrats too.

Unfortunately for the apocalyptic prophets, only 29% of the moderate middle thinks churches or religious organization should have any role at all in politics. So like the prophets of old, they seem fated to join that long sad procession of failed zealots and martyrs who were overwhelmed by hard reality and their own rigid dogma.

    

 

 

Long term Effects on Voting

 

Republicans who worry about their party’s long-term prospects tend to focus on the ways the party’s leaders have been alienating Latinos and women. But newly published research suggests the GOP may have an even bigger problem: The generation that is just emerging.

In “Growing Up in a Recession,” published in the Review of Economic Studies,economists Paola Giuliano and Antonio Spilimbergo report that people who experienced a recession “during the critical years of early adulthood” generally “support more government redistribution, and tend to vote for left-wing parties.”

After examining three different data sets, they conclude “the effect of recessions on beliefs is long-lasting.” According to their analysis, coming of age in a lousy economic environment breeds the belief that “success in life depends more on luck than effort,” which in turn leads to more support for social welfare policies.

Giuliano and Spilimbergo first analyzed data from the General Social Survey, a nationally representative sample that gathers information from about 1,500 Americans every other year. They compared answers indicating ideology, political affiliation, and beliefs regarding assistance to the poor, with regional recessions that took place during the respondents’ young adult years.

They found that experiencing a recession during young adulthood “affected the probability of voting for the Democratic Party in a sizeable manner”—as much as 15 percent for people who were young during the Great Depression.

That same pattern was found in an analysis of the World Value Survey, which includes data from 37 countries. The researchers found “a positive association between experiencing a macroeconomic disaster when young and both left-wing ideology and affiliation with a left-wing political party.”

“There is no systematic effect of macroeconomic shocks on preferences for redistribution or political behavior during other periods of life,” they add.

The likely reasons for this are clear enough. If you come of age at a time when traditional values such as hard work and perseverance don’t necessarily pay off, you’re imprinted with a sense that economic success is fragile and unpredictable. And that increases the odds you’ll develop a belief system that prioritizes assistance for the needy.

That said, there are no guarantees that the next generation will vote straight Democratic. When an earlier, unpublished draft of this paper was circulating back in 2009, conservative columnist Ross Douthat replied that recessions “only benefit liberals when an activist government is perceived to have answers to the crisis.” Whether the Obama administration qualifies is TBD.

Either way, the study provides more evidence that early-life developments shape political attitudes—a phenomenon that has also been observed in relation to the Vietnam War draft. As Giuliano and Spilimbergo put it: “Shocks experienced during early adulthood have a permanent effect in the formation of beliefs.”

    

 

 

Genghis and Mongols

 

A conventional historical narrative holds that the rise and expansion of the Mongol Empire -- first under Genghis Khan and, later, his progeny and successors -- were propelled by a deteriorating climate in the Mongolian steppe. Fleeing drought, the narrative runs, Ghengis Khan's Golden Horde pushed west, south and east in a bid for expansion that would someday form the world's largest contiguous empire.

Indeed, climate records indicate that the arid, landlocked steppe was seized by decades of drought in the late years of the 12th century, possibly exacerbating the violent conflicts that racked the region at the time. It was into such a world Genghis Khan was born and rose to power, crushing his rivals and uniting the fractured Mongolian tribes under his own horsehair banner.

But a new study of centuries-old tree core samples indicates an abrupt turn of the weather around the first decade of the 13th century. According to the findings of a team of scientists, published recently in the Proceedings of the National Academy of Sciences, the region appears to have entered into a period of uncharacteristically strong rainfall around the year 1211, ushering in a decadelong period of heavy rain the likes of which Mongolia has not seen since.

The vast surplus of livestock and crops brought on by such advantageous conditions may have played a critical role in supporting the Great Khan's centralized authority and military ventures, at least in the early years of global conquest, the researchers believe.

"This is a new kind of thought," said Nicola Di Cosmo, a historian at the Institute for Advanced Study in New Jersey and co-author of the report. "If we can prove that you need a certain amount of productivity of land to support an expanded political establishment, we can start to explain why it lasted so long."

The team's findings may also, in time, help scientists and historians understand how humans have responded to abrupt climate shocks in the past, and how they may do so again in the future.

History in the trees

In 2010, forest scientists Amy Hessl and Neil Pederson were driving along the Orkhon Valley, in central Mongolia, where eight centuries earlier Genghis Khan had established his capital of Karakorum. The scientists had received a National Science Foundation grant to study the possible future impacts of climate change on Mongolian wildfires, and were looking for tree core samples to read what they could of the past record.

"The kinds of trees that are good for establishing past climate tend to be in places that are extremely water-stressed," said Hessl, a tree-ring scientist at West Virginia University. Such trees tend to be porous and take up moisture readily, she said. They also grow slowly and are less susceptible to rot.

North of the ruined Karakorum, they found what they were looking for: gnarled Siberian pines jutting up from a 7,000-year-old lava field. Samples from their first foray dated over six centuries; later expeditions would yield tree rings more than a millennium old.

Unique among all climate records, tree rings can date events with almost calendar accuracy. A new ring grows on a tree every cycle of the seasons, meaning each ring stands for a particular year. In wet years, the rings are fat; in lean years, thin.

"No other proxy can tell you, with annual accuracy, when an event took place," said Hessl. The samples "allow us to tie in our data with human time scales, with what we believe to have been happening at that point in history."

The tree rings tell a story of severe drought in the 1180s, one of the worst seen in the central Mongolian climate record that the team would ultimately plot. That dryness persisted into the first decade of the next century. The year 1211 showed a jump in average rainfall, only to see levels drop again the next year.

Then, in 1214, rainfall appears to have risen above the mean and remained there, persistently, for the next 12 years. The effect on the landscape may well have been transformative: lusher, more abundant grasses capable of swelling the Mongols' horse and livestock populations to previously unseen numbers.

Forged in drought, fueled by rain

"It's the longest pluvial we've seen going back 800 years," said Pederson, a tree-ring scientist at Columbia University's Lamont-Doherty Earth Observatory. Still, he cautioned, with further findings, the exact shape of the phenomenon may yet change.

"The [pluvial] feature was present in the first two trees we sampled," he said. Adding in later samples, he said, "the shape and length has changed, but it remains a strong recognizable feature."

Much of what can be inferred about the relationship between climate and the Mongols' rise is still speculation, although the team is pursuing its theories with a range of ongoing research. However, many features of the two stories seem to fit well together, said Di Cosmo.

The late 12th century was a period of protracted conflict, internecine warfare and revenge killings between the large Mongolian clans, a time when previous political orders were shattered and a new militaristic, centralized order emerged under Genghis Khan.

That severe drought and resulting resource scarcity may have played a role in this upheaval is an idea historians need to take seriously, he said. "There are well-established correlations between climate degradation and conflict," he added.

The political order established by Genghis Khan was much different from the tribal equilibrium and conflict that had preceded it. Under him, former chieftains became subjects and family members courtiers. Within Karakorum, a small village of personal bodyguards, armed forces and servants encircled the Khan.

Supporting such a state apparatus would have required resources, Di Cosmo said. And with a turn of climate in 1214, those resources may have been in sudden abundance.

"If you have a valley that produces three, four, five times more nutrients for cattle, sheep and horses, this new political order has a chance to survive before collapsing, as so many nomadic states have collapsed, due to resource scarcity," he said.

When horses meant power

Agriculture, all but impossible in Mongolia in times of drought, could have resurfaced, as well, he said, lending stability to an economy previously dependent on livestock alone.

And an abundance of grass would likely have meant more horses. Horses were key to the Mongols' military tactics, so much so that each warrior was expected to have five mounts of his own.

"Energy flows from the bottom of an ecosystem, up the ladder of human society," said Pederson. "Even today, many people in Mongolia live just like their ancestors did."

The researchers aren't arguing that climate necessitated the Mongols' rise, of course. "Climate may have played a role, but it certainly wasn't the only thing shaping events," said Hessl. "Still, it's very interesting that our climate record appears to fit so well with the historical narrative."

The study also rings a more somber note. The tree rings verify what is already well-known: As in the late 12th century, Mongolia today is facing severe climatic conditions. Temperatures in parts of the country have risen by as much as 4.5 degrees Fahrenheit, far exceeding the global average. A series of droughts over the past two decades have killed millions of livestock and racked the country's still-substantial agricultural sector.

The tree ring study puts these changes in troubling perspective. Along the entire multicentury timeline they reveal, the last drought -- persisting from 2002 to 2009 -- is the hottest on record.

    

 

 

Foxes and Hedgehogs

 

An allusion to a phrase originally attributed to the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.” I would like to consider the famous phrase in its original context.

I’d like to; however, the phrase as we know it does not have an original context. The work of Archilochus of Paros, like that of his near-contemporary Sappho, has survived only in fragments, and his most famous proverb comes to us as a mere shard quoted in a collection put together by Zenobius (who believed that Archilochus had been sampling Homer). Epigrammatic partly by virtue of being enigmatic—“an iambic trimeter which is as mysterious as it is charming”—it caught on back in the day. In 1500, when Erasmus dropped his blockbuster gathering of adages, he offered a trim translation (“Many-sided the skill of the fox: the hedgehog has one great gift”) and an interpretation that was 100-percent Team Hedgehog:

[S]ome people do more with one piece of astuteness than others with their various schemes. The fox protects itself against the hunters by many and various wiles, and yet is often caught. The echinus … by its one skill alone is safe from the bites of dogs; it rolls itself up with its spines into a ball, and cannot be snapped up with a bite from any side.”

In 1953, the Oxford philosopher Isaiah Berlin put his rhetorical paws on the saying at the outset of an essay on Tolstoy. “Scholars have differed about the correct interpretation of these dark words,” Berlin wrote. “But, taken figuratively, the words can be made to yield a sense in which they mark one of the deepest differences which divide writers and thinkers, and, it may be, human beings in general.” Berlin classed writers such as Dante, Ibsen, and Proust as hedgehogs (“who relate everything to a single central vision”); placed the likes of Shakespeare, Molière, and Joyce as foxes (“who pursue many ends, often unrelated and even contradictory”); and offered the hypothesis that “Tolstoy was by nature a fox, but believed in being a hedgehog”: “his gifts and achievement are one thing, and his beliefs, and consequently his interpretation of his own achievement, another.”

Before making that last point, Berlin took care to state the baseline pointlessness of his endeavor—“the dichotomy becomes, if pressed, artificial, scholastic, and ultimately absurd.” But as it was already clear that pressing that dichotomy is more fun than popping bubble wrap, writers began treating his warning light as a signal to speed ahead applying it to other fields. Economists were early adapters of the hedgefox taxonomy, but Berlin’s reformulation burrowed into brains across disciplines. For the philosopher Richard Rorty, John Dewey “was a hedgehog rather than a fox; he spent his life trying to articulate and restate a single vision.” For the jazz critic Gary Giddins, Miles Davis was “a born hedgehog who believes in being a fox.” For the historian Peter Gay, Freud was “a fox who at times affected a hedgehog’s clothing.”

This parlor game started getting out of hand in the 1990s—the decade of Judy Davis delivering a foxy, Berlin-inspired monologue as part of her sensational performance in Husbands and Wives, of dorm-room CD players inquiring “Are you a fox or a hedgehog?” on Luna’s hedgehogging Penthouse, of Michael Ignatieff’s Isaiah Berlin: A Life identifying its subject as “the type of fox who longs to be a hedgehog.” At the turn of the 21st Century, the idea exploded. I mean this to say that it fully emerged from the realm of the New York Review of Books into that of the bestseller lists, and also I mean to say that the idea ceased to have a coherent identity. As the idea drifted—into the deliberately difficult title of a Stephen Jay Gould book, into the branding of a quarterly magazine, into the jargon of marketing consultants and their jabbering ilk—its meaning shifted in several directions at once.

FiveThirtyEight arrives at its understanding of foxes and hedgehogs by way of Philip E. Tetlock, a professor of political psychology whose work reconfigures the woodland creatures as representatives of “cognitive styles.” As Silver wrote in his 2012 book, The Signal and the Noise, “Foxes, Tetlock found, are considerably better at forecasting than hedgehogs.” So hedgehogs are now popularly understood to be bad. Except when they’re good, as in this slide presentation, which explains the “hedgehog concept” as a corporate necessity on the order of a mission statement. (For an example of a fairly traditional usage of the concept in a corporate context, see the recent Wall Street Journal item headlined “McDonald’s and Wendy’s: A Modern-Day Fox vs Hedgehog”: “Wendy’s ... went full hedgehog—but instead of curling up into a spiky protective ball, it doubled down on its core burger lineup.”)

How on earth do people who communicate in buzzwords keep straight all the pluralistic (foxy?) usages of this one big (hedgehoggish!) concept? I understand that phrases, like words, can change their meanings over times. I understand, further, why the Archilochus adage has been enjoying its extended cultural moment: Its binary elegance slices a complicated world into tidy parts; its philosophical pedigree masks its brute simplicity; its Gladwellian polarity suits the contemporary culture of thought, such as it is; it’s fun to do self-diagnostic quizzes. I totally get why someone as otherwise astute as Silver—so clearly a hedgehog in Berlin’s sense, given his systematic central vision—should prefer to be thought of as a fox: Foxes are foxy.

Further, I have an idea about how to resign myself to a future where the development of ideas will sometimes involve their flummoxing degradation. Here’s a relevant line from the relevant chapter (“The Hedgehog and the Fox”) of The Passionate Mind, a book by an anthropologist with a relevant name (Robin Fox): “The fox had more or less come to accept that this was the way of the world: information accumulated while mind decayed.”

    

 

 

The Vikings

 

Forget the funeral boats burning at sea and tales of the most bloodthirsty warriors in history. In fact, you can forget pretty much everything you think you know about the Vikings—it’s all wrong. Many of the legends associated with the Norse raiders were invented by their victims, whose written accounts dominated the narrative long after the Viking Age.

In truth, the Vikings’ most remarkable achievement was setting up an extraordinary intercontinental trade network that surpassed even the great Roman trade routes. That is not to suggest these Norse adventurers were anything less than fearsome, indeed the first phase of globalization arrived in North America powered by the Vikings’ innovative and gruesome slave industry. Their unparalleled reach across the globe is the subject of a new exhibition at the British Museum in London, which goes beyond the legend of the Scandinavian explorers.

One brutal exhibit features a recently discovered mass grave in Dorset, southern England. Around 50 men, whose bones can be traced back to Scandinavia, were rounded up and beheaded at some point in the 11th Century. It’s hardly the kind of scene you find in the terrifying, florid descriptions of unbeatable Norse raiders written by monks and churchmen at the time.

The discovery, made near Weymouth in 2009, is thought to have contained remains of the entire crew of a medium-sized warship who were captured, stripped and executed. According to an account in The Anglo-Saxon Chronicle, written in the 9th century, that failed Viking raid was hardly a one-off. In 896, a group of six Viking ships were said to have been fought off by locals leaving only a few survivors who were “very much wounded.”

These battlefield set-backs were extremely common said Gareth Williams, curator of “Vikings: Life and Legend.” “When it came to actual battle the Vikings were no more successful than their enemies,” he said. “Where they could, they tended to avoid combat. If you can get what you want without having to fight for it that enhances your profits, and enhances the chances of you living long enough to enjoy the profits.”

So where did the Vikings’ traditional rape and pillage reputation come from? Horrifying tales of their brutality have been around for centuries. The death of King Ella is a good example; he was supposedly executed in York by having his ribs cut along the spine and splayed out to resemble wings, in a technique known as the Blood Eagle.

It seems this was a rare era in which history was not written by the victors; mostly because the victors couldn’t write. It was left to monks and Christian churchmen to craft the only contemporary accounts of many of the Vikings’ raids, and Vikings did attack churches, which held no sacred mystique for them. They were simply seen as easy, wealthy targets, confounding local conventions of the time.

“These accounts are dressed up in the language of religious polemic,” Williams said. “Many [of the stories] were borrowed from earlier accounts—from classical antiquity. The violent reputation and particularly the reputation for atrocities was created then, but the Vikings were probably no worse than anyone else.”

There was a revisionist phase in the late-1970s when historians began to suggest that these accounts couldn’t be trusted at all—and that perhaps the Vikings were just successful traders. The latest research suggests a middle-ground, and it's worth looking at what they were trading.

“There’s not much distinction between the Viking as the violent raider and the idea of the Vikings as the peaceful trader if you’re talking about the slave trade,” Williams told the Daily Beast. “We've got accounts in Irish, Anglo-Saxon and Frankish sources of fleets of Vikings descending on an area and carrying off hundreds of slaves at a time. It's not far removed from what was happening on the West Coast of Africa in the 18th century. There's a bit of a tendency to see that slave trade as unprecedented—it's acquired particular overtones because of the color, but in terms of what was happening, the slave trade in the Viking era was fundamentally very similar.”

The remains of the longest Viking warship ever discovered stands at the center of the British Museum’s exhibition. It’s surrounded by a 120-foot recreation of the original vessel, which was found in the late-1990s. It was these powerful, flat-bottomed boats that allowed the Vikings to set up the unprecedented network through which they traded furs, falcons, walrus ivory and slaves for spices, silk, silver and jewelry.

“The violent reputation and particularly the reputation for atrocities was created then, but the Vikings were probably no worse than anyone else.” The design of the boats revolutionized trade and raiding: Their power allowed the Vikings to cross major seas in relatively small vessels, while the flat bottoms allowed them to dock on beaches or travel inland along rivers. One famous maritime Viking legend, however, is supported by absolutely no evidence.

The only contemporary record of a funeral pyre in a boat pushed out to sea comes in a description of the burial of Baldr, a Norse god and son of Odin. “Certainly not an eyewitness account!” said Williams. For obvious reasons, there is also no archaeological evidence of sea burials.

“Boat burning on land certainly did take place though,” said Williams. “Ibn Fadlan, an Arab traveler, described a funeral on the banks of the river Volga by the Rus—the Vikings who give their name to Russia—with a sacrificed female slave who has ritualized sex with most of his followers before being killed and put into the boat with him. She is killed by a priestess they call the Angel of Death.”

    

 

 

The Georgians and Succession

 

Georges. We Brits have known a few and are getting to know another. That chubby little nugget currently charming New Zealand and Australia with his parents will eventually become the seventh George to reign over Britain (though he might just be the first to do so without Scotland).

This year marks the 300th anniversary of the succession of the House of Hanover to the British throne and the Royal Collection is mounting an exhibition, The First Georgians: Art & Monarchy 1714-1760, to give a clearer idea, through portraits, maps, documents and personal effects, of who these ancestors of the House of Windsor actually were. The word “Georgian” has come to mean many things — Bath townhouses, breeches, Hogarth and fat kings — but, at the time, for the slightly disgruntled natives “Georgian” more or less meant “German”. George I, who inherited the throne from Queen Anne, was the son of Ernest Augustus, Elector of Hanover and his wife Sophia, daughter of the King of Bohemia. He didn’t speak a word of English and was way down the list of succession.

So how did this happen? It’s all down to the British suspicion of Catholics. Briefly, because I know you’re keen to get on: the last Stuart monarch, the Catholic James II was deposed in the Glorious Revolution of 1688 in favour of his Protestant daughter Mary and her husband the Dutch William of Orange. They were invited to take the throne by Parliament in the Bill of Rights, which also barred Catholics from reigning. They had no children and were succeeded in 1702 by Mary’s sister Anne, who despite 17 pregnancies died childless in 1714, leaving Parliament in something of a spot.

“To reach the most direct line with a Protestant, you had to go back to James I,” Desmond Shawe-Taylor, the Surveyor of the Queen’s Pictures and the exhibition’s curator, explains. “His daughter Elizabeth married the Elector Palatine, who was then chosen as King of Bohemia. They ruled there for one winter before they were kicked out by the Catholics and lived in exile. They had 15 children, most of whom died or ended up Catholic, but their youngest daughter Sophia married the Elector of Hanover, and she was absolutely squeaky clean Protestant. She was chosen as the successor to Queen Anne, assuming that Anne died childless.” Inconveniently, Sophia died in the same year, so instead Parliament summoned her son George I, from Hanover.

“He’s massively distant,” Shawe-Taylor says. “If you had the list of people [in line before him], it is literally 50 people.”

It sounds bizarre, but it was a very practical way of solving what had become a major problem across Europe, he says. “The rest of Europe at this stage has wars of succession. All the big dynasties were dying out and in a proper dynastic struggle you don’t have a choice . . . We had a new way of solving the problem, which was for Parliament to say ‘These are the rules we’re making up and this is who we want.’ With the Glorious Revolution, we decided that Parliament chooses kings rather than kings choosing Parliament.”

The first Georges — and Georgians — were living in a new, enlightened time in Britain. The way that people thought about the universe itself was being fundamentally changed by Isaac Newton’s explanation of the movement of the heavenly spheres. But instead of a collision of science and religion such as had forced Galileo to recant half a century previously, the Church of England had, in Shawe-Taylor’s words, “decided ‘Oh no, that’s fine, don’t worry! There are no angels pushing planets through the ether, none of that stuff, it’s all just a clock — but my word what a wonderful clockmaker.’” At that moment, he says, “Britain was the most revolutionary, advanced — intellectually, politically, scientifically — country in the world. It was the shining light.” Though the Enlightenment was germinating across Europe, it was here that its thinkers found the warmest welcome. In exile Voltaire wrote his Letters Concerning the English Nation, hailing the political and religious tolerance of 17th and 18th-century Britain.

This interest in progress was reflected among the Hanoverians, particularly Queen Caroline, wife of George II, generally acknowledged as the most intellectual of the dynasty. In 1730 she employed the architect William Kent to build a Hermitage in the gardens of Richmond Palace, in which she displayed busts of thinkers including the scientist Robert Boyle, Newton, the philosopher John Locke and two theologians, Samuel Clarke and William Wollaston. Some of these busts are in the exhibition. The Hanoverians had long been tutored by the German philosopher (and Newton’s rival) Gottfried Wilhelm Leibniz, who enjoyed a particularly close friendship with Caroline, though her husband was less keen on “all that lettered nonsense”. Crucially, Caroline had her children inoculated, showing huge confidence in the new science. “That’s not vaccinating them,” Shawe-Taylor says, “it’s giving them smallpox . . . Obviously it’s very important that they survived.”

Where this atmosphere of enlightenment did not extend was family relations in the House of Hanover. From father to son and through following generations, they absolutely loathed each other. When he arrived in England, George I left his wife Sophia Dorothea locked up in a castle in Saxony for the last 30 years of her life as punishment for an affair with an aristocratic soldier (whom he had bumped off). This understandably led to tensions between the king and his heir, George Augustus.

The simmering feud boiled over in 1717, when George Augustus was expelled from the family’s London seat St James’s Palace. He set up instead in Leicester House, then in Leicester Square. In so doing the Georges unwittingly set in motion another fundamental shift for the monarchy — that from the hidden and ceremonial to the visible and domestic. “Two things happened,” Shawe-Taylor says. “One, that you have two rival political factions, openly in opposition, but secondly, rather than living in a castle, the Prince of Wales now lives in a London townhouse just like any other member of the aristocracy. Suddenly, culturally, they belong with the fashionable world and that dynamic, you don’t get anywhere else on the continent.”

Far from being a brief anomaly, exactly the same thing happened during George Augustus (George II)’s reign with his own eldest son, Frederick Prince of Wales, who, having remained in Germany as the House of Hanover’s representative until his father’s succession to the throne, was rather too used to the role of top dog for George II’s liking. Frederick and his household were also booted out of court following a bizarre act of spite on Frederick’s part — on discovering that his wife Augusta had gone early into labour at Hampton Court, he secretly had the poor woman removed by carriage to St James’s Palace in the middle of the night so that his parents would be denied their traditional right of witnessing the birth. Frederick was duly expelled and set up his household as his father had done, at Leicester House, as well as taking up in Carlton House, just around the corner, the White House at Kew and a country house in Henley called Park Place which was sold in 2011 to a Russian buyer for a staggering £140 million.

This blurred relationship between royalty and its subjects, through the Prince Regent (later George IV) chumming about London with the dandy Beau Brummell to Princes William and Harry charging around the nightlife of Mayfair with club owner Guy Pelly, hasn’t changed since. So, as well as being great supporters of science, of philosophy, of art, you can trace a direct line from the Georgians to Heat magazine. I fear Caroline would not be pleased.

    

 

 

Seth Godin on Changes

 

It seems as though profit-maximizing business people ought to be speaking up loudly and often for three changes in our culture, changes that while making life better also have a dramatically positive impact on their organizations.

Minimum Wage: Three things worth noting:

Most minimum wage jobs in the US can't easily be exported to lower wage places, because they're inherently local in nature.
The percentage of the final price of a good or service due to minimum wage inputs is pretty low.
Many businesses sell to consumers, and when they have more money, there's more demand for what they sell.
Given that for even the biggest organizations there are more potential customers than employees, the math of raising the minimum wage works in their favor. More confident and more stable markets mean more sales. Workers struggling to make ends meet are a tax on the economy.
(Consider the brilliant strategic move Henry Ford made in doubling the pay of thousands of his workers in 1914. The assembly line was so efficient that it created profits—but only when it was running, and high turnover made that difficult. By radically raising pay, Ford put pressure on all of his competitors (and on every industry that hired the sort of men he was hiring) at the same time that he created a gateway to the middle class, a middle class that could, of course, buy his cars, whether or not they happened to work for him). Also, consider this point of view...

Climate Change: The shift in our atmosphere causes countless taxes on organizations. Any business that struggled this winter due to storms understands that this a very real cost, a tax that goes nowhere useful and one that creates countless uncertainties. As sea levels rise, entire cities will be threatened, another tax that makes it less likely that people will be able to buy from you.

The climate upredictability tax is large, and it's going to get bigger, in erratic and unpredictable ways.

Decreasing carbon outputs and increasing energy efficiency are long-term investments in global wealth, wealth that translates into more revenue and more profit.

Anti-corruption movements: The only players who benefit from corruption in government are the actors willing to race to the bottom--the most corrupt organizations. Everyone else is forced to play along, but is unlikely to win. As a result, for most of us, efforts to create transparency and fairness in transactions are another step toward efficient and profitable engagements.

Historically, when cultures clean up their acts, get more efficient and take care of their people, businesses thrive. It's not an accident, one causes the other.

In all three cases, there's no political or left/right argument being made--instead, it's the basic economics of a stable business environment with a more secure, higher-income workforce where technological innovation leads to lower energy costs and higher efficiency.

( more good ideas from Seth Godin)

    

 

Chess and Politics

 

There is no evidence that Vladimir Putin plays chess. There’s no reason he should, even as someone brought up in the Soviet system, which regarded the world’s oldest strategy game as an ideal diversion for the workers. It is worth noting, however, if only because in America politicians and pundits have, since the Ukrainian crisis broke, been acclaiming Putin as a “chess player” whose masterful strategic skill has been making fools of the West and in particular President Obama.

Thus, the chairman of the House intelligence committee, Congressman Mike Rogers, declared: “I think Putin is playing chess, and I think we’re playing marbles ... and so they’ve been running circles around us.” The New York Post argued that “Putin is acting like a grandmaster of chess while Obama stumbles at chequers.” Exactly the same simile was used by The Washington Times.

Admittedly, none of the above could be described as well disposed towards President Obama; they will happily seize on an image that paints the president as a directionless dunderhead (and it’s true that Obama has no particular vision).

Yet his critics also betray an admiration for Putin’s alleged strategic brilliance, exemplified by the way the former Republican presidential candidate Rudy Giuliani gushed: “Putin decides what he wants to do and he does it in half a day, right? ... He makes a decision and he executes it quickly. Then everybody reacts. That’s what you call a leader.”

Well, yes. That is one advantage of being an unchallengeable dictator in all but name, totally in control of your country’s legislature. An American president in such a position could decide to invade Mexico in half a day. That might constitute leadership, but would hardly qualify as strategic brilliance.

Some of this admiration for Putin on the part of Americans comes back to a naive view of chess. There is a sort of syllogism that goes: you have to be very clever to play chess. The Russians are brilliant at chess. Ergo the Russians can outwit us at will.

The former world chess champion Garry Kasparov dealt with this point pithily in his book How Life Imitates Chess: “It’s natural to suggest that aptitude at chess signifies great intelligence, even genius. There is little to support this theory, unfortunately.” I can vouch for this: I played chess for Oxford University and only one of our team had any special academic abilities (not me).

As for its applicability to politics, Kasparov is equally dismissive: “In chess every piece of information you need is available at the board, so what is being tested is your ability to process that information. In politics things are different: we never have all the information. People often compare politics with chess, but in fact politics is more like a game of cards, poker perhaps.”

Quite. A number of US presidents have been very skilled at that game of bluff and money — none more so than Richard Nixon, who financed his first congressional campaign from winnings at the poker table.

When John F Kennedy confronted Nikita Khrushchev over the Cuban missile crisis, the US president faced similar charges from his domestic political opponents: that he was being hopelessly outwitted by a deep Russian strategy, second nature for a land of chess players. Yet it turned out that the Soviet leader, if anyone, was bluffing. It also emerged that, of the two, Khrushchev was by far the more impulsive character.

We don’t yet know what will happen in the face-off between Washington and Moscow over Ukraine, but this much is clear: Obama is an intensely deliberative politician, passionless even, while Putin is impulsive and governed by emotion — just what no serious chess player should be.

The emotion that governs Putin is resentment, chiefly over the collapse of the Soviet Union and what he perceives as the lack of international respect for Russia. Most if not all of the worst wars in modern history have been fuelled by a similar psychology of resentment: Kaiser Wilhelm’s in 1914 and Adolf Hitler’s 25 years later. In neither of those cases did Germany benefit from what at the outset was admired as decisiveness — even by many non-Germans.

But what of the power play over Ukraine in 2014? To pursue the dubious chess analogy, is Putin really the master strategist? Is the game going his way? First of all, he had desperately wanted his man, Viktor Yanukovych, to remain in power. That piece was swept from the board, chiefly because Putin had instructed Yanukovych to use lethal force against demonstrators in Kiev. It was this decision that did for the regime.

Then Putin — again impulsively, and stung by the demolition of his Kiev gambit — seized Crimea. Strategically this was meaningless. Moscow already had, by international agreement, control of Crimea for military purposes as the base for Russia’s Black Sea fleet. So now Putin is fomenting Russian separatism in east Ukraine. But the consequence of this is to make the vast majority of Ukrainians look still more longingly at their Polish neighbour to the west, which has forged a dramatically more prosperous relationship with the rest of Europe.

Meanwhile, the adverse consequences for Russia mount up on the financial board. As Anders Aslund, who served as an adviser to the governments of Russia and Ukraine, points out: “Russia is far too weak to be so aggressive ... Capital outflows amounted to $64bn [£38bn] in the first quarter, slightly more than 3% of GDP, and they are now expected to rise to some $150bn for the year as a whole ... The rouble has fallen and inflation has risen, forcing the central bank of Russia to raise interest rates by 150 basis points ... That will hurt Russia’s standard of living, which has been vital for Putin’s popularity.” All this has happened even without the western powers organising any serious economic sanctions.

In Russia’s sole significant export industry, hydrocarbons, European distributors have abruptly cut off negotiations over the 1,500-mile South Stream pipeline. This was to have been a new Russian gas-exporting channel to Germany, France and Italy. So Russia is left only with the existing pipelines that cross Ukraine — exactly the arrangement its state monopoly Gazprom had sought to supersede. Putin’s adventurism has only strengthened the hand of American shale gas producers, who have been pressing Washington to allow them to export part of their glut to Europe.

Thus in the real “great game” of energy, which has always been characterised by genuinely long-term strategy, Putin has gratuitously compromised his nation’s position on the board of play. It may be that in the short term his domestic popularity has been increased by his apparent triumph in Crimea, but even the state-run Russian Public Opinion Research Centre found in February that 73% were against intervention in “the conflict between government and opposition in Ukraine”.

Chess, however, is a game for the steady and incremental accumulation of small advantages. Apart from revealing their own febrile temperaments, it is a testament to the ignorance about chess on the part of American politicians and pundits that they think Putin is playing like a grandmaster.

    

 

George Carlin

 

Because the owners of this country don't want that. I'm talking about the real owners, now. The real owners, the big wealthy business interests that control things and make all the important decisions. Forget the politicians, they're an irrelevancy. The politicians are put there to give you the idea that you have freedom of choice. You don't. You have no choice. You have owners. They own you. They own everything. They own all the important land. They own and control the corporations. They've long since bought and paid for the Senate, the Congress, the statehouses, the city halls. They've got the judges in their back pockets. And they own all the big media companies, so that they control just about all of the news and information you hear. They've got you by the balls. They spend billions of dollars every year lobbying,­ lobbying to get what they want. Well, we know what they want; they want more for themselves and less for everybody else.

But I'll tell you what they don't want. They don't want a population of citizens capable of critical thinking. They don't want well-informed, well-educated people capable of critical thinking. They're not interested in that. That doesn't help them. That's against their interests. They don't want people who are smart enough to sit around the kitchen table and figure out how badly they're getting fucked by a system that threw them overboard 30 fucking years ago.

    

 

Closed Minds

 

The internet was supposed to open people’s minds. Yet it’s having exactly the opposite effect.

“Get out there and let your beautiful, freaky differences shine!”

It’s commencement speech season in America, when big names are invited to address new college graduates. They loved that line at Pasadena City College, near Los Angeles, but it nearly didn’t get delivered. The speaker was Dustin Lance Black, the Academy Award-winning screenwriter and gay activist (and partner of Tom Daley) who had been invited to give the address, then disinvited, then reinvited.

As he put it to students and squirming administrators: “If you measure the weight of an honour by the amount of work it takes to actually get there, well this might damn well be the biggest honour of my life.”

Black had been invited because he is an alumnus of the college, disinvited because a stolen sex tape featuring him was on the internet, then reinvited when it emerged that the replacement speaker, Dr Eric Walsh (a public health official), believes that evolution is a religion created by Satan.

If the commencement speech snafu had been limited to a relatively obscure college in an unremarkable suburb of Los Angeles, no one would have noticed. However, according to the Foundation for Individual Rights in Education, (a non-party campaigning group), there have been 95 protests against planned speeches on US campuses since 2009, resulting in 39 withdrawals. And the pace is growing.

This year has seen a vintage crop. The best publicised was the decision last month of Brandeis University to rethink the honorary degree awarded to Ayaan Hirsi Ali, the Somali-born women’s rights activist, after a campaign against her. The same happened with Condoleezza Rice, President Bush’s national security adviser, at Rutgers. At Azusa Pacific University, California, Charles Murray, the libertarian author of The Bell Curve, also bit the dust. Add to that Christine Lagarde, managing director of the International Monetary Fund, withdrawing from Smith College’s commencement activities and a pattern seems to be emerging. As the conservative columnist George Will put it, students are demanding, on top of their other privileges, “a right never to be annoyed”.

Yet it’s more than that. The truth is that many thoughtful Americans worry that this closing of the mind is happening outside universities as well. And conservatives are not the only victims. Last month, the graduating class of Oklahoma police officers secured a speaker for their big day who, on the face of it, could not be bettered — the US attorney-general himself. Big political figures tend not to find time in their diary to visit Oklahoma City.

But Eric Holder is a hate figure on the right so a campaign began and, lo and behold, Mr Holder eventually found he had a prior engagement. The Oklahoma authorities behaved just like the long-haired Ivy League luvvies they so despise. Whose fault is this? Perhaps it starts in dorm rooms, studies and kitchen tables across the nation. In recent times Americans have become attuned to assimilating information via web sites. In other words they find what they want to find when they want to find it and are shocked, even repelled, when some contrary piece of knowledge is directed at them and — horror of horrors — they have to sit quietly and listen to it.

In his book The Signal and the Noise, America’s best-known statistician and election forecaster Nate Silver tells us that the information age has brought with it a potential (and paradoxical) down side: “The instinctual short cut that we take when we have too much information is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies of those who have made the same choices and enemies of the rest.”

It happened, Silver claims, when the invention of the printing press in 15th-century Europe ushered in a century of sectarian conflict and it is happening again today in information-overloaded America.

Silver does not link his idea to university speakers but the connection is pretty obvious if he is right — Americans are increasingly driven, at the level of psychology rather than politics, to clear their heads of all the noise.

And that means no to Christine Lagarde. No to Eric Holder. No to the anti-evolution replacement speaker in Pasadena. The implications for us in Britain — and for these Opinion pages — are profound. If Silver is right, people growing up in the internet age do not want conflicting views; they want uniformity. In fact they need it. It helps them to cope. Only fail to connect, as Pasadena City College literature students might put it.

    

 

Independent Scotland

 

A Times reader in London contacts me, worried whether the Scots have considered what a “yes” vote in the referendum will mean when they visit Blackpool.

First, he says, a Scottish mobile would be on an international call tariff, a foreign credit card would incur a fee and a postcard home would require a foreign stamp.

That’s before visitors realised they booked in Scotland paying Scottish VAT, which will probably be higher than Rest-of-UK VAT. And before the coach driver does a bunk without them because if he stayed another night in England that would put him over his 183 days abroad and he’d lose Scottish tax residency.

Totally ridiculous, Mr Rice. It could never happen.

Just as ridiculous as the suggestion that the Lottery could, in the final reckoning, be the thing responsible for breaking up Britain. Nobody saw that one coming in 1993, did they?

But think about it. Chris and Colin Weir, avid nationalists, won £161 million on Euromillions in 2011. They have so far given £5.5 million to the SNP and the pro-independence fight, representing 80 per cent of the campaign’s total funding. A campaign which is, largely as a result, doing brilliantly.

So contemplate some random cause and effect. A nice roly-poly semi-retired couple from Ayrshire pluck some numbers out of the air. And Britain is no more. How’s that for the law of unintended consequences?

    

 

Revolution?

 

On the June cover of the conservative magazine American Spectator, a vision arises from the collective unconscious of the rich. Angry citizens look on as a monocled fatcat is led to a blood-soaked guillotine, calling up the memory of the Reign of Terror during the French Revolution, when tens of thousands were executed, many by what came to be known as the “National Razor.” The caption reads, “The New Class Warfare: Thomas Piketty’s intellectual cover for confiscation.” One member of the mob can be seen holding up a bloody copy of the French economist’s recent book, Capital in the 21st Century.

Confiscation, of course, can only mean one thing. Off with their heads! In reality, the most “revolutionary” thing Professor Piketty calls for in his best-sellling tome is a wealth tax, but our rich are very sensitive.

In his article, however, James Pierson warns that a revolution is afoot, and that the 99 percent is going to try to punish the rich. The ungrateful horde is angry, he says, when they really should be celebrating their marvelous good fortune and thanking their betters:

“From one point of view, the contemporary era has been a ‘gilded age’ of regression and reaction due to rising inequality and increasing concentrations of wealth. But from another it can be seen as a ‘golden age’ of capitalism marked by fabulous innovations, globalizing markets, the absence of major wars, rising living standards, low inflation and interest rates, and a thirty-year bull market in stocks, bonds, and real estate.”

Yes, things do indeed look very different to the haves and the have-nots. But some of the haves are willing to say what’s actually going down — and it’s a war of their own making. Warren Buffett made this very clear in his declaration: “There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.”

Warren is quite correct: It is the rich who have made war against the 99 percent, not the other way around. They have dumped the tax burden onto the rest of us. They have shredded our social safety net and attacked our retirements. In their insatiable greed, they refuse even to consider raising the minimum wage for people who toil all day and can’t earn enough to feed their children. And they do everything in their power to block as many people from the polls as possible who might protest these conditions, while crushing the unions and any other countervailing forces that could fight to improve them.

The goal of this vicious war is to control all of the wealth and the government not just in the U.S., but the rest of the world, too, and to make sure the people are kept in a state of fear.

But the greedy rich are experts in cloaking their aggression. Like steel tycoon Andrew Carnegie, who successfully transitioned from robber baron to philanthropist, David H. Koch and his conservative colleagues put on the mask of philanthropy to hide their war dance. Or they project their aggression onto ordinary people who are simply trying to feed their families, pay the bills, and keep the roof over their heads. Many of the wealthy liberals play a less crass version of the game: they talk about inequality only to alleviate their conscience while secretly — or not so secretly — protecting their turf (witness: NY Governor Andrew Cuomo and his mission to reduce taxes on his wealthy benefactors).

It is rich Americans, in particular, financial capitalists, who have made the war-like values of self interest and ruthlessness their code of ethics through their championing of an unregulated market. When we hear the term, “It’s just business,” we know what it means. Somebody has legally gouged us.

People in America are under attack daily. The greedy rich know it, because they are the ones doing the attacking. They know that they have made collateral damage out of hungry children, hard-working parents, grandmother and grandfathers. And somewhere behind the gates of their private communities and the roped-off areas — their private schools, private hospitals, private modes of transport—they fear that the aggression may one day be turned back. They wonder how far they can erode our quality of life before something might just snap.

The growing concentration of wealth is creating an increasingly antagonistic society, which is why we have seen the buildup of the police state and the rise of unregulated markets appear in tandem. This is why the prisons are bursting at the seams with the poor.

The oligarchs hope that Americans will be so tired, so pumped full of Xanax, so terrified, that they will remain in their places. They hope that we will watch the rich cavorting on reality shows and set ourselves to climbing the economic ladder instead of seeing that the rungs have been kicked away.

Of course, there is a very easy way for the rich to remain rich and alleviate their nightmares of the guillotine. That is simply to allow their unearned wealth to be taxed at a reasonable rate. Voila!No more fear of angry mobs.

Or they can wait for some less pleasant alternative, like a revolution. This theme, which once timidly hid behind the scenes, has lately burst onto cultural center stage. The cover of the current issue of Lapham’s Quarterly, dedicated to the topic, “Revolutions,” features five crossed swords. Its contents outline various periods in history when ordinary folks had had enough, such as “The People’s Patience is Not Endless,” a pamphlet issued by the Command of Umkhonto we Sizwe, the armed wing of the African National Congress, in December 1961.

Very interesting reading for the 1 percent.

    

 

White Supremacists

 

It is known as “the murder capital of the internet” — an online crucible of hatred where an extreme fringe of white America rails against the growing diversity that is reshaping their country.

On an average day, fewer than 1,800 registered users log on to Stormfront.org, a website run by Don Black, 60, an ex-Ku Klux Klan leader who once tried to invade the Caribbean island of Dominica to oust its black-run government and turn it into a “white state”.

Members of this small cohort have committed nearly 100 murders in the past five years — or one about every three weeks. The number of deaths has risen sharply since Barack Obama became the first black president, and Stormfront has become a window into the internet-fuelled rage festering on America’s extreme right.

The build-up to a killing often follows a pattern, according to Heidi Beirich, of the Southern Poverty Law Center, which has tracked the site. The typical Stormfront murderer is a “frustrated, unemployed, white adult male living with his mother or an estranged spouse or girlfriend. She is the sole provider in the household.”

Forensic psychologists call him a “wound collector”. Instead of searching for a job, “he projects his grievances on society and searches the internet for an explanation unrelated to his behaviour or the choices he has made in life”. He begins by reading right-wing anti-government websites. He progresses to hate sites where he will read diatribes claiming that his race is under attack — a target of “white genocide”. He is likely to spend hundreds of hours online. “He gradually gains acceptance in this online den of self-described ‘lone wolves’, but he gets no relief, no practical remedies, no suggestions to improve his circumstances. He just gets angrier,” Ms Beirich said. “And then he gets a gun.” Mark Potok, a hate group expert, described Stormfront as a place where white supremacists on the edge of mental illness found affirmation. “They are told: ‘You’re not crazy — there are a thousand people who think like you do.”

Mr Potok suspects that Elliot Rodgers, who killed six people in California on Friday, might have achieved a similar kind of validation through online misogynist groups. The reach of Stormfront is international: Anders Breivik, who killed 77 people in Norway in 2011, was a member. After he uploaded one anti-Muslim post, another user told him: “Glad to have you here.”

Experts regard the site as a sick reaction to the changing profile of the US. Last year, for the first time, minorities made up about half of all Americans under the age of 5. By 2043, whites will no longer form a majority of Americans, according to census projections.

The number of right-wing anti-government “militia” groups in the US has exploded. In 2008, there were 149. Now there are more than 1,000.

    

 

FairTrade

 

A new study exposes Fairtrade for what it is – a Western vanity project that impoverishes those it’s meant to benefit. The world’s ethical shoppers are still reeling this week after a report revealed that Fairtrade programmes are of little benefit to those working on farms in the developing world.

The government-funded study published by SOAS, a part of the University of London, was conducted over a four-year period in Uganda and Ethiopia. It showed that labourers on farms that are part of Fairtrade programmes are usually paid less and are subject to worse working conditions than their peers on large commercial farms, and even other small farms that are not part of Fairtrade programmes. Professor Christopher Cramer, the study’s main author, said: ‘Fairtrade has not been an effective mechanism for improving the lives of wage workers, the poorest rural people.’

The study also found that the ‘social premium’ incorporated into the price of Fairtrade products, which is meant to be used to improve infrastructure in poor communities, is often misspent. In one instance, researchers found that modern toilets built with this premium were in fact for the use of senior farm managers only. The report also documented examples of health clinics and schools set up with social-premium funds that charged fees that were too high for the labourers they were intended to benefit.

Of course, nobody needed the clever people at SOAS to tell us all this. From its very inception, the concept of Fairtrade was rooted in maintaining low ‘sustainable’ horizons for the poor by those who consider people in Africa and other parts of the Third World to be intrinsically different to the rest of us. The movement did not originate with the poor farmers of the developing world, but with Western NGOs and their army of gap-year do-gooders intent on imposing their reactionary ‘small is beautiful’ values on an Africa desperate for change.

According to the Fairtrade worldview, the poor farmers of the world are in fact quite happy with their lot and only desire a stable, if low, price for their produce. Once this is in place, they will be free to enjoy their simple idyllic existence. The fact that Western countries left extreme poverty behind through rapid industrialisation and urbanisation does not apply to Africa, they say. Instead, it is of paramount importance that Fairtrade ‘promotes and protects the cultural identity and traditional skills of small producers’. They should receive enough money never to be in danger of starvation, but not enough to afford a foreign holiday or to send a child to university or, indeed, do any of the things we in the West enjoy, lest it undermine their cultural identity.

The concept of Fairtrade was enthusiastically lapped up by Western companies desperate to prove their brands were ethical and right-on – now they could tell their customers that by buying their goods they were making the world a better place. It became one of the most successful marketing campaigns in history.

In reality, though, the idea of ethical shoppers transforming the world through their consumer choices was always a fairy tale. Yes, Fairtrade farmers are guaranteed a minimum price for their produce, in the event that the price of the commodity they produce crashes. However, in return they are expected to adhere to stringent regulations, many of which prevent producers from developing or expanding their farms. As a result of Fairtrade, then, many farmers are kept in poverty.

Principle 10 of the Fairtrade charter, for instance, demands that farmers have ‘respect for the environment’. In practice, this means actively discouraging the use of chemical fertilisers, pesticides and mechanisation – the three things that make modern agriculture possible. Instead, Fairtrade stresses the importance of ‘traditional skills’, which is code for backbreaking manual labour. The end result of all this is that farmers are forced to endure more toil for lower yields. It is no surprise that farmers pass the economic pinch on to their labourers in the form of lower wages.

All of this has been known for a long time. In 2005, the education charity WORLDwrite made the documentary The Bitter Aftertaste which exposed the chasm between the desperate circumstances of those who worked on Fairtrade smallholdings and the self-righteous do-gooders back in Blighty who believed they were helping them. Almost a decade on, it seems little has changed. The Guardian, which has long been Fairtrade’s loudest media cheerleader, foolishly defended its cause in the face of the report’s scathing criticism. Commissioning a series of articles, including one by Cramer, it chose to go with the line that the damaging impact of Fairtrade on the developing world meant that Fairtrade should be reformed rather than done away with altogether.

It’s a sorry state of affairs. Advocates of Fairtrade such as the Guardian accept that all people in the developing world deserve is a life just above subsistence, and that it’s our duty in the West to provide that for them. It is time this patronising attitude gave way to a more positive outlook, one which sees economic development as the best way to improve the lives of farmers. Rather than lamenting the changes economic development will bring to the way of life of rural Africans, these changes should be celebrated. African farmers don’t need Fairtrade restrictions and regulations; they need the freedom to develop their societies into modern economies, with large efficient modern farms in which the workers might have a chance to demand a decent wage for their labour.

    

 

Colour Blind Love

 

The roots of racism go deep but with inter-ethnic relationships on the rise the new spirit of our times is: who cares? My mum still remembers the tuts as she walked down Bromley High Street in the late 1960s. The cause of people’s anger was the child she was pushing in her pram: my older brother, Andy. He had (still has) dark skin, like freshly made cappuccino, and people were outraged by it. They could tell from his appearance that my mum, a freckly, fiery-haired teenager from Wales, had had a child with a foreigner, an outsider, one of those dark-skinned types that were taking over the bloody country.

They didn’t know anything about my dad, a dashing, ferociously self-confident Indian immigrant with a work ethic that put many indigenous Brits to shame, nor the intensity of their feelings for each other (nor that their marriage, while at times turbulent, would survive for 45 years and counting), but they felt entitled to express disapproval. Actually, it was more than disapproval; it was approaching hostility. Those tuts still ring in my mother’s ears.

From today’s vantage point, in a week that a survey revealed that inter-ethnic relationships continue to rise, it is worth reflecting on the paranoia that until very recently surrounded what Americans call miscegenation. Its roots reach deep into our cultural and intellectual history. In 1792 Carl Linnaeus, a Swedish botanist, created his famous taxonomy of the races. American Indians were described as “red, ill-tempered with hair black, straight, thick; nostrils wide, obstinate . . . ruled by habit”; Europeans as “hair blond, eyes blue, very smart . . . inventive . . . ruled by law”; and Africans as “relaxed . . . hair kinked . . . crafty, slow, foolish . . . ruled by caprice”.

These categories dominated thinking for the next 200 years, fostering the sense that humans were fundamentally different from each other, that discrepancies in skin colour reflected deeper schisms that it would be foolish (even sinful) to interfere with through “interbreeding”. With the publication of On the Origin of Species in 1859, these prohibitions were grafted on to the theory of evolution. Whites having babies with blacks (or the other “inferior” races) were betraying their genetic heritage, creating a mongrel race that would destroy the bloodline.

Sex was, in many ways, the ultimate barrier, the intimate taboo that reached deep into our primal fears. Studies have suggested that the sheer ferocity of the lynchings still being carried out by the Ku Klux Klan in the 1960s were animated more by hyper-anxiety about black sexuality than any concern about sustaining white advantage through economic segregation. White men were beside themselves with fear that their daughters would be seduced by the sons and grandsons of slaves. It is a remarkable fact that interracial marriage was a criminal offence in 16 US states, including Alabama, North Carolina, Texas and Virginia, as late as 1967.

And that is why this week’s report should be a cause for celebration. We often focus on ethnic strife, but almost never upon the extraordinary strides in attitudes that have taken place over the past 30 years. When I first asked a girl out in my early teens — Yvonne: pretty, brown-haired, good at art — she replied: “Pick on someone your own colour.” It was crushing but over the next few years, cruelly familiar. Every time I chatted to a girl, I was vigilant for signs of racism, those telltale remarks that betray prejudice.

Today that vigilance has disappeared. On our two-year anniversary in 2011, my wife and I re-enacted our first date: a meeting at South Kensington Tube station (my sister had set us up), a cup of tea and an eclair in a café, a detour into a local church, then dinner in an Italian. As we ate, it struck me that we had never once discussed my colour, or her attitude to it, over the entire 24 months of our relationship.

We had talked about everything else: politics, art, the quality of Marks & Spencer own-brand wine. But the difference in skin pigmentation was just not relevant. “Who cares?” she said, and in those two words evoked the integrationist spirit of our times.

The intellectual climate has changed, too. In 1972 the evolutionary biologist Richard Lewontin made the seminal discovery that beneath the surface, humans are remarkably similar. About 85 per cent of genetic variation exists between individuals within population groups, but only 7 per cent between the so-called races. As Henry Harpending, Professor of Anthropology at the University of Utah, put it: “Personal computers are divisible into major races — Compaq, Dell, Gateway, Micron — as well as many minor populations. Are there deep essential differences between clone X and clone Y? Hardly. Take the cases off and we can barely tell them apart. The important differences among PC races are the labels on the outside of the box. Human race differences are like that.”

There are many problems in the world today. But the problem of race per se; the atavistic fears that were once inflamed by people who happen to look a little different from ourselves, has all but disappeared. We are working together, sleeping together, having babies together. A new generation of mixed-racers include Tiger Woods (mixed-race father and mother), Lewis Hamilton (Afro-Caribbean father and white mother) and Zadie Smith (English father and Jamaican mother). Just before Christmas, I was chatting with my dad in a café in Richmond. Suddenly, I noted a change in his expression, a slight widening of the eyes, and a smile. I looked around and saw a black man and white woman holding hands at the next table. It is something you see every day, in every part of the country, but it didn’t stop my dad smiling and, every now and again, glancing over. We didn’t exchange a word, but I knew what he was thinking. He could see in their love — and in the glorious indifference towards it of those around them — a sign of the times. He could glimpse in their intimacy not just a story of human romance, but of a coming together that is transforming our world.

There is a long way to go and there are fierce pockets of resistance, but the trend is unmistakable. Love, for the first time in human history, is becoming colour-blind.

    

 

Quarantining The Islamic Cauldron

 

ON MY bookshelf sits an old biography of Sir Mark Sykes, the British diplomat who in 1916 drew the map of the Middle East that is being torn up today by the insurgents in Iraq.

Inside the cover is the signature of its original owner, Anthony Eden, who inscribed his name on April 30, 1923, when he was but an aspiring Conservative parliamentary candidate.

As prime minister Eden led Britain into a disastrous adventure in 1956 to capture the Suez canal and overthrow the dictator of Egypt. It failed.

Every time I pick up the faded blue and gold volume it feels like a silent witness to the long 20th century of British engagement with the Arab world.

That century is gone. In Eden’s youth few doubted Britain’s imperial mission in Arabia, Persia and India. Later, few questioned engagement in all senses: military, diplomatic, commercial, cultural — and, for some, emotional.

Now such certainty is outdated, for the era of engagement is coming to an end and a period of quarantine has begun.

No politician wants to talk about this. It’s inconvenient, embarrassing and worrying because there are no pleasant solutions. But the retreat stems from reality. Foreign intervention driven by electoral cycles does not defeat forces rooted in centuries of historical change.

The violent break-up of Iraq and Syria is merely a continuation of the break-up of the Ottoman empire, which Sykes helped to bring about in the First World War.

In a flash of candour Winston Churchill wrote in a foreword to the biography that Sykes ran “all that intricate and remarkable policy which split the Arab from the Turk [and] divided the Muslim world”.

The “caliphate” proclaimed by extremists last month in an area straddling Iraq and Syria is the restoration of a realm that was governed by the Turkish sultans, then the supreme figures in Islam.

The conflicts in Afghanistan and Pakistan, another legacy, are a delayed result of the splintering of British India.

All this is unfolding against the background of two struggles that may last generations. One is between the forces of modernity and conservatism within Islam. The other pits the main Sunni sect against its Shi’ite rivals in a dispute over the succession to the prophet Muhammad dating back to the 7th century.

Against such gigantic pressures, brief and episodic military moves are pointless. Talk of universal values seems irrelevant. Only Tony Blair, a lonely apostle of intervention, dares to speak in favour of continuous engagement, as if we could influence the outcome of events by finding a liberal royal here or enlightening a military dictator there. While I have some sympathy with Blair’s willingness to say unpalatable things, his prescriptions have practically zero public support.

The British government committed its forces to three full-scale coalition campaigns: the expulsion of Iraqi troops from Kuwait in 1991, the invasion of Afghanistan in 2001 and the invasion of Iraq in 2003. Almost nobody in Whitehall or Washington wishes to do that again. Recessions, weariness and bad outcomes have ruled it out.

Look, instead, at what the big-power governments are doing. Without saying so, they are fencing off a swathe of the world, intervening around the fringes to confine the spread of chaos to a core extending from the Sahara to the Hindu Kush.

Chinese and Japanese warships, rivals in their home waters, join patrols off Somalia; the French send in troops to Mali; Britain and America help the Nigerians; and in Kenya the Israelis train the security forces.

Then there’s the human factor. In the Mediterranean the Italian navy is doing an unsung job rescuing thousands of people every month who are fleeing north Africa in rickety boats. Far off in the Indian Ocean the Australians are repelling boatloads of Afghans, Iraqis and Iranians. Where will they all go? Voters don’t seem to care, as long as it is not next door.

If you ask a western intelligence chief, he is likely to tell you that most security and surveillance operations are aimed at identifying and tracing potential terrorists from the Middle East and south Asia. It will not be long before intrusive travel controls and even tougher visa regimes on such places are routine.

Meanwhile, Christians are leaving the Levant and expatriates are dwindling in number as life behind concrete walls in countries with harsh social codes becomes a frightening bore.

Maybe that’s why the psychological shutters are coming down. In the culture wars France has banned the wearing of a burqa in public, the Swiss voted against allowing minarets, Britain has discovered “Trojan horse” Islamists in schools and respective affirmations of identity are more in fashion than multiculturalism.

One by one the old strategic arguments for engagement are falling away. America will be self-sufficient in energy soon and the main customers for Middle Eastern oil will be Chinese. Do we really care if Russian ships sail in and out of President Bashar al-Assad’s minor port of Latakia in Syria?

It seems to me there are only three big things the US would fight for. First is to stop Iran getting the nuclear bomb. Second is to uphold the commitment, made decades ago, to the free passage of shipping through the Strait of Hormuz, gateway to the oilfields — a vital American interest. Third is to save Israel in an existential crisis.

Even grand imperialists often shrank from interventions. Lord Salisbury, the “Victorian Titan” of Andrew Roberts’s biography, declared that it was “no part of England’s duty” to stop Turkish massacres of Christians.

As for Sykes, the Orientalist who had trained at Jesus College, Cambridge, he died in the worldwide flu epidemic of 1919 and saw nothing of his own handiwork. Eden, the Oxford Persian scholar, remains a symbol of decline and disillusion. Every statesman since his time lives with the spectre of Suez and none has broken with the tide of failure. That is why, like it or not, we are now on the path from engagement to quarantine.

    

 

Paul Krugman on Denial Politics

 

On Sunday The Times published an article by the political scientist Brendan Nyhan about a troubling aspect of the current American scene — the stark partisan divide over issues that should be simply factual, like whether the planet is warming or evolution happened. It’s common to attribute such divisions to ignorance, but as Mr. Nyhan points out, the divide is actually worse among those who are seemingly better informed about the issues.

The problem, in other words, isn’t ignorance; it’s wishful thinking. Confronted with a conflict between evidence and what they want to believe for political and/or religious reasons, many people reject the evidence. And knowing more about the issues widens the divide, because the well informed have a clearer view of which evidence they need to reject to sustain their belief system.

As you might guess, after reading Mr. Nyhan I found myself thinking about the similar state of affairs when it comes to economics, monetary economics in particular.

Some background: On the eve of the Great Recession, many conservative pundits and commentators — and quite a few economists — had a worldview that combined faith in free markets with disdain for government. Such people were briefly rocked back on their heels by the revelation that the “bubbleheads” who warned about housing were right, and the further revelation that unregulated financial markets are dangerously unstable. But they quickly rallied, declaring that the financial crisis was somehow the fault of liberals — and that the great danger now facing the economy came not from the crisis but from the efforts of policy makers to limit the damage.

Above all, there were many dire warnings about the evils of “printing money.” For example, in May 2009 an editorial in The Wall Street Journal warned that both interest rates and inflation were set to surge “now that Congress and the Federal Reserve have flooded the world with dollars.” In 2010 a virtual Who’s Who of conservative economists and pundits sent an open letter to Ben Bernanke warning that his policies risked “currency debasement and inflation.” Prominent politicians like Representative Paul Ryan joined the chorus.

Reality, however, declined to cooperate. Although the Fed continued on its expansionary course — its balance sheet has grown to more than $4 trillion, up fivefold since the start of the crisis — inflation stayed low. For the most part, the funds the Fed injected into the economy simply piled up either in bank reserves or in cash holdings by individuals — which was exactly what economists on the other side of the divide had predicted would happen.

Needless to say, it’s not the first time a politically appealing economic doctrine has been proved wrong by events. So those who got it wrong went back to the drawing board, right? Hahahahaha.

In fact, hardly any of the people who predicted runaway inflation have acknowledged that they were wrong, and that the error suggests something amiss with their approach. Some have offered lame excuses; some, following in the footsteps of climate-change deniers, have gone down the conspiracy-theory rabbit hole, claiming that we really do have soaring inflation, but the government is lying about the numbers (and by the way, we’re not talking about random bloggers or something; we’re talking about famous Harvard professors). Mainly, though, the currency-debasement crowd just keeps repeating the same lines, ignoring its utter failure in prognostication.

You might wonder why monetary theory gets treated like evolution or climate change. Isn’t the question of how to manage the money supply a technical issue, not a matter of theological doctrine?

 

 

US Politics and Money

 

In 2011, a significant chunk of the congressional agenda was taken up by banks and merchants battling over swipe fees -- namely, how much could banks charge Walmart and others to run a debit card. The focus on the penny brawl made no sense from a public policy perspective.

But merchants and banks rained down a staggering sum of money in their fight -- on lobbyists, consultants, campaigns, public relations firms and any other bucket that Washington put out. That flow of dollars determines what gets on Congress' agenda.

The opposite dynamic dictates what doesn't get lawmakers' attention.

Six years earlier, and every year after, the inspector general for the Department of Veterans Affairs warned of serious backlogs and unreliable recordkeeping in the VA's health care system, on which millions of veterans rely. But the congressional calendar was not stacked with meetings with VA officials or auditors. Veterans didn't have the cash to be heard.

Money molds not just the agenda but the shape of Congress itself. Think of it as a host-parasite relationship in which the host, Congress, adjusts to interact most effectively with the parasite, money.

The House Financial Services Committee is one of the most desirable panels on the Hill, coveted for its access to bank cash. As a result, it has 61 members. Energy and Commerce is another money committee, as is Ways and Means, both of which enable members to profit off the big-dollar industries whose interests are at stake. Those panels have 54 and 39 members respectively, many with decades of congressional tenure and the clout that comes with it.

There is much less jockeying for a seat on the House Veterans' Affairs Committee. It has just 25 members. When Republican David Jolly won a special election in Florida this spring, the House's most junior addition -- himself a former lobbyist -- probably knew where he was headed: Veterans' Affairs.

Democrats, meanwhile, couldn't even fill their seats on the committee at the beginning of this Congress and had to ask Rep. Tim Walz (D-Minn.) to take on an extra panel assignment in order to fill their 11 slots, sources familiar with the arrangement told HuffPost.

Because the Veterans' Affairs Committee comes with no opportunity to rake in campaign cash, senior lawmakers seek out other panels. "There are really only three money-raising committees in the House: Financial Services, Ways and Means, and Energy and Commerce," said veteran Rep. Jim Cooper (D-Tenn.), who does not sit on the Veterans' Affairs Committee. "That's why you see people exiting those [other] committees as soon as they can."

Because the committee has so few senior members, it has little access to congressional leadership, cable news, the White House or the various levers of power in Washington. It should be little surprise then that a powerless committee has found itself powerless to oversee the VA.

On Wednesday, House Speaker John Boehner (R-Ohio) and other Republican leaders sent a letter to the president about the VA scandal that was notably not signed by the VA committee chairman -- not as an intentional slight to him, but because it doesn't much matter what the panel thinks. The letter-writing leaders sought to stick up for their little brother. "[W]e request that you direct the VA to cooperate with the House and Senate as both chambers conduct the necessary oversight. As you may also know, the department has repeatedly failed to provide the House Veterans' Affairs Committee with timely information," the letter reads.

The panel members' lack of experience also makes them less effective. "You have green and inattentive members on the committee, very few who are willing to conduct systematic oversight," said Cooper. "Most couldn't name senior officials below the secretary level. Most don't know how [the VA system] works. The warnings were not heeded."

Freshman Rep. Beto O'Rourke (D-Texas) is a member of the committee. "It's not a place from which a member can successfully raise money from the special interests who have a stake in the legislative outcomes," he noted. "It's also not a high-profile committee, and the work is really tough -- these are longstanding systemic problems that predate [recently resigned VA Secretary Eric] Shinseki."

The committee roster is a who's not who of Congress. The top Democrat on the panel, Rep. Mike Michaud (Maine), is a serious lawmaker with seniority and pull on the Hill, but he is likely to depart soon to become governor of his home state. After Michaud, the bench is thin. Besides Walz, the only other Democratic panel member with experience is Rep. Corrine Brown (D-Fla.). All the other Democrats on the committee are freshmen.

The lack of interest in the VA panel is not confined to one party. Five Republicans on the panel are freshmen and, other than the chairman, none has more than four terms' experience in Congress.

After one member, Rep. Tim Huelskamp (R-Kan.), was caught in a coup attempt against Speaker Boehner, among other offenses against GOP leadership, he was booted off the Budget and Agriculture committees. He was not kicked off Veterans' Affairs. He was being punished.

The only political value to being on the committee, members and staffers said, is the opportunity to co-sponsor popular-sounding bipartisan bills that benefit vets. "As far as Veterans' Affairs goes, you're not gonna get any benefit in terms of fundraising. The appeal might be a little limited unless you're gung-ho about it," said a GOP committee aide, adding that there is a PR benefit.

"We've got all kinds of press releases with headlines touting the bipartisan nature of things. It creates the impression you're working across the aisle," he said.

Senate fundraising dynamics are different: An individual senator can raise money much more easily, and senators often sit on four or five committees, several more than House members do. That means the Senate Veterans' Affairs Committee has plenty of senior members, but the sheer number of committee assignments a senator enjoys means lower-priority ones receive little attention.

And the VA is low priority: A Congressional Research Service report found that in the most recent Congress, the Senate vets committee had the second lowest budget to spend on staff and other expenses -- just a few thousand dollars more than the Committee on Indian Affairs. It is also one of the few, if not the only panel, without its own press operation. And it's such an afterthought that the Senate lets a socialist run it. (We kid, but it is chaired by Vermont independent Bernie Sanders.)

House fundraising totals are also revealing. The average House Republican has raised just over $1.1 million so far this Congress, according to the Center for Responsive Politics. Meanwhile, Veterans' Affairs Committee Chairman Jeff Miller (R-Fla.) has hauled in just $280,000 -- a staggeringly small sum for a panel boss.

"Jeff Miller is a good chairman and he does good work, but the dynamic you're describing is a real one," said one top House Republican aide about the committee's lack of power.

The median fundraising total for members of the committee clocks in this Congress at $638,000, according to a HuffPost analysis. The median total for all House members is $775,000. A Sunlight Foundation analysis found a similar pattern going back more than a decade.

Lawmakers must devote the bulk of their time to raising that money. At the beginning of this Congress, the Democratic campaign arm, according to a presentation obtained by HuffPost, warned its incoming freshman that they ought to be spending roughly four hours per day on the phone calling rich people to ask for money -- a chore known as "call time."

"The number one job is now fundraising, and everything gets distorted by this need to fundraise," said Harvard professor Lawrence Lessig, who studies congressional corruption. The silence of the Veterans' Affairs Committee, he said, is a "really interesting other dimension of this. You can't even get the ordinary work done because it doesn't pay."

Last Wednesday, when Shinseki came to receive his congressional tongue lashing, the hearing was packed, O'Rourke noted. But a subcommittee hearing the next day was sparsely attended by camera crews and, not coincidentally, members of Congress. "Subcommittees are where the real work is done -- questioning officials, listening to testimony from VSOs [veterans' service officers] and veterans, hearing from the inspector general, marking up bills," said O'Rourke.

Or, in the case of Congress, not done.

    

 

Tight or Loose?

 

E pluribus unum—“Out of many, one”—was the first official motto of the United States, adopted by the founding fathers and enshrined in the nation’s Great Seal in 1782. In its statement of unity, it exemplifies the differences inherent in the United States—a fitting description for a singular nation defined by innumerable internal divergence. Yet, few organizing principles exist to explain these differences, which find their expression in divergent ecologies, histories, average personality traits, and various state outcomes. Why, for instance, is the incidence of illicit substance use greater in states like Hawaii, Alaska, and New Hampshire relative to Mississippi, Ohio, and Oklahoma, but incidents of discrimination much higher in the latter than the former? Why do states like Colorado and Connecticut exhibit traits associated with greater impulsivity and greater tolerance, while other states, such as Alabama and Kansas, exhibit the opposite patterns? What might shed light on the difference in anti-immigrant attitudes between Arizona and New York, states with similarly large populations of illegal immigrants? In all, what does this seemingly diverse and wide array of state-level differences have in common?

Although the United States is often parsed on a red versus blue dichotomy, our lab suggested another framework by which to understand differences amongst the states in a paper published in the Proceedings of the National Academy of Sciences: states vary in terms of their tightness or looseness, which captures whether states have strong norms and little tolerance for deviance (tight) or weak norms and greater tolerance. As anyone who has traveled widely in the United States can attest, the range of behavior across states is incredibly diverse. Finding a singing cowboy playing the guitar in his underwear may indeed be a hard thing to find outside of New York City, for instance. In this study, we document not only how states vary in tightness-looseness, but why they vary—in large part based on the ecological and historical differences between the states.

The strategy of examining how cultures vary harkens back as far as Herodotus in his classic, Histories. More recently, Geert Hofstede greatly spurred these efforts with the publication of his book Culture’s Consequences, detailing the extent to which certain values (for instance, collectivism versus individualism) are endorsed across nations. More recently, we have broadened the toolkit even more and begun to study how cultures vary beyond values. For instance, we showed that cultures vary in the strength of social norms (i.e., tightness) across 33 nations and demonstrated that this cultural dimension is distinct from the various value dimensions proposed by Hofstede and others (e.g., the GLOBE research project). Consistent with the idea that cultural differences often arise from differences in ecological and historical conditions, we found that tight countries have experienced a wide range of ecological and historical threats whereas loose countries experienced fewer. The strong norms that characterize tight nations help humans coordinate their social action in the face of numerous survival threats. Loose nations can ”afford” more latitude and permissiveness because they face far fewer natural and human made threats.

In this study, we wanted to see whether tightness, and its predictors and outcomes, could be applied to the state level. We reasoned that while the U.S. is generally a loose culture, we might find that there is wide variation in tightness across the 50 states. For inspiration, we drew on Vandello and Cohen’s classic study which created an index of state level collectivism using archival data and we created a new index that measures the strength of norms and punishments across the states.

Check out the map of tightness to diagnose where your state is. Tighter states—those with stronger rules and greater punishment for deviance—are located primarily in the South and the Midwest, while looser states are located in the North East, the West Coast, and some of the Mountain States. We calculated state tightness with a composite index, compiling multiple variables. This includes items that reflect the strength of punishments in states, including the legality of corporal punishment in schools, the percentage of students hit/punished in schools, the rate of executions from 1976 to 2011, and the severity of punishment for violating laws, as well as the degree of permissiveness or deviance tolerance in states, which includes the ratio of dry to total counties per state and the legality of same-sex civil unions. The index also captures the strength of institutions that constrain behavior and enforce moral order in states, including state-level religiosity and the percentage of the total state population that is foreign, an indicator of diversity and cosmopolitanism.

Like our international study, our research on the 50 states shows that some striking similarities in why states vary in the strength of their social norms: Tight states have more threatening ecological conditions, including a higher incidence of natural disasters, poorer environmental health, greater disease prevalence, and fewer natural resources. Tight states were also found to have greater perceptions of external threat, reflected in the desire for more national defense spending and greater rates of military recruitment. This may have a historical basis, as states with a large amount of slave-owning families in 1860—those states that were “occupied” by the North and lost the backbone of their slave-based economy following the Civil War—are tighter. In all, we argue that ecological and historically based threats necessitate greater coordinated action to promote collective survival. One might use this construct to predict, for example, that states that increasingly have natural disasters, resource threats, or even terrorism threats might start to become tighter.

This study also helps to explain the vast differences we see in personality across the United States. Tighter states had a higher average of conscientiousness—a personality characteristic associated with lower impulsivity, greater self-control, orderliness, and conformity—relative to looser states. In contrast, looser states had greater average openness—a personality characteristic associated with non-traditional values and beliefs, tolerance toward difference, and cosmopolitanism. There are many other interesting differences between tight and loose states. Tight states have greater social organization (less instability and greater cohesion), better indicators of self-control (lower alcohol and illicit abuse), and lower rates of homelessness relative to loose states. However, they also exhibited higher incarceration rates, greater discrimination, lower creativity, and lower happiness, as compared to loose states. Tight and loose states each have their own advantages and disadvantages.

As you might expect, the map of state-level tightness-looseness approximates the electoral map of the past few decades, with states voting for conservative, Republican candidates falling on the tighter side and states voting for the more liberal, Democratic candidates falling on the looser side. Yet, conservatism and tightness are distinct, with tightness being a different and broader construct. Conservatism and liberalism are value systems that take the form of individual beliefs, while tightness and looseness describe an external social reality that exists independently of any one individual.

Our study is not without potential criticisms. First, it is important to note that our results are purely correlational, and accordingly, we can’t infer causation in the data. We have been using some other methods, including laboratory experiments and computer modeling, to strengthen the causal case that tightness is an adaptation to ecological and historical threats. Second, this study was only done in the context of the U.S. which is relatively loose, and in effect, allows for a lot of state variation. It remains to be seen whether other tight countries (e.g., Japan) have as much variation. S. Finally, we focused on the state level, but clearly there is also some interesting within-state variation that could be examined. For example, although Louisiana is a tight state, New Orleans may be a fairly loose city with a lot of behavioral flexibility. Likewise, although California is loose, it has pockets of tight counties. In our case, we were interested in broad, state level differences rather than specific localities. Other researchers may be interested in other levels of analysis to explore the construct.

In conclusion, E pluribus unum is an accurate descriptor of the United States. Out of many, seemingly disparate variables, it is important to seek a general, unifying principle to explain their concordance. We show that tightness may be one such candidate.

    

 

Conservative Psychology

 

Behavioral and Brain Sciences employs a rather unique practice called "Open Peer Commentary": An article of major significance is published, a large number of fellow scholars comment on it, and then the original author responds to all of them. The approach has many virtues, one of which being that it lets you see where a community of scholars and thinkers stand with respect to a controversial or provocative scientific idea. And in the latest issue of the journal, this process reveals the following conclusion: A large body of political scientists and political psychologists now concur that liberals and conservatives disagree about politics in part because they are different people at the level of personality, psychology, and even traits like physiology and genetics.

That's a big deal. It challenges everything that we thought we knew about politics—upending the idea that we get our beliefs solely from our upbringing, from our friends and families, from our personal economic interests, and calling into question the notion that in politics, we can really change (most of us, anyway).

The occasion of this revelation is a paper by John Hibbing of the University of Nebraska and his colleagues, arguing that political conservatives have a "negativity bias," meaning that they are physiologically more attuned to negative (threatening, disgusting) stimuli in their environments. (The paper can be read for free here.) In the process, Hibbing et al. marshal a large body of evidence, including their own experiments using eye trackers and other devices to measure the involuntary responses of political partisans to different types of images. One finding? That conservatives respond much more rapidly to threatening and aversive stimuli (for instance, images of "a very large spider on the face of a frightened person, a dazed individual with a bloody face, and an open wound with maggots in it," as one of their papers put it).

In other words, the conservative ideology, and especially one of its major facets - centered on a strong military, tough law enforcement, resistance to immigration, widespread availability of guns - would seem well tailored for an underlying, threat-oriented biology.

The authors go on to speculate that this ultimately reflects an evolutionary imperative. "One possibility," they write, "is that a strong negativity bias was extremely useful in the Pleistocene," when it would have been super-helpful in preventing you from getting killed. (The Pleistocene epoch lasted from roughly 2.5 million years ago until 12,000 years ago.) We had John Hibbing on the Inquiring Minds podcast earlier this year.

: Hibbing and his colleagues make an intriguing argument in their latest paper, but what's truly fascinating is what happened next. Twenty-six different scholars or groups of scholars then got an opportunity to tee off on the paper, firing off a variety of responses. But as Hibbing and colleagues note in their final reply, out of those responses, "22 or 23 accept the general idea" of a conservative negativity bias, and simply add commentary to aid in the process of "modifying it, expanding on it, specifying where it does and does not work," and so on. Only about three scholars or groups of scholars seem to reject the idea entirely.

That's pretty extraordinary, when you think about it. After all, one of the teams of commenters includes New York University social psychologist John Jost, who drew considerable political ire in 2003 when he and his colleagues published a synthesis of existing psychological studies on ideology, suggesting that conservatives are characterized by traits such as a need for certainty and an intolerance of ambiguity. Now, writing in Behavioral and Brain Sciences in response to Hibbing roughly a decade later, Jost and fellow scholars note that:

There is by now evidence from a variety of laboratories around the world using a variety of methodological techniques leading to the virtually inescapable conclusion that the cognitive-motivational styles of leftists and rightists are quite different. This research consistently finds that conservatism is positively associated with heightened epistemic concerns for order, structure, closure, certainty, consistency, simplicity, and familiarity, as well as existential concerns such as perceptions of danger, sensitivity to threat, and death anxiety.

Back in 2003, Jost and his team were blasted by Ann Coulter, George Will, and National Review for saying this; congressional Republicans began probing into their research grants; and they got lots of hate mail. But what's clear is that today, they've more or less triumphed. They won a field of converts to their view and sparked a wave of new research, including the work of Hibbing and his team.

Granted, there are still many issues yet to be worked out in the science of ideology. Most of the commentaries on the new Hibbing paper are focused on important but not-paradigm-shifting side issues, such as the question of how conservatives can have a higher negativity bias, and yet not have neurotic personalities. (Actually, if anything, the research suggests that liberals may be the more neurotic bunch.) Indeed, conservatives tend to have a high degree of happiness and life satisfaction. But Hibbing and colleagues find no contradiction here. Instead, they paraphrase two other scholarly commentators (Matt Motyl of the University of Virginia and Ravi Iyer of the University of Southern California), who note that "successfully monitoring and attending negative features of the environment, as conservatives tend to do, may be just the sort of tractable task…that is more likely to lead to a fulfilling and happy life than is a constant search for new experience after new experience."

All of this matters, of course, because we still operate in politics and in media as if minds can be changed by the best honed arguments, the most compelling facts. And yet if our political opponents are simply perceiving the world differently, that idea starts to crumble. Out of the rubble just might arise a better way of acting in politics that leads to less dysfunction and less gridlock…thanks to science.

    

 

What Is The Fairest Voting System?

 

(New Scientist Last Word)

Mathematically speaking, which kind of voting system produces the most democratic and fair result in a general election? There are several to choose from: first past the post, alternative vote, single transferable vote and many others, but surely mathematics can decree which is the fairest system?

• Arrow's impossibility theorem shows that any voting system in which candidates are ranked in order will not provide a fair result in all cases.

Take an example where there are two candidates with extreme views – A and C – each preferred by 35 per cent of the population, but hated by everyone else. Candidate B is moderate but only scores 30 per cent when all three candidates are standing. However, B will win (with 65 per cent) in a two-horse race against either A or C. Yet if all three candidates stand for election in almost all voting systems in common use, B will be eliminated, leaving either A or C to win, even though B could beat either A or C in a two-way vote.

B could win if voters ranked candidates in order of preference and the candidate with the most "least preferred" votes was eliminated. But in a larger field, such methods tend to favour "do-nothing" candidates who are not disliked by anyone but would not gain first place against any other candidate either.

An alternative is range voting, a system in which every candidate is given a score: for example, from 0 to 10. This method allows voters to give some candidates the same score and to record "no opinion" for candidates they know nothing about.

Range voting is simple and easily understood, and it only produces the failures of ranking systems if voters mistakenly think it is better to vote tactically by giving their preferred candidate top score and everyone else zero. In practice, giving an honest score to everyone gives the best result.

However, the method is unlikely to win favour because politicians get to choose the method of voting, and most do not want to know what score the public would actually give them.

Brian Horton, West Launceston, Tasmania, Australia

• When it comes to voting systems, it is easy to assume that "fair and democratic" means "most directly reflects the views of the voters", but this is not always the case.

For example, a voting system may produce a result where a small, extremist party holds the balance of power between two larger parties or coalitions. This small party may then influence policy in a way that is neither fair nor democratic. Also, when parties form a coalition, they might reach compromises on policies or form new policies that the electorate did not vote for.

Voting systems that most closely reflect the many different opinions of voters inevitably produce numerous small parties. There is a risk that these small parties may become entrenched in their positions and unable to move forward. If they form coalitions, they blame others for any failures that result and do not learn from their mistakes. If re-elected, they continue as before. The parties that result from such a system may be more idealistic than pragmatic because they never have to accept full responsibility for government.

It is my opinion that the most democratic system is not one that puts the "right" people into office, whoever they might be. It is a system that does not allow the elected party or group to walk away from blame when things go wrong and, most importantly, that allows us to get rid of the people we voted into office the last time, should we want.

Alan Urdaibay, Paignton, Devon, UK

• The short answer is that mathematics can prove that there is no fair voting system. There are a number of theorems to this effect, such as Arrow's impossibility theorem and the Gibbard-Satterthwaite theorem, among others.

These theorems are proved in essentially the same way. First, the idea of fairness is expressed as a number of axioms. For example, for Arrow's theorem these are: if every voter prefers alternative X over alternative Y, then the group prefers X over Y; if every voter's preference between X and Y remains unchanged, then the group's preference between X and Y will also remain unchanged (even if voters' preferences between other pairs such as X and Z, Y and Z, or Z and W change); and there is no "dictator" (no single voter possesses the power to always determine the group's preference).

This set of axioms is then mathematically proved to be inconsistent: that is, no voting system can satisfy them all.

This means that any voting system has to abandon at least one criterion of "fairness". Which criterion to abandon is a matter for social and political debate. Mathematics can prove that this debate is necessary but cannot, of course, dictate how it should be resolved. It all depends on what you mean by fairness.

John Dobson, Hexham, Northumberland, UK

    

 

Exploiting The Tea Party

 

Definition of a grifter as: A grifter is a con artist—someone who swindles people out of money through fraud. If there’s one type of person you don’t want to trust, it’s a grifter: Someone who cheats someone out of money.

Historically, grifters have taken many shapes. They were the snake-oil salesmen who rolled into town promising a magical, cure-all elixir at a price. The grifter was long gone by the time people discovered the magical elixir was no more magical than water. They were the sideshow con men offering fantastic prizes in games that were rigged so that no one could actually win them. They were the Ponzi scheme operators who got rich promising fantastically high investment returns but returning nothing for those sorry investors at the bottom of the pyramid.

Over the last few years we have seen the rise of a new grifter—the political grifter. And the most important battle being waged today isn’t the one about which party controls the House or the Senate, it’s about who controls the Republican Party: the grifting wing or the governing wing.

Today’s political grifters are a lot like the grifters of old—lining their pockets with the hard-earned money of working men and women be promising things in return that they know they can’t deliver.

Political grifting is a lucrative business. Groups like the Club for Growth, FreedomWorks and the Tea Party Patriots are run by men and women who have made millions by playing on the fears and anger about the dysfunction in Washington. My former House colleague Chris Chocola is pocketing a half-million dollars a year heading the Club for Growth; same for Matt Kibbe heading up FreedomWorks (and I don’t think Kibbe’s salary includes the infamous craft beer bar that FreedomWorks donors ended up paying for). The Tea Party Patriots pay their head, Jenny Beth Martin, almost as much. These people have lined their pockets by promising that if you send them money, they will send men and women to Washington who can “fix it.” Of course, in the ultimate con, the always extreme and often amateurish candidates these groups back either end up losing to Democrats or they come to Washington and actually make the process even more dysfunctional.

Just look at what happened this past week, when hard-right House members with extensive ties to these outside groups, egged on by Texas Sen. Ted Cruz, snarled up a sensible effort to pass a bill that would at least begin to address the crisis of undocumented children at the U.S.-Mexico border. It was an embarrassing display of congressional dysfunction, and it showed that the grifting wing has learned nothing from last fall’s shutdown fiasco.

The grifting wing of the party promises that you can have ideological purity—that you don’t have to compromise—and, of course, all you have to do is send them money to make it happen. The governing wing of the Republican Party knows that’s a damn lie. Our Founding Fathers set up a system of government that by its very nature excludes the possibility of one party or one ideological wing of one party getting everything it wants. Ted Cruz, who quotes the founders almost every chance he gets, ought to know this.

    

 

Inequality

 

First came Occupy Wall Street, and its pitch-perfect slogan on inequality: “We are the 99 percent.” After that movement fizzled, Thomas Piketty, the handsomely ruffled French professor, released a 685-page book explaining that we really were living in a new Gilded Age in which the wealth gap was as wide as it had ever been. Finally, in June, one of the plutocrats sitting atop the piles of money he made in the digital revolution, Nick Hanauer, wrote an article in Politico magazine—it’s the most-shared story ever on Politico’s Facebook page—warning that the pitchforks were coming, and rich people like him should advocate for a healthier middle class and a higher minimum wage.

The debate over inequality is now raging, and most Americans are unhappy about the widening divide between the haves and have-nots. Hanauer has been making the same case for years, drawing heaps of both praise and scorn. Forbes magazine has alternately called Hanauer insane and ignorant. His TED University presentation calling for a $15-minimum wage was left off the organization’s website because it was deemed too “political.” That’s nothing next to Piketty’s detractors, who at their most extreme accused him of twisting his data.

Hanauer and Piketty inspire these broadsides because they are challenging, in a far more aggressive way than plutocrats and economists usually do, the conservative economic orthodoxy that has reigned since at least the 1980s. Under Ronald Reagan, we called it trickle-down economics, the idea that the men who can afford their own private jets—they’re usually men—deserve gobs of money because they provide some special entrepreneurial or innovative talent that drives the American economy.

That’s well known. Far less often discussed is the flipside of this belief: that helping the less well off will dampen the American money-generating engine—that it will hurt growth, because the only thing that inspires the “job creators” to work so hard is the promise of insanely vast financial rewards. Poverty is a necessary evil in this worldview, and helping the less well off creates a “culture of dependency,” which discourages work. “The United States thrives because of a culture of opportunity that encourages work and disdains relying on handouts,” Matthew Spalding of The Heritage Foundation wrote in 2012, neatly summing up the conservative ethos.

Conservatives have dominated discussions of poverty for a generation with arguments like this one. It’s completely wrong. It’s more than that—it’s just a lie, concocted as cover for policies that overwhelmingly favored the rich. But it took the worst economic crisis since the Great Depression for many economists, liberal or not, to finally say publicly what many had long argued: Inequality is bad for the economy.

It took the worst economic crisis since the Great Depression for many economists to finally say publicly what many had long argued: Inequality is bad for the economy.

The latest to say so is the rating agency Standard and Poor’s, not exactly a bastion of lefty propaganda. An S&P report released August 5 says that rising inequality—gaps in both income and wealth—between the very rich and the rest of us is hurting economic growth. The agency downgraded its forecast for the economy in the coming years because of the record level of inequality and the lack of policy changes to correct for it. The report’s authors argue against the notion that caring about equality necessarily involves a trade-off with “efficiency”—that is, a well-functioning economy.

To be sure, they’re not making a case for a massive government intervention to help low-income Americans. They discuss the benefits of current policy proposals—like raising the federal minimum wage to $10.10 per hour—with the caveats that such changes could have potential negative consequences—like dampening job growth. (Most economists agree that such a small hike wouldn’t have that impact.)

At its core, though, the S&P report does argue that pulling people out of poverty and closing the gap between the 1 percent and the 99 percent will increase economic growth. The authors argue for some redistributive policies, like increased financial aid for post-secondary education. “The challenge now is to find a path toward more sustainable growth, an essential part of which, in our view, is pulling more Americans out of poverty and bolstering the purchasing power of the middle class,” the authors write. “A rising tide lifts all boats…but a lifeboat carrying a few, surrounded by many treading water, risks capsizing.”

It’s an important moment for such a debate. The Great Recession was a great equalizer, a crisis in which many in the middle class, and even upper-middle class, fell all the way to the bottom and relied on the government safety net. They learned what anyone who cared to look at the data already knew: The vast majority of people relying on government benefits are suffering a temporary setback that they will recover from, as long as they have a helping hand. The holes in the safety net also became more apparent. Even Paul Ryan, the Republican congressman from Wisconsin who has set his blue eyes on higher office, adequately diagnosed many of the problems with anti-poverty programs when he introduced a new plan last month. (Whether he would actually want to pay for the changes he calls for is debatable.)

Closing the gap by lifting low-income families out of poverty could do more to help the economy than any number of tax credits for “job creators” might, which is what Hanauer argued in Politico. And the S&P report puts more support in his corner.

On the question of what to do, there is widespread agreement on boosting educational attainment and increasing salaries at the bottom end. Policymakers have had a lot of time to think about how to help the middle class, since real wages began declining in the mid-1970s. Many of the problems of inequality have policy solutions ready to go, spelled out in a white paper stuffed in someone’s desk drawer. Why has it taken so long to think about addressing it? Was the political might of the right so overwhelming that they couldn’t speak up until people like Hanauer saw, as he warned in his essay, that the pitchforks would be coming for them?

    

 

Tribes

 

Intermarriage has always been a problem, all the way back to Romeo and Juliet (and West Side Story, of course). Intermarriage de-demonizes the ‘other’, and the insecure tribe member sees this as an existential threat, the beginning of the end of tribal cohesion.

Gangs in LA view high school as a threat. A kid who graduates from high school has options, can see a way up, which decreases the power of the gang and its leaders. Public school is seen as a threat by some tribes, a secular indoctrination and an exposure to other cultures and points of view that might destabilize what has been built over generations. And digital audio is a threat to those in the vinyl tribe, because at some point, some members may decide they’ve had enough of the old school.

Lately, two significant threats seen by some tribes are the scientific method and the power of a government (secular, or worse, representing a majority tribe). One fear is that once someone understands the power of inquiry, theory, testing and informed criticism, they will be unwilling to embrace traditional top-down mythology. The other is that increased government power will enforce standards and rituals that undermine the otherness that makes each tribe distinct.

If a tribe requires its members to utter loyalty oaths to be welcomed [“the president is always right, carbon pollution is a myth, no ____ allowed (take your pick)”] they will bump into reality more and more often. I had a music teacher in elementary school who forbade students to listen to pop music, using a valiant but doomed-to-fail tactic of raising classical music lovers.

Tribes started as self-defending groups of wanderers. It didn't take long, though, for them to claim a special truth, for them to insulate themselves from an ever-changing world.

In a modern, connected era, successful tribes can’t thrive for long by cutting themselves off from the engines that drive our culture and economy. What they can do is engage with and attract members who aren’t there because the tribe is right and everyone else is wrong, but instead, the modern tribe quite simply says, “you are welcome here, we like you, people like us are part of a thing like this, we'll watch your back.” It turns out that this is enough.

( more good ideas from Seth Godin)

    

 

Why The Tea Party Wants Small Govt

 

An essential dogma of the religious right is that government should provide minimal services for and impose minimal demands on the citizenry. Sound familiar? But the reason isn’t, as popular libertarian dogma would have it, because the government should keep its nose out of your business. Dating back to conservative Christian red scares, anti-union and anti-New Deal ideology, and to Christian Reconstructionist framing of the proper role of government in relation to the church, the family, and the individual, these principles emerge from the idea that the secular state is the enemy of a proper Christian ordering of markets, social norms, and family and religious life.

    

 

The Cost of the Underclass

 

THE true scale of Britain’s “underclass” has been revealed by a government initiative that has uncovered 500,000 problem families, estimated to be costing the taxpayer more than £30bn a year.

The staggering number of seriously troubled families, four times the previous estimate, has emerged in a three-year operation to confront those who are blighting neighbourhoods with their dysfunctional behaviour.

The depth of the malaise has been uncovered by Louise Casey, the troubleshooter entrusted by David Cameron with turning round 120,000 problem families after the urban riots of the summer of 2011. An additional 400,000 families are now to be targeted.

In an interview with The Sunday Times, Casey, head of the government’s troubled families programme, said: “These families are off the barometer in the number of problems they have. This is the first time we have been able to evidence the extent of the problems.”

Shocking findings from the existing £448m programme include a hard core of families who triggered police calls out to their homes up to 15 times a month. One chief constable told Casey: “I could park a police officer on the settees of some of your families 24 hours a day, they are that demanding of our services.”

Three quarters of the 120,000 families had nobody in work and almost half had children who were excluded from school. Many are affected from their mid-30s by chronic health problems including diabetes and heart conditions, which are normally associated with old age.

The government will announce that 53,000 families, more than half of the total whom Casey is already working with, have made significant progress on measures such as attending school, finding work and stopping their criminal and antisocial behaviour.

Each of the 120,000 costs the taxpayer an average of £75,000 a year, or a total of £9bn in the drain on benefits and demand on services. The annual cost of the next tier of 400,000 will bring the combined total cost to above £30bn.

Casey warned against assuming that the next tier of families would have fewer problems. “I don’t think we should make any assumptions — particularly on that basis — yet,” she said, pointing out that she had not expected the problems of the first wave to be as bad as they were.

She highlighted some of the most extreme problems:

•One family triggered 90 police call-outs to their home over six months — an average of one every two days. Overall, the families were on average responsible for one police call-out a month. “That’s incredible,” said Casey.

•A morbidly obese mother has visited her GP on 226 occasions since the birth of her youngest child, age 7. She used out-of-hours services 65 times, sought 18 secondary care referrals, visited the minor injuries unit 12 times, made four trips to accident and emergency and rang 999 on four occasions.

•A mother of 10 was so badly beaten by her violent partner that her children removed all the internal doors in the house and dumped them in the back garden so they would have advance warning of his attacks. Officials found out the truth only because of a complaint about the state of her garden.

The next wave of the programme, to be announced this week by Eric Pickles, the communities and local government secretary, will seek to target pre-school children and to rescue women from domestic violence. It will also tackle the families’ chronic health problems which undermine efforts to turn their lives around.

Casey said: “Physical health is so obvious when you look at the families, but they will talk about their drug, alcohol and mental health problems.

“They won’t talk about the fact that they are either skinny or they are wired or they are obese or they smoke too much because they are bloody miserable.”

Earlier this year a parliamentary watchdog warned that the troubled families programme was likely to miss its targets, but Casey insists it is now on track and will prove its worth. She said: “This is a programme of the head — these families are costing us money — and a programme of the heart — because I don’t want these children growing up to repeat the same patterns of behaviour of their parents.”

She added: “We are beginning to achieve a revolution in how you deal with the worst families in Britain.

“Worst in that they have got the worst problems, frankly they cause the most problems and frankly you wouldn’t want to live with them.

“It’s a cultural revolution in the way that we see these families. There is an acceptance that the poor will always be with us. I spend my entire life saying that’s not how it has to be.”

The Sunday Times first highlighted the plight of the underclass in 1989.

    

 

Why British Muslims Radicalise (and US ones don't)

 

Americans responded to the killing of James Foley with outrage, horror and confusion — and perhaps no one more so than American Muslims.

Numbering around 2.7 million, America’s Muslims tend to be more wealthy, well-educated and integrated than their European counterparts. In recent years, surveyors have found them to be more optimistic about the future of America than their fellow Americans.

Though a handful of their number are said to have travelled to Syria to join rebel groups, and young members of America’s Somali community are known to have joined Al-Shaba’ab, they appear to be few and far between in comparison with the radicalised young Britons known to have left their country to fight.

Britain is among the leading European sources of foreign fighters for Isis, though France has a larger number overall and Belgium has the highest proportional representation.

Analysts said that a sense of alienation, together with economic hardship and largely self-contained conservative communities, fuelled problems with alienation among British Muslims that have helped fuel an interest in Islamic jihad for some young men.

The vast majority of Britain’s Bangladeshi and Pakistani communities come from just two relatively small towns – Sylhet in northern Bangladesh and Mirpur in Pakistani Kashmir. Both were relatively conservative communities in rural backwaters when tens of thousands of their inhabitants migrated to the UK – uprooting and resettling entire self-contained communities in Britain.

Pakistani migration starting in the 1960s fed a demand for labour in textile mills in northern England – an industry that collapsed in the 1990s with severe economic consequences for the Muslim communities.

“Talking about Mirpur, the community was transported wholesale to Bradford and other northern towns,” said Dr Usama Hasan, a former jihadist who is now an academic working for the counter-radicalisation think tank Quilliam. “They went from rural to urban, east to west, from totally Muslim to a predominantly non-Muslim environment. Any one of those would cause serious tensions but they got the triple whammy.”

Hasan said that he believed a failure to integrate was the major reason why Europe was producing far more jihadists than the US.

John Esposito, a professor of Islamic Studies at Georgetown University in Washington DC, said the Muslims who began arriving in America in the twentieth century tended to be educated and middle class, and many were arriving in order to go to college.

“Often they were escaping political situations or looking for economic opportunities,” he said. “Then you have an influx in the 1970s of young Muslims who are being sent here for a college education.”

He said women tended to have as high a level of education as men, and “as a religious community they are second only to America’s Jews in terms of education level. Even during the downturn they were more optimistic about the next five years in America than the average American”.

Among the more notable examples of American Muslim integration is the arrival of Kareem Salama on the American country music scene. Mr Salama was born in a small town in Oklahoma where there was only one other Muslim family. “They are more redneck than I am, I mean good old boys,” he said, in 2007. “They go hunting every weekend and drive big old trucks.” Edgar Hopida, a Muslim convert from Indiana, and a spokesman for the Islamic Society of North America said Americans relative comfort with religion and spirituality may have helped Muslims to integrate. He also noted the example of Keith Ellison, a Muslim congressman from Minnesota, on being sworn into office. “He took his oath on Thomas Jefferson’s Koran,” he said.

Back in Britain, Mr Hasan said the difference in the two communities was brought home to him at a conference for Future Leaders of Islamic communities in Europe and America in Copenhagen, which was funded by the US State Department. “We were asked to list the top 5 issues facing our communities. The results were very surprising. All the Europeans listed integration as an issue, usually the number 1 issue. None of the US delegates mentioned it.”

“It is in the nature of the US, a young nation, a nation of immigrants. European communities have struggled to integrate. That is why I myself went off to fight jihad. We felt excluded.”

Dr Hasan was an undergraduate at Cambridge University when we went to Afghanistan in 1990. “Socio-economic indicators are also an important factor. Especially in the Pakistani and Bangladeshi communities, things are way behind in terms of education and employment. But don’t forget many terrorists are also university educated.”

    

 

AIPAC - The Israel Lobby in US Politics

 

On July 23rd, officials of the American Israel Public Affairs Committee—the powerful lobbying group known as AIPAC—gathered in a conference room at the Capitol for a closed meeting with a dozen Democratic senators. The agenda of the meeting, which was attended by other Jewish leaders as well, was the war in the Gaza Strip. In the century-long conflict between the Israelis and the Palestinians, the previous two weeks had been particularly harrowing. In Israeli towns and cities, families heard sirens warning of incoming rockets and raced to shelters. In Gaza, there were scenes of utter devastation, with hundreds of Palestinian children dead from bombing and mortar fire. The Israeli government claimed that it had taken extraordinary measures to minimize civilian casualties, but the United Nations was launching an inquiry into possible war crimes. Even before the fighting escalated, the United States, Israel’s closest ally, had made little secret of its frustration with the government of Prime Minister Benjamin Netanyahu. “How will it have peace if it is unwilling to delineate a border, end the occupation, and allow for Palestinian sovereignty, security, and dignity?” Philip Gordon, the White House coördinator for the Middle East, said in early July. “It cannot maintain military control of another people indefinitely. Doing so is not only wrong but a recipe for resentment and recurring instability.” Although the Administration repeatedly reaffirmed its support for Israel, it was clearly uncomfortable with the scale of Israel’s aggression. AIPAC did not share this unease; it endorsed a Senate resolution in support of Israel’s “right to defend its citizens,” which had seventy-nine co-sponsors and passed without a word of dissent.

AIPAC is prideful about its influence. Its promotional literature points out that a reception during its annual policy conference, in Washington, “will be attended by more members of Congress than almost any other event, except for a joint session of Congress or a State of the Union address.” A former AIPAC executive, Steven Rosen, was fond of telling people that he could take out a napkin at any Senate hangout and get signatures of support for one issue or another from scores of senators. AIPAC has more than a hundred thousand members, a network of seventeen regional offices, and a vast pool of donors. The lobby does not raise funds directly. Its members do, and the amount of money they channel to political candidates is difficult to track. But everybody in Congress recognizes its influence in elections, and the effect is evident. In 2011, when the Palestinians announced that they would petition the U.N. for statehood, AIPAC helped persuade four hundred and forty-six members of Congress to co-sponsor resolutions opposing the idea.

During the Gaza conflict, AIPAC has made a priority of sending a message of bipartisan congressional support for all of Israel’s actions. Pro-Israel resolutions passed by unanimous consent carry weight, but not nearly so much as military funding. During the fighting, Israel has relied on the Iron Dome system, a U.S.-funded missile defense that has largely neutralized Hamas’s rockets. Although the U.S. was scheduled to deliver $351 million for the system starting in October, AIPAC wanted more money right away. On July 22nd, Defense Secretary Chuck Hagel had sent a letter to Harry Reid, the Senate Majority Leader, seeking an immediate payment of $225 million.

In the conference room, the senators sat on one side of a long table, the Jewish leaders on the other. Robert Cohen, the president of AIPAC, justified Israel’s assault, agreeing with Netanyahu that Hamas was ultimately responsible for the deaths of its own citizens. At one point, Tim Kaine, a Democrat from Virginia, asked about conservative trends in Israel, a participant recalled. “He said that he supports Israel, but he’s concerned that Israel is headed toward a one-state solution—and that would be so damaging and dangerous for everyone involved.”

Charles Schumer, the senior Democrat from New York, interrupted. Turning to address the room, he said, “It troubles me when I hear people equate Israel and Hamas. That’s wrong, that’s terrible!” Kaine protested, “That’s not what I meant!” Cohen simply repeated that Hamas was to blame for everything that was happening.

The Senate, preparing for its August recess, hastened to vote on the Iron Dome funding. At first, the appropriation was bundled into an emergency bill that also included money to address the underage refugees flooding across the Mexican border. But, with only a few days left before the break began, that bill got mired in a partisan fight. Reid tried to package Iron Dome with money for fighting wildfires, and then offered it by itself; both efforts failed, stopped largely by budget hawks. “If you can’t get it done the night before recess, you bemoan the fact that you couldn’t get it done, and everybody goes home,” a congressional staffer said. Instead, Mitch McConnell, of Kentucky, the Republican leader, decided to stay over, even if it meant missing an event at home. The next morning, with the halls of the Senate all but empty, an unusual session was convened so that McConnell and Reid could try again to pass the bill; Tim Kaine was also there, along with the Republicans John McCain and Lindsey Graham. “There were five senators present and literally no one else!” the staffer said. “They reintroduced it and passed it. This was one of the more amazing feats, for AIPAC.”

In a press conference, Graham, who has been a major recipient of campaign contributions connected to AIPAC, pointed out that the funding for Iron Dome was intended as a gesture of solidarity with Israel. “Not only are we going to give you more missiles—we’re going to be a better friend,” Graham said. “We’re going to fight for you in the international court of public opinion. We’re going to fight for you in the United Nations.”

The influence of AIPAC, like that of the lobbies for firearms, banking, defense, and energy interests, has long been a feature of politics in Washington, particularly on Capitol Hill. But that influence, like the community that AIPAC intends to represent, is not static. For decades, AIPAC has thrived on bipartisanship, exerting its influence on congressional Democrats and Republicans alike. But Israel’s government, now dominated by a coalition of right-wing parties led by Likud, has made compromise far less likely than it was a generation ago. Prime Minister Netanyahu, the leader of Likud and an unabashed partisan of the Republican view of the world, took office at about the same time as President Obama, and the two have clashed frequently over the expansion of Israeli settlements and the contours of a potential peace agreement between the Israelis and the Palestinians. Although both men repeatedly speak of the unshakable bond between the U.S. and Israel, their relationship has been fraught from the start. In 2012, Netanyahu made little secret of the fact that he hoped Mitt Romney would win the election. Time and again—over issues ranging from Iran to the Palestinians—AIPAC has sided strongly with Netanyahu against Obama.

AIPAC’s spokesman, Marshall Wittmann, said that the lobby had no loyalty to any political party, in Israel or in the U.S., and that to suggest otherwise was a “malicious mischaracterization.” Instead, he said, “we are a bipartisan organization of Americans who exercise our constitutional right to lobby the government.” For AIPAC, whose stated mission is to improve relations between the U.S. and Israel, it is crucial to appeal across the political spectrum. In recent years, though, Israel has become an increasingly divisive issue among the American public. Support for Israel among Republicans is at seventy-three per cent, and at forty-four per cent among Democrats, according to a poll conducted in July by the Pew Research Center for the People and the Press; the divide is even greater between liberal Democrats and conservative Republicans.

This difference represents a schism among American Jews—AIPAC’s vital core. For decades, the Jewish community was generally united in its support for Israel. Today, a growing number of American Jews, though still devoted to Israel, struggle with the lack of progress toward peace with the Palestinians. Many feel that AIPAC does not speak for them. The Pew Center’s survey found that only thirty-eight per cent of American Jews believe that the Israeli government is sincerely pursuing peace; forty-four per cent believe that the construction of new settlements damages Israel’s national security. In a Gallup poll in late July, only a quarter of Americans under the age of thirty thought that Israel’s actions in Gaza were justified. As Rabbi Jill Jacobs, the executive director of the left-leaning T’ruah: The Rabbinic Call for Human Rights, told me, “Many people I know in their twenties and thirties say, I have a perfectly good Jewish life here—why do I need to worry about this country in the Middle East where they’re not representing who I am as a Jew? I’m not proud of what’s happening there. I’m certainly not going to send money. ”

This is precisely the kind of ambivalence that AIPAC adherents describe as destructive. And yet even Israeli politicians recognize that AIPAC faces a shifting landscape of opinion. Shimon Peres, who served as Prime Minister and, most recently, as President, says, “My impression is that AIPAC is weaker among the younger people. It has a solid majority of people of a certain age, but it’s not the same among younger people.”

For AIPAC, the tension with the Obama Administration over Gaza comes amid a long series of conflicts. Perhaps the most significant of these is over the question of Iran’s obtaining a nuclear weapon. Last October, Iran and the consortium of world powers known as P5+1—Britain, China, France, Germany, Russia, and the United States—met in Geneva to begin talks. For two decades, AIPAC has been warning that if Iran acquired nuclear arms it would pose an existential threat to Israel, which has had a nuclear capacity since the late sixties. Netanyahu has insisted that the United States—or Israel alone, if necessary—must be prepared to take military action against Iran. The Obama Administration, too, has said that a nuclear Iran is unthinkable and that “all options”—including military options—“are on the table.” But Netanyahu fears that Obama is prepared to settle for too little in the negotiations, and, when they began, he launched an uninhibited campaign of public diplomacy against them. In early November, after meeting in Jerusalem with Secretary of State John Kerry, he proclaimed a tentative proposal “a very, very bad deal. It is the deal of the century for Iran.” A photo op for the two men was abruptly cancelled, and Kerry returned to Switzerland.

Later that month, Ron Dermer, the Israeli Ambassador to the U.S., met with a bipartisan group of two dozen congressmen in the offices of John Boehner, the House Speaker. Dermer, who comes from a political family in Miami, worked in the nineties for the Republican consultant Frank Luntz as he shaped Newt Gingrich’s Contract with America campaign. A few years later, Dermer emigrated to Israel, where he worked as a political consultant and wrote columns for the Jerusalem Post, a conservative daily, in which he referred to Jews who denounced the occupation as “self-haters.” When Netanyahu took office in 2009, he brought in Dermer as a top adviser, and the two became virtually inseparable. “Whenever we met with Bibi in the last several years, Dermer was there,” a former congressional aide said. “He was like Bibi’s Mini-Me.” In Boehner’s offices, a senior Democrat recalled, “Dermer was very critical of the proposed Iran nuclear agreement. He talked about how Reagan would never have done anything like this.” Finally, one of the other politicians in the room had to advise him, “Don’t talk about what Reagan would do. He’s not very popular with Democrats.”

The great incentive that the P5+1 could offer Iran was to reduce the sanctions that have crippled its economy. As the talks proceeded, though, Israel’s supporters in Congress were talking about legislation that would instead toughen the sanctions. Dermer didn’t say specifically that he favored such a law—representatives of foreign governments customarily do not advocate for specific U.S. legislation—but it was clear that that was what he and the Israeli leadership wanted. A former congressional staff member who attended the meeting said, “The implicit critique was the naivete of the President.”

Obama’s aides were alarmed by the possibility that AIPAC might endorse new sanctions legislation. They invited Howard Kohr, the group’s chief executive officer, and officials from other prominent Jewish organizations to briefings at the White House. Members of the Administration’s negotiating team, together with State Department officials, walked them through the issues. “We said, ‘We know you guys are going to take a tough line on these negotiations, but stay inside the tent and work with us,’ ” a senior Administration official recalled. “We told them directly that a sanctions bill would blow up the negotiations—the Iranians would walk away from the table. They said, ‘This bill is to strengthen your hand in diplomacy.’ We kept saying, ‘It doesn’t strengthen our hand in diplomacy. Why do you know better than we do what strengthens our hand? Nobody involved in the diplomacy thinks that. ’ ”

In late November, the negotiators announced an interim Joint Plan of Action. For a period of six months, Iran and the six world powers would work toward a comprehensive solution; in the meantime, Iran would limit its nuclear energy program in exchange for initial relief from sanctions. Netanyahu blasted the agreement, calling it a “historic mistake,” and, within a few days, the leadership of AIPAC committed itself to fighting for new sanctions. A senior Democrat close to AIPAC described to me the intimate interplay between Netanyahu’s circle and the lobby. “There are people in AIPAC who believe that it should be an arm of the Likud, an arm of the Republican Party,” he said. Wittmann, the lobby’s spokesman, disputed this, saying, “AIPAC does not take any orders or direction from any foreign principal, in Israel or elsewhere.”

For the Israeli leadership and many of its advocates, the Iran negotiations presented an especially vexing problem of political triangulation. Mahmoud Ahmadinejad, Iran’s previous President, had been a kind of ideal adversary, attracting widespread outrage by questioning whether the Holocaust had taken place and by challenging Israel’s right to exist. Danny Ayalon, a former Israeli Ambassador to the U.S., once described Ahmadinejad’s hateful rhetoric to me as “the gift that keeps on giving.” But Iran’s new President, Hassan Rouhani, was carefully presenting himself as a relative moderate. Netanyahu would have none of it, calling Rouhani “a wolf in sheep’s clothing.”

AIPAC worked to mobilize its friends in Congress. Mark Kirk, a Republican senator from Illinois and a major beneficiary of AIPAC-related funding, began pressing to pass a new sanctions bill. “He was saying, ‘We’re in negotiations with a wolf in sheep’s clothing!’ ” a former Senate aide recalled. The bill, co-sponsored by Robert Menendez, a New Jersey Democrat, was drafted with considerable input from AIPAC. This was the first time in decades that the lobby had challenged the sitting U.S. President so overtly.

The Obama Administration was furious. “It’s one thing to disagree on some aspect of the peace process, on things that are tough for Israel to do,” the senior Administration official told me. “But this is American foreign policy that they were seeking to essentially derail. There was no other logic to it than ending the negotiations, and the gravity of that was shocking.”

AIPAC was incorporated in 1963, fifteen years after the State of Israel came into being. Its leader, Isaiah (Si) Kenen, had been a lobbyist for American Zionist organizations and an employee of Israel’s Office of Information at the United Nations. In that job, Kenen had been obligated to register under the Foreign Agents Registration Act, which had stringent disclosure requirements about financial expenditures and communications with the U.S. government. The journalist M. J. Rosenberg, who volunteered at AIPAC in 1973 and is now a critic of it, recalled Kenen’s saying that the foreign-agent model was too restrictive. AIPAC would lobby Congress for aid to Israel, but its members would be Americans, taking orders from an American board of directors. Rosenberg told me that Kenen was “an old-fashioned liberal” who liked to say, “AIPAC has no enemies, only friends and potential friends.” When asked which politicians he hoped to elect, he said, “We play with the hand that is dealt us.” Congress must lead, he said, and “our job is to help it lead.”

Kenen retired in 1974, and by the late eighties AIPAC’s board had come to be dominated by a group of wealthy Jewish businessmen known as the Gang of Four: Mayer (Bubba) Mitchell, Edward Levy, Jr., Robert Asher, and Larry Weinberg. Weinberg was a Democrat who gradually moved to the right. The others were Republicans. In 1980, AIPAC hired Thomas Dine, a former diplomat and congressional staffer, as its executive director. Dine set out to develop a nationwide network that would enable AIPAC to influence every member of Congress. This was a daunting challenge. Jews made up less than three per cent of the American population, concentrated in nine states, and they voted overwhelmingly Democratic. How could AIPAC, with such a small base, become a political force in both parties and in every state?

Dine launched a grass-roots campaign, sending young staff members around the country to search for Jews in states where there were few. In Lubbock, Texas, for instance, they found nine who were willing to meet—a tiny group who cared deeply about Israel but never thought that they could play a political role. The lobby created four hundred and thirty-five “congressional caucuses,” groups of activists who would meet with their member of Congress to talk about the pro-Israel agenda.

Dine decided that “if you wanted to have influence you had to be a fund-raiser.” Despite its name, AIPAC is not a political-action committee, and therefore cannot contribute to campaigns. But in the eighties, as campaign-finance laws changed and PACs proliferated, AIPAC helped form pro-Israel PACs. By the end of the decade, there were dozens. Most had generic-sounding names, like Heartland Political Action Committee, and they formed a loose constellation around AIPAC. Though there was no formal relationship, in many cases the leader was an AIPAC member, and as the PACs raised funds they looked to the broader organization for direction.

Members’ contributions were often bundled. “AIPAC will select some dentist in Boise, say, to be the bundler,” a former longtime AIPAC member said. “They tell people in New York and other cities to send their five-thousand-dollar checks to him. But AIPAC has to teach people discipline—because all those people who are giving five thousand dollars would ordinarily want recognition. The purpose is to make the dentist into a big shot—he’s the one who has all this money to give to the congressman’s campaign.” AIPAC representatives tried to match each member of Congress with a contact who shared the congressman’s interests. If a member of Congress rode a Harley-Davidson, AIPAC found a contact who did, too. The goal was to develop people who could get a member of Congress on the phone at a moment’s notice.

That persistence and persuasion paid off. Howard Berman, a former congressman from California, recalled that Bubba Mitchell became friends with Sonny Callahan, a fellow-resident of Mobile, Alabama, when Callahan ran for Congress in 1984. Eventually, Callahan became chairman of the House Appropriations Subcommittee on Foreign Operations. “Sonny had always been against foreign aid,” Berman said. “Then he voted for it!”

Republicans knew that they would never get more than a minority of the Jewish electorate, but AIPAC members convinced them that voting the right way would lead to campaign contributions. It was a winning argument. In 1984, Mitch McConnell narrowly beat AIPAC supporters’ preferred candidate, the incumbent Democrat Walter Huddleston. Afterward, McConnell met with two AIPAC officials and said to them, “Let me be very clear. What do I need to do to make sure that the next time around I get the community support?” AIPAC members let Republicans know that, if they supported AIPAC positions, the lobby would view them as “friendly incumbents,” and would not abandon them for a Democratic challenger. The Connecticut Republican senator Lowell Weicker voted consistently with AIPAC; in 1988, he was challenged by the Democrat Joe Lieberman, an Orthodox Jew. Lieberman won, but Weicker got the majority of funding from Jewish donors.

In the early days, Howard Berman said, “AIPAC was knocking on an unlocked door.” Most Americans have been favorably disposed toward Israel since its founding, and no other lobby spoke for them on a national scale. Unlike other lobbies—such as the N.R.A., which is opposed by various anti-gun groups—AIPAC did not face a significant and well-funded countervailing force. It also had the resources to finance an expensive and emotionally charged form of persuasion. Dine estimated that in the eighties and nineties contributions from AIPAC members often constituted roughly ten to fifteen per cent of a typical congressional campaign budget. AIPAC provided lavish trips to Israel for legislators and other opinion-makers.

Nevertheless, the lobby did not endorse or rank candidates. “We made the decision to be one step removed,” Dine said. “Orrin Hatch once said, ‘Dine, your genius is to play an invisible bass drum, and the Jews hear it when you play it.’ ” In 1982, after an Illinois congressman named Paul Findley described himself as “Yasir Arafat’s best friend in Congress,” AIPAC members encouraged Dick Durbin, a political unknown, to run against him. Robert Asher, a Chicago businessman, sent out scores of letters to his friends, along with Durbin’s position paper on Israel, asking them to send checks. Durbin won, and he is now the Senate Majority Whip. (Findley later wrote a book that made extravagant claims about the power of the Israel lobby.) In 1984, AIPAC affiliates decided that Senator Charles Percy, an Illinois Republican, was unfriendly to Israel. In the next election, Paul Simon, a liberal Democrat, won Percy’s seat. Dine said at the time, “Jews in America, from coast to coast, gathered to oust Percy. And American politicians—those who hold public positions now, and those who aspire—got the message.”

As AIPAC grew, its leaders began to conceive of their mission as something more than winning support and aid for Israel. The Gang of Four, a former AIPAC official noted, “created an interesting mantra that they honestly believed: that, if AIPAC had existed prior to the Second World War, America would have stopped Hitler. It’s a great motivator, and a great fund-raiser—but I think it’s also AIPAC’s greatest weakness. Because if you convince yourself that, if only you had been around, six million Jews would not have been killed, then you sort of lose sight of the fact that the U.S. has its own foreign policy, and, while it is extremely friendly to Israel, it will only go so far.”

In the fall of 1991, President George H. W. Bush decided to delay ten billion dollars in loan guarantees to Israel, largely because of the continuing expansion of settlements. In response, AIPAC sent activists to Capitol Hill. The lobby was confident. Its officials had told Yitzhak Shamir, the Israeli Prime Minister at the time, that Bush did not have the political desire to take on AIPAC, according to a memoir by former Secretary of State James Baker. But Bush proved willing to fight. The former AIPAC official recalled that Bubba Mitchell was summoned to the White House for a meeting: “When he came back to the AIPAC boardroom, an hour after the meeting, he was still shaking—because the President of the United States yelled at him!” Soon afterward, Bush remarked that he was “one lonely little guy” fighting “something like a thousand lobbyists.” The Senate lined up behind him, and voted to postpone consideration of the loan guarantees. For AIPAC, this marked the beginning of a difficult period. The next June, Israeli voters ousted Shamir and his Likud Party and voted in Labor, headed by Yitzhak Rabin. After a career of military campaigns and cautious politics, Rabin began a transformation, offering to scale back settlement activity. In response, Bush asked Congress to approve the loan guarantees. Afterward, Rabin admonished the leaders of AIPAC, telling them that they had done more harm than good by waging battles “that were lost in advance.” Daniel Kurtzer, then the Deputy Assistant Secretary of State for Near Eastern Affairs, told me, “Rabin was furious with AIPAC. He felt they were allied with Likud and would undermine him in what he was trying to do.”

In September, 1993, Rabin and Arafat signed the Oslo Accords, which were aimed at building a formal peace process with the Palestine Liberation Organization. AIPAC officially endorsed the agreement, and still does. But many members were uncomfortable with it, according to Keith Weissman, a former analyst for the lobby. “AIPAC couldn’t act like they were rejecting what the government of Israel did, but the outcry in the organization about Oslo was so great that they found ways to sabotage it,” he said. (In 2005, Weissman was indicted, along with Steven Rosen, for conspiring to pass national-defense information to a reporter and an Israeli government agent, and AIPAC fired them. The charges were ultimately dropped.) As part of the agreement, the U.S. was to make funds available to the Palestinians, Weissman said. “The Israelis wanted the money to go to Arafat, for what they called ‘walking-around money.’ But AIPAC supported a bill in Congress to make sure that the money was never given directly to Arafat and his people, and to monitor closely what was done with it. And, because I knew Arabic, they had me following all of Arafat’s speeches. Was he saying one thing here, and another thing there? Our department became P.L.O. compliance-watchers. The idea was to cripple Oslo.”

In 1995, AIPAC encouraged Newt Gingrich, the new Speaker of the House, to support bipartisan legislation to move the U.S. Embassy from Tel Aviv to Jerusalem. This put Rabin in a political corner. On one hand, he knew that such a move would infuriate the Arab world and endanger the Oslo process. On the other, as Yossi Beilin, then an official in the Labor government, pointed out, “You are the Prime Minister of Israel and you are telling American Jews, ‘Don’t ask for recognition of Jerusalem as our capital’? Nobody can do that!” At a dinner with AIPAC leaders, Rabin told them that he did not support the bill; they continued to promote it nonetheless. In October, the bill passed in Congress, by an overwhelming majority. President Bill Clinton invoked a national-security waiver to prevent its enactment, and so has every President since.

In 1999, Ehud Barak, also of the Labor Party, became Prime Minister, and, as Rabin had, he grew friendly with Clinton. “AIPAC flourishes when there is tension between Israel and the U.S., because then they have a role to play,” Gadi Baltiansky, who was Barak’s press spokesman, told me. “But the relations between Rabin and Clinton, and then Barak and Clinton, were so good that AIPAC was not needed. Barak gave them courtesy meetings. He just didn’t see them as real players.” Still, the lobby maintained its sway in Congress. In 2000, Barak sent Beilin, who was then the Justice Minister, to obtain money that Clinton had promised Israel but never released. Beilin went to see Sandy Berger, Clinton’s national-security adviser. “He said this money is tied to two hundred and twenty-five million dollars in assistance to Egypt,” Beilin recalled. “We cannot disburse the money to Israel unless we do to Egypt, so we need to convince Congress to support the whole package. I said, ‘I am speaking on behalf of my Prime Minister. We want Egypt to get the money.’ He said, ‘Yossi, this is really wonderful. Do you know somebody in AIPAC?’"

Beilin was astonished: “It was kind of Kafka—the U.S. national-security adviser is asking the Minister of Justice in Israel whether he knows somebody at AIPAC!” He went to see Howard Kohr, the AIPAC C.E.O., a onetime employee of the Republican Jewish Coalition whom a former U.S. government official described to me as “a comfortable Likudnik.” Kohr told Beilin that it was impossible to allow Egypt to get the money. “You may think it was wrong for Israel to vote for Barak as Prime Minister—fine,” Beilin recalled saying. “But do you really believe that you represent Israel more than all of us?” By the end of Barak’s term, in 2001, the money had not been released, to Israel or to Egypt. “They always want to punish the Arabs,” Beilin concluded. “They are a very rightist organization, which doesn’t represent the majority of Jews in America, who are so Democratic and liberal. They want to protect Israel from itself—especially when moderate people are Israel’s leaders.”

In the spring of 2008, AIPAC moved from cramped quarters on Capitol Hill to a gleaming new seven-story building on H Street, downtown. At the ribbon-cutting ceremony, Howard Kohr introduced Sheldon Adelson, a casino magnate who had been a generous donor to AIPAC since the nineties, and who had helped underwrite congressional trips to Israel (paying only for Republican members). On this bright spring day, according to someone who was in the audience, Adelson recalled that Kohr had telephoned him, asking him to have lunch. Adelson remembered wondering, How much is this lunch going to cost me? Well, he went on, it cost him ten million dollars: the building was the result. He later told his wife that Kohr should have asked him for fifty million.

Netanyahu became Prime Minister the following year. AIPAC officials had been close to him since the eighties, when he worked at the Israeli Embassy in Washington, and stuck with him when, in 1990, he was banned from the State Department for saying that U.S. policy was built “on a foundation of distortion and lies.” As Prime Minister, Netanyahu had a difficult relationship with Bill Clinton, largely because Clinton found him unwilling to stop the expansion of settlements and to meaningfully advance the peace process—a sharp contrast with the approach of Rabin, who was assassinated in 1995. Then as now, Netanyahu displayed a vivid sense of his own historical importance, as well as flashes of disdain for the American President. After their first meeting, Clinton sent a message to another Israeli, wryly complaining that he had emerged uncertain who, exactly, was the President of a superpower.

But, even if Netanyahu had trouble with the executive branch, AIPAC could help deliver the support of Congress, and a friendly Congress could take away the President’s strongest negotiating chit—the multibillion-dollar packages of military aid that go to Israel each year. The same dynamic was repeated during Barack Obama’s first term. Israeli conservatives were wary, sensing that Obama, in their terms, was a leftist, sympathetic to the Palestinian cause. They took note when, during the 2008 campaign, Obama said, “I think there is a strain within the pro-Israel community that says unless you adopt an unwavering pro-Likud approach to Israel that you’re opposed to Israel, that you’re anti-Israel, and that can’t be the measure of our friendship with Israel.”

At Obama’s first meeting with Netanyahu, in May, 2009, Dermer came along, and found himself unable to observe the well-established protocol that one does not interrupt the President. As Obama spoke, Dermer’s hand shot up: “Excuse me, Mr. President, I beg to differ!” Obama demanded a full settlement freeze, as a means of convincing the Palestinians that Netanyahu was not merely stalling the Americans. Netanyahu was incensed, and AIPAC rallied members of Congress to protest. At an AIPAC conference, Dermer declared that Netanyahu would chart his own course with the Palestinians: “The days of continuing down the same path of weakness and capitulation and concessions, hoping—hoping—that somehow the Palestinians would respond in kind, are over.” Applause swept the room.

In a speech at Bar-Ilan University, in June, 2009, Netanyahu seemed to endorse a two-state solution, if in rather guarded terms. Leaders of the settler movement and even many of Netanyahu’s Likud allies were furious at this seemingly historic shift for the Party, though, with time, many of them interpreted the speech as a tactical sop to the United States. No less significant, perhaps, Netanyahu introduced a condition that could make a final resolution impossible—the demand that the Palestinians recognize Israel as a Jewish state. “It was a stroke of political brilliance,” the former Senate aide, who had worked closely with Dermer, told me. “He managed to take the two-state issue off the table and put it back on the Palestinians.”

In March, 2010, while Vice-President Joe Biden was visiting Israel, the Netanyahu government announced that it was building sixteen hundred new housing units for Jews in Ramat Shlomo, a neighborhood in East Jerusalem. Biden said that the move “undermines the trust we need right now.” Secretary of State Hillary Clinton called Netanyahu to upbraid him. But, while Obama and his team viewed the move as a political insult and yet another blow to a potential two-state solution, AIPAC went into defensive mode, sending an e-mail to its members saying that the Administration’s criticisms of Israel were “a matter of serious concern.” Soon afterward, a letter circulated in the House calling on the Obama Administration to “reinforce” the relationship. Three hundred and twenty-seven House members signed it. A couple of months later, when the U.S. tried to extend a partial moratorium on construction in settlements in the West Bank, AIPAC fought against the extension. Obama eventually yielded.

In May, 2011, Obama gave a speech about the Arab Spring, and, hoping to break the stalemate in the peace talks, he said, “The borders of Israel and Palestine should be based on 1967 lines with mutually agreed swaps.” The 1967 borders, with some adjustments, had long been recognized as the foundation for a peace agreement, but Obama was the first President to utter the words so explicitly. The next day, Netanyahu arrived in Washington and rebuked him in the Oval Office, saying, “We can’t go back to those indefensible lines.”

A veteran Israeli politician was aghast at Netanyahu’s performance. “This is the President of the United States of America, and you are the head of a client state—let’s not forget that!” he said. “AIPAC should have come to Bibi and said, ‘You don’t talk to the President the way you do! This is not done, you have to stop it!’ Instead of reflecting almost automatically everything the Israeli government is doing and pushing in that direction.”

AIPAC officially supports a two-state solution, but many of its members, and many of the speakers at its conferences, loudly oppose such an agreement. Tom Dine has said that the lobby’s tacit position is “We’ll work against it until it happens.” After Obama endorsed the 1967 borders, AIPAC members called Congress to express outrage. “They wanted the President to feel the heat from Israel’s friends on the Hill,” a former Israeli official recalled. “They were saying to the Administration, ‘You must rephrase, you must correct!’ ” When Obama appeared at an AIPAC policy conference three days later, he was conciliatory: “The parties themselves—Israelis and Palestinians—will negotiate a border that is different than the one that existed on June 4, 1967. That’s what ‘mutually agreed-upon swaps’ means.” AIPAC had e-mailed videos to attendees, urging them not to boo the President; they complied, offering occasional wan applause. The next day, Netanyahu addressed a joint session of Congress and received twenty-nine standing ovations.

Fifty years ago, before Israel became an undeclared nuclear power and its existence was under threat, any differences it had with the U.S. were usually aired in private. Today, the political dynamics in both countries—and the particulars of the relationship—have evolved. A majority of Israelis still favor the idea of a two-state solution, but the political mood has shifted markedly to the right. The reasons range from the deeply felt notion that the Palestinians were “offered the world and rejected it” to the rise of Hamas in Gaza, from the aftershock of terror attacks a decade ago to the instability throughout the Middle East. Likud has rejected relative moderates like Dan Meridor and Benny Begin; Netanyahu himself is considered a “dove” by some leaders of his coalition and members of his party. The consensus deepens that Oslo was a failure, and that, as Netanyahu says, “there is no partner for peace.” The Palestinians, for their part, argue that the settlements in the West Bank and Jewish expansion into East Jerusalem have created a “one-state reality.” They point out that members of Netanyahu’s coalition reject a two-state solution—“The land is ours!”—and endorse permanent Israeli control, or outright annexation, of the West Bank.

Netanyahu prides himself on understanding the American political climate. But his deepest relationships are with older, often wealthy members of the establishments in New York and Los Angeles, and he is less conscious of the changes in American demographics and in opinion among younger American Jews. Assaf Sharon, the research director of Molad, a progressive think tank in Jerusalem, said, “When Israelis see House members jump like springs to applaud every lame comment Bibi utters, they think he is a star in Washington. Then they are told by the local pundits that everything else is just personal friction with Obama. My sense is that the people surrounding Bibi—and the Prime Minister himself—don’t appreciate the significance of the shift.”

Yet the rhetoric of Netanyahu’s circle has never been more confident. In a recent talk, Dermer argued that Israel is a regional superpower, with much to give in its relationship with the U.S. “America’s most important ally in the twentieth century was Great Britain,” he said. “Your most important ally in the twenty-first century is going to be the State of Israel.” In a meeting with young Likud supporters last spring, which one of them transcribed online, Netanyahu boasted of defying Obama’s pressure to halt settlements; 2013 was a record year for settlement construction in the West Bank. He preferred to “stand up to international pressure by maneuvering,” he said. “What matters is that we continue to head straight toward our goal, even if one time we walk right and another time walk left.” When one of the Likudniks asked about peace talks with the Palestinians, Netanyahu is said to have replied, as the audience laughed, “About the—what?”

AIPAC’s hold on Congress has become institutionalized. Each year, a month or two before the annual policy conference, AIPAC officials tell key members what measures they want, so that their activists have something to lobby for. “Every year, we create major legislation, so they can justify their existence to their members,” the former congressional aide said. (AIPAC maintains that only members of Congress initiate legislative action.) AIPAC board meetings are held in Washington each month, and directors visit members of Congress. They generally address them by their first names, even if they haven’t met before. The intimacy is presumed, but also, at times, earned; local AIPAC staffers, in the manner of basketball recruiters, befriend some members when they are still serving on the student council. “If you have a dream about running for office, AIPAC calls you,” one House member said. Certainly, it’s a rarity when someone undertakes a campaign for the House or the Senate today without hearing from AIPAC.

In 1996, Brian Baird, a psychologist from Seattle, decided to run for Congress. Local Democrats asked if he had thought about what he was going to say to AIPAC. “I had admired Israel since I was a kid,” Baird told me. “But I also was fairly sympathetic to peaceful resolution and the Palestinian side. These people said, ‘We respect that, but let’s talk about the issues and what you might say.’ The difficult reality is this: in order to get elected to Congress, if you’re not independently wealthy, you have to raise a lot of money. And you learn pretty quickly that, if AIPAC is on your side, you can do that. They come to you and say, ‘We’d be happy to host ten-thousand-dollar fund-raisers for you, and let us help write your annual letter, and please come to this multi-thousand-person dinner.’ ” Baird continued, “Any member of Congress knows that AIPAC is associated indirectly with significant amounts of campaign spending if you’re with them, and significant amounts against you if you’re not with them.” For Baird, AIPAC-connected money amounted to about two hundred thousand dollars in each of his races—“and that’s two hundred thousand going your way, versus the other way: a four-hundred-thousand-dollar swing.”

The contributions, as with many interest groups, come with a great deal of tactical input. “The AIPAC people do a very good job of ‘informing’ you about the issues,” Baird told me. “It literally gets down to ‘No, we don’t say it that way, we say it this way.’ Always phrased as a friendly suggestion—but it’s pretty clear you don’t want to say ‘occupied territories’! There’s a whole complex semantic code you learn. . . . After a while, you find yourself saying and repeating it as if it were fact.”

Soon after taking office, Baird went on a “virtually obligatory” trip to Israel: a freshman ritual in which everything—business-class flights, accommodations at the King David or the Citadel—is paid for by AIPAC’s charitable arm. The tours are carefully curated. “They do have you meet with the Palestinian leaders, in a sort of token process,” Baird said. “But then when you’re done with it they tell you everything the Palestinian leaders said that’s wrong. And, of course, the Palestinians don’t get to have dinner with you at the hotel that night.”

In early 2009, after a brief truce between Israel and Hamas collapsed in a series of mutual provocations, Israel carried out Operation Cast Lead, an incursion into Gaza in which nearly fourteen hundred Palestinians were killed, along with thirteen Israelis. Baird visited the area a few weeks later and returned several times. As he wrote in an op-ed, he saw “firsthand the devastating destruction of hospitals, schools, homes, industries, and infrastructure.” That September, the U.N. Human Rights Council issued a report, based on an inquiry led by the South African jurist Richard Goldstone, that accused Israel of a series of possible war crimes. AIPAC attacked the report, saying it was “rigged.” A month later, an AIPAC-sponsored resolution to condemn the report was introduced in the House, and three hundred and forty-four members voted in favor. “I read every single word of that report, and it comported with what I had seen and heard on the ground in Gaza,” Baird said. “When we had the vote, I said, ‘We have member after member coming to the floor to vote on a resolution they’ve never read, about a report they’ve never seen, in a place they’ve never been.’ ” Goldstone came under such pressure that threats were made to ban him from his grandson’s bar mitzvah at a Johannesburg synagogue. He eventually wrote an op-ed in which he expressed regret for his conclusions, saying, “Civilians were not intentionally targeted as a matter of policy.” Other members of the council stood by the report.

In 2010, Baird decided not to run again for the House; he is now the president of Antioch University Seattle. Few current members of Congress are as outspoken about AIPAC as Baird. Staff members fret about whether AIPAC will prevent them from getting a good consulting job when they leave government. “You just hear the name!” a Senate aide said. “You hear that they are involved and everyone’s ears perk up and their mood changes, and they start to fall in line in a certain way.”

Baird said, “When key votes are cast, the question on the House floor, troublingly, is often not ‘What is the right thing to do for the United States of America?’ but ‘How is AIPAC going to score this?’ ” He added, “There’s such a conundrum here, of believing that you’re supporting Israel, when you’re actually backing policies that are antithetical to its highest values and, ultimately, destructive for the country.” In talks with Israeli officials, he found that his inquiries were not treated with much respect. In 2003, one of his constituents, Rachel Corrie, was killed by a bulldozer driven by an Israeli soldier, as she protested the demolition of Palestinians’ homes in Gaza. At first, he said, the officials told him, “There’s a simple explanation—here are the facts.” Or, “We will look into it.” But, when he continued to press, something else would emerge. “There is a disdain for the U.S., and a dismissal of any legitimacy of our right to question—because who are we to talk about moral values?” Baird told me. “Whether it’s that we didn’t help early enough in the Holocaust, or look at what we did to our African-Americans, or our Native Americans—whatever! And they see us, members of Congress, as basically for sale. So they want us to shut up and play the game.”

In 2007, John Mearsheimer and Stephen Walt, two leading political scientists of the realist school, published a book called “The Israel Lobby and U.S. Foreign Policy.” The book, a best-seller, presented a scathing portrait of AIPAC, arguing that the lobby had a nearly singular distorting influence on American foreign policy, and even that it was a central factor in the rush to war in Iraq. While the authors’ supporters praised their daring, their critics argued that they had neglected to point out any failures of the Palestinian leadership, and painted AIPAC in conspiratorial, omnipotent tones. Even Noam Chomsky, a fierce critic of Israel from the left, wrote that the authors had exaggerated the influence of AIPAC, and that other special interests, like the energy lobby, had greater influence on Middle East policy.

A broader political challenge to AIPAC came in 2009, with the founding of J Street, a “pro-Israel, pro-peace” advocacy group. Led by Jeremy Ben-Ami, a former Clinton Administration aide whose grandparents were among the first settlers in Tel Aviv, J Street was founded to appeal to American Jews who strongly support a two-state solution and who see the occupation as a threat to democracy and to Jewish values. J Street has only a tiny fraction of AIPAC’s financial power and influence on Capitol Hill, but it has tried to provide at least some campaign funding to weaken the lobby’s grip.

AIPAC and its allies have responded aggressively. This year, the Conference of Presidents of Major American Jewish Organizations voted not to admit J Street, because, as the leader of one Orthodox alliance said to the Times, its “positions are out of the mainstream of what could be considered acceptable within the Jewish community.” Danny Ayalon, the former Israeli Ambassador, told me, “When Jewish organizations join the political campaign to delegitimatize Israel, they are really undermining our security collectively. Because I do believe that, if Israel’s security is compromised, so is that of every Jew in the world.”

Many Israeli and Palestinian leaders have taken note of the rise of J Street and, without overestimating its capacities, see that it represents an increasing diversity of opinion in the American Jewish community. At the last J Street convention, in Washington, Husam Zomlot, a rising figure in Fatah, the largest faction in the P.L.O., delivered a speech about the Palestinian cause and got a standing ovation. “AIPAC is not as effective as it was,” Zomlot said. “I wouldn’t say J Street is the mainstream representative of Jewish Americans, but it is a trend that gives you some sense of where things are and what is happening. Though it has limited funding, it is the first organized Jewish group with a different agenda in Washington since Israel was established. It’s worth noticing.”

Some politicians in Washington have indeed noticed, and not always to their benefit. Soon after J Street got started, it endorsed Robert Wexler, a Democratic congressman who represented a South Florida district. “Some AIPAC people told me they would not support me anymore if I went to a J Street event or took their support,” Wexler recalled. “I called them and said, ‘You’ve supported me for twelve years. You’re not going to support me because somebody from J Street endorsed me?’ ” Wexler added, “AIPAC is still by a factor of a hundred to one the premier lobbying organization for the Jewish community. I’ll never understand why they care one iota about J Street—but they have this bizarre fixation on it.”

Jan Schakowsky, who has represented a liberal Chicago district since 1999, was another of J Street’s first endorsees. For years, she had maintained good relations with AIPAC, whose members* gave money to her campaigns and praised her positions. She voted to condemn the Goldstone report and signed a 2010 letter urging the Administration to keep any differences with Israel private. But in her 2010 race, she was challenged by Joel Pollak, an Orthodox Jew, who argued that she was insufficiently supportive of Israel. “We were very much aware that AIPAC-associated people were fund-raising for Jan’s opponent,” Dylan Williams, the director of government affairs for J Street, said. A small but vocal contingent of AIPAC members were behind Pollak. But he was also backed by the Tea Party, which J Street believed might drive away other Jewish voters. The new lobby raised seventy-five thousand dollars for Schakowsky (through its PAC, whose financial contributions are publicly disclosed), and she won by a wide margin. “It was exactly the type of race we had hoped for!” Williams said. “A lot of the power of AIPAC is based on this perception, which I believe is a myth, that if you cross their line you will be targeted, and your opponent in your next race will receive all this money, and it will make a difference.” Still, Schakowsky told me, the process was painful. “Getting booed in a synagogue was not a pleasure,” she said. “This is not just my base—it’s my family!” She added, “Increasingly, Israel has become a wedge issue, something to be used against the President by the Republicans, and it can be very unhelpful.”

AIPAC is still capable of mounting a show of bipartisanship. At this year’s policy conference, Steny Hoyer, the House Democratic Whip, appeared onstage with Eric Cantor, then the Republican House Majority Leader, and together they rhapsodized about the summer trip they routinely took, leading groups of mostly freshmen on an AIPAC tour of Israel. “Few things are as meaningful as watching your colleagues discover the Jewish state for the very first time,” Cantor said.

Hoyer offered a benediction: “We Baptists would say, ‘Amen.’ ”

Cantor and Hoyer have been steadfast supporters of AIPAC, and its members have held at least a dozen fund-raisers for them each year. But last December AIPAC’s efforts to implement sanctions against Iran were so intense that even this well-tempered partnership fractured. When Congress returned from its Thanksgiving recess, legislators in the House began discussing a sanctions bill. According to the former congressional aide, Cantor told Hoyer that he wanted a bill that would kill the interim agreement with Iran. Hoyer refused, saying that he would collaborate only on a nonbinding resolution.

Cantor sent Hoyer a resolution that called for additional sanctions and sought to define in advance the contours of an agreement with Iran. “The pressure was tremendous—not just AIPAC leadership and legislative officials but various board members and other contributors, from all over the country,” the former congressional aide recalled. “What was striking was how strident the message was,” another aide said. “ ‘How could you not pass a resolution that tells the President what the outcome of the negotiations has to be?’ ” Advocates for the sanctions portrayed Obama as feckless. “They said, ‘Iranians have been doing this for millennia. They can smell weakness. Why is the President showing weakness?’ ” a Senate aide recalled.

AIPAC was betting that the Democrats, facing midterms with an unpopular President, would break ranks, and that Obama would be unable to stop them. Its confidence was not unfounded; every time Netanyahu and AIPAC had opposed Obama, he had retreated. But Obama took up the fight with unusual vigor. He has been deeply interested in nonproliferation since his college days, and he has been searching for an opening with Iran since his Presidential campaign in 2008. As the Cantor-Hoyer resolution gathered momentum, House Democrats began holding meetings at the White House to strategize about how to oppose it.

Debbie Wasserman Schultz, the head of the Democratic National Committee, attended the meetings, at some political risk. Wasserman Schultz represents a heavily Jewish district in South Florida, and has been a reliable signature on AIPAC’s letters and resolutions; she has boasted of concurring with a hundred per cent of its positions. Now the lobby e-mailed out an “AIPAC Action Alert,” including the text of a story about the meetings in the conservative Washington Free Beacon, in which she was described as “siding with the Mullahs over the American people.” The alert asked AIPAC’s executive-council members to contact her office, ask if the story was true, and challenge her opposition to Cantor-Hoyer. Stephen Fiske, the chair of the pro-Israel Florida Congressional Committee PAC, sent a similar alert to Wasserman Schultz’s constituents, setting off a cascade of calls to her office. (Fiske told the Free Beacon that the callers included a team of young students: his son’s classmates at a Jewish day school in North Miami Beach.) Wasserman Schultz was furious. Soon afterward, she flew to Israel for the funeral of former Prime Minister Ariel Sharon. On the trip, she remarked to a colleague, “They’re doing this to me?”

But as the meetings continued Democrats began to build a consensus. In December, Ester Kurz, AIPAC’s director of legislative strategy, went to see Nancy Pelosi, the Minority Leader, to urge her to pass the resolution. Pelosi resisted, pointing out that many members of Hoyer’s caucus strongly opposed it. David Price, a Democrat, and Charles Dent, a Republican, had written a letter to the President, urging him to use the diplomatic opening that followed Rouhani’s election to attempt a nuclear agreement; it garnered a hundred and thirty-one signatures. Pointing to the letter, Pelosi demanded to know why AIPAC wanted this resolution, at this time.

The members of Hoyer’s caucus pressed him, and, on December 12th, just as the language of the resolution became final, he asked to set aside the effort, saying that the time was not right. His demurral—from someone who had rarely disappointed AIPAC—was a sign that the lobby might be in uncharted terrain. Two weeks after local AIPAC activists pressured Wasserman Schultz, a national board member issued a statement that called her “a good friend of Israel and a close friend of AIPAC.”

The crucial fight, though, was in the Senate. A couple of days before the Christmas recess, Robert Menendez and Mark Kirk introduced their sanctions bill, the Nuclear Weapon Free Iran Act of 2013. At first, senators were eager to express support—previous Iran-sanctions bills had passed by votes of 99–0—and, by the second week of January, Menendez and Kirk had secured the votes of fifty-nine senators, including sixteen Democrats. One more vote would enable the bill’s supporters to overcome a filibuster. A number of senators facing reëlection were told by AIPAC contacts that fund-raisers would be cancelled if they did not sign on, according to several employees of another lobby. (AIPAC denies this.)

In January, though, AIPAC’s effort stalled. Some senators complained that the bill called for immediate sanctions. In fact, a close reading of the bill makes plain that most of the sanctions would become active ninety days after enactment. But the sanctions, ostensibly intended to put pressure on the Iranian negotiators, were designed to go into effect automatically, no matter how the nuclear talks went. The bill also dictated to negotiators the acceptable terms of an agreement, and committed the U.S. to support any defensive military action that Israel took against Iran. On the Senate floor, Dianne Feinstein gave a pointed speech, in which she warned that, if the bill passed, “diplomatic negotiations will collapse,” and said, “We cannot let Israel determine when and where the United States goes to war.” Ten Senate committee chairmen—including Feinstein, who serves on the Select Committee on Intelligence, and Carl Levin, of Michigan, the head of the Armed Services Committee—wrote to Harry Reid, noting that the intelligence community believed that new sanctions would effectively halt the negotiations.

At the same time, AIPAC was urging Reid to bring the measure to a vote—and, as the former congressional aide noted, “you don’t alienate a key fund-raising base, especially when you may be about to lose the Senate.” But the pressure from the White House was even greater. Brad Gordon, AIPAC’s longtime legislative official, said ruefully, “I have not seen the Administration act with such force and such sustained effort . . . since Obama became President.” At a meeting with several dozen Democratic senators in January, Obama spoke at length about Iran, warning of the possibility of war. Senator Tom Carper, a Delaware Democrat, said later that the President “was as good as I’ve ever heard him.” As congressional Democrats continued to meet in the White House Obama’s press secretary, Jay Carney, referred to the proposed sanctions as part of a “march to war.” Not long afterward, Bernadette Meehan, a National Security Council spokeswoman, said, “If certain members of Congress want the United States to take military action, they should be up front with the American public and say so.” Congressional offices were inundated with calls from constituents alarmed by the prospect of war. The decisive moment came in the State of the Union speech, when Obama said plainly, “If this Congress sends me a new sanctions bill now that threatens to derail these talks, I will veto it.”

About a week later, forty-two Republican senators sent a letter to Reid, demanding that he bring Menendez-Kirk to a vote, and noting that he had already “taken unprecedented steps to take away the rights of the minority in the Senate.” Reid’s staff members urged AIPAC officials to stop pressing for the bill; their office had been open to a bipartisan process, they argued, but siding with the Republicans against Obama was hardly bipartisan. According to a former Senate aide, the lobbyists seemed to realize that if they continued to push they would have to give up any claim to bipartisanship. Two days later, AIPAC issued a statement saying that the time was not right for a vote; Menendez issued a similar statement. “That was the fundamental moment when Menendez-Kirk lost,” the aide said.

AIPAC had sustained a painful defeat—and its annual policy conference was only a few weeks away. The day before the conference, according to a senior House Democrat, “AIPAC still did not have its ‘ask’ together.” Instead of dictating the terms of legislation, the lobby struggled to negotiate letters to the President, urging him to support sanctions. In the end, Cantor and Hoyer’s resolution was reduced to a letter, circulated in the House, that was so anodyne that most Democrats in the progressive caucus signed it.

Some of the House Democrats who had fought against the resolution were enjoying a new sense of confidence. For a month, David Price and his fellow-Democrat Lloyd Doggett had been gathering support for a letter to the President, saying that Congress should “give diplomacy a chance.” They expected to get perhaps forty signatures. Instead, they got a hundred and four, including those of four Republicans. “AIPAC tried to peel some away, but what’s striking is how few we lost,” Price said. A handful of Jewish members signed, including Jan Schakowsky. Wasserman Schultz did not. “It was a difficult policy spot for all of us, as Jewish members,” Schakowsky said. But, had the Cantor-Hoyer resolution passed, she continued, “it would have created an atmosphere surrounding the bargaining table that the President could not bargain in good faith. And it would for the first time have dramatically divided the Democrats.”

John Yarmuth, of Kentucky, another Jewish member who signed the letter, said, “AIPAC clearly has a great deal of clout in the Republican conference, and many Democrats still think that they have to be responsive to it.” But he believes that the letter was an important measure of congressional restiveness. “I think there is a growing sense among members that things are done just to placate AIPAC, and that AIPAC is not really working to advance what is in the interest of the United States.” He concluded, “We all took an oath of office. And AIPAC, in many instances, is asking us to ignore it.”

A few months later, the Gaza war began, and AIPAC mobilized again. “There were conference calls, mass e-mails, talking points for the day,” a congressional aide said. “AIPAC activists would e-mail me, with fifteen other AIPAC activists cc’d, and then those people would respond, saying, ‘I agree entirely with what the first e-mail said!’ ”

It didn’t hurt AIPAC’s cause that the enemy was Hamas, whose suicide bombings a decade ago killed hundreds of Israeli civilians, and whose rocket attacks in recent years have terrorized citizens, particularly in southern Israel. As Israel pressed its offensive, and hundreds of Palestinian civilians were killed, AIPAC argued, as did Netanyahu, that the casualties came only because Hamas was using human shields. Online, AIPAC posted a short film, “Israel’s Moral Defense,” which depicted an Israeli major in a quandary. Looking at a schoolyard filled with girls in neat uniforms, he sees fighters with a rocket launcher not far behind them. Should he order his men to fire their machine guns, and risk hitting the girls, or hold back, and risk the rocket killing Israelis? “I didn’t pull the trigger,” the soldier says. “We are totally different. . . . I am very proud to be in an army that has this level of morality.” A couple of weeks after the film appeared, Israeli shells struck a United Nations school in the Jabaliya refugee camp, killing twenty-one people and injuring more than ninety; it was the sixth U.N. school that Israel had bombed. The next day, the United Nations High Commissioner for Human Rights, Navi Pillay, pointed out that, as Israeli forces attacked homes, schools, and hospitals, the U.S. was supplying them with heavy weaponry. Almost simultaneously, the House passed an AIPAC-supported resolution denouncing Hamas’s use of human shields and condemning an inquiry into Israel’s Gaza operations that Pillay was sponsoring.

According to congressional staffers, some members of Congress seemed eager to make up for their recent apostasy on the Iran negotiations. While Reid and his colleagues went to extraordinary lengths to fund the Iron Dome missile-defense system, the House leadership engaged in the same mission. The vote in the House came late on the night of Friday, August 1st—the last possible moment before the summer recess. The earlier resolutions that AIPAC had sponsored during the war had passed unanimously, with no record of individual votes, but on this vote the roll was called. (AIPAC sometimes asks congressional leaders to call the roll when a decisive victory seems likely.) “I think AIPAC thought this vote would be one hundred per cent,” Jim Moran, a Democrat from Virginia, said. It was close: out of four hundred and thirty-five members, only eight voted no. Moran, who has been in Congress since 1990, and is retiring this year, was one of four Democrats who voted against the resolution. As a longtime member of the Defense Appropriations Committee, he did not believe that there was any urgent need for the funding. “We have put about nine hundred million dollars into the Iron Dome,” he argued. “We know that the

re are many millions unexpended in Israel’s Iron Dome account. And Israel was to get three hundred and fifty-one million on October 1st, for Iron Dome.”

Beto O’Rourke, a freshman Democrat from El Paso, also voted against the funding. “I tried to find him on the floor, but I couldn’t,” Moran said. “I wanted him to switch his vote. Now, he might not have switched it anyway, because—as shocking as it may be—he’s in Congress solely to do what he considers to be the right thing. I’m afraid he may have a tough race in November.” The morning after the vote, O’Rourke e-mailed a local AIPAC activist, Stuart Schwartz, to explain his vote, according to a knowledgeable person. In his explanation, which he also posted on Facebook, he pointed out that he had voted for Iron Dome in the past, and had supported the funds that were scheduled to arrive in October. But, he wrote, “I could not in good conscience vote for borrowing $225 million more to send to Israel, without debate and without discussion, in the midst of a war that has cost more than a thousand civilian lives already, too many of them children.” Within hours, O’Rourke was flooded with e-mails, texts, and calls. The next day, the El Paso Times ran a front-page story with the headline “O’ROURKE VOTE DRAWS CRITICISM.” In the story, Stuart Schwartz, who is described as having donated a thousand dollars to O’Rourke’s previous campaign, commented that O’Rourke “chooses to side with the rocket launchers and terror tunnel builders.” A mass e-mail circulated, reading “The Following Is Shameful, El Paso Has an Anti-Israel Congressman. . . . Do Not Reëlect Beto O’Rourke.” At the bottom was the address of AIPAC’s Web site, and a snippet of text: “AIPAC is directly responsible for the overwhelming support this legislation received on the Hill. If you are not a member of AIPAC, I strongly recommend that you join. Every dollar helps fund this important work in Congress.”

The day that Congress passed the Iron Dome bills happened to be an especially deadly one in Gaza. In the city of Rafah, Israeli troops pursued Hamas fighters with such overwhelming force that about a hundred and fifty Palestinians were killed, many of them women and children. Israel’s critics in the region have been energized. Hanan Ashrawi, a Palestinian legislator, told me that Congress had sent a clear message by funding Iron Dome that day. “Congress was telling Israel, ‘You go ahead and kill, and we will fund it for you.’ ” She argued that Israelis had dominated American political discourse on the war, as they have for decades on the Israeli-Palestinian conflict. “They say, ‘The Palestinians are all terrorists, they are the people we don’t know, they are alien, foreign, strange—but Israelis are like us.’ Who shaped the presentation, in the U.S.? AIPAC, to a large degree.”

Yet the war has broad support in Israel. According to the Israel Democracy Institute, just six per cent of the Jewish population believes that the Israeli Army has used excessive force. Of those who expressed an opinion, almost half believe that the force has not been severe enough. The left, finding itself increasingly isolated, is deeply critical of AIPAC. Zeev Sternhell, a leading Israeli intellectual and an expert on European fascism, told me, “I consider AIPAC’s role to have been absolutely disastrous, because it prevents any possibility to move with the Palestinians. We cannot move without American intervention—but we are more or less free of American intervention. This is AIPAC’s job. So the present coalition has this sentiment of impunity.”

In the U.S., the war has created tense disagreement, dividing left and right, young and old. Congress showed no such uncertainty, which is a triumph for AIPAC. But the lobby also faces an inevitable question about the extent to which young liberals like O’Rourke represent the future. When I asked Dore Gold, an external adviser to the Netanyahu government, about AIPAC’s prospects, he spoke in determinedly upbeat tones, dismissing the Iran-sanctions episode. “A political loss does not necessarily mean that a political organization has reached its sunset years,” he said. “To the contrary, it can give added motivation for people who are concerned with the implications of Iran crossing the nuclear threshold.” Still, he said, “when issues become so partisan, it is harder for an organization like AIPAC. You have to fight that.” For decades, AIPAC has maintained a hugely successful model, creating widespread support from an unlikely base, and tapping into a seemingly endless wellspring of support from the American Jewish community. But bipartisanship is a relic now, and a generation of unquestioning adherents is aging. Like its embattled allies in Congress, AIPAC needs to reach constituents who represent the country as it will look in the coming decades.

At AIPAC’s policy conference last March, Olga Miranda, the president of S.E.I.U. Local 87, gazed out at the crowd that filled the darkened Washington Convention Center—a gathering she dubbed the “Jewish Super Bowl.” Large video screens displayed her image. A lively woman with long black hair and a commanding voice, Miranda proclaimed, “I am a union leader, I am Joaquin’s mother, I am one of nine children raised by a single mother, I am a Chicana—and I am AIPAC!” For years, she explained, her information about the Middle East had come from television, and she sympathized with the Palestinians, until one day she got a call from someone at AIPAC who asked her if she’d be interested in a trip to Israel. That trip changed her life, she said. Now she argues about Israel with her friends and colleagues. “See you on the picket lines!” she shouted.

“The face of pro-Israel activists has changed pretty dramatically,” David Victor, a former AIPAC president, told me. In the past eight years, AIPAC has reached out to Hispanics, African-Americans, and evangelical Christians, in the hope that greater diversity will translate into continued support in Congress. Victor pointed out that this year’s AIPAC conference was bigger than ever. In 2008, when he was president, eight thousand members attended; this year, there were fourteen thousand, including two hundred and sixty student-government presidents. “These are future opinion leaders,” he said.

Those opinion leaders face a difficult task when they return to campus. Many young American Jews believe that criticism is vital to Israel’s survival as a democratic state. Some are even helping to support a campaign known as B.D.S., for Boycott, Divestment, and Sanctions, which is aimed at ending the occupation and recognizing the rights of Palestinian refugees and citizens. In June, the U.S. branch of the Presbyterian church voted to divest from three companies seen as profiting from the military occupation of the West Bank. (One was Caterpillar, the construction-equipment company, which Rachel Corrie’s parents had sued, unsuccessfully.) The church took care to affirm Israel’s right to exist and to disavow an endorsement of the B.D.S. movement. J Street, likewise, has said that B.D.S. can be “a convenient mantle for thinly disguised anti-Semitism.” But the movement persists, particularly on campuses and in left-wing circles.

Ironically, there is also a threat to AIPAC from the right. Many American conservatives were enraged by the perception that AIPAC had surrendered in the fight for Iran sanctions. Shortly after Menendez set aside his efforts to pass the bill, AIPAC issued a statement vowing to try again later. “They did that because there was an eruption from the other side,” a former Senate aide said. “ ‘How could you sell out the Republican caucus, when we were advocating exactly what Bibi Netanyahu was!’ ” Republicans were frustrated by the lobby’s refusal to move forward at the expense of Democrats, the aide said: “I know AIPAC has its commitment to bipartisanship. But what good is that commitment if in the end you don’t achieve your policy objective?”

For AIPAC’s most severe conservative critics, its attempts to occupy a diminishing sliver of middle ground are unacceptable. Recently, Sheldon Adelson, who funded AIPAC’s new office building a few years ago, has been increasing his support for the right-wing Zionist Organization of America. Mort Klein, the head of the Z.O.A., told me, “Adelson is not happy with AIPAC, clearly.” Several people affiliated with the right-wing Jewish movement told me that significant donors are talking about founding a new organization.

Caught between the increasingly right-leaning Israel and the increasingly fractious United States, AIPAC has little space to maneuver. Wittmann, the spokesman, said, “Our positions in support of the Oslo process and the two-state solution have generated criticism from some on the right, just as our stand for strong prospective Iran sanctions has spurred criticism from some on the left”—a statement of bipartisan intent, but also of the difficulty of contemporary politics. Recently, the lobby has begun another outreach effort, focussed on progressive Democrats. At the conference, Olga Miranda and Ann Lewis, a senior adviser to Hillary Clinton’s 2008 Presidential campaign, spoke on a panel called “The Progressive Case for Israel.” Lewis told me that she has recently been involved in conversations with AIPAC staff and board members about finding ways to improve AIPAC’s connections with progressive Democrats. “They are exploring how to reach progressives, but they’re lost on this!” a leader in the pro-Israel community who is knowledgeable about the effort said. “They don’t know how to bridge the gap. People see AIPAC as representing issues that are anathema to them. It’s an enormous challenge.”

At the conference, the extent of the challenge was clear. Even Netanyahu seemed struck by the mood. At one point in his speech, he said, “I hope that the Palestinian leadership will stand with Israel and the United States on the right side of the moral divide, the side of peace, reconciliation, and hope.” The audience members responded with scant, listless applause. “You can clap,” the Prime Minister said.

    

 

Bill Gates Takes On The NRA

It was reported Monday that Bill Gates, Microsoft co-founder and incredibly wealthy guy, and with his wife, Melinda, have given $1 million to Initiative 594 in Washington state. The ballot initiative, if passed by voters on November 4 (and it currently enjoys overwhelming support), will require universal background checks for all firearm purchases in the state.

Gates is only the latest Washington billionaire to give to the effort, with original Amazon investor Nick Hanauer providing crucial early funding, and more recently upping his overall donation to $1.4 million. Additionally, Gates’s Microsoft co-founder, Paul Allen, has provided $500,000 for the cause.

But Gates’s fame brings more attention and further legitimizes the initiative in a way that almost nobody else could. Once the Gates Foundation made it a priority to combat malaria around the world in 2000, it brought down deaths due to the insect-borne disease by 20 percent in 11 years, saving the lives of 1 million African children in the process.

Gates has the ability to grab headlines and make an issue go viral with the constant media coverage he receives, and the financial ability, if he wins, to fund similar efforts around the country. His involvement could be the answer to the public health crisis that makes American children 93 percent of those murdered in the 26 high-income countries around the world.

Meanwhile, the NRA has…Chuck Norris, doing its “Trigger The Vote” Campaign. An actor, in the sense that he showed up in films, who was last seen round-housing Vietnamese extras in B-movies in the ’80s, back when he was only pushing 50. In more recent times, the more Methuselah-esque-appearing Norris has spent his time warning us of 1,000 years of darkness if President Obama is reelected. (He was. Boo!)

That, in short, is why the guy with the French-sounding name, National Rifle Association head honcho Wayne LaPierre, is probably somewhere drowning his sorrows in his Pernod. Because Gates’ involvement in this issue is just about the last thing LaPierre needs.

Already, the NRA has shown its disdain for anyone with the guts and resources to take on its political cartel of legally bribed legislators around the country. It was used to having the field to itself financially in the 2000s, until along came New York City Mayor Michael Bloomberg. After seeing his constituents and police force victimized by lax gun laws out of state, lobbied for by the NRA, he decided it was time to do something.

The now former mayor’s activism had led to the ire of LaPierre & Company, who’ve just released a multimillion-dollar advertising campaign blasting Bloomberg, replete with his supposed sneering at “flyover country” in between the coasts. Which LaPierre clearly doesn’t do while receiving his million-dollar-plus compensation in the wealthy Northern Virginia suburbs of Washington, D.C.

Ironically, it was in Virginia where Bloomberg’s organization, Everytown for Gun Safety, had one of its biggest victories, when it elected a governor, lieutenant governor and attorney general in 2013. None of whom thought a 12-year old should be able to open-carry an Uzi in St. Patrick’s Cathedral, because of, you know, freedom. Suddenly those who agree with the 90 percent of the country who support universal background checks had access to similar, if not greater, financial resources than those who pledged their allegiance to an arms dealer-funded front group.

Bloomberg is worth $33 billion, but if that’s not enough, Gates is worth well over two times that amount. Who knows, with that kind of dough, maybe even measures that “only” enjoy 56 percent support like bans on assault weapons and/or high-capacity magazines could pass via direct voting by uncorrupted American citizens. Or perhaps state legislators and members of Congress who bend easily to the will of these Lords of War could be swapped out for those who live in a closer neighborhood to the best interests of the American populace.

Likely the NRA will try to do to Gates what it has attempted to do to Bloomberg for a few years now, and seek to make this fight about him and not its right-wing radicalism in the service of avarice. He’s a billionaire trying to influence our political process, after all, unlike Manhattan resident David Koch, who along with his brother Charles has polluted our political process to no end, including funding the NRA.

Sure, in an ideal world big money wouldn’t play such an outsize role in our elections, such as this hugely important ballot initiative in Washington state. But that’s not what the NRA wants. It just wants its big money still to be all that decides the outcome, and it isn’t. Which is why Wayne LaPierre’s having a bad day.

    

 

American Racialism

 

MANY white Americans say they are fed up with the coverage of the shooting of Michael Brown in Ferguson, Mo. A plurality of whites in a recent Pew survey said that the issue of race is getting more attention than it deserves.

Bill O’Reilly of Fox News reflected that weariness, saying: “All you hear is grievance, grievance, grievance, money, money, money.”

Indeed, a 2011 study by scholars at Harvard and Tufts found that whites, on average, believed that anti-white racism was a bigger problem than anti-black racism.

Yes, you read that right!

So let me push back at what I see as smug white delusion. Here are a few reasons race relations deserve more attention, not less:

• The net worth of the average black household in the United States is $6,314, compared with $110,500 for the average white household, according to 2011 census data. The gap has worsened in the last decade, and the United States now has a greater wealth gap by race than South Africa did during apartheid. (Whites in America on average own almost 18 times as much as blacks; in South Africa in 1970, the ratio was about 15 times.)

• The black-white income gap is roughly 40 percent greater today than it was in 1967.

• A black boy born today in the United States has a life expectancy five years shorter than that of a white boy.

• Black students are significantly less likely to attend schools offering advanced math and science courses than white students. They are three times as likely to be suspended and expelled, setting them up for educational failure.

• Because of the catastrophic experiment in mass incarceration, black men in their 20s without a high school diploma are more likely to be incarcerated today than employed, according to a study from the National Bureau of Economic Research. Nearly 70 percent of middle-aged black men who never graduated from high school have been imprisoned.

All these constitute not a black problem or a white problem, but an American problem. When so much talent is underemployed and overincarcerated, the entire country suffers.

Some straight people have gradually changed their attitudes toward gays after realizing that their friends — or children — were gay. Researchers have found that male judges are more sympathetic to women’s rights when they have daughters. Yet because of the de facto segregation of America, whites are unlikely to have many black friends: A study from the Public Religion Research Institute suggests that in a network of 100 friends, a white person, on average, has one black friend.

That’s unfortunate, because friends open our eyes. I was shaken after a well-known black woman told me about looking out her front window and seeing that police officers had her teenage son down on the ground after he had stepped out of their upscale house because they thought he was a prowler. “Thank God he didn’t run,” she said.

One black friend tells me that he freaked out when his white fiancée purchased an item in a store and promptly threw the receipt away. “What are you doing?” he protested to her. He is a highly successful and well-educated professional but would never dream of tossing a receipt for fear of being accused of shoplifting.

Some readers will protest that the stereotype is rooted in reality: Young black men are disproportionately likely to be criminals.

That’s true — and complicated. “There’s nothing more painful to me,” the Rev. Jesse Jackson once said, “than to walk down the street and hear footsteps and start thinking about robbery — then look around and see somebody white and feel relieved.”

All this should be part of the national conversation on race, as well, and prompt a drive to help young black men end up in jobs and stable families rather than in crime or jail. We have policies with a robust record of creating opportunity: home visitation programs like Nurse-Family Partnership; early education initiatives like Educare and Head Start; programs for troubled adolescents like Youth Villages; anti-gang and anti-crime initiatives like Becoming a Man; efforts to prevent teen pregnancies like the Carrera curriculum; job training like Career Academies; and job incentives like the earned-income tax credit.

The best escalator to opportunity may be education, but that escalator is broken for black boys growing up in neighborhoods with broken schools. We fail those boys before they fail us.

So a starting point is for those of us in white America to wipe away any self-satisfaction about racial progress. Yes, the progress is real, but so are the challenges. The gaps demand a wrenching, soul-searching excavation of our national soul, and the first step is to acknowledge that the central race challenge in America today is not the suffering of whites.

    

 

Rotherham and British Social Services

 

The ball bounces high in the shadows off the gable end and a handful of kids chase it down the road. Under the stairway to the flats nearby, half a dozen teenage girls lie sprawled on the concrete, sheltering from the slate-grey drizzle. They watch the ball ping back up the street, strung out in the fading evening light, as the acrid smell of cannabis hangs overhead. Further down the road, a group of lads in hoodies mill around the off-licence asking passers-by if they can buy a few cans of strong lager for them.

It’s a scene you’ll find in many parts of northern England and one I’m all too familiar with. Even now, when I see the boredom and despair in kids’ eyes out on the streets, the same feeling comes back to me. Growing up in a single-parent family near Burnley, drinking at 14 and hanging around off-licences asking grown-ups to buy me a drink, just as I see in Rochdale now, I knew about the vulnerability of kids roaming the streets with nothing to do. There were dangers then, but now it’s worse. For gangs of men looking to groom kids to be violently abused, they’re easy prey.

Scenes like this are not far from Rochdale Council’s new £50 million offices. But when I spoke to child-protection bosses in the wake of Rochdale’s grooming scandal a few years ago, where young girls had been continually raped by gangs of men, I may as well have been in a foreign country. Despite talking about a reality that existed a few minutes’ drive from their offices, there was no awareness of what life was like for these kids. No connection, no empathy. The head of the Rochdale Safeguarding Children Board told me they needed to take “a deeper dive into the theory” to understand the problem. The director of children’s services implied to me that young girls who were being raped were “making lifestyle choices”. She later admitted to an incredulous home affairs select committee that she’d never met any of the victims.

The impression I got was that they viewed these girls as an astronomer would look through a telescope at planets. Their lives were so far from the girls’ experiences that to them, they might as well have been a remote dot.

The Rochdale grooming scandal would have never come to light had it not been for the fantastic health care workers who helped these young girls. They listened, they understood and they cared. They were steeped in working-class community values, not remote theory. One of them in particular tried desperately hard to get the police and social services to listen to the girls and take action, but to no avail.

The problem then, as now in Rotherham, was a middle-class management in children’s services that simply didn’t want to know and didn’t care. The author of the Rotherham report, Prof Alexis Jay, said last week that a group of senior managers held a view that couldn’t be challenged. This despotic approach is ruining social work and failing families. Where once there was a fair representation of working-class social workers who could listen and relate to all manner of challenging families, the profession is now stuffed with textbook professionals bereft of emotional intelligence and incapable of relating to troubled kids. And for the good ones still left, they’re all too often forced to adopt foolish practices that fly in the face of common sense.

Sit before children’s services managers and you’re likely to hear endless waffle about guidelines, policies, procedures, strategy and thresholds. But they won’t mention the kids. Worse still, management never refers to practitioners or seeks advice from those at the coalface. Experience has no currency. Cold, remote theory rules. In the wake of the Rochdale grooming scandal, a Serious Case Review was critical of the health workers whose outreach work had uncovered an endemic child abuse problem. Amazingly, they were criticised for having the wrong qualifications.

Once you start heading down this road, where management exists in a bubble and an organisation’s values come from textbooks rather than the people they serve, then you end up with situations like Rotherham. Where political correctness and cultural sensitivity are more important than child rape. Managers become more interested in ticking boxes in diversity training than protecting children. And social-work bosses ban families from looking after children because they’re members of Ukip and not sufficiently versed in multiculturalism.

This is dogma for breakfast, lunch and dinner. Common sense is not on the menu.

The collapse of the banks a few years back was brought about by bad management and the cult of leadership. The leaders of these banks not only followed a tick-box culture that allowed them to avoid their responsibilities, but also had no concept of the values within their organisation and didn’t even understand their own complex financial instruments.

In many areas of social work, the same tick-box culture, lack of values and a failure to have the remotest understanding of the complex lives of those being dealt with is bringing about a similar collapse. And like the banks, it will have far-reaching consequences for society.

We’re also starting to see a worrying cult of leadership. Highly paid managers are seemingly untouchable and distant from front-line workers. The rise of the unsackable, unaccountable and unapologetic public-sector manager is a trend that will only see services continue to deteriorate. And let’s be clear about what that means. It won’t be just missed targets or a poor Ofsted rating. We’re storing up huge social costs.

Think of the 1,400 children abused in Rotherham. They were beaten, had guns pointed at their heads, were routinely gang raped, had petrol poured over them and were threatened with being set alight. What do you think happens to these kids?

They don’t tend to end up in nice jobs, washing their cars in the suburban sunshine on a Sunday morning. They grow up angry, resentful and lost to society. The Prime Minister talks about sending in welfare squads to tackle “problem families”, but what about the lost generation that’s being created now by allowing children to be abused on an industrial level?

I’ve sat before these people and listened to their stories. I remember every one. Some have managed to turn their lives around, and these stories are inspirational. But, for most, the burden of abuse is too heavy to bear. Kids who walk through the fire of extreme abuse make very different choices to the rest of us. They end up joining the Foreign Legion, committing violent crimes, taking drugs or sleeping on the streets. When I looked at the criminal record of one victim and asked why he couldn’t stay on the straight and narrow, he told me it was safer in prison.

For most, they struggle to make relationships and the human cost is massive. “I don’t have friends, I prefer to be on my own because I don’t trust people,” one sexually abused man told me. He’d made a living over the past 20 years working with the travelling community on the margins of society. These were the only people who didn’t judge him, he said.

But talking to bosses in children’s services and the multitude of highly paid professionals running our protective agencies, you’ll never get any real understanding of the depth or complexity of the people they’re dealing with. It’s become a cold science where the hard work of gaining trust, taking a human approach and supporting people has been replaced with a detached, long-lens view. And it’s not just management that has this view. At one point last year, I made a complaint on behalf of one of the Rochdale grooming victims as a social worker kept appearing outside her house, peering through the window. Is that the “help” a survivor of sexual abuse needs?

This dearth of understanding does not only relate to victims. Too little is known about the perpetrators of these crimes and too frequently I get the impression that politically correct reasons prevent authorities from trying to find out more or challenge these people.

Some are poor men from rural Kashmiri communities, or second- or third-generation Kashmiris or Pakistanis who have developed or inherited an openly violent misogyny. I visited one abuser in prison – he’d attacked a female prostitute with a hammer and was clearly mentally ill. I asked the family about his wife, who’d come over from Kashmir two years before and spoke no English, only to be told that she knew he was in prison but wasn’t aware of the crimes he’d committed.

I’ve also had family members come to my surgery asking me to make representations on behalf of brothers who have been found guilty of child sex abuse. When I refuse, I frequently receive a tirade of abuse. “These girls are prostitutes,” one man shouted at me, and warned that I would pay a heavy price for not supporting him. He’d get thousands of people not to vote for me.

As a Labour politician, it can be difficult challenging some of these issues, but you can’t ignore child abuse and violent misogyny. Three years ago, former home secretary Jack Straw said some Pakistani men see white girls as “easy meat” for abuse. He was accused of perpetuating racist attitudes. Like all political parties, the Labour Party is a broad church. But I fear too many hold the view expressed by former Rotherham MP, Denis MacShane, last week. He avoided child abuse in his constituency, he told the BBC, because he was “a Guardian-reading liberal Leftie” and didn’t want to “rock the multicultural community boat”.

Last week I received a text message from a current Labour MP saying she was disappointed by my views on this issue. I was only elected in 2010 and already I’ve found that politicians are sometimes discouraged from exploring and investigating complex issues because they’re expected to stay tethered to a dominant ideology and not stray far from the stock replies to difficult questions. This does nothing to strengthen democracy. It weakens it, and creates cynicism. The public want to see matters like this discussed and they want politicians to come up with answers, not just endless hand-wringing.

I’m in no doubt that Rotherham is not an isolated case, and the same kind of abuse is happening right now in towns and cities across the country. As shocking as the Rotherham report is, the fallout out from this type of abuse and the long-term social consequences are even more horrifying.

Far too many people are sliding into an underclass as a result of violent abuse that fails to register with protective agencies. Inquiries, reports and media scrutiny are only the beginning of the change we need. If we’re going to save a lost generation from having childhood innocence ripped from them, then we need to stop obsessing about multiculturalism and reform children’s services now.

    

 

White Privilege

 

(comment in a Fark thread after Ferguson)

"It all comes down to family, culture, personal responsibility,"

This is about the collectivization of every misdeed committed by a black person, the way all black people are implicated and have responsibilities imposed on them. When a white man beats his children or kills his wife or robs a liquor store or commits insider trading, nobody tells Bill O'Reilly that he, as a white person, needs to do something about it. And he sure as hell doesn't go on the air and say that white people need better role models. There isn't a thing called "white on white crime," but there is a thing called "black on black crime," because crimes committed by black people are black crimes, born from blackness and soiling all black people, but crimes committed by white people have nothing to do with the race of the perpetrators; they're just crimes, no modifier needed.

My guess is that if you asked Bill O'Reilly what responsibility white musicians or white politicians have for the thousands of white crimes committed every year, he would have no idea what you're talking about. It would sound like gibberish to him. As I've written before, a big part of the privilege of whiteness is that you don't have to have responsibility for anyone else. You can be just yourself. The security guard is not going to follow you around in a store because some other white person shoplifted there last week. A TV host is not going to demand that you defend something stupid another white person said, for no reason other than the fact that the two of you are white. No one is going to think that because of the music you're playing, it might be a good idea to fire ten bullets into your car.

Creating that broad black responsibility doesn't just happen, it has to be reinforced and maintained. Nobody does it with more vigor than Bill O'Reilly and the rancid cauldron of race-baiting that is the network for whom he works

    

 

The Pyramids and Jewish Slaves

 

The stories we hear in Sunday school seem to form the basis for the popular belief that Jewish slaves were forced to build the pyramids in Egypt, but they were saved when they left Egypt in a mass Exodus. That's the story I was raised to believe, and it's what's been repeated innumerable times by Hollywood. In 1956, Charlton Heston as Moses went head to head with Yul Brynner as Pharaoh Ramesses II in The Ten Commandments, having been placed into the Nile in a basket as a baby to escape death by Ramesses' edict that all newborn Hebrew sons be killed. More than 40 years later, DreamWorks told the same story in the animated Prince of Egypt, and the babies died again.

In 1977, Israeli prime minister Menachem Begin visited Egypt's National Museum in Cairo and stated "We built the pyramids." Perhaps to the surprise of a lot of people, this sparked outrage throughout the Egyptian people, proud that they had built the pyramids. The belief that Jews built the pyramids may be prominent throughout Christian and Jewish populations, but it's certainly not the way anyone in Egypt remembers things.

Pop culture has a way of blurring pseudohistory and real history, and many people end up never hearing the real history at all; and are left with only the pseudohistory and no reason to doubt it. This is not only unfortunate, it's dangerous. In the words of Primo Levi inscribed front and center inside Berlin's Holocaust Museum, "It happened, therefore it can happen again." 20th century Jewish history is probably the most important, and hardest learned, lesson that humanity has ever had the misfortune to be dealt. Forgetting or distorting history is always wrong, and is never in anyone's best interest.

I've heard some Christians say the Bible is a literal historical document, thus Jewish slaves built the pyramids (the Bible actually doesn't mention pyramids at all, this came from Herodotus. See below. - BD); and I've heard some non-religious historians say there's no evidence that there were ever Jews in ancient Egypt. Both can't be true. To find the truth, we need to take a critical look at the archaeological and historical evidence for the history of Jews in Egypt. In order to do this responsibly, we first have to put aside any ideological motivations that would taint our efforts. We're not going to say such research is sacrilegious because it seeks to disprove the Bible or the Torah; we're not going to say such research is a moral imperative because religious accounts are deceptive; and we're not going to pretend that such research is racially motivated against either Jews or Egyptians. We simply want to know what really happened, because true history is vital.

One of the first things you find out is that it's important to get our definitions right. Terms like Jew and Hebrew are thrown around a lot in these histories, and they're not the same thing. A Jew is someone who practices the Jewish religion. A Hebrew is someone who speaks the Hebrew language. An Israelite is a citizen of Israel. A Semite is a member of an ethnic group characterized by any of the Semitic languages including Arabic, Hebrew, Assyrian, and many smaller groups throughout Africa and the Middle East. You can be some or all of these things. An Israelite need not be a Jew, and a Jew need not be a Hebrew. Confusion over the use of these terms complicates research. Hebrews could be well integrated into a non-Jewish society, but modern reporting might refer to them as Jews, which can be significantly misleading.

Now, there are more than just a single question we're trying to answer here. Were the Jews slaves in ancient Egypt? Were the pyramids built by these slaves? Did the Exodus happen as is commonly believed?

The biggest and most obvious evidence — the pyramids themselves — are an easy starting point. Their age is well established. The bulk of the Giza Necropolis, consisting of such famous landmarks as the Great Pyramid of Cheops and the Sphinx, are among Egypt's oldest large pyramids and were completed around 2540 BCE. Most of Egypt's large pyramids were built over a 900 year period from about 2650 BCE to about 1750 BCE.

We also know quite a lot about the labor force that built the pyramids. The best estimates are that 10,000 men spent 30 years building the Great Pyramid. They lived in good housing at the foot of the pyramid, and when they died, they received honored burials in stone tombs near the pyramid in thanks for their contribution. This information is relatively new, as the first of these worker tombs was only discovered in 1990. They ate well and received the best medical care. And, also unlike slaves, they were well paid. The pyramid builders were recruited from poor communities and worked shifts of three months (including farmers who worked during the months when the Nile flooded their farms), distributing the pharaoh's wealth out to where it was needed most. Each day, 21 cattle and 23 sheep were slaughtered to feed the workers, enough for each man to eat meat at least weekly. Virtually every fact about the workers that archaeology has shown us rules out the use of slave labor on the pyramids.

It wasn't until almost 2,000 years after the Great Pyramid received its capstone that the earliest known record shows evidence of Jews in Egypt, and they were neither Hebrews nor Israelites. They were a garrison of soldiers from the Persian Empire, stationed on Elephantine, an island in the Nile, beginning in about 650 BCE. They fought alongside the Pharaoh's soldiers in the Nubian campaign, and later became the principal trade portal between Egypt and Nubia. Their history is known from the Elephantine Papyri discovered in 1903, which are in Aramaic, not Hebrew; and their religious beliefs appear to have been a mixture of Judaism and pagan polytheism. Archival records recovered include proof that they observed Shabbat and Passover, and also records of interfaith marriages. In perhaps the strangest reversal from pop pseudohistory, the papyri include evidence that at least some of the Jewish settlers at Elephantine owned Egyptian slaves.

Other documentation also identifies the Elephantine garrison as the earliest immigration of Jews into Egypt. The Letter of Aristeas, written in Greece in the second century BCE, records that Jews had been sent into Egypt to assist Pharaoh Psammetichus I in his campaign against the Nubians. Psammetichus I ruled Egypt from 664 to 610 BCE, which perfectly matches the archaeological dating of the Elephantine garrison in 650.

If Jews were not in Egypt at the time of the pyramids, what about Israelites or Hebrews? Israel itself did not exist until approximately 1100 BCE when various Semitic tribes joined in Canaan to form a single independent kingdom, at least 600 years after the completion of the last of Egypt's large pyramids. Thus it is not possible for any Israelites to have been in Egypt at the time, either slave or free; as there was not yet any such thing as an Israelite. It was about this same time in history that the earliest evidence of the Hebrew language appeared: The Gezer Calendar, inscribed in limestone, and discovered in 1908. And so the history of Israel is very closely tied to that of Hebrews, and for the past 3,000 years, they've been essentially one culture.

But if neither Jews nor Israelites nor Hebrews were in Egypt until so many centuries after the pyramids were built, how could such a gross historical error become so deeply ingrained in popular knowledge? The story of Jewish slaves building the pyramids originated with Herodotus of Greece in about 450 BCE. He's often called the "Father of History" as he was among the first historians to take the business seriously and thoroughly document his work. Herodotus reported in his Book II of The Histories that the pyramids were built in 30 years by 100,000 Jewish slaves [In point of fact, Herodotus only says 100,000 workers. He does not mention either Jews or slaves. So even this popular belief seems to be in error, and the origin of the idea of Jews building the pyramids remains a mystery - BD]. Unfortunately, in his time, the line between historical fact and historical fiction was a blurry one. The value of the study of history was not so much to preserve history, as it was to furnish material for great tales; and a result, Herodotus was also called the "Father of Lies" and other Greek historians of the period also grouped under the term "liars". Many of Herodotus' writings are considered to be fanciful by modern scholars. Coincidentally, the text of the Book of Exodus was finalized at just about exactly the same time as Herodotus wrote The Histories. Obviously, the same information about what had been going on in Egypt 2,000 years before was available to both authors.

(From Brian Dunning's Skeptoid )

 

    

 

Homes For The Homeless

 

In 2005, Utah set out to fix a problem that’s often thought of as unfixable: chronic homelessness. The state had almost two thousand chronically homeless people. Most of them had mental-health or substance-abuse issues, or both. At the time, the standard approach was to try to make homeless people “housing ready”: first, you got people into shelters or halfway houses and put them into treatment; only when they made progress could they get a chance at permanent housing. Utah, though, embraced a different strategy, called Housing First: it started by just giving the homeless homes.

Handing mentally ill substance abusers the keys to a new place may sound like an example of wasteful government spending. But it turned out to be the opposite: over time, Housing First has saved the government money. Homeless people are not cheap to take care of. The cost of shelters, emergency-room visits, ambulances, police, and so on quickly piles up. Lloyd Pendleton, the director of Utah’s Homeless Task Force, told me of one individual whose care one year cost nearly a million dollars, and said that, with the traditional approach, the average chronically homeless person used to cost Salt Lake City more than twenty thousand dollars a year. Putting someone into permanent housing costs the state just eight thousand dollars, and that’s after you include the cost of the case managers who work with the formerly homeless to help them adjust. The same is true elsewhere. A Colorado study found that the average homeless person cost the state forty-three thousand dollars a year, while housing that person would cost just seventeen thousand dollars.

Housing First isn’t just cost-effective. It’s more effective, period. The old model assumed that before you could put people into permanent homes you had to deal with their underlying issues—get them to stop drinking, take their medication, and so on. Otherwise, it was thought, they’d end up back on the streets. But it’s ridiculously hard to get people to make such changes while they’re living in a shelter or on the street. “If you move people into permanent supportive housing first, and then give them help, it seems to work better,” Nan Roman, the president and C.E.O. of the National Alliance for Homelessness, told me. “It’s intuitive, in a way. People do better when they have stability.” Utah’s first pilot program placed seventeen people in homes scattered around Salt Lake City, and after twenty-two months not one of them was back on the streets. In the years since, the number of Utah’s chronically homeless has fallen by seventy-four per cent.

Of course, the chronically homeless are only a small percentage of the total homeless population. Most homeless people are victims of economic circumstances or of a troubled family environment, and are homeless for shorter stretches of time. The challenge, particularly when it comes to families with children, is insuring that people don’t get trapped in the system. And here, too, the same principles have been used, in an approach called Rapid Rehousing: the approach is to quickly put families into homes of their own, rather than keep them in shelters or transitional housing while they get housing-ready. The economic benefits of keeping people from getting swallowed by the shelter system can be immense: a recent Georgia study found that a person who stayed in an emergency shelter or transitional housing was five times as likely as someone who received rapid rehousing to become homeless again.

It may seem surprising that a solidly conservative state like Utah has embraced an apparently bleeding-heart approach like giving homeless people homes. But in fact Housing First has become the rule in hundreds of cities around the country, in states both red and blue. And while the Obama Administration has put a lot of weight (and money) behind these efforts, the original impetus for them on a national scale came from the Bush Administration’s homelessness czar Philip Mangano. Indeed, the fight against homelessness has genuine bipartisan support. As Pendleton says, “People are willing to pay for this, because they can look at it and see that there are actually solutions. They can say, ‘Ah, it works.’ ” And it saves money.

The recognition that it makes sense to give money away today in order to save money later isn’t confined to homeless policy. It has animated successful social initiatives around the world. For more than a decade, Mexico has been paying parents to keep their children in school, and studies suggest that the program is remarkably cost-effective, once you take into account the economic benefits of creating a more educated and healthy population. Brazil’s Bolsa Familia is a similar program. The traditional justification for such initiatives has been a humanitarian or egalitarian one. But a cost-benefit analysis suggests that, in many cases, such programs are also economically rational.

Our system has a fundamental bias toward dealing with problems only after they happen, rather than spending up front to prevent their happening in the first place. We spend much more on disaster relief than on disaster preparedness. And we spend enormous sums on treating and curing disease and chronic illness, while underinvesting in primary care and prevention. This is obviously costly in human terms. But it’s expensive in dollar terms, too. The success of Housing First points to a new way of thinking about social programs: what looks like a giveaway may actually be a really wise investment.

    

 

Framing Persuasion

 

Last week’s People's Climate March drew 400,000 people onto the streets of Manhattan and a great deal of international attention to a subject of dire urgency. But some were skeptical about the event’s overall significance. “The march slogan was, ‘to change everything, we need everyone,’ which is telling, because it won’t change everything, because it didn’t include everyone,” wrote David Roberts of Grist. “Specifically, it won’t change American politics because it didn’t include conservatives.” True enough.

If there weren’t such a stark divide between American conservatives and almost everyone else on the question of the existence and importance of climate change — a divide that can approach 40 points on some polling questions — the political situation would be very different. So if any progress on climate change is going to be made through the American political system — apart from executive orders by Democratic presidents — it is going to have to somehow involve convincing a lot of conservatives that yes, climate change is a threat to civilization.

How do you do that? The answer has more to do with psychology than politics.

The practice of tailoring a political message to a particular group is commonplace, of course. But the climate activist community has broadly failed to understand just how differently conservatives and liberals see the world on certain issues, and, as a result, just how radically different messages targeting conservatives should look.

“Although climate scientists update, appropriately, their models after ten years of evidence, climate-science communicators haven’t,” said Dan Kahan, a professor of law and psychology at Yale who studies how people respond to information challenging their beliefs. Luckily, social and political psychologists are on the case. “I think there’s an emerging science of how we should talk about this if we’re going to be effective at getting any sort of movement,” said Robb Willer, a sociologist at Stanford.

It’s worth pointing out, of course, that for many conservatives (and liberals), the current debate about climate change isn’t really about competing piles of evidence or about facts at all — it’s about identity. Climate change has come to serve as shorthand for which side you’re on, and conservatives tend to be deeply averse to what climate crusaders represent (or what they think they represent). “The thing most likely to make it hard to sway somebody is that you’re trying to sway them,” said Kahan.

But in practical, apolitical contexts, many conservatives already recognize and are willing to respond to the realities of climate change. “There’s a climate change people reject,” Kahan explained. “That’s the one they use if they have to be a member of one or another of those groups. But there’s the climate change information they accept that’s just of a piece with all the information and science that gets used in their lives.” A farmer approached by a local USDA official with whom he’s worked before, for example, isn’t going to start complaining about hockey-stick graphs or biased scientists when that official tells him what he needs to do to account for climate-change-induced shifts to local weather patterns.

In a larger context, social scientists have shown in laboratory settings that there are ways to discuss climate change that nudge conservatives toward recognizing the issue. Research is proceeding along a few different tracks. One of them involves moral foundations theory, a hot idea in political psychology that basically argues that people holding different political beliefs arrive at those beliefs because they have different moral values (even if there’s plenty of overlap). Liberals tend to be more moved by the idea of innocent people being harmed than conservatives, for example, while conservatives are more likely to react to notions of disgust (some of the conservative rhetoric over immigration reflects this difference).

In a study they published in Psychological Science in 2013, Willer and a colleague, the Stanford social psychologist Matthew Feinberg, tested the effectiveness of framing environmental issues in a way that takes into account conservatives’ moral foundations. After completing a questionnaire that included items about their political beliefs, respondents were asked to read one of three excerpts. The unfortunate control group “read an apolitical message on the history of neckties.” For the other two groups, though, what followed was an op-ed-like block of text designed to stoke either “care/harm” (innocents suffering) or “purity/sanctity” (disgust) concerns — one excerpt “described the harm and destruction humans are causing to their environment and emphasized how important it is for people to care about and protect the environment,” while the other touched on “how polluted and contaminated the environment has become and how important it is for people to clean and purify the environment.”

Afterwards, respondents were gauged on their pro-environmental attitudes and belief in global warming. In the care/harm group, there was a sizable gap between liberals and conservatives on both measures. In the disgust group, however, there was no statistically significant difference in general environmental attitudes, and the gap on belief in global warming had been cut significantly.

Another promising route that researchers are exploring involves the concept of “system justification.” Put simply, system justification arises from the deep-seated psychological need for humans to feel like the broad systems they are a part of are working correctly. It doesn’t feel good to know you attend a broken school or inhabit a deeply corrupt country — or that your planet’s entire ecology may be on the brink of collapse.

People tend to deal with major threats to their systems in one of two ways: taking a threat so seriously that they seek out ways to neutralize it, or “finding ways to justify away problems in order to maintain the sense of legitimacy and well-being of the system,” explained Irina Feygina, a social psychologist at New York University. This latter route is system justification.

Conservatives don’t have a monopoly on system justification, but there’s strong evidence they do it more than liberals. “There’s a lot of research that just goes out and asks people what their opinions and preferences are, and pretty consistently — I don’t actually know of any examples to the contrary — people who tend to report being further on the conservative end of the spectrum also report having greater confidence in the system and greater motivation to justify it,” said Feygina.

She and two colleagues looked into the connection between system justification and environmental beliefs for a series of studies published in Personality and Social Psychology Bulletin in 2009. They found that, among an undergraduate sample at least, there was a strong correlation between system justification (as measured by reactions to items like “In general, the American political system operates as it should”) and denial of environmental problems.

In a follow-up study designed to test whether this relationship was causal or simply correlational, students read a rather vanilla statement about how researchers have been tracking, with interest, changes to the environment. Some of the students also read two extra sentences: “Being pro-environmental allows us to protect and preserve the American way of life. It is patriotic to conserve the country’s natural resources.” This final bit was designed specifically to “reframe[e] pro-environmental change as consistent with system preservation” by emphasizing not a threat to a beleaguered system, but rather an opportunity to help protect an established, robust one.

After reading the passage, students rated their agreement with ten statements about whether and to what extent they planned on engaging in pro-environmental activities, and were asked if they would like to sign various pro-environmental petitions. In the control condition, those who felt a stronger urge to justify the system expressed weaker pro-environmental intentions and signed fewer petitions. In the experimental group, though, the researchers effectively defused the effects of system justification: there was no difference in attitudes and numbers of petition signed between strong and weak system justifiers.

So how would this translate to a real-world message? “What you need to do is put the system first,” said Feygina. “Instead of saying, ‘Let’s deal with climate change, let’s be pro-environmental, let’s protect the oceans,’ what you need to do is come in and say, ‘If we want to preserve our system, if we want to be patriotic, if we want our children to have the life that we have, then we have to take these actions that allow us to maintain those things that we care about.’” The starting point can’t be about averting catastrophe, in other words — it has to be about pride in the current system and the need to maintain it.

She cited the film Carbonnation as an example:

There’s strikingly little talk of disaster here. Rather, climate change is viewed as a challenge to a great country, yes, but also an opportunity to profit, to save money, to compete with China. And, crucially, the messengers aren’t environmentalists or easily identified “activists,” but instead are folks who fit into a conservative view of patriotism and hard work (“military, farmers, Midwesterners, people living in rural areas,” as Feygina put it). The environmental imagery isn’t melting ice caps or stranded polar bears — it’s snow-white clouds and sparkling, bubbling streams. And the filmmakers instantly neutralize any sense that this is about group membership by stating that the film is for both believers in and deniers of human-induced global warming. The movie’s tagline alone — “A climate change solutions movie (that doesn’t care if you believe in climate change)” — echoes many of Kahan, Willer, and Feygina’s suggestions.

Still, it’s not as though shifts in framing can undo decades of culture-war battles. Willer was realistic in describing the limitations of grafting language from moral foundations theory and system justification onto climate-change messages. “It’s unlikely that such a short, small framing intervention would have a long, sustained effect — that’s very unlikely,” said Willer. “The idea, we hope, is that application of these techniques in a longer-term more committed campaign would be effective and would stick.”

Another challenge, though, is that many of the messages that do seem to work for liberals — at least “work” in the sense of helping to build communities, organize marches, and so on — are ones that conservatives will likely find extremely off-putting. Climate activists often stamp their feet, perplexed as to how dire talk of ecologies collapsing and cities getting flooded don’t reach conservatives even as they assist in fund-raising and in activating liberals. “Oftentimes people decide on how they’re going to build their [message] based on intuition — they say ‘Oh, this is how humans works,’” said Feygina.

But that intuition is often flawed. If climate activists are serious about doing anything other than preaching to the choir, they’re going to have to understand that messages that feel righteous and work on liberals may not have universal appeal. To a liberal, the system isn’t working and innocent people will suffer as a result — these are blazingly obvious points. But conservatives have blazingly obvious points of their own: The system works and we need to protect it, and it’s important not to let pure things be defiled.

Climate activists, said Feygina, are often “not able to step outside that and ask questions about how we process information, and what are the barriers at hand.” And that, she said, “completely misses the target.”

    

 

How To Win An Argument

 

You’ve probably gotten in a political argument in the recent past, whether with your nutso cousin at Thanksgiving or your militantly ignorant co-worker at a happy hour.

And you’ll probably get in another political argument sometime in the near future. Hard as it may be to believe, you can actually win these arguments. Here’s how.

1. Forget facts.

Psychologists who study political belief and persuasion think it’s adorable how obsessed argumentative people are with those cute little things called facts. When it comes to winning arguments, truthfulness and details simply don’t matter as much as we think they do.

“People think emotionally, and they very often will have these gut moral intuitions that certain things are right or wrong,” said Matthew Feinberg, a psychologist at Stanford. The process of belief formation runs in the opposite direction than we’d hope: People “come to the conclusion first, and then the reasons they kind of pull out just to support their beliefs.”

This runs counter to a lot of what we learn when we’re writing term papers in school or reading our favorite authors, of course — in these contexts, logical precision is key. But when you’re engaging in a live argument with someone who views the world very differently than you do, it’s important not to get too hung up on factual accuracy.

So how do you capitalize on this knowledge?

2. Let your opponent hang him or herself.

It may not seem right judging from cable news, but when people are asked to explain their beliefs about how a given thing works, they’ll actually become less confident in those beliefs.

This phenomenon is known as the “illusion of explanatory depth.” If you ask the average person to explain why they hold a given opinion, “They will come to realize the limitations of their own understanding,” said Frank C. Keil, a Yale University psychologist who studies intuitive beliefs and explanatory understanding. Keil cautions that this won’t necessarily lead to a change in point of view, but said that if you ask them gently and non-aggressively to walk you through their point of view, they’ll likely see the holes more."

3. Don’t be such a dick.

This one can be tough to remember, but even in a heated debate with a distant cousin whose political beliefs would make a Neanderthal blush, there’s a tactical upside to being nice.

“When people have their self-worth validated in some way, they tend to be more receptive to information that challenges their beliefs,” said Peter Ditto, a psychology professor at UC-Irvine who studies emotion and its connection to political and religious beliefs. This is partly because our mood determines a lot about how receptive we are to new information or ideas: If we’re happy and confident and at ease, we’re more likely to be open-minded.

The problem is that political arguments, by their nature, tend to make their participants angry and frustrated. So it’s easy to be self-defeating here: When you’re arguing with Uncle Bob, who fervently believes certain things about 9/11, you may think you’re making a cutting, incisive point that reveals his stupidity for the world to see. But if you’re antagonizing him or openly implying that he’s nuts, he’s only going to feel backed into a corner — and that, the research suggests, will harden his beliefs further. (This effect is likely only amplified in big group settings, which bring greater opportunities for “point-scoring” and embarrassment, meaning smaller gatherings are more conducive to substantive debate and persuasion.) Instead, as you’ll read below, there are ways to possibly nudge him toward reason without threatening his entire worldview.

4. Defuse disgust.

Not every hot-button political issue can be traced back to disgust, but many of them can. A long line of research has shown just how intimately connected our politics and our sense of disgust are (many researchers think this is an outgrowth of the cognitive systems that provided us with a visceral aversion to dangerous substances like human and animal waste back before the days of sewer systems and litter boxes). And if you look around, you’ll see political arguments couched in disgust everywhere, such as in the memorable recent case of the South Dakota state rep who likened gay sex to “a one-way alley meant only for the garbage truck to go down.”

So what should you do if you find yourself locked in debate with someone who is grossed out, and you suspect his disgust, rather than a more substantive argument, is fueling his belief? One paper by Feinberg and some colleagues suggests simply asking your adversary not to be disgusted could be a surprisingly successful strategy. The researchers had participants watch a video of two men kissing. Some were instructed to simply watch, while others were asked to “try to think about what you are seeing in such a way that you don’t feel anything at all.” The latter condition was designed to short-circuit feelings of disgust, and political conservatives in that group “subsequently expressed more support for same-sex marriage than conservatives in the control condition,” as the study’s abstract put it.

5. Change the frame.

Here’s where you can earn your black belt in political argument. One of the most prominent current theories through which psychologists explain differences in political beliefs is called Moral Foundations Theory, or MFT. MFT posits that there are five foundations to moral beliefs: care/harm (whether other beings are being hurt); fairness/cheating (whether people are treating others fairly); loyalty/betrayal (whether people are exhibiting loyalty to their group); authority/subversion (whether people are playing by the rules); and sanctity/degradation (whether people are sullying physical or spiritual things that are sacred). According to the theory, liberals and conservatives view these concerns differently. For liberals, care/harm and fairness/cheating are the most important of the five, while conservatives are more into loyalty/betrayal, authority/subversion, and sanctity/degradation.

This is pretty powerful knowledge, because it can help you know your opponent’s “weak points,” in a sense — which aspects of morality will resonate for them, and which won’t.

During a debate, you’re more likely to make progress “if you can appeal to the moral concerns of the people that you’re talking with,” said Jesse Graham, a USC professor who helped develop MFT. All too often, though, “there are ways in which liberals and conservatives can talk past one another in these debates.”

The idea that changing the moral framing can help convince people to rethink their views has been borne out in some as-yet-unpublished work by Feinberg and his collaborator Robb Willer, also at Stanford, in which they got conservatives to say they approved of gay marriage at a higher rate by describing gay Americans as proud, patriotic Americans with the same hopes and dreams as everyone else (invoking the loyalty/subversion foundation), and liberals to support expanded military spending by arguing that doing so would provide valuable career opportunities to low-income young people (invoking the fairness/cheating foundation). And in another study that has been published, they “largely eliminated the difference between liberals’ and conservatives’ environmental attitudes,” as they put it in the abstract, by describing environmental degradation as a threat to the planet’s purity (invoking the sanctity/degradation foundation).

Pulling It All Together

Let’s put this into (hypothetical) practice. Say you’re arguing with an uncle who insists that the Boy Scouts should continue their policy of excluding openly gay people from being scout leaders. “For thousands of years, society has been built on one man, one woman,” he insists. “It just seems like a dangerous and unnatural social experiment to start having role models teaching kids that it’s okay to be gay.”

Here’s how to respond, and how not to:

Wrong response: No, it hasn’t! The concept of heterosexual, one-man, one-woman marriage is actually really new. Haven’t you read the Bible? Dudes had tons of wives back then! It’s like conservatives just conveniently ignore all this history when they’re trying to fight gay rights.

Why it’s wrong: Too nerdy and fact-y and confrontational. Remember that his opinion on this is probably coming from deep gut feelings rather than because he has expertly sifted the history and data.

Better: I think you’re definitely right that there’s a long-standing, wonderful tradition of one-man, one-woman relations. I totally respect how much you care about that institution — I do, too! I think my main reason for supporting allowing gay people to be scout leaders is that I have some gay friends who were Boy Scouts growing up, and who seriously treasure the lessons they learned during that time. They have the same ideals as you and I do, love our country for the same reasons, and even root for the same sports teams. They just want to give back to an organization that helped shape who they are, that taught them all sorts of invaluable life skills.

Why it’s better, and which of the points of advice you’re following: You’re not engaging with his questionable historicizing (forget facts), you’re being calm and respectful (don’t be such a dick), you’re sidestepping questions of disgust or what is or isn’t natural (defuse disgust — well, halfway, since you’re simply ignoring it), and you’re invoking the loyalty/betrayal framework by implying that Americans are all the same (unleashing some Moral Foundations Theory). That’s four of the five tools in just a few sentences — a dazzling Bruce Lee–esque combo of rhetorical mastery.

Suffice it to say that in the real world, any argument about a hot-button issue is unlikely to end with one party reaching a hand out to the other and saying, “You know what? You’re totally right. My bad!” But still: There’s a right way to argue, and a wrong way. And too many of us, having spent countless hours watching jerks on TV scream at each other, have developed bad argumentative habits.

    

 

UKip and the Tea Party

 

I happened to be in Washington when the result of the Clacton by-election was announced. This was not the most convenient place to follow the goings- on in a rundown British seaside town. The global financial elite was in the city for the annual IMF–World Bank meeting and the talk was all about what could be done to breathe life into the global economy. At dinner I had to check my phone surreptitiously for news about Clacton as the masters of the universe argued about quantitative easing and structural reform.

It was, however, arguably a good place to digest the significance of the election. The United States is in the habit of getting to the future first. And that is certainly the case with the rise of the populist right. The best way to understand both the causes and the consequences of what happened in Clacton on Thursday is to study the causes and consequences of the rise of the Tea Party. It is a worrying story and one the financial elites ought to take more seriously than they do.

Douglas Carswell would fit right into the American Tea Party (there are groups in the backwoods of Virginia who follow his doings carefully). He is animated by the two principles that animate American populists: a righteous hatred of the Establishment and a naive faith in the people.

The one time I met Carswell in his cramped office in Westminster, he was on fine form about the evils of the British ruling class and its addiction to “sofa government”. He was far less convincing when it came to the wisdom of British voters. He refused to contemplate the possibility that many of our problems were the result of giving people what they wanted — more entitlements and lower taxes — rather than the machinations of the boys and girls on the sofa.

Mr Carswell has the same half-admirable and half-worrying taste for ideas as the Tea Party. He has the same heroes — Friedrich Hayek and Milton Friedman — and the same quirky obsessions with the evils of central banks and the possibilities of e-democracy. He has the same very unconservative willingness to rip up ancient institutions (or parties) and start all over again. He also has the same inner toughness.

The Republican establishment was blindsided by the Tea Party because it underestimated the toughness of its leaders and the fury of its members. Tory high command also mutters about “swivel-eyed loons”, on the one hand, yet insists that the very same loons will come home in the general election.

The Tea Party is the home of America’s angry and left-behind: a growing constituency as the lion’s share of the benefits of economic growth go to the top 10%, and particularly the top 1%, and the incomes of ordinary people stagnate. These are people who are getting the sharp end of economic change: their jobs are being shipped off to Mexico or China or handed over to machines.

They are also people who are getting the sharp end of social change. They see the America they love being turned into a very different country from the one they grew up in (often a long time ago: Tea Party people are a fairly elderly bunch). There are 12m illegal immigrants. Gays can get married. A black man is president. The fact that the liberal elite treats their worries with uncomprehending condescension only makes them angrier.

Their complaints carry a good deal of wisdom: Washington politics is badly broken. The parties are vehicles for careerists — not just the people who win the seats but also the people who advise them on the dark arts of raising money and crafting attack advertisements. They are also deeply intertwined with bloated corporations that want to suck at the public teat. But the wisdom is intermingled with a good deal of nonsense. Not only do they want to recreate a country that can never be recreated. They are also blind to the extent to which they are part of the problem: the pensioners who turn out in such numbers to protest against big government are determined to keep their bit of big government in the form of pensions and healthcare.

You can see all this being repeated — sometimes word for word — in Ukip. Clacton is a poster town for the economically left- behind. Richer Ukippers (of whom there are many in my home county of Hampshire) feel just as left behind by gay marriage and immigration. Nigel Farage talks about “taking our country back”. Carswell talks about going to war with the Westminster machine.

Ukip is only part of a much broader populist movement: one Tea Party among many across the country and across Europe. The Scottish National party is a protest against the overcentralisation in London and the globalisation that limits people’s ability to control their lives. Populist parties, some of them — such as Hungary’s Jobbik and Greece’s Golden Dawn — distinctly sinister, are springing up across Europe, providing a voice for what Dominique Reynié, a French academic, calls the “existentially destabilised”. Marine Le Pen, leader of the French National Front, is polling at 25%. Populist parties such as the Sweden Democrats and the Danish People’s party are flourishing in Europe’s supposedly sensible north.

The parallels between Ukip and the Tea Party are not perfect: the Tea Party is a faction within the Republicans, while Ukip is a distinct party. But they are close enough to be worrying to anybody who cares about the future of British politics. The impact of the Tea Party in America has been overwhelmingly negative. For all its reasonable anger at America’s dysfunctional political system, it has made that system worse.

The Tea Party has strengthened the Democrats by dragging the Republicans to the right and leaving the middle to be occupied by their rivals. Primary contests between establishment Republicans and Tea Party figures have wasted the party’s energy and frequently lumped the Republicans with eccentrics.

Mitt Romney might easily have won the 2012 presidential election for the Republicans if he had not felt the need to pander to the Tea Party purists: he may have been an awkward candidate, but Barack Obama was a weak and unpopular president.

The Tea Party has also made it almost impossible for politicians to address America’s problems, with its high-octane rhetoric about sending immigrants back south of the border and antics such as Senator Ted Cruz’s attempt to shut down the government.

Carswell’s victory in Clacton on Thursday makes it much more likely that all this will be repeated in Britain. The Conservative party has responded to the rise of Ukip by moving further to the right on the issues that worry the left-behind such as Europe and immigration. This could easily alienate the middle ground without taming the populist furies: in Sweden a highly successful (and Cameron-friendly) conservative government recently lost an election because the Swedish equivalent of the Tea Party won 13% of the electorate. The more the Tories move to the right, the less they will be able to act as a champion for reform at home and freer markets in the European Union.

The lesson of the rise of the Tea Party is clear: for all the superficial appeal of its anti-Washington rhetoric, it makes dysfunctional politics more dysfunctional and sensible reform all but impossible. The British need to study the US example before it is too late.

    

 

UK Immigration

 

Those demanding curbs on foreigners entering the UK may suffer from irrational fears they dare not admit to themselves Here’s a test: an imaginary exchange between a prime minister and a voter. Tell me the point (if such a moment comes) when it doesn’t ring true for you.

PM: Now, sir: what would you say is your biggest concern for our country?

VOTER: Immigration from Eastern Europe.

PM: How do you think it harms us?

VOTER: Parts of Britain are being swamped. In some places our schools and social services just can’t cope. The indigenous population are being elbowed aside for housing, hospital treatment and things like that.

PM: Yes, I know this bothers people. But we could fix it. We could earmark whatever government money was needed, so that wherever immigrants were placing a big strain on public services, funds would be allocated for the authorities to cope. If you believed we’d do this, would it solve the problem for you?

VOTER: Oh yes; that would be fine.

Most readers will find this exchange imaginable right up until the last sentence. It’s the “that would be fine” that doesn’t ring true. You just know that this voter would not be satisfied with this proposed solution. Which is queer. Because it would undeniably solve what this voter has just said is his big problem with immigration.

Somebody here is not telling the truth — and it isn’t the politician.

In our own lives, though, we have no trouble understanding such apparent illogic. When we come up with a solution to an objection raised by friends or workmates and their eyes glaze over or they just raise a different objection, we can explain it. They’re not actually interested in a solution.

We realise that something else is bothering them: something they don’t want to acknowledge, perhaps even to themselves. So we’d be wasting time trying to deal with the stated problem: the real problem may lie elsewhere, maybe a long way away from where the complainant has chosen to focus his complaint. The phenomenon is not far from the Freudian concept of “transference”.

Scratch the surface and we see that beneath apparent practical objections to immigration, lie disappointments and insecurities that feed into, if not outright racism, an irrational resentment of the alien, of the Other.

Why can’t we, and why can’t the Conservative party, understand that this goes a long way to explaining opinion polls and headlines about “popular fury” over “immigration and Europe”? Why haven’t our mainstream politicians the brains or moral courage to push back against the lies and the nonsense?

And why can’t they see that wittering nervously about “game-changing” reforms they hope to conjure from God-knows-where, to “deal with” voters’ concerns about immigration and Europe, impresses nobody: it makes them look rattled and on-the-run. It validates the outriders in our politics: outriders like Ukip who will never be called upon to put their ideas into practice and so can always outbid the mainstream parties with yet more outlandish claims.

Any fool in my part of England knows that if you’re crossing a field and a herd of bullocks starts mobbing you, the worst thing you can do is run. Spin round and face them, wave a stick, clap hands and shout “boo!” They’ll back off. As with bullocks, so with populists. For the best description of Ukip’s claptrap about immigration and Europe, make a simple substitution to the first vowel in “bullocks”. Are the Tories too scared to say this word?

How many Romanians are there in Clacton? I challenge the assumption that very large numbers of people in Britain are seriously affected in their everyday lives by the presence of East European immigrants. It just isn’t true. Where have you seen European immigrants spoiling British people’s lives? The overwhelming majority are Poles. Where is this conjectured anger and resentment towards Poles? Most people consider them a decent, hard-working and useful bunch. Get people off the subject of “immigration generally” and ask them how seriously they feel threatened on a personal basis by Poles, and the anger dies.

Down below this column — if you read me online — there’s a dark and rather scary world we call Readers’ Posts. I go there often to do battle with the Ukip and ConHome astroturfers — the rabble who migrate between the online comment sections of papers like ours, the Financial Times, The Daily Telegraph and The Guardian (places you often sense are not their natural pastures) giving the impression of a huge, angry, grassroots surge of support for Ukip.

For the pleasure of imagining the expressions of complete uninterest spreading across their angry faces as they encounter real facts, I offer a few . . .

According to the Office for National Statistics, net immigration from the eight East European countries that joined the EU in 2004 peaked around 2007 and has declined between then and now. The ONS’s net immigration estimate for 2013 is little over half what it was in 2007. Poland dominates. Romania and Bulgaria do not (so far) feature strongly. Immigration from the Indian subcontinent is as significant as from Europe but much of this is through marriage, and Ukip doesn’t propose to end the right of British nationals to marry foreigners. Public attitudes to immigration have been hostile since 1964: no more so now than then; in fact hostility has dropped from more than 80 per cent to about 60 per cent.

But what has certainly spiked is the importance people attach to the issue. Which brings me back to where we started. I believe there’s something evanescent about this “popular fury”. I’m uncertain of the cause. It may be “transferred” distress among some about the way their life is going generally: a feeling that everything has gone to the dogs. Or it may be economic anxiety and being (as many now are) financially pinched. Or it’s possible that it’s just a fashion. Patrick Kidd in his Times sketch of the by-election count in Clacton described the mood as “Biebermania for pensioners”. We don’t seek explanations for Justin Bieber; we just trust that in time he’ll go away.

I expect that “immigration and Europe” will go away too, in time. Or will do unless the Tories so panic that Britain gets stuck beyond recall on the path to a “no” in a European referendum. The way they’re behaving now, that could very well happen.

    

 

Impact of Cheaper Oil

 

The falling cost of energy, helped by fracking, should be celebrated. It can deliver a circle of prosperity and innovation So ingrained is the bad-news bias of the intelligentsia that the plummeting price of oil has mostly been discussed in terms of its negative effect on the budgets of oil producers, both countries and companies. We are allowed to rejoice only to the extent that we think it is a good thing that the Venezuelan, Russian and Iranian regimes are most at risk, which they are. Yet by far the greater benefit of the oil price fall comes from the impact on consumers. Making this essential resource cheaper allows everybody, whatever their nationality, to spend less money on dull things like heat, transport, metal and plastic, which leaves them more money for things like movies, holidays and pets, which gives other people new jobs, which raises everybody’s living standards.

The oil price peaked at almost $150 a barrel in 2008, just before the financial crisis. That is probably no coincidence. Although the crisis was fuelled by a credit bubble, rocketing oil prices helped trigger the bust. All over the world, but especially in America, people were saddling themselves with longer and longer commutes to find houses they could almost afford, a phenomenon known among American mortgage brokers as “drive till you qualify”. The doubling of fuel prices in the US between 2005 and 2008 killed that strategy and began the collapse of the housing market.

The price of Brent crude oil has fallen from about $115 a barrel in June to about $85 today. That will make a tank of petrol cheaper (though not by as much as it should, because of taxes) but it will also make everything from chairs to chips to chiropody cheaper too, because the cost of energy is incorporated into the cost of every good and service we buy. The impact of this cost deflation will dwarf any effect of, say, a fall in the price of BP shares in your pension plan. It is true that part of the reason oil prices are falling is that world economic growth is slowing. But economists reckon that every 10 dollars off the price of a barrel of crude oil transfers 0.5 per cent of world GDP from countries that export oil to countries that import it — and the latter tend to spend the money more quickly, accelerating the velocity of money and encouraging investment and innovation.

The industrial revolution itself was built around abundant cheap energy, mainly in the form of coal, which enabled mechanisation, which vastly amplified the productivity of the average worker and therefore his income. Today a typical British family of four uses as much energy as if it had 400 slaves in the back room pedalling eight-hour shifts on exercise bicycles. It would use even more if it also fed those slaves!

The falling oil price is largely the Americans’ fault. By reinventing the extraction process for first gas, then oil, with horizontal drilling and hydraulic fracturing, engineers have almost doubled the country’s output of oil in six years. That ingenuity was made possible by the high price of oil, which promised fabulous riches to those who could get oil out of shale, but it is no longer dependent on the high price of oil. It is often said that the cure for high oil prices is high oil prices and so it has proved.

The International Energy Agency (IEA) says that most shale oil production remains profitable at $80 a barrel. One North Dakota oilman tells me that his rate of return on fracked wells drops to 10 per cent only when oil prices reach $55, and that’s without taking into account the falling price of fuel for his vehicles. Every month American oil production rises by 100,000 barrels a day at the moment and that is not going to change for a year or so whatever happens to the price of oil. The number of rigs drilling for oil will start to drop off if the oil price falls further, but only slowly. Of course, unlike the nationalised decisions of Opec, oil production in America is the sum of the investment decisions of hundreds of independent companies.

The IEA reckons that only about 3 per cent of world production needs a break-even price above $80. Much of that is in China, Indonesia, Malaysia, Nigeria and Russia, where costs are high largely because of big government taxes and royalties. It is governments, in other words, that are most likely to take the spring out of the consumer’s step, with Opec’s impending decision on whether to constrain production being the best hope of the spoilsports.

Talking of spoilsports, here in Britain the opposition has managed to deprive people of the benefits of lower energy prices. Ed Miliband’s promise that he would freeze the price of energy from 2015 came just before world energy prices began to fall smartly. The energy utilities, reluctant to have their prices frozen when they are low, have therefore done their utmost to avoid dropping their prices in case Mr Miliband becomes prime minister.

As Paul Massara, head of npower, told the regulator Ofgem in August: “We are acutely aware that if the Labour party were to implement their proposed price freeze, we will be living with the consequences of our standard rate tariff price for a very long time and beyond the level of risk that we could manage in the wholesale market.”

Not that the Liberal Democrat part of the coalition government has covered itself in glory on energy costs either. In 2010 Chris Huhne’s Department of Energy and Climate Change assumed that gas prices would double by 2020, leading renewable energy subsidies to wither away. Instead gas prices have fallen. In a normal market that would mean lower electricity prices, but in DECC’s wonderland, we have to pay the difference between gas prices and the costs of wind. In other words, this country’s official energy policy insulates consumers and manufacturers to some extent from the benefits of falling costs. That way lies uncompetitiveness.

But is cheap fossil fuel not bad news for the climate? A new paper in Nature magazine argues that when the gas boom sparked by fracking goes global, prices will fall fast, economic growth will accelerate and so we will end up using more energy and producing more emissions than before, even if we give up coal. It forgets to mention that if we get that much richer, we will also abolish much more poverty, disease and misery, and have the investment funds to invent new, cheap and low-carbon forms of energy too.

    

 

Redlining

 

We're hiring, but not people like you.

I'm looking for a doctor, but of course, not someone like you.

We're putting together a study group, but we won't be able to include people like you.

Redlining is an efficient short-term selection strategy. At least that's what we tell ourselves. So the bank won't loan to people in that neighborhood or people with this cultural background, because, hey, we can't loan to everyone and it's easier to just draw a red line around the places not worth our time...

The challenge with redlining, beyond the fact that it's morally repugnant, is that it doesn't work. There's a difference between "people like you" and "you." You, the human being, the person with a track record and a great attitude and a skillset deserve consideration for those things, for your psychographics, not your demographics.

When there's not so much data, we often resort to crude measures of where you live or what you look like or what your name is to decide how to judge. But the same transparency that the net is giving to marketers of all sorts means that the banks and the universities and the hiring managers ought to be able to get beyond the, "like you" bias and head straight for "you."

Because 'you' is undervalued and undernoticed.

When we say, "I don't work with people like you, I won't consider supporting someone like you, I can't invest in someone like you," we've just eliminated value, wasted an opportunity and stripped away not just someone else's dignity, but our own.

What have you done? What do you know? Where are you going? Those are a great place to start, to choose people because of what they've chosen, not where they started. Not because this will always tell us what someone is capable of (too many people don't have the head start they deserve) but because it is demonstrably more useful than the crude, expensive, fear-based shortcuts we're using far too often.

In a society where it's easier than ever to see "you," we can't help but benefit when we become anti-racist, pro-feminist, in favor of equal opportunity and focused (even obsessed) on maximizing the opportunity everyone gets, early and often.

    

 

Height discrimination in China

 

WHEN two security guards in Dalian in north-east China got their first month’s pay packet earlier this year, they questioned why each received different amounts for identical work. The company responded that one man was 5cm (two inches) taller than his peer. Workers over 180cm earn more, they said, because bigger guards make people feel safer.

Stature is often a desirable attribute of guards, but in China height requirements are routinely specified for jobs which seem to have no need of them. To study tourism and hotel management at Huaqiao University in Fujian province, men topping 170cm are favoured, and women over 158cm. A post as a female cleaner in Beijing is advertised to women of at least 162cm. Many companies are less explicit about such demands than they used to be, but candidates often list height (and weight) on their curricula vitae.

The height premium is most pronounced for women, according to a study from Huazhong University of Science and Technology. It found that each centimetre above the mean adds 1.5-2.2% to a woman’s salary, particularly among middle- and high-wage earners. A group at China University of Political Science and Law is working on a draft law against employment discrimination for height and other physical characteristics.

Ever more Chinese are rising above such constraints, however. A 45-year-old man in China today is around 5cm taller than 30 years ago, according to the RAND Corporation, a think-tank. Soldiers are growing too tall for the diminutive tanks favoured by the People’s Liberation Army; in 2010 the government raised by 10cm the height under which children in China travel free on trains (a rare scheme that benefits the small).

Greater heights mostly reflect greater incomes. Richer people tend to eat more and live in cleaner, better homes. Meat consumption per person has increased more than fourfold since 1980. Infant mortality is less than a tenth of what it was 60 years ago. Household size has also helped. Historically people from big families have been shorter (not just in China) because food supplies must stretch further. In China the birth rate fell sharply from the 1970s nationwide.

But there are differences across the country which partly reflect the uneven benefits of the economic boom. Eighteen-year-olds from the richest cities are on average 7-8cm taller than those from the poorest ones. The height gap between prosperous and impoverished rural areas is similar. Southerners have long been shorter than northerners. Although the difference between rural and urban heights has narrowed since 1975, other discrepancies persist. The World Health Organisation says around 20% of children in poor rural areas are “stunted”, a common indicator of chronic malnutrition. This compares with 2.5% of city children. Employers’ preference for high and mighty staff exacerbates that inequality. It is time they grew up.

    

 

Solution Aversion

 

As Science of Us has previously noted, psychological researchers are very interested in the question of why some people don’t believe in human-cause climate changes despite the overwhelming evidence for it. That’s partly because it’s of obvious importance on the saving-the-world front, and partly because it’s a useful issue for examining all sorts of biases and psychological tendencies that help explain broader divides when it comes to political opinions. A study on this subject by Troy Campbell and Aaron Kay of Duke University adds an intriguing new idea to the mix: “solution aversion.”

The basic idea here is that people are less likely to believe that something’s a problem if they have “an aversion to the solutions associated with the problem,” as the authors put it. Strictly speaking, this doesn’t make sense — when determining whether or not to believe in a problem, all that should matter is evidence for the existence of that problem. (Just because you believe it will be expensive to replace that leak in your roof shouldn’t make your belief in the leak any less likely.)

But it fits into a broader framework of what psychologists call “motivated reasoning” — the human tendency to form beliefs not based on a strictly “objective” reading of the facts, but in a way that offers some degree of psychological protection. “We think politicians have disagreements about the solutions because they have disagreements over the facts,” said Campbell in an email, “when in actuality it’s often the other way around: Partisans have disagreements over the facts because they disagree with solutions.”

So how does this apply to conservatives and climate change? Many conservatives believe in free markets and limited government. Generally speaking, the most well-known potential solutions to the problems posed by climate change involve increased regulation. So Campbell and Kay posited a link between the two: if they could manipulate how the online skeptics in their study viewed the likely solutions to climate change, maybe those respondents would be more likely to trust the science.

Sure enough, that’s what happened: overall, self-identifying Republicans in the study were a lot less likely to say they thought humans were causing climate change, but when the problem was paired with a “free market friendly solution” rather than a “government regulation solution,” a significant gap opened up: on a scale of 0 to 8, Republicans in the free-market solution group ranked the likelihood of humans causing climate change at 5.68, while those who saw the regulation solution put the number at just 3.67 (by way of comparison, the average score for Democrats was 6.7).

The researchers saw the same effect when they asked respondents to what extent they agreed with mainstream climate-change science projections about the warming Earth: Republicans agreed with the science a lot less, but researchers significantly closed the gap by pairing the problem with a free-market solution. (Among self-identifying Democrats, there was basically no difference between the two framings — almost everyone thought climate change was caused by humans, and almost everyone agreed with the science)

And make no mistake: This isn’t just a conservative thing. In another study, the researchers tested the respondents on an issue that tends to resonate more with conservatives than liberals: the danger of home invasion by a criminal. In this case, they hypothesized that, just as conservatives would be less likely to believe in climate change if they were presented with a solution involving government regulation, liberals would be less likely to believe that home invasions are a major problem if they were presented with a solution involving less gun control (so homeowners can protect themselves). That’s exactly what they found.

Now, there’s obviously a big difference between manipulating people’s self-reported beliefs in a lab and convincing them in the infinitely louder, noisier context of the real world, but this is still a promising new avenue for explaining why people come to such different conclusions about which risks to heed and which to ignore — and why debates over some of these issues lead to such grinding, interminable gridlock. “Politicians have already decided what they want the solutions to be,” said Campbell in an email, “and end up believing whatever facts will support those.” To a certain extent, the same can be said of voters.

    

 

Self Interest and Leadership Coups

 

Gordon Tullock was a political obsessive. He had a view on everything. He used to write letters to The New York Review of Books picking it up on minor factual errors about state balloting in elections long past. He was only ever a few seconds away from an argument.

Yet he didn’t vote. He couldn’t see the point. You see, he looked at everything in terms of incentives. The probability that his vote would change the outcome was incredibly small. The difference it would make to him personally if the outcome was changed wasn’t much larger. Multiply those things together and why leave the house? You might get run over.

Last week, at the age of 92, the grand old man of public choice economics died (ironically on midterm election day, which led his students to joke that in famously corrupt Illinois he probably voted for the first time). And as I read accounts of his work, I was struck yet again by its explanatory power.

It set me wondering. Might it be possible to use this work, Tullock’s public choice economics, to develop a general theory of political coups and use it to work out what will happen to Ed Miliband?

At its heart, public choice economics is very simple indeed. It is the assertion that personal incentives are central to the choices of public officials. Gordon Tullock and his collaborator James Buchanan were irritated by the way that politicians and government workers were thought to be trying to advance the public interest while private companies were advancing only their own interests.

So in the early 1960s they began developing the argument that public officials were no different to anyone else. Just like anyone else, they were interested in maximising their personal gain. People didn’t cease to be profit maximisers just because they were taking part in political affairs.

This didn’t mean they were only interested in money. Politicians, for instance, are vote maximisers, motivated by the gaining of votes. It does, however, mean that the claim of pressure groups to be acting for the wider community should be taken with a pinch of salt. They are looking after their own interests.

That this now seems fairly obvious, doesn’t mean that it wasn’t important. Buchanan won the Nobel prize for work that challenged ideas about public spiritedness that were common in the 1950s. (That Tullock did not is put down to his famous waspishness. There are entire blogs devoted to the insults of Gordon Tullock.)

Personal incentives are too crude a way of understanding what makes people tick and too simple to entirely explain their behaviour. Yet what public choice theory lacks in sophistication, it makes up for in simplicity. There is a great deal that can be understood about political behaviour simply by understanding the personal incentives of the main actors.

Let’s apply this insight to the problems of Ed Miliband and the question of coups against political leaders. Getting rid of an unpopular or incapable leader of a political party is an act that may benefit everyone in it. It is a public service, with the gains spread across every supporter of the cause.

What about the costs? The costs are concentrated. Some people have to get the coup started. They need, for instance, to resign their seats on the front bench or anger their colleagues and the party leader by denouncing him. They have to do this without being sure they will be successful. The leader may stay in office, able to punish the rebels for their disloyalty.

Yet even if they do succeed, and the leader falls, those who initiate the coup can’t be certain they will benefit from it. They may be resented by colleagues or distrusted. Certainly they can’t be sure they will get any more benefit than anyone else.

In economics it is well known that there is a market failure, or shortage, in scientific research. The cost of the research is borne by the small number who conduct and pay for it, while the benefits go to everyone. As a result, not enough research is done.

In exactly the same way, there is a market failure in political coups. Those who pay the cost of a coup (the people who initiate it) don’t get enough of the benefit (which is spread). So coups are under-supplied.

Let’s use this model to look at some modern coups that happened or failed to happen. It was obvious that Gordon Brown was going to lead Labour to defeat in 2010. The entire party would benefit from his departure and everyone knew it. Yet the coup failed to happen. This was entirely predictable.

Mr Brown and his allies had the ability to exact revenge, not just immediately but for some time to come. So rebelling against him was potentially very costly for anyone who tried to get rid of him and failed. Anyone who moved against him and succeeded could not be certain to benefit from it personally.

As a result, the only rebels against Mr Brown were those who didn’t care what happened to them and, predictably, this wasn’t enough. A coup was needed and was in everyone’s interest collectively, but in no one’s interest individually, so it didn’t happen.

Contrast this with the coup against Tony Blair. It wasn’t in the party’s interest but it was in the interest of many individuals. The gain to Labour was non-existent but rebels were guaranteed a reward and protection by Gordon Brown. So the coup happened.

Or look at what happened to Iain Duncan Smith when he was leader of the Conservatives. People could rebel against him anonymously by writing a letter requesting a vote of no confidence. The benefit to the party turned out to be small, but the cost to individuals was even smaller.

So what does this suggest for Ed Miliband? It is obviously in the party’s collective interest to get rid of him but it is highly unlikely that it will. Almost anybody else would improve the party’s fortunes and it would be well worth a little temporary disunity to make a change. Yet the cost to any individual of making a move against Ed Miliband is, potentially, a big one.

If Miliband becomes prime minister, which he still might, a rebel shadow minister would be left out of his government. If he fails to become prime minister, a rebel might still not be forgiven by the party.

Which leaves any rebelling to be done by people who don’t care about their future. Which won’t be enough.

    

 

Govts and New Technology

 

IF THE WORLD is thought to be suffering from both too little innovation and too much at the same time, it may be reasonable to think that the future will look a lot like the past. That strategy offers some room for optimism. Thanks to technological change and the resulting economic growth, many countries are now vastly richer than they were 300 years ago, and rich in ways pre-industrial societies could not have conceived of. Fears of mass unemployment raised by earlier technological leaps never came to pass. Instead, technology allowed people to live longer, fuller lives. Quite possibly this time will be no different. Humans’ greatest advantage over machines has always been their flexibility, which should help them adapt to the new world around them. A generation from now people everywhere will almost certainly be richer and live longer, and most of those looking for work will probably still be able to find it.

Yet history also offers plenty of reason to worry. Humans may be flexible, but their governments typically are not: they act only when forced to do so. In past economic revolutions it took a shift in the balance of political power, sometimes achieved only after violent conflict, to ensure that the gains from growth were broadly shared. The necessary investment in education and infrastructure and the provision of a social safety net proceeded in fits and starts and did not always go right.

Over the past few decades technology has hollowed out workforces, leaving too many people competing for jobs that require minimal skills and offer minimal pay. Rising inequality and stagnant wages are eating away at the legitimacy of existing tax and redistribution systems. Governments’ responses so far have ranged from the uninspiring to the negligent.

Broadly speaking, there are three ways of dealing with the labour imbalance: raising the productivity of less-skilled workers; turning less-skilled workers into more-skilled workers; and providing income support for those who find it hard to earn a living in this new world.

Raising the productivity of less-skilled workers may not be as hard as it sounds, but it requires governments to get their economic policies right. Often they simply need to get out of the way. A prime example is occupational licensing. Between the 1950s and 2008 the share of employment in America covered by occupational regulation rose from roughly 5% to nearly a third. It now includes not only professions like nursing and teaching but jobs in interior design and even in nail salons. Excessive regulation reduces mobility and makes it harder for workers to change careers or earn extra income. In Europe non-transferability of professional qualifications restricts migration. In parts of the emerging world jobs involve so much red tape that many global firms would rather automate than employ more people.

Having workers in the right places is critically important to generating more and better jobs. In both the rich and the emerging world unmet demand for housing is a significant constraint on growth. In developing economies many large cities have outgrown their capacity to house their populations, resulting in sprawling slums that harbour crime and disease. India’s government, for example, tightly restricts land use, making new construction costly and modern housing extremely expensive.

In rich countries restrictions on the supply of housing can be just as pernicious. In economically dynamic places such as New York and London the shortage of housing is a serious constraint on growth in output and highly paid jobs. Inadequate investment in infrastructure exacerbates the problem. As roads and trains become more crowded, residents grow wary of agreeing to new developments, and so it goes on.

Back to school

The best hope for reducing the glut of less-skilled labour is to transform some of it into the more-skilled sort through higher spending on education. In the 19th and 20th centuries it took significant public investment to ensure that newly industrialised economies had a supply of labour with the right qualifications. Something similar is needed today. Rich countries are short of highly skilled workers, and many developing economies lack the basic educational infrastructure to produce a more effective labour force. Immigration from poor countries to rich ones might help adjust that global imbalance, but is too politically contentious to make a big difference. Across the world more effort is needed to improve primary and secondary education.

A good standard of literacy and numeracy across populations in emerging countries will be critical if large numbers of workers there are to take part in trading global services. Governments need not turn every student into a PhD candidate to boost his or her earnings prospects. Demand for skilled tradespeople such as plumbers and electricians remains high. Recent studies of the long-term effect of good teaching indicate that improving the quality of teachers just from poor to middling has a significant effect on the lifetime earnings potential of a typical school class. Long-run analyses of intensive pre-school programmes suggest that they achieve annual social returns on investment (allowing for expected cost savings from reduced crime and welfare spending) of 7-10%.

Politics must craft rules and institutions that harness technology to suit society’s values

But providing better opportunities through education and deregulation may not be enough to ensure that the benefits of technology-based growth are sufficiently widely spread. As in past economic revolutions, the social safety net will also need to be strengthened. That might include measures such as introducing or extending minimum wages. This time, however, governments face a sticky problem. If such policies make workers more expensive, firms will hire fewer of them. If on the other hand wages are kept very low and benefits are reasonably generous, workers may be dissuaded from looking for jobs. And at a time when fiscal demands on taxpayers are rising, governments cannot afford to allow labour-market participation to fall and thus reduce their tax base.

Some research suggests that modest increases in minimum wages can lead to productivity improvements. That may be because they reduce worker turnover, or because they prompt firms to invest in their workers or get them to work harder. Yet although higher minimum wages can be politically appealing, their use will need to remain limited. The easier it becomes to automate basic work, the less of a nudge firms will need to swap workers for machines when wages rise.

One way of squaring that circle would be for governments to provide wage subsidies. Such payments encourage participation in the labour force by making work more worthwhile for low-paid workers without discouraging firms from recruiting. America’s earned-income tax credit and Britain’s working tax credit both use the tax system to help families with low incomes. In America the subsidy available to poor families with children is relatively generous, but the maximum paid to the childless is a miserly $496 a year.

Economists frown on the idea of sharing out work to make it go further, but as a temporary measure it has been used with some success. The best-known example is that of the Kurzarbeit programmes used in Germany during the recession following the 2008 financial crisis. Workers accepted a shorter working week in lieu of lay-offs, and the government helped make up the resulting shortfall in income.

If the dislocating effect of technology turns out to be really severe this time, governments might consider offering a universal basic income, just sufficient to live on, to which all working-age adults would be entitled. A basic income for all is an old idea receiving new attention because of the recent labour-market upsets. Switzerland is getting closest to trying this out: last year campaigners there obtained enough signatures to force a referendum, to take place in the next couple of years, on introducing a basic income of SFr30,000 ($32,000).

The idea of a guaranteed income runs smack against core beliefs regarding the meaning and importance of work. Allowing people to become full-time couch potatoes at public expense is abhorrent to those who reckon that healthy adults should contribute to society in order to benefit from its economic output, as well as to those who see work as a source of personal dignity or a means to maintain mental balance, to say nothing of the majority who would still be working for their living and generating the tax income that would fund such a scheme.

What would you do if you didn’t have to work?

Entitlement to a basic income might be linked to a requirement to seek a regular job, take part in make-work schemes or engage in volunteering. Yet economic liberals might argue that such paternalism is unlikely to make anyone better off. And freedom from want might create scope for other socially benign activities, such as work or self-employment that generates some income, just not enough to live on. Given a basic income, many more budding entrepreneurs might launch businesses doing something they feel passionate about.

Whichever way governments respond, budgets will be tested. Even modest increases in income subsidies imply both a rise in government spending as a share of GDP and a concentration of the tax burden on a smaller share of the population. A higher tax burden will encourage tax avoidance among the very rich and distort economic decisions. In America and Britain the top 1% of earners already contribute 46% and 28% respectively of all tax revenues. If they are squeezed too much, some of them might take their money and move elsewhere. Governments got much bigger after previous technological revolutions. They cannot expand much more without running into serious fiscal constraints.

Tax competition may become an increasingly divisive international issue. Some of the highly mobile rich will be attracted by countries with low-tax, low-spending regimes, whereas the relatively immobile poor will hope for generous state benefits at home. Governments may need to tighten up their residence rules to prevent the rich from pretending to live in a low-tax country to minimise their tax bill, and tax regimes may need to be co-ordinated to discourage avoidance and evasion.

Preventing fiscal disaster may also require comprehensive reforms to make tax systems more efficient, so that a given tax burden is more difficult to dodge and less disruptive to the economy. One way of doing that would be to tax immobile factors such as land more heavily. Land taxes within cities, if combined with a loosening of zoning restrictions, should encourage denser construction, which could help alleviate housing shortages in some of the most expensive places.

Taxing undesirable activities such as emitting carbon and causing pollution would also raise revenues at minimal economic cost. Shifting the brunt of taxation from income to consumption in America could help the country resolve its fiscal and inequality problems at the same time—provided the money is used to boost sagging incomes. In Europe the use of value-added taxes has allowed governments to maintain high public expenditure at relatively low economic cost. America, which currently has a progressive tax system but spends less on helping the poor, might need to review its system.

The first two industrial revolutions fundamentally changed the relationship between the individual and the state. The digital revolution now in progress will inevitably bring about yet another such change. Governments may need to develop new economic approaches, giving technology freer rein to transform production while providing workers with more of a cushion against the painful effects of that creative destruction. Some might instead tolerate the emergence of a growing underclass that is hard to escape from while continuing to search for a technological solution to underemployment. Governments themselves might be transformed by new political movements emerging in response to the dissatisfaction generated by technological change: in benign ways, through political reform and realignment, or in uglier fashion.

Technologies are tools without an agenda of their own, but their influence on society is never neutral. They blindly sweep aside the livelihoods of some people and enrich others. Politics must craft rules and institutions that harness technology to suit society’s values and vision of itself.

    

 

Walmart Buys Govt

 

When retail workers want something, they ask their employers, get denied, get bullied and sometimes fired. Sometimes, they take to the streets, as they have for the last three years on Black Friday. By contrast, when retailers want something, they scurry to the halls of Congress, where they purchase influence with their exorbitant profits.

The largest big-box retailers have spent a total of $111 million since the 2000 election cycle on lobbying and campaign contributions. During the 2014 election cycle, big-box retailers spent $30 million on federal campaign contributions and lobbying, which is almost six times what they spent in 2000 (after an inflation adjustment).

Wal-Mart in particular is known for its status as a corporate-welfare queen; one study estimates that one 300-person Supercenter costs taxpayers $904,542 to $1,744,590. Another estimates that Wal-Mart and the Walton family pull in $7.8 billion a year in tax breaks and subsidies. Meanwhile, a brand-new report from Americans for Tax Fairness finds that Wal-Mart also avoids taxes on more than $21 billion in offshore profits. In its most recent annual report, Wal-Mart openly admits that changes to government food stamp programs may hurt its financial performance. Hundreds of thousands of Wal-Mart workers make near-poverty wages.

Wal-Mart certainly benefits from other favorable government actions. Changes to labor laws allow it to abuse worker schedules, although sometimes it ignores them completely and just refuses to pay workers, flat out. Since Wal-Mart is a serial polluter, lax environmental standards are beneficial. It’s also very concerned about taxes, trade and intellectual property. To advance these interests, Wal-Mart spent $2.4 million on campaign donations and $12.5 million in lobbying in 2014 alone. Even still, these numbers vastly understate political spending. For one, there is not yet a comprehensive database on state and local political spending. Further, many of these retailers are members of 501(c)6 groups like the American Legislative Exchange Council (ALEC) and the Chamber of Commerce; money paid to these groups is not reported. Also unreported are donations to 501(c)3 groups. As Demos has noted before, there is a clear need for stronger disclosure requirements, so that the full political influence of companies can be made available to the public. What we do know tells us that Republicans live up to their reputation as the “party of big business” and pull in more than $2 from retail for every $1 that goes to Democrats. With the exception of Costco, every big-box retailer heavily favors Republicans.

Numerous studies find that this money buys influence. Retailers lobby on a variety of issues, including tax policy, labor issues and the terms of international trade. A vast literature shows that these efforts produce returns, often at the expense of other democratic interests. In a comprehensive study of such conflicts, researchers found that business interests prevailed in 9 out of 11 issues in which businesses and labor were opposed. In the 16 cases that pitted business groups against citizen group coalitions, businesses won nine.

Taxes were the most frequently lobbied issue by large retailers in 2014, and by a wide margin. This legislative area has proven lucrative for business in the past; a 1 percent increase in businesses lobbying expenditures yields a lower effective tax rate of between 0.5 and 1.6 percent for the firm that lobbies. One study on the subject finds that the market value of an additional dollar spent on lobbying could be as high as $200. In 2014, the largest big-box retailers reported lobbying on a total of 37 incidences of specific taxation issues, including corporate tax reform, Internet sales tax and the extension of temporary tax breaks. The next most common issues of lobbying were health care reform, labor, antitrust and workplace regulations. As one example of the power of lobbying, the research firm Strategas maintains an index of the 50 firms that lobbying most intensely. The index has outperformed the S&P 500 every year since 1998.

Campaign contributions also produce benefits. There is strong evidence that the most important impact of campaign contributions is to increase access to politicians with the intent of setting the political agenda. A study of the telecommunications industry finds that regulators respond to private political spending with regulations that favor the donors. Companies that bid for federal contracts across industries are more likely to be granted those contracts if the bids are complemented by campaign contributions.

What can be done to stop the spread of big money in politics? Over the long run it will be necessary to overturn laws like Citizens United that opened the floodgates of money into Congress. But there are also short-term solutions. The Supreme Court explicitly endorsed disclosure as the alternative to campaign contributions. On the heels of Citizens United, Congress came within one vote of overcoming a party-line filibuster to pass the DISCLOSE Act. In the absence of Congress, shareholders should demand that corporations either get out of politics, or disclose their donations to organizations like ALEC that they might not approve of. The SEC should require this disclosure if Congress won’t.

Publicly financed elections can help candidates not in the pockets of big money get into office, and more states should consider the system. To slow the rise of lobbying, states and the federal government should regulate it more strictly. Patrick Flavin finds that states with stricter regulations on lobbying are more politically equal — that is, responsive to voters of all income groups. Politicians need to tell Congress that even on Black Friday, government isn’t up for sale.

    

 

Act Local

 

A businesswoman was so fed up with the state of her village that she has employed a handywoman to clean up and patrol the community.

Ling Valentine, 41, advertised for someone to revamp Grange Villa, Co Durham, after it became starved of council funds and police resources. Clare Honey, 40, a former care worker, applied for the job, which required someone to do “absolutely anything that needs doing”. She works 40 hours a week on a salary of £16,000, and patrols and does odd jobs in the former pit village armed with a JCB truck and company-issue clothing.

Ms Honey, nicknamed “Breath of fresh Clare” by locals, collects rubbish, sweeps public areas, picks up dog mess, cuts hedges and fixes light bulbs in people’s homes. She has even put up “Welcome to Grange Villa” signs after her employer spent £1,000 of her own money. If street lights are out, the council is informed, while police get a call if anything criminal is reported.

Mrs Valentine, who runs her own vehicle leasing business and once turned down the offer of £50,000 on the TV show Dragons’ Den, said: “She is a help person. We did not want her to be a warden or official. If an elderly person needs their ivy trimmed, Clare will do it. Even if they need the newspaper picked up from the newsagents she will do it.”

    

 

Faux News

 

Tucker Carlson said on Fox that more children die of bathtub drownings than of accidental shootings. They don't.

Steve Doocy said on Fox that NASA scientists faked data to make the case for global warming. They didn't.

Rudy Giuliani said on Fox that President Barack Obama has issued propaganda asking everybody to "hate the police." He hasn't.

John Stossel said on Fox that there is "no good data" proving secondhand cigarette smoke kills nonsmokers. There is.

So maybe you can see why serious people — a category excluding those who rely upon it for news and information — do not take Fox, well … seriously, why they dub it Pox News and Fakes News, to name two of the printable variations.

Fox is, after all, the network of death panels, terrorist fist jabs, birtherism, anchor babies, victory mosques, wars on Christmas and Benghazi, Benghazi, Benghazi. It's not just that it is the chief global distributor of unfact and untruth but that it distributes unfact and untruth with a bluster, an arrogance, a gonad-grabbing swagger, that implicitly and intentionally dares you to believe fact and truth matter.

Many of us have gotten used to this. We don't even bother to protest Fox being Fox. Might as well protest a sewer for stinking.

But the French and the British, being French and British, see it differently. And that's what produced the scenario that recently floored many of us.

There was Fox, doing what Fox does, in this case hosting one Steve Emerson, a supposed expert on Islamic extremist terrorism, who spoke about so-called "no go" zones in Europe — i.e., areas of Germany, Sweden, France and Britain — where non-Muslims are banned, the government has no control and sharia law is in effect. Naturally, Fox did not question this outrageous assertion — in fact, it repeated it throughout the week — and most of us, long ago benumbed by the network's serial mendacities, did not challenge Fox.

Then, there erupted from Europe the jarring sound of a continent laughing. British Prime Minister David Cameron called Emerson an "idiot." A French program in the mold of "The Daily Show" sent correspondents — in helmets! — to interview people peaceably sipping coffee in the no-go zones. Twitter went medieval on Fox's backside. And the mayor of Paris threatened to sue.

Last week, Fox did something Fox almost never does. It apologized. Indeed, it apologized profusely, multiple times, on air.

The most important takeaway here is not the admittedly startling news that Fox, contrary to all indications, is capable of shame. Rather, it is what the European response tells us about ourselves and our waning capacity for moral indignation with this sort of garbage.

It's amazing, the things you can get used to, that can come to seem normal. In America, it has come to seem normal that a major news organization functions as the propaganda arm of an extremist political ideology, that it spews a constant stream of racism, sexism, homophobia, Islamophobia, paranoia and manufactured outrage, and that it does so with brazen disregard for what is factual, what is right, what is fair, what is balanced — virtues that are supposed to be the sine qua non of anything calling itself a newsroom.

If you live with aberrance long enough, you can forget it's aberrance. You can forget that facts matter, that logic is important, that science is critical, that he who speaks claptrap loudly still speaks claptrap — and that claptrap has no place in reasoned and informed debate. Sometimes, it takes someone from outside to hold up a mirror and allow you to see more clearly what you have grown accustomed to.

This is what the French and the British did for America last week.

For that, Fox owed them an apology. But serious people owe them thanks.

    

 

Weather Forecasting in US

 

In the 1930s the chance of an American being killed by lightning was about 1 in 400,000. Today, according to the author and statistician Nate Silver, it’s 1 in 11 million: a staggering improvement. Why? Partly because folks in the modern US take their work breaks next to water-coolers rather than under trees but mainly, Mr Silver says, because American weather forecasters have mastered their trade; people get proper warnings of storms and have time to get out of the way.

His message was summed up a few years ago in a New York Times headline: “The weatherman is not a moron.” It’s a mantra worth repeating this week, as the m-word is back in vogue after a series of events and nonevents that illustrate how weather in America is one of the things — like religion and Cheez Whiz — that really do separate them from us.

You may have read about “snowmageddon”, a storm that the mayor of New York said looked like one of the worst in history and was heading for New York and New Jersey. There would be three feet of snow, whipped up by high winds. It would be crippling. Stern-faced governors and city mayors stopped public transport and imposed curfews on drivers.

Only there was no snow. Or hardly any: the main fall was farther north in Boston and rural New England, where it is simply part of life. Over to you, National Weather Service meteorologist Gary Szatkowski: “My deepest apologies to many key decision-makers and so many members of the general public. This is a big forecast miss.”

On the face of it that is the end of the matter: the forecasters cocked it up and the politicians overreacted. The truth, however, is more complex. There is a clash in American meteorology — and life — between two distinct characteristics: first, the profound belief that human beings can and should order the world around them to maximise prosperity and happiness; a life-enhancing childlike energy that transfuses every aspect of American life and says: “Hot damn, we don’t have to put up with this. We can master the weather!”

And yet, second, set against this, is the sheer danger of that weather, which is perfectly capable of killing large numbers of people. Americans built their nation in the teeth of a storm that is by no means over. In my first year in Washington DC we had two small tornados, a hurricane that left us without power for a week, thick snow, body-sapping heat and electrical storms that required you to stop driving and take cover. Until recently (when the Australians claimed the title) the windiest spot on Earth was Mount Washington in New Hampshire, where a gust of 231mph was recorded in 1934.

Meteorologists know America is a rough neighbourhood and they know as well that their customers are much less rough (or tough) than they used to be and much less keen on surprises. Too often modern urban Americans think the weather might really have been conquered. The result is the wonderfully named “wet bias” — commercial forecasting that deliberately overestimates the probability of precipitation.

Nate Silver’s 2012 book The Signal and the Noise drew attention to this weird aspect of American behaviour. The forecasting was getting more sophisticated yet the public were apt to be angry if it rained when the probability was said to be low but much less angry if it was (falsely) said to be higher.

So weather forecasters exaggerate potential harm and so, of course, do politicians. Storms have destroyed political careers. Timely warnings have launched them.

Big risk and big aversion to risk: that’s life in modern America. It also explains the growing rift between rural and urban Americans. Want to understand why rural America is so brutal? Just look up into the big sky. Or watch the disaster film Into the Storm, released last year. The critics said the dialogue was awful but the tornados were awe-inspiring. Which, many Americans would say, just about sums up Oklahoma.

    

 

American Politicians and The Religious Right

 

GOD had a busy week. Alabama alone was a heavy lift, what with all those God invocations by state leaders trying to cast out the demon of gay marriage, then London called as well. Scott Walker was on a trip there, and he tugged God into the picture when he was asked about evolution and declined to answer, as if embracing it would be a heathen outrage.

In a subsequent tweet, Walker insisted that there wasn’t any conflict between “faith & science,” which, he wrote, “go hand in hand.”

That’s debatable. This isn’t: Faith and government shouldn’t be as cozy as they are in this country. Politicians in general, and Republicans in particular, shouldn’t genuflect as slavishly as they do, not in public. They’re vying to be senators and presidents. They’re not auditioning to be ministers and missionaries.

No one told that to Rick Perry as he ramped up for the 2012 presidential race and gave God a workout to be remembered. I’ve certainly never forgotten it. He was then the governor of Texas, and in April 2011, as wildfires ravaged the state, he signed a gubernatorial proclamation denoting one 72-hour period as the Days of Prayer for Rain in the State of Texas.

The following month, reflecting on the array of problems confronting America, he said, “I think it’s time for us to just hand it over to God, and say, ‘God: You’re going to have to fix this.’ ” And three months after that, he gathered some 30,000 people, most of them evangelical Christians, in a Houston stadium for an event called The Response: A Call to Prayer for a Nation in Crisis.

As Manny Fernandez noted in his coverage of that rally in The Times, Perry used “his office’s prestige, letterhead, Web site and other resources to promote it.” I don’t see much of a separation of church and state there.

Remarkably, none of this was a drag on his aspirations for the Oval Office, not at all. He remained a serious contender for his party’s nomination until a debate performance that was less than celestial sent him tumbling to earth.

Faith is a serious matter, and an important one, but it’s trivialized when it’s toted too readily and stridently into the political arena.

And while a creed can rightly be a personal compass, it’s wrongly deployed as marching orders or a governing strategy. Politicians’ religions — and I use the plural on purpose, because there’s no one religion that gets to trump the others — should be a source of their strength and of their empathy, not of their agendas.

But that’s not the way it works out in this country, especially not among Republicans, who can’t quit their fealty to the religious right and who, because of that, drive away many independent voters who are otherwise receptive to an ideology of limited government, personal responsibility and muscular foreign policy.

These voters just can’t stomach all the moralizing that comes with that ideology. They can’t take the placement of divinity above Darwin.

And there’s a heavy dose of divinity.

Mike Huckabee, who is an ordained minister in the Southern Baptist church, put God in the title of a new book that he wrote and just released on the cusp of what may be another presidential bid. He ran previously in 2008, when he won the Iowa caucuses.

The book is called “God, Guns, Grits, and Gravy.” These are a few of his favorite things.

During a recent appearance on a Christian TV program, he explained that he was mulling a 2016 campaign because America had lost sight of its identity as a “God-centered nation that understands that our laws do not come from man, they come from God.” The way he talks, the Constitution is a set of tablets hauled down from a mountaintop by a bearded prophet.

HE later added that “the only thing worse than not being elected president would be to be elected president without God’s blessing. I can’t think of a worse place in the world to be than in the Oval Office without God’s hand upon you.”

Last week he injected religion into politics in a different way, recalling President Obama’s recent reference to the Crusades and questioning the president’s respect for Christianity. Huckabee said that Muslims are “the one group of people that can know they have his undying, unfailing support.”

That’s ugly and absurd. While I agree that Obama’s digression into history was ill-timed and unnecessary, I’m offended by Huckabee’s extrapolation.

Huckabee is an extreme case within his party, but the Republican courtship of the religious right and its fear of giving offense to Christian fundamentalists are pervasive. Republican presidential candidates, even relatively moderate ones, run from the subject of evolution as if it were a ticking bomb. And they routinely polish their religious bona fides.

But we should be wary of politicians who are too eager to talk of religion, which is an easy rallying cry and, frequently, a diversion or even a disguise. It can cover up private misdeeds.

It can put a rosy glow on political calculations. Obama, for example, framed his past opposition to gay marriage as a deeply personal matter of faith. But as David Axelrod’s new book, “Believer,” makes clear, it was a deeply expedient matter of evading some voters’ wrath. He more or less supported gay marriage, at least when he was away from the podium, all along.

We should be even warier of politicians and other leaders who wrap policy in dogma, claiming holy guidance. That’s a dangerous road to take. At the far, bitter end of it lie theocracies and brutal extremists.

We should listen hard to what’s being said in Alabama, where opponents of gay marriage aren’t merely asserting that it runs counter to what Alabamians want. They’re declaring that it perverts God’s will, which was the position that some racists took about integration.

Last week, the chairman of the Alabama Republican Party wrote that the state would “reap God’s wrath if we embrace and condone things that are abhorrent to God, such as redefining marriage.”

And in an interview with the CNN anchor Chris Cuomo, Alabama Chief Justice Roy Moore — the man who once put up a granite monument to the Ten Commandments in the rotunda of the Alabama Judicial System building — said, “Our rights, contained in the Bill of Rights, do not come from the Constitution, they come from God.”

“That’s your faith,” Cuomo replied. “But that’s not our country.”

Cuomo’s right, and God should be given a rest. Even in Genesis he got one.

    

 

American Conservatives

 

.... a deep wellspring of American political thought, one defined by the Columbia historian Richard Hofstadter five decades ago. In an article in Harper’s, Hofstadter described “the paranoid style in American politics,” which he said was characterized by “heated exaggeration, suspiciousness, and conspiratorial fantasy.” Looking back, Hofstadter pointed to the anti-Masonic movement and the nativist and anti-Catholic movement as examples, but he also ascribed the paranoid style to his own era. He wrote:

The modern right wing . . . feels dispossessed: America has been largely taken away from them and their kind, though they are determined to try to repossess it and to prevent the final destructive act of subversion. The old American virtues have already been eaten away by cosmopolitans and intellectuals; the old competitive capitalism has been gradually undermined by socialistic and communistic schemers; the old national security and independence have been destroyed by treasonous plots, having as their most powerful agents not merely outsiders and foreigners as of old but major statesmen who are at the very centers of American power. Their predecessors had discovered conspiracies; the modern radical right finds conspiracy to be betrayal from on high.

Hofstader wrote those words in 1964

    

 

GOP Captured By The God Squad

 

Another presidential campaign is taking shape, and potential Republican candidates are beginning to speak with extra care — and sometimes with censorious hellfire — about certain social issues. As ever, they’re bowing to a bloc of voters described as Christian conservatives.

But these voters are a minority of Christians. They’re not such representative conservatives.

They have a disproportionate sway over the Republican Party. And because of that, they have an outsize influence on the national debate.

That’s an inescapable takeaway from new data compiled by the Public Religion Research Institute, a nonpartisan group that interviewed more than 50,000 Americans last year.

To put together what it is calling the American Values Atlas, the institute divided survey respondents into more than a dozen faith-related categories, some of which factored racial identity into the equation as well. White evangelical Protestants and black Protestants are separate groups, as are white Catholics and Hispanic Catholics.

The institute looked at three issues: gay marriage, abortion and immigration.

It gave me a sneak peek at the results, being released in full on Wednesday, and also did some special analyses.

Among religious groups with large populations, white evangelical Protestants, who represent 18 percent of all Americans but 36 percent of self-identified Republicans, according to the survey, stood out as the most conservative.

If you looked at the responses of all Republicans minus this evangelical subset, you saw a remarkably different party.

Among all Republicans, 35 percent favored the legalization of gay marriage, while 58 percent opposed it. But subtract the white evangelicals and the spread changes: 45 to 47. The party becomes almost evenly divided, in bold contrast to the decidedly negative stance that most Republican congressional leaders take.

Just 39 percent of all Republicans said that abortion should be legal in all or most cases, while 58 percent said that it shouldn’t. Subtract the white evangelicals and again there’s another nearly even split: 48 to 49 percent. So the party’s anti-choice ardor makes sense chiefly in terms of evangelicals.

In fact Republican voters who aren’t evangelical are probably even more socially liberal than those even-split numbers suggest, because the institute’s survey counted as Republican or Democrat only those respondents who readily identified themselves that way. It didn’t press others on whether they typically voted for one party, so the share of voters it categorized as independent was unusually large: 40 percent. In other words, its Republicans were the most committed Republicans.

“If you added in independents who lean Republican, you would expect to see the support for same-sex marriage and the support for abortion rise, precisely because they tend to be less ideologically oriented,” Robert Jones, the head of the institute, told me.

And you’d see that white evangelical Protestants “tend to be outliers today on issues like same-sex marriage,” he said. He added that in the 1980s, when Jerry Falwell and others referred to white evangelical Protestants as part of a “moral majority,” the designation was overblown but perhaps arguable.

“Today, on these key, bellwether issues like same-sex marriage, it’s no longer true,” Jones said. While 55 percent of all Americans, 52 percent of white Catholics and 42 percent of Hispanic Catholics said that abortion should be legal in all or most cases, just 32 percent of white evangelical Protestants responded that way.

In terms of allowing gays and lesbians to marry legally, 77 percent of Jews and at least 60 percent of all three Catholic subgroups — white Catholics, Hispanic Catholics and “other non-white Catholics” — said they favored it. So did more than 60 percent of white mainline Protestants.

But white evangelical Protestants?

Just 28 percent.

According to the survey, there are just seven American states — Alabama, Mississippi, Arkansas, West Virginia, South Carolina, Tennessee and Kentucky — where more than 50 percent of voters oppose gay marriage.

All of them “have white evangelical Protestant populations of one-third or more,” Jones told me. “There’s basically a linear relationship between the number of white evangelical Protestants and opposition to same-sex marriage.”

And when survey respondents were asked whether immigrants “strengthen the U.S.” or “are a burden,” the only religious group in which fewer people said “strengthen” than “burden” was white evangelical Protestants. The spread was 36 to 53 percent.

Among all Americans, the spread was the opposite, with 55 percent saying “strengthen” and 36 saying “burden.”

The conversation on Capitol Hill doesn’t exactly reflect that. Instead it suggests, accurately, that some voters’ voices are louder than others’, and are better heard.

    

 

How To Sneer At The Greens

 

They’re against cars, ‘cow rape’ and shiny, plastic politicians. But the Greens are getting slicker and more corporate as they gain ground. Who are they and what do they want?

There is barely a way to describe the smell. A soft, composty fug rises up from the bowels of a conference centre in Liverpool. This is the new Green smell, the quiet, suburban nidor of sensitive solicitors and hippie IT managers.

Occasionally, trekking past the neat stalls with the new green slogans, there is a savage, nose-tasering blast of the old Green smell, a pheromonal barfstorm bad enough to melt your tights. One man, visibly unclean, covered in bags and badges, emits a body odour so horrifying I feel the flesh slowly peeling off my face. But then he is gone. This is a new party, new pores.

The newest pores belong to Natalie Bennett, the Green leader, a scrubbed and shiny mini-heffalump who delivers an opening speech standing on a 10in box in the main auditorium.

“The Green faahty,” she whoops, in her penetrating Qantas honk, “has quadrupled in size since last year.” The membership of the Green faahty, she continues, is now 55,000. The Green faahty is, more importantly, bigger than the Lib Dems or Ukip — putting it in a weirdly powerful position, as one of the biggest smallest parties ahead of an election that will almost certainly result in coalition.

This is the official reason why an unexpectedly large number of excited carrotsniffers are flooding the conference, although I am sure at least half of them (including me) are here to see whether Bennett will spectacularly mess up again, after a disastrous LBC interview two weeks ago in which she managed to flatline on the subject of green housing policy (who wouldn’t?).

Bennett blamed a) a huge cold, b) brain fade, c) being human. Everyone now adopts the firm line that Natalie is “human”. She is the opposite of a “shiny, plastic” politician, says the former party chairwoman, Jenny Jones, a weaponised marsupial with dead dragon’s hair who leapt so wildly to Bennett’s defence after the interview that she publicly told her to shut up.

“It was completely wrong of me to stop her,” she says. “But I was trying to protect her, like a tiger mother.” She repeats the mantra: we are not like the other parties. We are not shiny, slick politicians.

But glancing around the vast conference hall, the only thing I can see is shiny slickness. The place hums like an expensive yoghurt advertisement. It feels absurd — more than 1,000 delegates and all this fuss, this branding, this signage, this vast expense, for a party with just one seat and hopes of a couple more at most?

Who knew the Green faahty could feel this corporate — a faahty where even the cannabis stall looks like the reception desk at the local Ibis. Tom, an IT worker in favour of legalising cannabis, looks as if he has eaten the world’s entire supply of skunk. Even his eyelids have eyelids. “This country is a shambles,” he says. And so the first real question I ask at the Green faahty conference is: are you selling cannabis? “No,” he says. But I can come to a cafe if I am around tomorrow, a “speakeasy”, which I must not tell anyone about.

I imagine this is a pretty standard exchange for people at the conference, along with discussions about incinerators and bicycles and craftivism and even, during the foie gras workshop I attended, cow rape. According to one activist, the motion to ban the import of foie gras, produced by the cruel force-feeding of French geese, was “discriminatory” against cows, animals who are themselves kept in horrific servitude in evil sex camps (farms) where they are “rape-impregnated five or six times and their babies are taken away”. “Mmmm, well, I could happily do away with the entire milk industry,” nods a vegan of 43 years. “The Green party would want to help society transition to a plantbased diet.” That is not to say the Green faahty would fascistically impose veganism — I think. They simply want promote a greater understanding of “a protected philosophical belief” for infants as young as one week, says an activist at the vegan desk. (Does she miss bacon? “Not in the slightest. But I do miss bitter.”)

Next is the desk for Green party virgins. First timers, mostly students, journalists and escapees from Labour, are welcomed like visitors to Alton Towers, given a badge and a leaflet which explains how to behave.

The first thing the leaflet does is tell everyone — caringly — to shut up. Some groups, it coos, “can dominate the discussion”. Members must “limit their contributions”. In order to calm everyone down, each session will begin with “attunement”, a one-minute period of official shut-up where “all activity stops”.

I sit in the hall during attunement, a sad group hum that feels more like a minute of mourning for the passing of some of the party’s more bonkers policies. These include ancient pledges to ban cars, create genderless passports and extend the Human Rights Act to non-humans.

The Green faahty is now slightly embarrassed about these policies or at least, during a halloumi-powered press conference just before Bennett’s speech, a ginger man in a suit seeks to play down the drama over the cars, saying the Green party never really intended to ban them — it was just, ha ha, a really old policy that “cleverly” found its way to the top of the agenda.

No one buys this. No one buys the outlandish rumour that Natalie is prepared, either. “Has she written her speech yet?” screeches someone two minutes before she comes on. “Her speech has been written and it might be quite good,” says the ginger man. A blazing endorsement.

The backdrop for the speech is an eyepopping green baize reminiscent of Crufts. There are two women, “visual minuters”, in the corner drawing pictures on a whiteboard. Someone draws a large picture of Caroline Lucas, the MP for Brighton Pavilion and Bennett’s unofficial daycare nurse. The audience applauds wildly as Lucas carefully introduces Bennett, saying how Green party does leadership “differently”.

By “differently”, of course, she means picking the most random person anyone can think of (an Australian with three inexplicable degrees). Lucas leads Bennett onstage like a kindly seeing dog, whereupon Bennett nervously takes the podium. And then, oh my God, what an ear-drum shattering staccato. “We have. To be. Up to the task!” she shrieks.

The policies are: more money for poor people, more food for poor people, pray for the planet, never talk to the Tories.

The good news is that Bennett gets to the end of her speech without corpsing or weeping, before a panto-style finale with Lucas. They stand with their hands in the air, like anxious children, for what seems like minutes. An angry sign prompts the heavily attuned audience: “Applause!”

    

 

The Flawed Thinking Behind Political Correctness

 

In a liberal society we want to do the decent thing. We want people to be tolerant and considerate towards others who are different in some way. So we are right to have passed laws preventing expressions of racial or other kinds of prejudice. This liberal niceness, however, morphed in the 1980s into something very different. This was hate crime, rooted in identity politics — the agenda of groups who claimed to be the victims of discrimination based on race, ethnicity, gender, sexual proclivity or disability.

This was not a liberal concept at all. It was based instead on the Marxist concept that the whole of human society is animated by the play of power. According to this doctrine, people without power are invariably the victims of those with power.

These victims cannot be blamed for anything because they are shaped by circumstances beyond their control. It is the powerful who are therefore blamed for any bad stuff associated with the powerless. And if the powerful try to hold the powerless responsible, that is an expression of prejudice.

Thus political correctness was born, with prejudiced hate crime its signature offence. But, of course, prejudice is highly subjective; one person’s bigotry might be another’s legitimate criticism.

The key point about prejudice is surely that it is based on untruth or distortion. Statements that are true and fair cannot embody prejudice. Which is why Mr Phillips says people should be able to say, for example, that black people are more likely to be convicted of robbery — because it is true. The problem, though, is deeper still. As Mr Phillips observes: “Campaigners like me sincerely believed that if we could prevent people expressing prejudiced ideas then eventually they would stop thinking them.”

This insight gets to the core of politically correct zealotry. It is about more than stamping out the expression of hatred. It’s about stamping out the perceived hatred itself. It’s an agenda to make better human beings. And that leads to attempted thought control.

Last week I caught up with the hit stage play, The Nether. Its conceit is a dystopian world dominated by the net in which men are pursued by a cyber police force for creating a fantasy world of paedophile activity with children who exist only in the imagination. The question it asks is whether we should be called to account not just for what we do but for what we think.

I believe morality consists in deeds not thoughts. People are capable of both good and bad; we should encourage the former and discourage the latter. But thinking is the ultimate private activity. What anti-racism tried to do, as Mr Phillips now recognises, was to regulate how people think.

Bad thoughts, however, cannot be excised from the mind. The perfection of human nature is a utopian agenda. And like all such utopian projects going back to the French Revolution, its inherent impossibility leads to tyranny.

In 1951, the political thinker J L Talmon described this as “totalitarian democracy”, in which a “vanguard of the enlightened” justified coercion against those who refused to be virtuous in order to quicken man’s progress towards perfection and social harmony.

In similar vein, Paul Edward Gottfried described in his 2002 book Multiculturalism and the Politics of Guilt the “strong-armed tactics” of liberal societies which camouflage bullying as “effusive caring” or the necessary response to prejudice.

Muzzling dissent has thus become viewed as imperative to combat bigotry, inconvenient facts are suppressed or distorted as acts of “inclusiveness”, and anyone who dares challenge this world view with actual evidence is punished.

    

 

Republicans and Health Care

 

The Republicans have had six years to develop an alternative to Obamacare. Before that, they had three previous presidential administrations (Reagan and both Bushes) since the conservative movement rose to power with which to develop a response to the decades-long American health-care crisis. They are no closer today than they were when the health-care debate began in Congress.

And the reason for the absence of a specific, partywide alternative is that Republicans don’t just have different goals than Democrats, they have different kinds of goals.

Obamacare was designed to reduce the physical, mental, and financial stress that comes with lacking access to health insurance, while also slowing down the long-term growth of medical costs. Those are specific, measurable goals. Republicans have insisted the law would fail to meet its objectives — costs would run higher than expected, workers would be forced into part-time jobs, the law would even fail to reduce the number of uninsured. None of those things have happened.

All the Republican predictions have failed. Just this week, the Congressional Budget Office once again revised down its cost projections for the law, which is now projected to cost 20 percent less than originally estimated. Given conservative certainty that the opposite would occur, you might expect some revision. But conservatives have not abandoned or even reduced their fervent opposition to Obamacare. This is because the right’s specific, measurable predictions about the law are subordinate to deeper, philosophical beliefs. They oppose the law’s methods (more taxes, spending, and regulation) on principle. They believe those methods will fail to achieve their stated goals, but even if they succeed, they oppose them anyway. Republicans cannot design a partywide health-care alternative because they cannot reconcile the specific things most Americans want from the health-care system (access to affordable insurance, protection from discrimination against preexisting conditions) with their ideological commitments.

** This is not to say that the Democratic way of thinking about health care is right and the Republican way is wrong, though I personally agree with the former and not the latter. You don’t need to care that health-care reform “works” if you oppose it philosophically, and philosophical preferences are neither right nor wrong.

The point, rather, is that the two parties are not mirror images of each other. They are asymmetrical. One is organized around practical objectives, the other ideological ones. Practical objectives lend themselves more easily to compromise. They can be measured in empirical terms. Ideological objectives defy compromise and practical assessment.

The asymmetry between the two parties is a longtime fascination of mine, and in recent years has drawn the attention of political scientists like Jacob Hacker and Paul Pierson, and analysts like Norman Ornstein and Thomas Mann. Political scientists Matt Grossmann and David Hopkins have written a new paper exploring the asymmetry in beliefs between the two parties. They find a great deal of data to support the case that Republican Party is simply more ideologically oriented than the Democrats.

1. It has been a truism of American politics for decades that American voters are symbolically conservative and operationally liberal: They favor small government in the abstract but they like most actual programs. As Grossmann and Hopkins point out, this allows both parties to see themselves as representing the majority. The rule also explains Obamacare — Republicans endlessly touting polling showing the law’s unpopularity, while Democrats respond that the law’s actual provisions are popular.

2. “For more than six decades,” they report, “the American National Election Studies (ANES) have asked a sample of Americans what they like and dislike about each major party and presidential candidate in every presidential election, recording their open-ended responses.” Republicans overwhelmingly describe their preference in ideological terms, Democrats in practical terms. This is true at the level of general voters:

3. It also holds true among activists, among whom the Republican base is far more committed to conservative principle than the Democratic base is to liberal principle

4. Democratic voters are also more committed to specific government programs than to the general idea of government. Republicans believe in cutting government in general but don’t want to cut many actual programs

Republican consistency on broad ideological predispositions did not extend to specific policy questions, reflecting the enduring gap between symbolic and operational conservatism. For 81 percent of non-activist Democrats, the number of issue areas on which they supported an increase in spending exceeded the number of areas on which they supported spending cuts, but only 38 percent of non-activist Republicans identified more items to cut than items for which they favored spending growth. Democrats exhibit much stronger support for particular forms of government activity than for activist government as such, while Republicans are more united around broad principles of limited government than around the need for reductions in specific programs.

This, of course, explains why Republicans struggle so consistently to translate their principles into a concrete program: Even their own voters don’t support many real program cuts. This is why conservative Republican presidents have massively increased the deficit — they set tax levels to fund the government they would like in the abstract, while funding the level of government they want in the specific.

5. Liberal opinion columnists are more likely than conservatives to write about specific policy proposals, while conservative opinion columnists are more likely to write about abstract ideology

6. The Democratic platform is also more likely to invoke specific policies, while the Republican platform is more likely to discuss ideological principle

7. Republican voters want their party to stick to principle rather than compromise. Democratic voters want the reverse

For Republicans to actually unify around a health-care plan would require them to oppose every tendency that defines their party. They would have to side with the specific programmatic desires favored by the voters rather than its abstract ideological goals. They would need to compromise with the opposing party rather than stick to principle. Republicans won’t have a real health-care plan until they become a different kind of party.

    

 

Talking About Race

 

BRITAIN is in danger of silencing any debate on race issues by turning on those who dare to ask the questions, Trevor Phillips, the former head of the nation’s equality commission, warns today.

He singles out the case of Benedict Cumberbatch, the star of The Imitation Game, who in January was criticised for using the word “coloured” while demanding more roles for black actors.

Writing in News Review, Phillips, 61, says: “There is a real cost to this type of intimidation. The upshot is that the next time a white person wants to speak up for minorities, I would guess they’ll hesitate and ask themselves: ‘Will I make things worse by speaking out?’ ”

A decade ago Phillips caused controversy by expressing concern that Britain was “sleepwalking . . . to segregation” and now warns: “The perverse and unintended consequences of our drive to instil respect for diversity” is that politicians and the media “have become terrified of discussing racial or religious differences. “The instinct to avoid offence is understandable,” he says. “But its outcomes have been shown in practice to be disastrous.”

The Sunday Times revelations that schools had been taken over by fundamentalists — the Trojan Horse scandal — “show children’s education is at risk of being sacrificed on the altar of religious orthodoxy”, he argues.

Phillips says if African Caribbeans are statistically more likely to commit some kinds of crime — such as killing each other — then it would make sense to find out why, so we can stop it happening.

He adds: “Preventing anyone from saying what’s on their minds won’t ever remove it from their hearts. People need to feel free to say what they want to without the fear of being accused of racism or bigotry. That means we’re all going to have to become more ready to offend each other.”

    

 

Hispanics In America

 

A SATIRICAL film in 2004, called “A Day Without a Mexican”, imagined Californians running scared after their cooks, nannies and gardeners had vanished. Set it in today’s America and it would be a more sobering drama. If 57m Hispanics were to disappear, public-school playgrounds would lose one child in four and employers from Alaska to Alabama would struggle to stay open. Imagine the scene by mid-century, when the Latino population is set to have doubled again.

Listen to some, and foreign scroungers threaten America, a soft-hearted country with a wide-open border. For almost two centuries after America was founded, more than 80% of its citizens were whites of European descent. Today, non-Hispanic whites have dropped below two-thirds of the population. They are on course to become a minority by 2044. At a recent gathering of Republicans with presidential ambitions, a former governor of Arkansas, Mike Huckabee, growled about “illegal people” rushing in “because they’ve heard that there is a bowl of food just across the border.”

Politicians are right that a demographic revolution is under way. But, as our special report this week shows, their panic about immigration and the national interest is misguided. America needs its Latinos. To prosper, it must not exclude them, but help them realise their potential.

A Hispanic attack

Those who whip up border fever are wrong on the facts. The southern frontier has never been harder to cross. Recent Hispanic population growth has mostly been driven by births, not fresh immigration. Even if the borders could somehow be sealed and every unauthorised migrant deported—which would be cruel and impossible—some 48m legally resident Hispanics would remain. Latino growth will not be stopped.

They are also wrong about demography. From Europe to north-east Asia, the 21st century risks being an age of old people, slow growth and sour, timid politics. Swelling armies of the elderly will fight to defend their pensions and other public services. Between now and mid-century, Germany’s median age will rise to 52. China’s population growth will flatten and then fall; its labour force is already shrinking. Not America’s. By 2050 its median age will be a sprightly 41 and its population will still be growing. Latinos will be a big part of that story.

The nativists fret that Hispanics will be a race apart, tied to homelands racked by corruption and crime. Early migrants from Europe, they note, built new lives an ocean away from their ancestral lands. Hispanics, by contrast, can maintain ties with relatives who stayed behind, thanks to cheap flights and Skype. This fear is wildly exaggerated. People can love two countries, just as loving your spouse does not mean you love your mother less. Nativists are distracting America from the real task, which is to make Hispanic integration a success.

An unprecedented test of social mobility looms. Today’s Latinos are poorer and worse-educated than the American average. As a vast (and mostly white) cohort of middle-class baby-boomers retires, America must educate the young Hispanics who will replace them, or the whole country will suffer. Some states understand what is at stake—and are passing laws to make college cheaper for children with good grades but the wrong legal status. Others are going backwards. Texas Republicans are debating whether to make college costlier for undocumented students—a baffling move in a state where, by 2050, Hispanic workers will outnumber whites three to one.

Politicians of both left and right will have to change their tune. For a start, they will have to stop treating Hispanics as almost a single-issue group—as either villains or victims of the immigration system. Almost 1m Latinos reach voting age each year. With every election, Hispanics will want to hear less about immigration and more about school reform, affordable health care and policies to help them get into the middle class.

Republicans have the most work ahead. The party has done a wretched job of making Latinos feel welcome, and suffered for it at the polls. Just 27% of Hispanics voted for Mitt Romney, the Republican presidential candidate in 2012, after he suggested that life should be made so miserable for migrants without legal papers that they “self-deport”. Yet Democrats have no reason to be smug. At present, most Latinos do not vote at all; as they grow more prosperous their votes will be up for grabs. Jeb Bush, a putative White House contender in 2016 who is married to a Latina, has wooed Latinos by saying that illegal migration is often an act of family “love”.

Since their votes cannot be taken for granted, Hispanics will become ever more influential. This is especially true of those who leave the Catholic church to become Protestants. This subset already outnumbers Jewish-Americans, and is that rare thing: a true swing electorate, backing Bill Clinton, George W. Bush and Barack Obama. America should welcome the competition: its sclerotic democracy needs swing voters.

Chilies in the mix

Anxious Americans should have more faith in their system. High-school-graduation rates are rising among Latinos; teenage pregnancy is falling. Inter-marriage between Hispanics and others is rising. The children and grandchildren of migrants are learning English—just like immigrants of the past. They are bringing something new, too. A distinctive, bilingual Hispanic American culture is blurring old distinctions between Mexican-Americans and other Latinos. That culture’s swaggering soft power can be felt across the Spanish-speaking world: just ask artists such as Romeo Santos, a bachata singer of Dominican-Puerto Rican stock, raised in the Bronx. His name is unknown to many Anglos, but he has sold out Yankee Stadium in New York (twice) and 50,000-seat stadiums from Mexico City to Buenos Aires. One of his hits, “Propuesta Indecente”, has been viewed on YouTube more than 600m times.

America has been granted an extraordinary stroke of luck: a big dose of youth and energy, just as its global competitors are greying. Making the most of this chance will take pragmatism and goodwill. Get it right, and a diverse, outward-facing America will have much to teach the world.

    

 

Jeremy Clarkson

 

The divide is between what might be called enlightened metropolitan opinion (EMO), aka the chattering classes, aka the forces of political correctness, and popular opinion (PO), aka the silent majority, aka the great unwashed.

As far as PO is concerned, Clarkson is an entertainer with an instinctive take on the issues of the day that it can relate to. Yes, he might go too far every now and again, especially when he's had a drink or six, but don't we all? Besides, it's not intended to be taken seriously.

But EMO loathes Clarkson because, in its view, he breathes fresh life into dated and deplorable attitudes. He is, in his own words, a "dinosaur", but one who has thrived rather than become extinct.

    

 

Israel and Palestine

 

This makes null and void his speech in June 2009 at Bar Ilan University, where Netanyahu had laid out a different “vision of peace,” saying: “In this small land of ours, two peoples live freely, side by side, in amity and mutual respect. Each will have its own flag, its own national anthem, its own government. Neither will threaten the security or survival of the other.” Provided the Palestinian state recognizes Israel’s Jewish character and accepts demilitarization, he added, “We will be ready in a future peace agreement to reach a solution where a demilitarized Palestinian state exists alongside the Jewish state.”

Now, if there are not going to be two states for two peoples in the area between the Jordan River and Mediterranean, then there is going to be only one state — and that one state will either be a Jewish democracy that systematically denies the voting rights of about one-third of its people or it will be a democracy and systematically erodes the Jewish character of Israel.

Just look at the numbers: In 2014, the estimated Palestinian Arab population of the West Bank was 2.72 million, with roughly 40 percent under the age of 14. There are already 1.7 million Israeli Arabs citizens — who assembled all their parties together in the latest election onto one list and came in third. Together, the West Bankers and Israeli Arabs constitute 4.4 million people. There are 6.2 million Israeli Jews. According to statistics from the Jewish Virtual Library, the Jewish population of Israel grew by 1.7 percent over the past year, and the Arab population grew by 2.2 percent.

If there is only one state, Israel cannot be Jewish and permit West Bank Palestinians to exercise any voting rights alongside Israeli Arabs. But if Israel is one state and wants to be democratic, how does it continue depriving West Bankers of the vote — when you can be sure they will make it their No. 1 demand.

    

 

Singapore

 

From my office window I can see the Shard, Europe’s tallest building, sweeping up into the stratosphere. Near its foot an ugly wooden hut has appeared, shielding two escalators that usually take commuters down to the Underground but are broken. The hut is ominous. In 2013 engineers took a year to fix three sets of escalators at Oxford Circus station. Last summer you couldn’t change to the Bakerloo line at Paddington for two months while the escalators were being repaired.

Londoners are used to going the long way round. But I can’t help thinking that the Victorians would never have put up with this. How are we going to build George Osborne’s northern powerhouse if we can’t fix a moving staircase?

A friend who lives in Singapore tells me there was outrage last year when a metro escalator was out of order for “a whole five days”. Five days? Here it probably takes longer just to do the health and safety assessment.

In Singapore it’s not just the escalators that work, it’s everything. Fifty years ago the city state was a swamp with no natural resources. Its first prime minister, Lee Kuan Yew, transformed it into an economic powerhouse with a higher standard of living and better healthcare than in many western countries and school results that regularly top international rankings. In the process he upended the notion that political freedom is necessary for prosperity. Western democracies are now in a global race against autocratic capitalism.

This weekend Lee lies critically ill in hospital. Whatever you think about the autocratic regime that he ran and then passed to his son — a culture memorably described as “Disneyland with the death penalty” by Adrian Wooldridge and John Micklethwait in their book The Fourth Revolution — we could learn a lot from Singapore.

Lee was the ultimate pragmatist. “Not invented here”, which sums up a certain type of mulish British insularity, would have meant nothing to him. He was interested in what worked and he would borrow ideas from anywhere in the world.

His ambitions were staggering. To achieve them, Lee built one of the most efficient government machines in the world. It is powered by one of the smallest, most highly paid and ruthlessly meritocratic civil services. Forget promotion on seniority. Forget nepotism. The Singaporean state hires the best people, pays by performance and sacks underperformers.

It also consumes only 17% of GDP, which puts into perspective last week’s wrangling in Britain about whether the next government will spend a bit less or a bit more than 36% of GDP. In the general media furore hardly has anyone asked whether that colossal amount of money will be spent effectively. Yet that is what matters most.

The speed of the Singaporean machine is stunning. Biosciences is now one of the world’s fastest growing industries but 15 years ago Singapore did no biomedical research. Today it accounts for more than 6% of the country’s GDP. Glaxo, Roche and Novartis, the pharma– ceutical companies, sit in a vast hub of private and public laboratories called Biopolis. They are attracted by tax breaks and by the fact that Singapore boasts the world’s shortest approval time for starting the clinical trials that are crucial to the development of medicines.

What was Britain, with all its brilliant PhDs and Nobel prizes, doing during that time? Filling in forms. The number of clinical trials held here more than halved between 2002 and 2007. Trials had to be approved by seven different regulators. Instead of supporting scientific ideas, bureaucracy was killing them. Since 2011 the coalition has begun to turn the tide but it has been a colossal struggle waged against the machine by ministers, advisers and the chief medical officer.

Of course it is easier to take long-term decisions in an autocracy than in a democracy, where raucous groups lobby to be heard. But Victorian Britain was a democracy too. Joseph Bazalgette and Isambard Kingdom Brunel did not face the inertia of vested interests. Ministers who visit Singapore to learn about its teaching techniques, or go to India to see a world-class hospital, cannot take home what they observe: vested interests say, “Britain is different. We couldn’t do that here.” Sometimes it feels as though we are vanishing down the plughole of our own insularity.

Although Lee was clearly an obsessive — he used to turn up the air-conditioning to get officials to work harder — he was also willing to change course. In 1998 Singapore’s prisons were plagued, like those in the West, by overcrowding, high reoffending rates and a recruitment crisis. The government decided to make prisons “schools for life”. Guards became teachers; drug addicts and those with mental health problems were given treatment; prisoners were helped to get jobs after their release. Within 10 years reoffending rates fell from 44% to 27%. Staff morale boomed.

Similar proposals were made in Britain around the same time but when David Blunkett, then the home secretary, tried to license an experiment, the Treasury obstructed him. Since then security concerns have trumped every serious suggestion for rehabilitation. As a result our prisons continue to suffer from the same old problems, despite there being a perfectly good model of how it could be different on the other side of the world.

Lee was feted by small-state conservatives, notably Henry Kissinger, but Sir Michael Barber, a former adviser to Tony Blair, points out that effective government is vital whether you are a big state interventionist or a small state radical. In his new book How To Run a Government, Barber describes the great thinkers Amartya Sen and Jagdish Bhagwati passionately disagreeing on how much of India’s future can be left to the market. The point both men miss, says Barber, is that India’s lack of effective government will undermine either vision.

Singapore is at a turning point. Fifty years after independence, the educated middle class is clamouring for more freedom. The government is realising that such a tightly controlled society does not produce graduates who are creative and that creativity cannot be imposed. Creativity could turn out to be the West’s strongest card. The precious freedoms of our messy democracies translate into scientific freedoms, into art and into ideas that give us an economic edge. We won’t capitalise on those, however, if we are hamstrung by bureaucracy.

Ten years ago a few American academics were still arguing that the Asian model was too rigid to adapt well to change and that flexible, laissez-faire capitalism was more likely to win the global race. But we in the West increasingly look like the rigid ones. Singapore topped the 2014 Global Infrastructure Investment Index. Britain lagged behind in 10th place.

Lee Kuan Yew was happy to poach good ideas and we should do the same, starting with his Rolls-Royce civil service. We won’t win the global race if we can’t fix an escalator.

    

 

Fighting "Religious Freedom" Movement

 

The anti-gay backlash backlash is here.

In the wake of advances for LGBT equality, conservatives across the country have rallied to pass “religious freedom” bills that would allow people and businesses to discriminate if they have a religious justification for doing so.

The poster children of this campaign are religious wedding photographers and cake bakers. But the real impact is far more serious: huge corporations like Hobby Lobby denying benefits, services, and recognition to same-sex families; Catholic hospitals disallowing longtime, same-sex spouses to visit one another; huge university systems firing janitors, basketball coaches, and secretaries because they are gay.

And then there are the unintended consequences: wife- and child-abusers offering religion as a defense; Jews being turned away from hotels; and more absurd consequences like Satanists advertising their religion in state capitol buildings.

At first, these advances flew below the radar. When I first covered this issue two years ago, it was still somewhat arcane, lost in a haze of legalese. “Religious liberty” is a good thing, right? No one knew how to pronounce “RFRA.” (Riff-ra, if you please.)

But then came the Hobby Lobby decision, and the Arizona “Turn the Gays Away” fiasco, and increasing attention to “Religious Freedom Restoration Acts” (RFRAs) in Mississippi (passed), Georgia (going down to the wire), and Indiana (same).

Now, signs of a backlash against the backlash are cropping up.

The heavily funded, far-right-written-and-coordinated cascade of RFRAs—what I believe I christened RFRA Madness—has run into serious opposition.

In West Virginia, a bill identical to one that passed in Arkansas, which would forbid any municipality from passing anti-discrimination laws, died in committee.

In Michigan, a “Religious Freedom Restoration Act” failed late last year, though it may have another shot this year.

Perhaps most intriguing is Oklahoma, where yet another “Religious Freedom Act” died in the state House. It’s not known exactly why Republicans shelved the bill, but it may be due to one of the most ingenious counter-efforts in the country, led by Democrat Emily Virgin.

State Representative Virgin’s idea? Require any business that won’t provide services to LGBT people to state so publicly. In the language of the amendment, “Any person not wanting to participate in any of the activities set forth in subsection A of this section based on sexual orientation, gender identity or race of either party to the marriage shall post notice of such refusal in a manner clearly visible to the public in all places of business, including websites.”

Said Virgin on Facebook, “This would save same-sex couples the trouble and embarrassment of going into that business just to be turned away.”

This, if I may say so, is brilliant. (The idea came in consultations between Virgin, the ACLU of Oklahoma, and the state advocacy group Freedom Oklahoma.) On the surface, yes, Virgin’s rationale makes sense. But we all know what would really happen if Chick-Fil-A or Hobby Lobby posted such a sign: outrage. Virgin’s amendment would act as a kind of public shaming for businesses who want to turn gays, blacks, Jews, or anyone else away.

To be sure, many businesses might display their bigotry as a badge of honor—as Chick-Fil-A did. They might also benefit from doing so, at least in the short term.

But those Instagram pictures are going to be around for a long time. And in 20 years, they’re going to look a lot like “Whites Only” signs.

To many people, they already do. The fact is, while public opinion is still divided on same-sex marriage (about 55-45 in favor), it’s not on nondiscrimination. Over 75 percent of Americans believe that you shouldn’t be fired for being gay, and while they might sympathize with an individual baker or florist, they’re not going to be keen on national chains saying “No Gays Allowed.”

Again, it’s not known if Virgin’s proposed amendment killed the “Religious Freedom” bill in Oklahoma. But something did.

And then there are our friends at the Satanic Temple, eager to take these religious liberty provisions to their logical conclusions. In Michigan, they’ve proposed an amendment much like Virgin’s, and even printed some handy fill-in signs for bigots who want to get a head start. There and elsewhere (including, best of all, Florida), they’ve successfully installed Satanic “holiday displays,” making use of Christmas-display laws that shrink the scope of the First Amendment.

The Satanic Temple (not one organization, but many) is doing street theater, of course. Most likely, they are preaching to the already-Satanically converted. But they have also succeeded at highlighting the unintended consequences of the new “religious liberty.”

More seriously, we are now beginning to tease fact from fiction when it comes to “religious liberty.” Individual florists, bakers, photographers, innkeepers—these are sympathetic poster children.

But how about multibillion-dollar businesses, hospitals, malls, and schools? How about the doctor who refused to treat the child of same-sex parents? How about adoption agencies that receive federal money, but still won’t place children with legally married same-sex couples? Or the football coach who gets fired when someone heard him mention his husband?

These are the real faces of so-called “religious freedom.” And as their stories become better known, the scare quotes around that term get thicker and thicker.

It’s telling that Oklahoma dropped its “Religious Freedom” bill when a legislator demanded that discriminators make themselves known to the public.

Like repressed gays, prejudice has to hide in the closet.

    

 

American Inequality

 

In a candid conversation with Frank Rich last fall, Chris Rock said, "Oh, people don’t even know. If poor people knew how rich rich people are, there would be riots in the streets." The findings of three studies, published over the last several years in Perspectives on Psychological Science, suggest that Rock is right. We have no idea how unequal our society has become.

In their 2011 paper, Michael Norton and Dan Ariely analyzed beliefs about wealth inequality. They asked more than 5,000 Americans to guess the percentage of wealth (i.e., savings, property, stocks, etc., minus debts) owned by each fifth of the population. Next, they asked people to construct their ideal distributions. Imagine a pizza of all the wealth in the United States. What percentage of that pizza belongs to the top 20% of Americans? How big of a slice does the bottom 40% have? In an ideal world, how much should they have?

The average American believes that the richest fifth own 59% of the wealth and that the bottom 40% own 9%. The reality is strikingly different. The top 20% of US households own more than 84% of the wealth, and the bottom 40% combine for a paltry 0.3%. The Walton family, for example, has more wealth than 42% of American families combined.

We don’t want to live like this. In our ideal distribution, the top quintile owns 32% and the bottom two quintiles own 25%. As the journalist Chrystia Freeland put it, “Americans actually live in Russia, although they think they live in Sweden. And they would like to live on a kibbutz.” Norton and Ariely found a surprising level of consensus: everyone — even Republicans and the wealthy—wants a more equal distribution of wealth than the status quo.

This all might ring a bell. An infographic video of the study went viral and has been watched more than 16 million times.

In a study published last year, Norton and Sorapop Kiatpongsan used a similar approach to assess perceptions of income inequality. They asked about 55,000 people from 40 countries to estimate how much corporate CEOs and unskilled workers earned. Then they asked people how much CEOs and workers should earn. The median American estimated that the CEO-to-worker pay-ratio was 30-to-1, and that ideally, it’d be 7-to-1. The reality? 354-to-1. Fifty years ago, it was 20-to-1. Again, the patterns were the same for all subgroups, regardless of age, education, political affiliation, or opinion on inequality and pay. “In sum,” the researchers concluded, “respondents underestimate actual pay gaps, and their ideal pay gaps are even further from reality than those underestimates.”

These two studies imply that our apathy about inequality is due to rose-colored misperceptions. To be fair, though, we do know that something is up. After all, President Obama called economic inequality “the defining challenge of our time.” But while Americans acknowledge that the gap between the rich and poor has widened over the last decade, very few see it as a serious issue. Just five percent of Americans think that inequality is a major problem in need of attention. While the occupy movement may have a tangible legacy, Americans aren’t rioting in the streets.

One likely reason for this is identified by a third study, published earlier this year by Shai Davidai and Thomas Gilovich that suggests that our indifference lies in a distinctly American cultural optimism. At the core of the American Dream is the belief that anyone who works hard can move up economically regardless of his or her social circumstances. Davidai and Gilovich wanted to know whether people had a realistic sense of economic mobility.

The researchers found Americans overestimate the amount of upward social mobility that exists in society. They asked some 3,000 people to guess the chance that someone born to a family in the poorest 20% ends up as an adult in the richer quintiles. Sure enough, people think that moving up is significantly more likely than it is in reality. Interestingly, poorer and politically conservative participants thought that there is more mobility than richer and liberal participants.

According to Pew Research, most Americans believe the economic system unfairly favors the wealthy, but 60% believe that most people can make it if they’re willing to work hard. Senator Marco Rubio says that America has “never been a nation of haves and have-nots. We are a nation of haves and soon-to-haves, of people who have made it and people who will make it.” Sure, we love a good rags-to-riches story, but perhaps we tolerate such inequality because we think these stories happen more than they actually do.

We may not want to believe it, but the United States is now the most unequal of all Western nations. To make matters worse, America has considerably less social mobility than Canada and Europe.

As the sociologists Stephen McNamee and Robert Miller Jr. point out in their book, “The Meritocracy Myth,” Americans widely believe that success is due to individual talent and effort. Ironically, when the term "meritocracy” was first used by Michael Young (in his 1958 book “The Rise of the Meritocracy”) it was meant to criticize a society ruled by the talent elite. “It is good sense to appoint individual people to jobs on their merit,” wrote Young in a 2001 essay for the Guardian. “It is the opposite when those who are judged to have merit of a particular kind harden into a new social class without room in it for others.” The creator of the phrase wishes we would stop using it because it underwrites the myth that those who have money and power must deserve it (and the more sinister belief that the less fortunate don’t deserve better).

By overemphasizing individual mobility, we ignore important social determinants of success like family inheritance, social connections, and structural discrimination. The three papers in Perspectives on Psychological Science indicate not only that economic inequality is much worse than we think, but also that social mobility is less than you’d imagine. Our unique brand of optimism prevents us from making any real changes.

George Carlin joked that, “the reason they call it the American Dream is because you have to be asleep to believe it.” How do we wake up?

    

 

Rand Paul

 

Did Barry Goldwater ever make it to the White House? The conventional answer is no; Goldwater was the right-wing Republican ideologue who crashed and burned in 1964 — so spectacularly that Lyndon Johnson, his Democrat rival, won by the biggest share of the popular vote in presidential history.

The Goldwater campaign slogan — “In your heart you know he’s right” — became, courtesy of the roughhouse Johnson team, “In your guts you know he’s nuts”.

As The New York Times gloated once the votes were in: “Barry Goldwater not only lost the presidential election yesterday but the conservative cause as well.”

Not so fast. Goldwater, as every student of American politics knows, actually won. Not in 1964. Not in 1968, when his Republican rival Richard Nixon finally made it. Not with Gerald Ford, hell no! But in 1980, 16 years after Goldwater put conservatism on the agenda. That was when Ronald Reagan took Goldwater’s big ideas over the finishing line.

I wonder if a similar journey began this week in Kentucky.

Rand Paul — the Kentucky senator who announced that he was running for president — is the most interesting Republican candidate for decades. He is a serious libertarian. Where his father Ron Paul — a serial non-serious presidential candidate — holds views that are well outside polite American political discourse (he believes, for instance, that the federal government knew about the 9/11 attacks but did nothing to stop them) Mr Paul junior has managed to plough a new furrow without leaving the field.

So it’s goodbye to the earnest young men in bow-ties who used to surround Ron Paul and hello to a much broader coalition of perfectly normal people who happen to be weary of war, happy to let foreigners decide how they want to be governed, worried about personal freedoms being curtailed by the security state, happy to let folks do what they want in their own bedrooms, and worried about the cost and the good sense of imprisoning huge numbers of nonviolent criminals. This has the makings of a big tent. It is a tent that holds many young Americans, many independents, and many Democrats.

In a nation gripped by anger and outrage at the police killing of yet another unarmed black man, it is worth noting that Rand Paul, far more than any other would-be Republican candidate, has thought about concrete measures to improve the way poor black Americans are treated by the judicial system. He is clear on this: “Anyone who thinks that race does not still, even if inadvertently, skew the application of criminal justice in this country is just not paying close enough attention.”

The changes he has proposed include the ending of mandatory minimum-sentencing laws, which used to thrill the Republican law-and-order brigade but, he believes, are a disaster. His co-sponsor is Cory Booker, a Democrat and one of only two African-Americans in the Senate.

According to an opinion poll this week, Mr Paul would lose a presidential race against Hillary Clinton by a margin of 4 percentage points, which is pretty respectable given her fame. He also has a proper path to the nomination; he can win the “money primary” by tapping Silicon Valley where ending government surveillance is a commercial aim. And he could sweep the early primaries too — particularly in New Hampshire where the state motto is “Live Free or Die”, and folks regard seat belts and income tax as creeping socialism.

And yet. Already — in a couple of prickly encounters with reporters — Mr Paul has come up against the simple fact that libertarianism can feel uncomfortably adolescent.

For instance, does Rand Paul seriously want to cut aid to all foreign nations? Even Israel? He used to think so. And does he really believe that private businesses should be allowed to do what they want? Can they refuse to serve black people? Again: he used to think so.

He is already coming under attack from his own side. Mark Salter, a big cheese in John McCain’s 2008 presidential bid, said that Mr Paul’s foreign policy was informed by “crackpot theories” and so illconceived that, were he the candidate in 2016, Republicans should vote instead for Hillary. So that is a no, thank you, from the party establishment.

And perhaps a big win for Hillary if the two were pitted against each other and, in a dangerous world, the risk of Rand was to be properly assessed. Agreed then: a defeat of Goldwater proportions could not be discounted.

But 16 years from now? That could be a very different story.

    

 

Govt Welfare

 

Poverty looks pretty great if you're not living in it. The government gives you free money to spend on steak and lobster, on tattoos and spa days, on — why not? — cruise vacations and psychic visits.

Enough serious-minded people seem to think this is what the poor actually buy with their meager aid that we've now seen a raft of bills and proposed state laws to nudge them away from so much excess. Missouri wants to curtail what the poor eat with their food stamps (evidence of the problem from one state legislator: "I have seen people purchasing filet mignons"). Kansas wants to block welfare recipients from spending government money at strip clubs (in legalese: any "sexually oriented business or any retail establishment which provides adult-oriented entertainment in which performers disrobe or perform in an unclothed state for entertainment").

Then there are the states that want to drug-test welfare recipients — the implication being that we worry the poor will convert their benefits directly into drugs.

No steak, no seafood, no strip clubs: There’s a logical gap in the recent laws that bash the poor who receive government welfare and food stamps.

Sometimes these laws are cast as protection for the poor, ensuring that aid is steered in ways that will help them the most. Other times they're framed as protection for the taxpayer, who shouldn't be asked to help people who will squander the money on vices anyway.

But the logic behind the proposals is problematic in at least three, really big ways.

The first is economic: There's virtually no evidence that the poor actually spend their money this way. The idea that they do defies Maslow's hierarchy — the notion that we all need shelter and food before we go in search of foot massages. In fact, the poor are much more savvy about how they spend their money because they have less of it (quick quiz: do you know exactly how much you last spent on a gallon of milk? or a bag of diapers?). By definition, a much higher share of their income — often more than half of it — is eaten up by basic housing costs than is true for the better-off, leaving them less money for luxuries anyway. And contrary to the logic of drug-testing laws, the poor are no more likely to use drugs than the population at large.

The second issue with these laws is a moral one: We rarely make similar demands of other recipients of government aid. We don't drug-test farmers who receive agriculture subsidies (lest they think about plowing while high!). We don't require Pell Grant recipients to prove that they're pursuing a degree that will get them a real job one day (sorry, no poetry!). We don't require wealthy families who cash in on the home mortgage interest deduction to prove that they don't use their homes as brothels (because surely someone out there does this). The strings that we attach to government aid are attached uniquely for the poor.

That leads us to the third problem, which is a political one. Many, many Americans who do receive these other kinds of government benefits — farm subsidies, student loans, mortgage tax breaks — don't recognize that, like the poor, they get something from government, too. That's because government gives money directly to poor people, but it gives benefits to the rest of us in ways that allow us to tell ourselves that we get nothing from government at all.

Political scientist Suzanne Mettler has called this effect the "submerged state." Food stamps and welfare checks are incredibly visible government benefits. The mortgage interest deduction, Medicare benefits and tuition tax breaks are not — they're submerged. They come to us in round-about ways, through smaller tax bills (or larger refunds), through payments we don't have to make to doctors (thanks to Medicare), or in tuition we don't have to pay to universities (because the G.I. Bill does that for us).

Mettler's research has shown that a remarkable number of people who don't think they get anything from government in fact benefit from one of these programs. This explains why we get election-season soundbites from confused voters who want policymakers to "keep your government hands off my Medicare!" This is also what enables politicians to gin up indignation among small-government supporters who don't realize they rely on government themselves.

Mettler raises a lot of concerns about what the submerged state means for how we understand the role of government. But one result of this reality is that we have even less tolerance for programs that help the poor: We begrudge them their housing vouchers, for instance, even though government spends about four times as much subsidizing housing for upper-income homeowners.

That's a long-winded way of saying that these proposed laws — which insist that government beneficiaries prove themselves worthy, that they spend government money how the government wants them to, that they waive their privacy and personal freedom to get it — are also simply a reflection of a basic double-standard.

    

 

US Politics Trainset For The Uber-Rich

 

There is nothing normal about the money side of the Clinton campaign or indeed those of her Republican challengers. Normal has been left far behind: America is on course for a new era of mega-money politics which will put all previous campaigns in the shade and, according to some Americans, undermine the process to an extent that threatens democracy.

When the Republican war chest is added in, the amount spent on this campaign could hit $5 billion. A billion more than America spends in a year on overseas humanitarian aid. And what matters is that much of this money will not be raised by the candidates — subject to rules and regulations and limits — but will flow into shadowy organisations supporting them but not owned by them, the so-called super political action committees.

Back in 2010 these bodies were given the green light by the Supreme Court to raise as much money as they liked from whoever they wanted. The result is that a big money political process went mega-big. In the 2008 presidential election, total spending, independent of the campaigns, was $143 million. In 2012 it had risen to more than a billion. This year it will make up most of that $5 billion figure.

A taste of the madness: superPACs backing the Republican candidate Ted Cruz, a right-wing Texas senator who wants to abolish the Internal Revenue Service and frankly has no chance of becoming president, raised $31 million for him in the first eight days after he announced that he was running.

The veteran US political commentator Juan Williams called it “the most stunning political news of the year — beyond amazing”. And the interesting question: Why? What do the wealthy people who have contributed to the Cruz super-PAC think they are getting in exchange? Williams suggests, “a college debate champ on the political stage able to draw applause for damning government spending on food stamps and social security”.

They know he won’t be president but don’t care. American politics has become a playground for the überwealthy. They can, in effect, run candidates for fun. When Sheldon Adelson, a Las Vegas casino owner, put $20 million of his money into the 2012 candidacy of ex-congressman Newt Gingrich, folks thought it strange. Soon, perhaps, it will be normal. The über-rich have a new train set.

A train set that goes in circles. When the field is whittled down to two, they will both have “insane” amounts of money to spend, or be spent for them. As the election guru Nate Silver points out, “way beyond the law of diminishing returns”.

The people upset at this state of affairs are the merely super-wealthy. They used to support the process but never actually ran entire campaigns. A woman called Terry Neese told the Washington Post recently that she was once a big noise in Republican fundraising and pulled in a million dollars for George W Bush’s campaigns. But, the paper lamented, “This year, no potential White House contender has called — not even Bush’s brother, Jeb. As of early Wednesday, the only contacts she had received were emails from staffers for two other likely candidates; both went to her spam folder.”

Oh the ignominy. Emails from staffers! Every Iowan gets them. Even I get them. Mrs Neese is lumped in with us hoi polloi because the really big money no longer comes from millionaires or even the fabled “1 per cent”. The left-wing think tank Demos points out that the real money is with a group of around 16,000 Americans who make more than $10 million a year. This group — the 0.01 per cent — are the bosses now.

The heroine of the Washington Post story, the newly spurned Mrs Neese, finished her complaint thus: “Most of the people I talk to are kind of rolling their eyes and saying, ‘You know, we just don’t count any more’. ”

Perhaps she should talk to the rest of the 99.99 per cent. Or go to Iowa and try to meet Hillary in a gas station. Counting for something, in modern America, is getting mighty hard to do.

    

 

 

 

Marco Rubio actually has a couple of interesting, unorthodox plans, but Republican voters will punish him if he tries to talk about them.

So now we have us some candidates, on the Republican side. Who’s the big kahuna? Jeb Bush? He keeps getting called front-runner, and I suppose he is, even though the polls sometimes say otherwise. Scott Walker? Certainly a player. Rand Paul? Pretty bad rollout, but he has his base. The youthful, advantageously ethnicized Marco Rubio? Some as-yet-unannounced entrant who can hop in and shake things up?

Each has a claim, sort of, but the 800-pound gorilla of this primary process is none of the above. It’s the same person it was in 2008, and again in 2012, when two quite plausible mainstream-conservative candidates had to haul themselves so far to the right that they ended up being unelectable. It’s the Republican primary voter.

To be more blunt about it: the aging, white, very conservative, revanchist, fearful voter for whom the primary season is not chiefly an exercise in choosing a credible nominee who might win in November, but a Parris Island-style ideological obstacle course on which each candidate must strain to outdo his competitors—the hate-on-immigrants wall climb, the gay-bashing rope climb, the death-to-the-moocher-class monkey bridge. This voter calls the shots, and after the candidates have run his gauntlet, it’s almost impossible for them to come out looking appealing to a majority of the general electorate.

You will recall the hash this voter made of 2012. He booed the mention of a United States soldier during a debate because the soldier happened to be gay. He booed contraception—mere birth control, which the vast majority of Republican women, like all women, use. He lustily cheered the death penalty. He tossed Rick Perry out on his ear in part because the Texas governor had the audacity to utter a few relatively humane words about children of undocumented immigrants. He created an atmosphere in which the candidates on one debate stage were terrified of the idea of supporting a single dollar in tax increases even if placed against an offsetting $10 in spending cuts.

You can’t be a smart candidate in a party that wants to be stupid.

He is a demanding fellow. And he is already asserting his will this time around. Why else did Bush endorse Indiana Governor Mike Pence’s religious freedom bill in an instant, only to see Pence himself walk the bill back three days later? Bet Jeb would like to have that one back. But he can’t. The primary voter - along of course with the conservative media from Limbaugh and Fox on down - won’t permit it.

Now, as it happens, some of these candidates come to us with a few serious and unorthodox ideas. We all know about Rand Paul and his ideas about sentencing reform and racial disparities. He deserved credit for them. He was a lot quicker on the draw on Ferguson than Hillary Clinton was. But how much do we think he’s going to be talking up this issue as the Iowa voting nears? Time might prove me wrong here, but Paul has already, ah, soul-searched his way to more standard right-wing positions on Israel and war, so there’s reason to think that while he might not do the same on prison issues, he’ll just quietly drop them.

More interesting in this regard is Rubio. I read his campaign book not long ago, along with five others, for a piece I wrote for The New York Review of Books. Rubio’s book was the best of the lot by far. It was for the most part actually about policy. He put forward a few perfectly good ideas in the book. For example, he favors “income-based repayment” on student loans, which would lower many students’ monthly student-loan bills. It’s a fine idea. The Obama administration is already doing it.

Beyond the pages of the book, Rubio has in the past couple of years staked out some positions that stood out at the time as not consisting of fare from the standard GOP menu. He’d like to expand the Earned Income Tax Credit to more childless couples. Again, there are synergies here with the current occupant of the house Rubio wants to move into—the Obama administration is taking up this idea.

Now, there is to be sure another Rubio, one who’d feel right at home on Parris Island. He is apparently now the quasi-official blessed-be-the-warmakers candidate, with his reflexive hard lines on Iran and Cuba. Along with Senate colleague Mike Lee of Utah, he also has put forth a tax plan that would deplete the treasury by some $4 trillion over 10 years—for context, consider that George W. Bush’s first tax cut cost $1.35 trillion over a decade—in order that most of those dollars be placed in the hands where the Republicans’ God says they belong, i.e., the 1 percent of the people who already hold nearly half the country’s wealth.

I think it’s a safe bet that we’ll see the neocon Rubio and the supply-side Rubio out on the stump. But the Rubio who wants to make life better for indebted students and working-poor childless couples? Either we won’t see that Rubio at all, or we will see him and he’ll finish fourth in Iowa and New Hampshire and go home. You can’t be a smart candidate in a party that wants to be stupid.

Might I be wrong about the primary voter? Sure, I might. Maybe the fear of losing to Hillary Clinton and being shut out of the White House for 12 or 16 consecutive years will tame this beast. But the early signs suggest the opposite.

After all, how did Scott Walker bolt to the front of the pack? It wasn’t by talking about how to expand health care. It was by giving one speech, at an event hosted by one of Congress’ most fanatical reactionaries (Steve King of Iowa), bragging about how he crushed Wisconsin’s municipal unions. That’s how you get ahead in this GOP. I’d imagine Rubio and Paul and the rest of them took note.

    

 

Big Business and Social Conservatives

 

Louisiana Gov. Bobby Jindal’s op-ed in the New York Times marks the whimpering end of an unholy alliance. The letter itself was a ham-handed attempt to capture the 2016 evangelical vote before Sen. Ted Cruz does. But the very crudity of his piece revealed that the union at the heart of Movement Conservatism is ripping apart.

In his op-ed, Jindal undertook to explain to business leaders how Movement Conservatism works. Its political strategy, he lectured, “requires populist social conservatives to ally with the business community on economic matters and corporate titans to side with social conservatives on cultural matters.” The governor is right: Since the 1980s big business interests have managed to secure policies that have concentrated wealth at the very top of the economic ladder, and they have managed their coup only with the help of the votes of social conservatives.

But Jindal’s hyperbolic posturing as he warns “any corporation” “bullying” social conservatives into accepting same-sex marriage to “Save your breath,” reveals a touchstone moment: This grand alliance is over.

Its end has been a long time coming. The toxic amalgam of economic and social reactionaries that Jindal identified began to mix after the Second World War. Americans in that era rallied behind the New Deal consensus. Reactionary businessmen loathed business regulation and taxation, but had no luck convincing voters to turn against the policies most saw as important safeguards against another Great Depression. Then, in 1951, a wealthy young writer suggested that social issues might be the way to break popular support for the New Deal. William F. Buckley, Jr. advanced the idea that unfettered capitalism and Christianity should be considered fundamental American values that could not be questioned. According to him, anyone who called for an active government or a secular society was an anti-American collectivist in league with international communism.

Few Americans paid much attention to an argument that equated even Republican President Eisenhower’s wildly successful capitalist economy with communism. But desegregation gave Buckley’s Movement Conservatism the popular social issue it needed to turn Americans against an active government. The year after the Supreme Court handed down the 1954 Brown v. Board of Education decision outlawing segregation, Buckley launched his National Review, which quickly tied business regulation to unpopular desegregation of public schools. Buckley hired Virginia newspaper editor James Kilpatrick to assure readers that an active government that protected the rights of black Americans undermined American “freedom.”

According to the National Review, Eisenhower’s 1957 use of the troops to desegregate Little Rock High School illustrated the New Deal’s destruction of America itself. The troops escorting black students into white schools were paid with tax dollars. This, in the formulation of Buckley and his allies, was a redistribution of wealth. Congress levied taxes on white people, forcing them to pay their hard-earned money into the treasury, which government officials turned around and used to give unearned advantages to poor blacks. Desegregation enabled Movement Conservatives to describe American “freedom” as a marriage of social conservatism to unfettered capitalism.

This odd formulation, in which equality became inequality and fairness turned into unfairness, slid into the fringes of the Republican Party with the rise of Sen. Barry Goldwater of Arizona as a national figure. His supporters launched his 1960 candidacy for the presidential nomination with a declaration of Movement Conservative principles, ghost-written by Buckley’s brother-in-law. “The Conscience of a Conservative” maintained that government protection for African-Americans was unconstitutional. It undermined property rights by redistributing tax dollars, and thus destroyed “liberty.” Goldwater captured the Republican presidential nomination in 1964, but voters revealed the unpopularity of his ideas when he carried only his home state of Arizona and the five states of the Deep South: South Carolina, Georgia, Alabama, Mississippi and Louisiana. It seemed the union of big business and social conservatism had collapsed.

But, over the next 20 years, that alliance would strengthen until it came to dominate American politics.

One of Goldwater’s key supporters in 1964 was an actor and spokesman for General Electric: Ronald Reagan. In a folksy televised speech just before the election, Reagan called out a dangerously intrusive government run by out-of-touch elites. America faced a clear choice. On the one hand was “individual freedom;” on the other, “the ant heap of totalitarianism.” When Reagan made a strong push for the 1968 presidential nomination, and George Wallace ran on a third-party ticket, Eisenhower’s far more moderate vice president Richard Nixon had little choice but to promise Goldwater voters in the Deep South that he would bow to their racial prejudices. This was the “Southern Strategy,” as Republican political operative Lee Atwater later explained in a vicious 1981 interview in which he deployed the n-word freely. Rather than explicitly invoking racism, he said, Movement Conservatives talked about desegregation, states’ rights and cutting taxes. Economic language didn’t create the same backlash because it was far more abstract than racial slurs, but it accomplished the same thing. It equated an activist government with the redistribution of white wealth to minorities.

Nixon’s troubled presidency required more than white racism to prop it up. To rally support behind his sliding popularity, he pulled more voters into the conservative social coalition by dividing Americans into “us” and “them.” When National Guard troops killed four people at Kent State in May 1970, Nixon took refuge behind the argument that a “silent majority” of Americans opposed the “vocal minority” that was trying to impose its views on the rest of the nation by protesting in the streets. It was imperative to hold the line not only against African-Americans, but also against young radicals and feminists. In 1971, Nixon set up a straw man as a dog-whistle for social conservatives. Unspecified “voices,” “detractors of America,” were urging “disadvantaged groups” to “take the welfare road rather than the road of hard work, self-reliance, and self-respect,” he claimed. Time noted that “middle Americans” were rapidly swinging against “angry minorities” who were sucking up their hard-earned tax dollars, while they seemed to have less and less influence on American policies.

No one made better use of this growing link between economics and culture than Ronald Reagan. In 1980, when he launched his general election campaign just miles from where civil rights workers had been murdered in 1964, Reagan announced: “I believe in states’ rights.” He personified the link between race, sex, and taxes with his “Welfare Queen,” who was a Cadillac-driving, unemployed moocher who “has 80 names, 30 addresses, 12 Social Security cards and is collecting veteran’s benefits on four non-existing deceased husbands…. She’s got Medicaid, getting food stamps, and is collecting welfare under each of her names.” Reagan claimed to speak for what he called “the majority of Americans” — hard-working, white taxpayers — against “special interests,” those lazy Americans, people of color and women, who wanted government handouts. As Rosalyn Carter said of Reagan, “I think this president makes us comfortable with our prejudices.”

Reagan urged Grover Norquist, who had been an economist for the Chamber of Commerce, to bring together big business, evangelical Christians, and social conservatives as a voting bloc to pressure reluctant congressmen into supporting tax cuts. “Traditional Republican business groups can provide the resources,” Norquist explained, “but these groups can provide the votes.” Norquist’s Americans for Tax Reform opposed taxes in any form, since taxes funded “social welfare schemes” that redistributed wealth. In 1989, Norquist’s friend Ralph Reed cemented evangelicals behind big business by pulling them into the Christian Coalition, which set out to spread both religion and unfettered capitalism. Movement Conservatives could win control of the country, Reed explained, only by addressing “the concerns of average voters in the areas of taxes, crime, government waste, health care,” abortion, and homosexuality.

President George W. Bush made the link between business and religion in government official. During his administration, White House officials met weekly with Norquist and a hundred leaders from religious, social and economic groups that made up the Movement Conservative coalition. “There isn’t an us and them with this administration,” Norquist boasted. “They is us. We is them.” A request by the Bush administration could marshal hundreds of thousands of constituents led by the members of the White House meetings. Their voices were amplified by talk radio hosts like Rush Limbaugh, who insisted that Congress must slash taxes and “begin an emergency dismantling of the welfare system, which is shredding the social fabric” and “gutting the work ethic, education performance, and moral discipline of the poor.”

And so the marriage of big business and social conservatism to control government was consummated. But its stability depended on convincing evangelicals and social conservatives that slashing taxes and destroying business regulation served their social ends. Since 1980, those economic policies have concentrated wealth upward and left values voters with less and less. Rumblings of discontent have disturbed the coalition as real wages have stagnated and tax burdens have shifted down the ladder. Movement Conservatives continued to be able to rally support from evangelicals as they limited women’s reproductive choices and attacked minorities and immigrants as lazy criminals, but their power has slipped as the programs they slash increasingly harm values voters. Same-sex marriage marks the beginning of the divorce. Big business Movement Conservatives were happy to pay lip service to right-wing populism so long as it kept Republicans in power. But supporting it now will do the opposite, as most Americans swing behind non-discrimination.

Jindal’s op-ed offers Republicans a great opportunity. It employs the same rhetorical techniques Buckley did in 1951 — turning the popular majority in favor of equal rights into “the radical left,” for example — but now those techniques seem transparently, almost laughably, disingenuous. Seeing such a caricature of the bargain that made Movement Conservatism succeed could create the magical moment in which the party finally rejects the devil’s bargain it struck in the 1950s

    

 

How to Combat Distrust of Science

A recent Pew poll shows that there is a substantial and growing amount of public disagreement about basic scientific facts, including human evolution, the safety of vaccines and whether or not human-caused climate change is real and happening.

Acceptance of science has become increasingly polarized in the United States. Indeed, a recent Pew poll shows that there is a substantial and growing amount of public disagreement about basic scientific facts, including human evolution, the safety of vaccines and whether or not human-caused climate change is real and happening. What is causing this, you might ask?

People often interpret the same information very differently. As psychologists, we are more than familiar with the finding that our brains selectively attend to, process and recall information. One consequence of this is “confirmation bias,” a strong tendency to automatically favor information that supports our prior expectations. When we consider issues that we feel strongly about (e.g., global warming), confirmation bias reaches a new height: it transitions into “motivated reasoning.” Motivated reasoning is the additional tendency to defensively reject information that contradicts deeply held worldviews and opinions. One example of this is the “motivated rejection of science”; if you are personally convinced that global warming is a hoax, you are likely to reject any scientific information to the contrary – regardless of its accuracy.

Yet, if our personal values, beliefs and worldviews really dictate our reality, then aren’t science communicators just blowing in the wind? Not necessarily so. Although some research has indeed shown that factors such as “scientific literacy” are not always associated with, say, more concern for climate change, we have investigated a different, social type of fact: “expert consensus.” Our research shows that highlighting how many experts agree on a controversial issue has a far-reaching psychological influence. In particular, it has the surprising ability to “neutralize” polarizing worldviews and can lead to greater science acceptance.

A recent study by one of us showed that perceived scientific consensus functions as an important “gateway belief.” In the experiment, we asked a national sample of the US population to participate in a public opinion poll about popular topics (participants did not know that the study was really about climate change). Participants were first asked to estimate what percentage of scientists they thought agree that human-caused climate change is happening (0 to 100 percent). We then exposed participants to a number of different experimental treatments that all contained the same basic message, that “97% of climate scientists have concluded that human-caused climate change is happening.” After several quizzes and bogus distractions, we finally asked participants again about their perception of the scientific consensus.

You might expect that given the contested and politicized nature of the climate change problem, such a simple message would have little effect or could even backfire. Indeed, some research has shown that disagreements between parties can become more extreme when exposed to the same evidence. Yet, contrary to the motivated-reasoning hypothesis, our results showed that on average, participants who were exposed to one of the consensus-messages increased their estimate of the consensus by about 13% (up to as much as 20% in some conditions). Moreover, we found that when respondents’ perception of the level of scientific agreement increased, this led to significant changes in other key beliefs about the issue, such as the belief that climate change is happening, human-caused and a worrisome problem. In turn, changes in these beliefs propelled an increase in support for public action. Thus, we found that people’s perception of the degree of scientific consensus seems to act as a “gateway” to other key attitudes about the issue.

What’s even more interesting is that we found the same effect for two differentially motivated audiences: Democrats and Republicans. In fact, the change was significantly more pronounced among Republican respondents, who normally tend to be the most skeptical about the reality of human-caused climate change. These findings are quite remarkable, if not surprising, given that we exposed participants only once, to a single and simple message.

Nonetheless, these new results are consistent with two previous Nature studies. Some years ago, our colleagues showed that people’s perception of the level of scientific agreement was associated with belief in climate change and policy support for the issue. A subsequent experimental study by one of us revealed a causal link between highlighting expert consensus and increased science acceptance. In that study, too, information about the degree of consensus “neutralized” the effect of ideological worldviews.

Since then, numerous studies have reported similar results. One study showed that even a small amount of scientific dissent can undermine support for (environmental) policy. A new paper published just this month reported that respondents across the political spectrum responded positively to information about the scientific consensus on climate change.

Why is “consensus-information” so far-reaching, psychologically speaking?

One feature that clearly distinguishes “consensus” from other types of information is its normative nature. That is, consensus is a powerful descriptive social fact: it tells us about the number of people who agree on important issues (i.e., the norm within a community). Humans evolved living in social groups and much psychological research has shown that people are particularly receptive to social information. Indeed, consensus decision-making is widespread in human and non-human animals. Because decision-strategies that require widespread agreement lie at the very basis of the evolution of human cooperation, people may be biologically wired to pay attention to consensus-data.

In the case of experts, it describes how many scientists agree on important issues and as such, implicitly embodies an authoritatively rich amount of information. Imagine reading a road sign that informs you that 97% of engineers have concluded that the bridge in front of you is unsafe to cross. You would likely base your decision to cross or avoid that bridge on the expert consensus, irrespective of your personal convictions. Few people would get out of their car and spend the rest of the afternoon personally assessing the structural condition of the bridge (even if you were an expert). Similarly, not everyone can afford the luxury of carving out a decade or so to study geophysics and learn how to interpret complex climatological data. Thus, it makes perfect sense for people to use expert consensus as a decision-heuristic to guide their beliefs and behavior. Society has evolved to a point where we routinely defer to others for advice—from our family doctors to car mechanics; we rely on experts to keep our lives safe and productive. Most of us are constrained by limited time and resources and reliance on consensus efficiently reduces the cost of individual learning.

Back to facts. A recent study showed that people are more likely to cling onto their personal ideologies in the absence of “facts.” This suggests that in order to increase acceptance of science, we need more “facts.” We agree but suggest that this is particularly true for an underleveraged but psychologically powerful type of fact — expert consensus.

The consensus on human-caused climate change is among the strongest observed in the sciences—about as strong as the consensus surrounding the link between smoking and lung cancer. Yet, as Harvard science historian Naomi Oreskes has documented, vested-interest groups have long understood the fact that people make (or fail to make) decisions based on consensus-information. Accordingly, so-called “merchants of doubt” have orchestrated influential misinformation campaigns, including denials of the links between smoking and cancer,and between CO2 emissions and climate change. If polarization on science is to be reduced, we need to harness the psychological power of consensus for which it was designed: the public good.

    

 

Free Immigration

 

Oddly, none of the main parties is proposing a measure that might help to address one of the clearest weaknesses of the UK economy: loosen (yes, loosen) immigration controls.

Let me explain. It isn’t in dispute that Britain has a problem with productivity. After recessions in the 1970s, the 1980s and the 1990s, labour productivity — how much is produced for a given hour’s work — recovered strongly. Yet that isn’t happening now. Productivity collapsed in the recession of 2008-09, in the wake of the banking crisis. It was the biggest annual fall since the 1970s, in the days of Edward Heath’s government and the three-day week.

Even with the much-vaunted recovery in output growth, productivity hasn’t even got back to pre-crisis levels. In the fourth quarter, labour productivity as measured by output per hour fell by 0.2 per cent from the previous quarter. It’s below the level of 2007. This lack of productivity growth in the past seven years has no postwar precedent. It is clear that unless productivity recovers, there won’t be sustainable growth in real wages. Over the long term, a country’s ability to sustain rising living standards depends on improvements in productivity. Within the G7, Britain ranks sixth, above only Japan, in labour productivity.

Well, here’s an idea: let’s get more people who are highly educated and who have skills. There are lots of them in other countries. If they come here, they might improve this country’s productivity by transferring knowledge and also by increasing the incentives to non-migrant workers to acquire new skills. You’ll listen in vain for mainstream politicians to argue this. Instead, the focus of the election debate on migration is how we can stop it.

The three main parties (not, of course, Ukip) accept the principle of freedom of movement within the EU. Yet it’s now argued by all of them that it was a mistake for Britain to open its labour market to newly acceding EU member states in 2004. At the Labour conference last year, Ed Balls said: “We should have had tougher rules on immigration from eastern Europe — it was a mistake not to have transitional controls in 2004.”

Well, perhaps Labour’s shadow chancellor won’t defend his own party’s record, but I will. My evidence is what happened in other EU economies. The Institute for the Study of Labour in Bonn has investigated the skills of the post-2004 migrants into western European economies and concludes that Britain got a disproportionate share of the most highly skilled workers. Germany, on the other hand, imposed quotas and work permits on eastern European migrants and attracted comparatively older and less-qualified migrants. The evidence is suggestive that just by being open to immigration, Britain impressed on people with skills that they would be welcome here and could thrive.

If Britain wants to curtail immigration — in particular, if it wants to meet the government’s stupendously misconceived net migration target — it will need to impose ever tighter curbs on non-EU migration. Yet these people are, in the main, students and workers with skills and education. British business needs people like that. It’s one of the paradoxes of political debate that a government averse to the notion of “picking winners” in industrial policy believes it’s capable of doing precisely that in the labour market — and, still more bizarrely, that a supposedly left-wing opposition agrees with it.

    

 

Being Black In America

 

An analysis in The Times — “1.5 Million Missing Black Men” — showed that more than one in every six black men in the 24-to-54 age group has disappeared from civic life, mainly because they died young or are locked away in prison. This means that there are only 83 black men living outside of jail for every 100 black women — in striking contrast to the white population, where men and women are about equal in numbers.

This astounding shortfall in black men translates into lower marriage rates, more out-of-wedlock births, a greater risk of poverty for families and, by extension, less stable communities. The missing men should be a source of concern to political leaders and policy makers everywhere.

While the 1.5 million number is startling, it actually understates the severity of the crisis that has befallen African-American men since the collapse of the manufacturing and industrial centers, which was quickly followed by the “war on drugs” and mass imprisonment, which drove up the national prison population more than sevenfold beginning in the 1970s.

In addition to the “missing,” millions more are shut out of society, or are functionally missing, because of the shrinking labor market for low-skilled workers, racial discrimination or sanctions that prevent millions who have criminal convictions from getting all kinds of jobs. At the same time, the surge in imprisonment has further stigmatized blackness itself, so that black men and boys who have never been near a jail now have to fight the presumption of criminality in many aspects of day-to-day life — in encounters with police, in schools, on the streets and on the job.

The data on missing African-American men is not particularly new. Every census for the last 50 years has shown the phenomenon.

In earlier decades, premature death played a larger role than it does today. But since the 1980s, the rising number of black men who were spared premature death was more than offset by the growing number shipped off to prison, many for nonviolent drug offenses. The path to that catastrophe was paved by what the sociologist William Julius Wilson described as “the disappearance of work,” which devastated formerly coherent neighborhoods.

As deindustrialization got underway, earnings declined, neighborhoods grew poorer and businesses moved to the suburbs, beyond the reach of inner city residents. As Mr. Wilson wrote in his 1996 book, “When Work Disappears,” for the first time in the 20th century, most adults in many poor inner-city neighborhoods were not working.

Joblessness became the norm, creating a “nonworking class,” that lived in segregated areas where most residents could not find jobs or had given up looking. In Chicago, where, Mr. Wilson carried out his research, employers wrote off the poor by not advertising in places where they could see the ads. The situation was so grave in 1996 that he recommended the resurrection of a Works Progress Administration-like strategy, under which the government would provide public employment to every American over 18 who wanted it.

The stigmatization of blackness presents an enormous obstacle, even to small boys. Last year, for example, the Department of Education reported that black children were far more likely to be suspended from school — even from preschool — than white children. Federal cases also show higher rates of public school suspensions for minority students than for white students for identical behavior, suggesting that racial discrimination against black males starts very early in life.

The sociologist Devah Pager, a Harvard professor who has meticulously researched the effect of race on hiring policies, has also shown that stereotypes have a powerful effect on job possibilities. In one widely cited study, she sent carefully selected test applicants with equivalent résumés to apply for low-level jobs with hundreds of employers. Ms. Pager found that criminal convictions for black men seeking employment were virtually impossible to overcome in many contexts, partly because convictions reinforced powerful, longstanding stereotypes.

The stigma of a criminal record was less damaging for white testers. In fact, those who said that they were just out of prison were as likely to be called back for a second interview as black men who had no criminal history at all. “Being black in America today is just about the same as having a felony conviction in terms of one’s chances of finding a job,” she wrote in her book, “Marked: Race, Crime and Finding Work in an Era of Mass Incarceration.”

In recent months, the many grievous cases of unarmed black men and boys who were shot dead by the police — now routinely captured on video — show how the presumption of criminality, poverty and social isolation threatens lives every day in all corners of this country.

    

 

What GOP Hasn't Worked Out

 

Stephen Harper in Canada. Tony Abbott in Australia. John Key in New Zealand. And now, impressively re-elected, a second-term David Cameron in the United Kingdom.

Center-right leaders are in charge of every one of America’s closest English-speaking allies. Only in the United States does the liberal left govern. With Hillary Clinton holding strong leads in the polls over all her likely opponents, this form of “American exceptionalism” looks likely to persist for some time to come. Why?

Their American detractors may grumble, but these other conservatives are indeed “real conservatives” (Harper and Abbott tend to be more popular among their U.S. counterparts than Cameron and Key). After coming to power in 2010, the Cameron government cut personal and corporate income taxes. It imposed tough new work requirements on physically capable welfare recipients. Government spending as a share of GDP will decline to pre-2008 levels next year. Thanks to Cameron’s reforming education minister, Michael Gove, more than 3,300 charter schools (“academies,” as the British call them) are raising performance standards in some of Britain’s toughest neighborhoods—a 15-fold increase since 2010. Under the prime minister’s leadership, the post office was privatized.

More reforms will follow in the next government. Cameron has pledged further tax reductions, including eliminating death taxes on family homes. Restrictions on home construction will be relaxed. Government’s share of GDP will be pushed down with the goal of undoing the Blair-Brown spending spree that began in 1997.

Cameron Conservatives, like conservatives in the Anglosphere and Germany, converge with and diverge from Republicans in the United States in significant ways. And these ways are crucial to their electoral success—and, I’d argue, supremely relevant to the comparative failure of their American counterparts.

Center-right parties in Australia, Canada, New Zealand, and the United Kingdom have all made peace with government guarantees of healthcare for all. These conservatives do not abjectly defend the healthcare status quo; they attempt to open more space for competition and private initiative within the health sector. But they accept that universal health coverage in some form has joined old-age pensions and unemployment insurance in the armature of an advanced modern economy. In this, their American counterparts are the true outliers. Before 2010, the United States provided the industrial world’s most lavish single-payer health system for citizens over 65—a hugely expensive and hugely inefficient system of tax subsidies for private insurance—at a total cost per U.S. taxpayer that was more than Canada spent on healthcare per Canadian taxpayer. And that system still left tens of millions uncovered and tens of millions covered but still exposed to large healthcare costs that they could not possibly afford. The pre-Obamacare American healthcare system was indefensible, and non-American conservatives are stronger for not having to try and defend such a thing.

These parties have updated for the 21st century their core message of respect for family, work, and community. None seek to police women’s sexual behavior or to impose restrictions on women’s reproductive choices. All have accepted gay equality, with Australia on the verge of a parliamentary vote to permit same-sex marriage. They are parties comfortable with racial inclusion and competitive with ethnic-minority voters—the Canadian Conservatives particularly so; people of Chinese origin are Canada’s second-largest non-white ethnic group, and in the country’s 2011 election, Canadian Conservatives won two-thirds of the vote among Canadians who speak Cantonese at home.

The parties are all unapologetically nationalist—an especially important stance in the United Kingdom, whose sovereignty is endlessly infringed upon by the European Union. In particular, all advocate an immigration policy determined by the national interest, not the interest of would-be immigrants. The Cameron Conservatives have pledged to reduce net immigration below 100,000 people per year. Tony Abbott’s government has halted the kind of migration now convulsing Europe with a “no exceptions” policy against illegal immigration by boat. Canadian immigration policy is determined by a points system aimed at selecting migrants who will flourish economically in the country, with the result that Canadian immigrants—like U.S. immigrants before 1970 but not U.S. immigrants today—attain higher levels of education than the Canadian-born population.

The parties are tough on terrorism, extremism, and international disorder. David Cameron has defined the security threat facing the U.K. as not only “violent extremism”—the Obama formulation—but all ideological movements that reject democracy and equal rights, “whether they are violent in their means or not.” And while making clear that the West has no quarrel with Islam and its believers, Cameron, unlike Obama, has been willing to state explicitly that the extremism that threatens Western democracies is, obviously, “Islamist extremism.” Stephen Harper and Tony Abbott have been especially firm and consistent supporters of Israel. As Chris Pyne, one of the many strong friends of Israel in the Australian cabinet, said during last year’s Gaza War, “Whenever there has been a congregation of freedom-loving nations versus non-freedom-loving nations, Australia has always been prepared to be in the fight and always on the right side. And that’s how we view the State of Israel—that we are on the right side.” Harper has been a forceful advocate within NATO for the defense of Ukraine. When Harper encountered Vladimir Putin at last year’s G-20 meeting, I’m told he said, “I have only one thing to say to you: Get out of Ukraine.” Putin replied, “I’m not in Ukraine.” Harper retorted, “And it’s because you say things like that that I have nothing to say to you.”

Unlike their U.S. counterparts, these conservatives don’t fetishize the music, fashion, or religious practices of some of their voters in a way that prevents them from reaching all of their potential voters. Unlike their U.S. counterparts, they accept that healthcare security actually supports—rather than inhibits—the entrepreneurial risk-taking of a dynamic free-market economy. Unlike their U.S. counterparts, they have found ways to both enforce immigration laws and to make immigrant populations feel at home politically.

Of course, these conservatives differ among themselves in important ways. And their success is conditional; all face political challenges at home, including a tough re-election for Stephen Harper in Canada later this year. But what they all show their American counterparts is that the fear of a “tipping point” beyond which a state plunges into socialist dependency is utterly misplaced. Countries with universal health coverage, for instance, can be hospitable to conservatives—if conservatives can resist the impulse to repeal that coverage. It’s the resistance to the program, not the program itself, that sinks conservative hopes. Politics doesn’t tip. It evolves. And winning conservative parties evolve with it.

    

 

How The Tories Won

 

On the evening before the election, a statistician called Matt Singh published an eyebrowraising article on his website. The opinion polls were wrong, he said; not by a little, either, but by 6 points or even 8.

His argument was simple: no party ever loses an election when it is ahead on choice of prime minister and economic competence at the same time. He plotted these things on graphs. The Tories, he said, were almost bound to be decisively ahead.

He was, of course, correct — and his analysis tells the story of how the Conservatives won a majority that very few were expecting. They had David Cameron as their leader and Labour had Ed Miliband. The public saw Mr Cameron as prime ministerial, and did not think the same of Mr Miliband.

At the same time, George Osborne’s economic strategy worked for just enough people for just long enough to get the Tories over the line.

When the risk of the Scottish National party in government was added to that of Mr Miliband in Downing Street it provided the basis for a strong campaign.

Andrew Cooper, the pollster who had worked as Mr Cameron’s strategy director, had identified a group of voters I dubbed Yes Yes Nos in an article for this newspaper. These were people who said that, yes, they thought Mr Cameron was the best prime minister, and, yes, the Conservatives would be better for the economy — but then said, no, they weren’t planning to vote Tory.

Mr Cooper managed to find a group of about 3 per cent of the population who admitted that, in the end, they might well vote Conservative. A big campaign goal was to win the support of Yes Yes Nos. Mr Cooper repeatedly said that he thought this might not happen until polling day, and no one would know whether it had worked until the votes were counted. He was correct.

Because of the incredibly successful fundraising of David Cameron’s friend, Lord Feldman of Elstree, the Conservatives had vastly more money than Labour. The mystery was: where did it go? There was speculation that the cash was being wasted on technologically illiterate Facebook advertising. Far from it.

While Labour hired David Axelrod, Barack Obama’s adviser, to develop a message, the Tories hired another Obama man: Jim Messina. He turned out to be orders of magnitude more useful as he bought and deployed big data to allow him to target social media messages incredibly precisely at just the right people.

This was matched by the most professional and highly centralised field operation the Tories had ever organised. Stephen Gilbert, the prime minister’s political secretary, has been the party’s most important official for almost two decades, although he prefers his work not to be noticed. He replaced the decentralised system of agents with campaign managers trained at the centre.

And then, of course, there was Lynton Crosby. Tough but civilised, feared but liked, the Australian strategist was able to get even the biggest players to follow a simple line that emphasised economic security. He wanted Mr Cameron to stay on the tracks and just repeat his message over and over again.

Pundits sneered that it was unimaginative, but Mr Crosby was sure it would work. It did.

    

 

How The Tories Won Part 2

 

Conservative campaign managers knew three weeks before election day that they were heading for victory, Jim Messina, President Obama’s campaign guru, has revealed.

Almost every public poll suggested for weeks that the election race was deadlocked, but Mr Messina, who was hired by the Tories to help to reach persuadable voters in key marginals, said that he was seeing a totally different picture in the battleground seats.

The Conservative campaign, borrowing micro-targeting techniques from the United States, was so sophisticated that in the final week the party was having multiple contacts via Facebook, on the phone and on the doorstep with individual voters who had been identified as likely to switch from the Liberal Democrats or to choose the Tories over Labour.

Speaking from Washington, Mr Messina said: “It was amusing to see all these polls saying ‘Oh, Labour’s talking to more people’. I kept thinking to myself: ‘That’s great, because we are talking to the right people, over and over and over again.’

“We were having as many as eight to ten conversations with undecided voters in the final week, while Labour was still, in the final week, mailing every single person in the constituency. And that just didn’t make sense.”

Mr Messina, the Obama campaign manager in 2012, was brought in by the Conservatives to find the swing voters who would decide the election. Facebook was the crucial weapon; using data that the social media site sells to advertisers, he was able to target key constituencies and to reach niche groups of voters.

Some American campaign experts narrow their message, so that, for example, they can target women aged over 40 living in a particular district who are concerned about health and education.

Without revealing all of his techniques, Mr Messina said that his work in the marginals had been even more targeted than that. “And I think the proof is in the pudding that we now hold every single west UK Lib Dem seat,” he said. “We went in and took very deep dives in the seats and to see what was doable, what was winnable . . . who were the voters, who were potential waverers, thinking about leaving the Lib Dems; who were the voters trying to decide between us and Labour; and who were the voters considering leaving us for Ukip — and we were able to have very focused messages to all of those people.”

Mr Messina said while nightly polls by Lynton Crosby, the Tory campaign chief, had the party way ahead, no one had predicted the scale of the victory. He sent Mr Crosby a document on election morning, predicting 312 seats; he later upgraded that to 319.

“For the last three weeks, both Lynton’s nightly polls and our look inside the marginal seats had us significantly ahead, and a week out, we had us leading in 305 seats . . . did we think in the end we were going to get 331 seats? No.”

Mr Messina derided the work of the pollsters as “garbage”, insisting that the UK system was “completely broken”. 6 Ed Miliband lost the election because of “lazy Labour” supporters who failed to vote, according to Ipsos Mori, undermining the theory that predictions were swayed by “shy Tories”.

Ben Page, the pollster’s chief executive, was seeking to explain why his final eve of election poll overstated the number of Labour voters by almost three million. The number of votes for Labour implied by his final poll was 12.2 million, but only 9.3 million voted. The same poll implied 12.5 million Tory votes; in fact, 11.3 million turned out.

    

 

Make a Conservative Friend

 

My father recently passed away, and the stories people inevitably tell of the dead brought back memories of childhood. As a small boy I remember Sunday lunches that culminated with my IRA-supporting godmother storming out after dad had said something especially offensive about Ireland. But she’d be back the following month and all was forgotten. Another of dad’s lunchtime friends, I remember, had the honour of being one of just 12 Englishmen to fight in the Spanish Civil War – for Franco. One crony, who I mainly remember smelling of whisky and offering my brother and me £50 if we learned Gaelic, had brought a Red Army Faction terrorist to visit mum in hospital after she’d given birth.

These were all people with varying, strongly-held views, and in retrospect I realise how important it is to have friends with whom you profoundly disagree – it’s discomforting and somewhat disturbing, but like a lot of unsettling things, it’s good for your character development. Otherwise you end up with the sort of political sectarianism we’ve seen on social media since Friday.

It’s hard not to feel great schadenfreude at the grieving of people so utterly convinced of their righteousness, although to be honest I haven’t really tried.

Many people are angry because they didn’t win, and shocked that, despite no one they know voting Conservative, the party still won. For all the casual assumptions about the benefits of democracy, people forget just how unnatural democracy is, and that it’s not normal for human beings to live with people with whom they profoundly disagree; we have to work at it. We naturally see people who have a very different vision of the tribe as potential enemies. To make matters worse, a population that is certain of its own moral righteousness and which is also used to getting its own way, because everything is a choice and nothing a sacrifice, is not best suited to such a restrained system of government. Politics is not your iTunes library.

The election has also brought out in relief the division not necessarily in our politics but in our personality types, split between people who accept that the world is imperfect and would rather compromise with it, and those who wish to make it perfect – the Jacobins of this world.

One characteristic of the fanatical personality is a tendency to blame others rather than oneself – the media, the voters – and although in everyday life this trait is associated with lower intelligence, fanatical personalities aren’t necessarily stupid. Indeed academia has plenty of fanatics, and one of the worst things about Twitter is that it has revealed how appalling so many talented artists are, how small-minded and ignorant; strangely enough novelists, whose mission is to uncover nuance and motive and internal conflict, are among the most Manichean in the way they see things.

At its most extreme this fanatical personality extends to a tolerance for violence or intimidation, so long as it’s in the ‘right’ cause, without appreciating that just as the rule of law has to apply to everyone or no one, so too must the rules of democratic engagement.The fanatical personality also confuses opponents and enemies; as a quick guide, the Tories are your opponents, Isis are your enemies.

In a less pathological form this fanatical personality is displayed in an inability to see that people from a different party actually share one’s aims but think there are different ways to achieve them; much Labour-leaning analysis of this election suggests that Conservatives vote out of ‘self-interest’, rather than simply believing state spending can be ineffective, or even counter-productive. As Jonathan Haidt pointed out in the Righteous Mind, self-identified liberals generally have a poor understanding of what their opponents believe.

It’s not that Left-wingers are more likely to be less open-minded about politics (by nature, conservatives are less adventurous), but that in England and parts of America it’s possible to be an educated person and exist entirely in a Left-wing world throughout life; for an urban conservative it’s impossible, and like all minorities they’re forced to interact more.

If I had to hazard a guess I would say that fanatical personality characteristics exist on a reverse bell curve across the political spectrum, with the traits most common in people on the extreme Left and Right and the least at the very-centre-Right. But that’s just a guess, based on prejudice.

All character traits are partly hereditary, but this fanaticism is no doubt heavily influenced by the culture, and by the ease in which people are allowed to avoid disconfirming views. And these post-election reactions suggest a generation that is politically spoilt, able to grow up without ever having to face serious opposition to their worldview. From school to university to the office and dinner parties where they can casually make caustic remarks about Right-wingers knowing no one will object, at the homes of friends where on the shelves they will see not a single book by a conservative. My advice to anyone shocked and stunned by this election would be, contra to what one academic says, to go and befriend a Tory. It will do you and your political tribe a world of good.

    

 

The Iraq War: the Verdict

 

“Hello, my name is A.J. and I was an Iraq War supporter.”

It is high time my fellow conservatives joined me in a 12-steps realization, and mea culpa, regarding our foreign-policy errors of the post 9-11 era. Many nowadays, thankfully, have the integrity to make the admission: Each year, the number of Republicans who openly admit that the Iraq “war” — ahem, invasion — was a mistake ticks up.

A new poll finds two-thirds of voters think a candidate’s position on the Iraq War is significant. Why? With constant talk of Iran, Syria, Libya, Yemen, Ukraine and others, voters want a president who will not embroil us in yet another catastrophic intervention thousands of miles away and of no benefit to America.

It is not a question of the past but rather, as Sen. Rand Paul notes, a “recurring question” any president will face.

GOP presidential hopefuls faced a week of tough questions on whether they thought the invasion was a mistake. Jeb Bush seemed rattled and confused, first stating he would have invaded, then ultimately saying — but only after backlash to his initial answer — that he would not have done so, while King of the Interventionists, Marco Rubio, said the war “was not a mistake” and was “the right decision based on what he knew at that time.”

Wisconsin Gov. Scott Walker conveyed a position similar to Rubio’s while Paul indicated his opposition and Sen.Ted Cruz, along with New Jersey Gov. Chris Christie, said it was not a good idea in hindsight.

The GOP field is in utter disarray on the issue. So let’s get a few of these 12-step admissions out of the way. IDS sufferers — Iraq Derangement Syndrome — may want to hang on.

First things first: No, the Iraq War wasn’t just a good-faith mistake based on the information the Bush administration had at the time. The administration sought out intelligence that favored its plan to invade Iraq — a pre-existing plan that even predated 9/11 — and ignored or silenced the intelligence that didn’t.

It’s the equivalent of deciding you want to divorce your spouse, then setting out to find any evidence of wrongdoing. The private investigator saw your wife embracing a man! Oh, it was her brother at her nephew’s soccer game? Hmm, never mind that, it’s mere semantics. You’ve got your smoking gun!

IDS sufferers’ second favorite argument is: “Well, the world is better off without a bad guy like Saddam, so it wasn’t a mistake.” OK, except this is completely inaccurate. The world is not better off without Saddam. Why? Because for all his faults, Saddam Hussein presided over a stable Iraq, served as a buffer to (a now more powerful) Iran and was no religious fanatic. When we invaded and removed him, we created a power vacuum in the country, a vacuum then filled by brutal ISIS.

Consider, for instance, the plight of Iraq’s Christians, who lived in relative peace under Saddam’s rule but, thanks to the invasion, were forced to flee in record numbers. Sure, Saddam was an awful man, but the alternative is worse. Or, are IDS sufferers saying ISIS is better than Saddam? And, does not this simplistic “bad guy out, so war good” mean we should go to war against any and all countries with a bad guy in power?

Third, no, we didn’t “win” the war at any point and, no, President Obama isn’t to blame. Lately, in an attempt to strike a two-fer coupon deal (defending the war while simultaneously striking one against Obama), the argument goes: “We were winning the war until “Obummer” decided to pull the troops out!” This is about as intellectually dishonest as it gets, since it was Dubya — not Obama — who decided on the troops’ withdrawal timetable. Again, facts are facts, and they are nothing if not inconvenient.

And no, at no point were we victorious in Iraq, unless by winning you mean a temporary win over a nebulous enemy, which required us to maintain a combative presence in the nation indefinitely. I may not be a general, that isn’t victory. This is, of course, not an assessment of our excellent troops — Iraq was simply not winnable.

Fourth, there is the logical elephant in the room that, even assuming Saddam had WMDs, he would have used them against us, the entire premise upon which the mistaken war was sold to the American public. The very notion is absurd, and it is almost inexplicable now, in hindsight, how so many of us bought the idea with so little questioning.

Our invasion of Iraq led to deaths of more than 4,000 brave Americans and half a million Iraqis. It cost us a staggering $2 trillion (or, over 10 percent of our national debt). Yet those who seek to be our next president cannot bring themselves to clearly state we erred, and erred badly?

Instead, Rubio is, this week, quoted the action movie Taken in explaining his foreign-policy views. Does this sound like (A) a man fit to be commander in chief, (B) a teenager playing video games who’s had too much sugar, or (C) a bro at the Hard Rock Vegas Sunday pool party? (Hint: It isn’t “A”).

Meanwhile, a sharp and somber Hillary Clinton, who voted in favor of the war, unequivocally and rightly stated this week that her vote was a “mistake.”

    

 

Making Voting Easier

 

Yesterday, Hillary Clinton endorsed automatic voter registration for all 18-year-olds. Expanding access to voting rights is a civil-rights issue that can be justified entirely in good-government terms. At the same time, it is also a completely partisan issue. The Republican Party is in favor of making voting more inconvenient in the correct belief that winnowing the electorate operates to its partisan advantage.

The Republican Party prefers to frame its stance on voting rights as a deep concern for preventing voter fraud. The most common response to this is to point out that voter impersonation is vanishingly rare — since 2000, 31 instances of it have been found, out of a billion ballots cast. But the true nature of its concern reveals itself most clearly when the party’s mania for suppression can be detached from its professed concern for preventing voter fraud and examined in naked isolation.

In theory, there are a number of ways one might make voting more convenient without enabling fraud. One of them is to expand early voting. A bipartisan commission, co-chaired by Democratic campaign lawyer Robert Bauer and Republican campaign lawyer Ben Ginsburg studied voting extensively and issued a 2014 report that, among other recommendations, endorsed more early voting. The authors, who engaged in months of public hearings and consultations with officials, found that voters are willing to tolerate waits to vote if they can pick a convenient day. “What does emerge from evidence about the experience of voters is that their tolerance for wait times is considerably higher with early voting,” the report found, “Having chosen the day and time for voting that is convenient for them, early voters are described as being in a more 'celebratory' frame of mind than under the often rushed circumstances they face on Election Day when they must vote at a specific location on a specific day.” This makes perfect sense — if you can pick a day when you’re not too busy, you can stand a longer wait than having to queue up on a day you don’t pick when you might not have time to spare.

The official Republican response rejected the endorsement of early voting. Letting people vote early, the Republicans reply, “diminishes the importance of Election Day.” While the bipartisan commission found that Americans enjoy the freedom to pick a convenient time to vote, the Republican Party declared that this is not the case — “Most Americans continue to prefer to vote alongside their neighbors and fellow citizens at the polls on Election Day so reform needs to start there.” This is a bit like saying most Americans prefer vanilla, therefore chocolate should not be allowed. If people prefer to vote on Election Day, they can.

Now, perhaps the Republicans are simply moved by a sentimental attachment to Election Day, and they don’t want it to get less special-feeling by giving people the choice to exercise their rights at a more convenient time. In that case, there is also a solution that meets the party’s concern: make Election Day a national holiday. Clinton has endorsed that idea, too, and it’s a long-standing liberal standby. But Republicans also oppose making Election Day a national holiday. As the conservative pundit John Fund has explained, making elections a national holiday might lead to people skipping work the preceding Monday. Also, Fund argued, “There’s no doubt that many people in our increasingly mobile and hectic society want voting to be as easy and convenient as buying fast food. But too much of anything can be bad — just ask someone who has gorged on drive-thru burgers and fries.” So since convenient fast food can be bad for you, convenient voting must be bad, too.

Automatic voter registration, which anybody could choose to opt out of, is another idea that would reduce bureaucratic impediments to voting without enabling fraud. It will be fascinating to watch the party generate arguments against this. The current official party response amounts to simple ad hominum criticism of Clinton (Republican spokesman: “Her exploitation of this issue only underscores why voters find her dishonest and untrustworthy,” etc.).

In the meantime, conservatives bothering to express their knee-jerk hostility have fallen back on their actual conviction, which is that voting should be restricted to a better class of people. An additional registration requirement, writes National Review’s Daniel Foster, “improves democratic hygiene because the people who can’t be bothered to register (as opposed to those who refuse to vote as a means of protest) are, except in unusual cases, civic idiots.” People who don’t have the flexibility to take extra time away from work to jump through whatever bureaucratic hurdles the Republicans throw in their path, or the familiarity with local agencies to navigate them smoothly, are too shiftless and ignorant to be trusted with the franchise.

And so Clinton’s embrace of voting rights may not have any plausible near-term prospects for enactment. But it serves to demonstrate to the party’s core constituents something elemental, and true: At the current moment, there is only one party that respects their rights as citizens.

    

 

Two Ways of Seeing The World

 



When politicians wonder why they fail to connect with the voters, they should realise it’s because so often they talk in code.

The Labour leadership contest is being conducted in a strange language all of its own, a kind of Esperanto of the left. Words such as “aspiration”, “equality”, “responsibility”, “choice” and “unity” have come to mean much more than their definition would suggest — they now signify the ideological differences running through the party. Labour needs a political Enigma machine to decipher what is really being said if it wants to work out how to win power again.

It is bizarre that, more than 20 years after Tony Blair won the leadership of his party, the discussion about who should succeed Ed Miliband is still framed as a choice between “Blairite” and “Brownite” candidates. Political parties get stuck in the era of their greatest success, rather as people’s musical taste is set in their youth.

In Labour’s case, though, there is more to it than that. Mr Blair and Gordon Brown represent the two sides of an schism in the party about its true purpose and how its goals should be achieved. All Labour politicians would say they want a fairer society — but “Blairite” and “Brownite” are shorthand for the difference between those who think that it is fine for some people to do better than others so long as nobody falls below a certain level and those who believe the real aim should be for everybody to be the same.

In Labour jargon it’s equality of opportunity versus equality of outcome; in the Esperanto of the left it’s the difference between emphasising aspiration and equality; in clichéd metaphors it’s the ladder up versus the level playing field.

This is the real dividing line of the leadership contest. When Liz Kendall echoes the Blairite mantra that “what matters is what works” in the public services, what she means is that improving standards is more important than providing uniformity, and that the question is not whether the service is public or private but whether it gets results for consumers.

When Andy Burnham says “the NHS is what matters” and healthcare should be based on “people not profits”, he is putting himself on the other side of this divide. This position was perfectly articulated a couple of weeks ago when the shadow health secretary committed himself to a “truly comprehensive” education system: “There don’t have to be losers for there to be winners in education.”

It’s possible to see every aspect of public policy through this prism. In education, there is the contrast between the drive for excellence and academic rigour versus the “all shall have prizes” mentality that led to a dumbing down of exams so that no pupil felt a failure. Then there’s the disagreement between those who support the diversity of free schools and those who believe only the traditional comprehensive system can guarantee uniformly fair education.

In health, the divide between Mr Blair and Mr Brown over semi-independent foundation hospitals has now morphed into a disagreement about whether the public sector should have a monopoly on the delivery of care. On economic policy, the difference is between those who would tax the rich and redistribute wealth and those who see this as the politics of envy. In welfare, it’s between people who emphasise responsibilities and others who see the benefit system in terms of rights.

The same tension exists in the debate about how to build a winning coalition of voters in time for the next general election. One side is keen to focus on wooing back disaffected supporters in the northern heartlands who went over to Ukip, while the other insists the only way to a Commons majority is to attract those who supported the Conservatives last month. Comfort zone versus reaching out, continuity or change, traditionalist and moderniser — these are all code for the same basic choice. Pat McFadden, Labour’s Europe spokesman, believes that even calls for “unity” can have a hidden meaning — “unity can be code for ‘let’s not confront the difficult choices”, he says.

In a way the traditional/Brownite/ continuity position is more idealistic; the purest, most open articulation of Labour values. But there is no doubt which side most voters are on. They want their children to achieve their full potential, even if (perhaps especially if) that means they do better than somebody else’s kids. They want the NHS to do whatever it takes to save lives, regardless of whether one hospital leaves another trailing. They are not selfish but they want success for themselves and their families, not an altruistic mush of mediocrity.

When focus groups were asked in the run-up to the election what they thought of “equality” — Ed Miliband’s defining purpose — they didn’t like it. James Morris, Labour’s pollster for the past five years, who conducted the research, says: “To them it means ‘everyone level’ and they think it sounds like communism. Equality is a problem word for Labour.”

Although voters approve of any Conservative promise to be fair — because this seems counter-intuitive — when it’s Labour using the word, “people think it means ‘fair’ to other people like immigrants, or people in poverty, not them”, he says. The language of “opportunity” and “responsibility”, he argues, are the best way of articulating the popular idea that everyone should have a decent shot at success, but too often Labour feels more comfortable with more traditional language. At the election in May, “the internal politics of code words ended up trumping a message that could persuade the voters,” he says.

As the leadership contest properly begins, with nominations next week following yesterday’s hustings for MPs, there is a danger that Labour will continue talking to itself rather than using language and ideas that appeal to the country. The question is: how much does the party really want power? The purity of opposition is comforting, but winning power in order to get things done requires compromise. This means taking the voters’ definition of success and fairness rather than the party’s historic interpretation of the words.

    

 

Socialism is Dead and Buried

 

It can happen in the history of ideas that a theory becomes so widely discredited as to attract not just disagreement but stigma. That the Earth is flat, for instance, by the time South Africa’s President Paul Kruger was still clinging to it, was a scientific theory attracting only scorn. A more recent South African president, Thabo Mbeki, attracted similar contempt for his theories about HIV/Aids.

Holocaust denial; climate change denial; eugenics — all have moved or are moving away from arguability towards mere notoriety. Most Times readers (though by no means all Americans) would now consider the debate about creationism closed. You and I would say that creationism has lost any comparable status with natural selection. It’s simply wrong. Children should be informed of this. We wouldn’t expect our broadcasters to show impartiality between the two.

It is time socialist economics met the same fate. This century’s intellectual consensus should show Marxism the door. Whether strictly defined as public ownership of the means of production, distribution and exchange, or more loosely as state direction of the “commanding heights” of the economy, socialism must be counted as definitively discredited. Over almost a century that theory has been tested — in every case — to destruction. To bring the public understanding of science up to date with this truth is overdue. I shall in a moment explain how tough — not comforting — the consequences would be for the Conservatives. But first to two premises that you see I’ve implied: that socialist economics has in fact been discredited; and that our era’s intellectual consensus has yet to properly come to terms with this.

For the first, a string of names should suffice. Albania (Hoxha), Argentina (Kirchner), Chile (Allende), pre-capitalist China, Cuba, Eritrea, Mozambique (Machel), North Korea, North Vietnam, Venezuela — and of course the entire Soviet Union. Some of these countries have gone the whole Marxist hog and persisted for decades; others have since 1917 lurched tentatively into socialist economics, and repented of the experiment. I don’t suggest, as some Tories do, that the United Kingdom or France have ever really tried socialism: the left is correct, we haven’t. Whenever we’ve teetered leftwards towards the abyss, British and French nerves have always, mercifully, failed. How many more times does the Marxist analysis of the underlying principles of economics have to be tried in the field, and fail, before we write the theory off?

But now to my second premise: that we’re reluctant to write it off. My case can be made in three words. Consider the expression “for private profit”. Can you honestly claim it’s possible in British political discussion to use that expression other than disapprovingly?

The whole idea of private profit is a) absolutely central to the theory of free-market economics; and b) assumed to be a term of abuse in modern debate. It is a huge weakness in the presentation of the centre-right’s case in western European politics that we have been bullied into sheepishness about the mainspring of the economic theory on which our politics rests. Socialist economics has lost every battle, destroyed every economy that embraced it, wasted the talents and careers of some of our best minds, and wrecked hundreds of millions of lives — yet we have allowed it to keep the moral advantage.

You cannot have a successful modern democracy without freemarket economics. You cannot have free-market economics without the profit motive. You cannot have the profit motive without letting the pursuit of private profit weave itself intimately into the fabric of ordinary citizens’ lives. Until we face this, until we learn it morally as well as intellectually, we skew not only our politics but our habits of thought.

Why is Adam Smith not held out to schoolchildren as Charles Darwin is: as a scientist whose analysis is now the consensus among most thinking people? Why is elementary market economics not taught routinely to schoolchildren? Why did I have to swot up Newtonian physics for school exams but never the theory of supply, demand and price? Consider the following statement: “I don’t need two schools/ hospitals/ supermarkets/ plumbers to choose from, I just need one good one.” What law of market economics is the speaker overlooking here? Discuss with reference to the fate of the GUM department stores in the former Soviet Union. Would that really be too difficult for a 16-year-old? I said at the outset that Conservatives who find my argument comforting should think again. The British centre-right, an unsatisfactory guardian of the theory of markets, has left a flank wide open here. Rightwingers have an ingrained habit of looking away when abuses of the free market are pointed out to them. Partly this is because we have felt forever on the back foot, hoping to smuggle in market economics without standing up for the principle of private profit.

And partly, I’m afraid, because Conservatives are too close to the monopolistic, cartel-hugging, anticompetitive elements about whom Smith repeatedly warns. It is shameful that we often leave it to the Americans to take a lead in attacking corruption, fraud, market rigging and insider trading. We’re pathetic at confronting some of these rascals. Everybody knows that banks, pension funds, supermarkets, insurers, energy companies and big currency traders — even pricecomparison websites — sew things up between themselves, too often without more than a token challenge from government.

This is sometimes specifically connected with the Conservative party’s sources of income from donors but more often it’s because the party’s mental reflex is to defend capitalism generally against a popular suspicion of anyone who wants to get rich. Even in their pomp, the Tories feel defensive about market economics: protective to the point of indulgence.

Abuses are rife, widely suspected and conveniently overlooked. Here is a real vacuum that an opposition party could fill. But for as long as the impression endures that the Labour party is secretly but in its deepest marrow hostile to the whole idea of market economics, it will never convincingly lead the charge to make the market work.

The prize is enormous. But to reach for it Labour must first tear something from its history and its heart. It would have to admit that on the biggest question of the 20th century — socialist versus market economics — its party had been wrong. If Labour could do this and be believed, it would be as if a big window were opened into a room full of stale air.

    

 

Trickle Down Doesn't Work

 

The idea that increased income inequality makes economies more dynamic has been rejected by an International Monetary Fund study, which shows the widening income gap between rich and poor is bad for growth.

A report by five IMF economists dismissed “trickle-down” economics, and said that if governments wanted to increase the pace of growth they should concentrate on helping the poorest 20% of citizens.

The study – covering advanced, emerging and developing countries – said technological progress, weaker trade unions, globalisation and tax policies that favoured the wealthy had all played their part in making widening inequality “the defining challenge of our time”.

The IMF report said the way income is distributed matters for growth. “If the income share of the top 20% increases, then GDP growth actually declines over the medium term, suggesting that the benefits do not trickle down. In contrast, an increase in the income share of the bottom 20% is associated with higher GDP growth,” said the report.

Echoing the frequent warnings about rising inequality from the IMF’s managing director, Christine Lagarde, the report says governments around the world need to tackle the problem. It said: “Raising the income share of the poor, and ensuring that there is no hollowing-out of the middle class, is actually good for growth.”

The study, however, reflects the tension between the IMF’s economic analysis and the more hardline policy advice given to individual countries such as Greece, which need financial support.

During its negotiations with Athens, the IMF has been seeking to weaken workers’ rights, but the research paper found that the easing of labour market regulations was associated with greater inequality and a boost to the incomes of the richest 10%.

“This result is consistent with forthcoming IMF work, which finds the weakening of unions is associated with a higher top 10% income share for a smaller sample of advanced economies,” said the study.

“Indeed, empirical estimations using more detailed data for Organisation for Economic Cooperation and Development countries [34 of the world’s richest nations] suggest that, in line with other forthcoming IMF work, more lax hiring and firing regulations, lower minimum wages relative to the median wage, and less prevalent collective bargaining and trade unions are associated with higher market inequality.”

The study said there was growing evidence to suggest that rising influence of the rich and stagnant incomes of the poor and middle classes caused financial crises, hurting both short- and long-term growth.

“In particular, studies have argued that a prolonged period of higher inequality in advanced economies was associated with the global financial crisis by intensifying leverage, overextension of credit, and a relaxation in mortgage-underwriting standards, and allowing lobbyists to push for financial deregulation,” it said.

It added that pretax incomes of middle-class households in the US, the UK, and Japan had experienced declining or stagnant growth rates in recent years. Additional pressures on the middle class reflected a declining share of labour income – the predominant source of income for the majority of households.

Inequality could lead to policies that hurt growth, the IMF study said, noting that it could it could cause a backlash against growth-enhancing economic liberalisation and fuel-protectionist pressures against globalisation and market-oriented reforms. “At the same time, enhanced power by the elite could result in a more limited provision of public goods that boost productivity and growth, and which disproportionately benefit the poor.”

The paper called for extra investment in health and education policies that reduce poverty, and more progressive taxation. “Fiscal policy already plays a significant role in addressing income inequality in many advanced economies, but the redistributive role of fiscal policy could be reinforced by greater reliance on wealth and property taxes, more progressive income taxation, removing opportunities for tax avoidance and evasion, better targeting of social benefits while also minimising efficiency costs, in terms of incentives to work and save.”

    

 

The End of the Middle Class

 

It’s no secret that the American middle class has been on the ropes for a while now. The problem isn’t just a crippling recession and an economic “recovery” that has mostly gone to the richest one percent, but the larger shifting of wealth from the middle to the very top that’s taken place since the late ‘70s. Add in things like the dismantling of unions that has accelerated apace since Ronald Reagan crushed the air-traffic controllers, and we’ve seen the middle class more solid in places like Canada, Germany, and Scandinavia, and begin to grow in a number of nations even while it shrinks here. Economists like Thomas Piketty thinks the process is inevitable with global capitalism, while others – the equally wise Joseph Stiglitz, for example – think the balance can be restored if we can find the political will.

It turns out that those concerned about a tattered middle class are right about most of it, but overlooking one thing: Boomers – or rather, a particular strain of Boomer and near-Boomer – are doing great. That is, if you were born in the ‘40s, you are going to be the last American generation to enjoy a robust safety net, and your gray years will be far more comfortable than those a decade older or younger.

Supported by income from Social Security, pensions and investments, as well as an increasing number of paychecks from delaying retirement, older people not only weathered the economic downturn that began in 2007 but made significant gains, a New York Times analysis of government data has found.

And despite our generally ornery Xer jingoism, we’re going to concede something here. We’ve noticed that our friends who we could call “young Boomers” – born in the late ‘50s and early ’60s – are often far less privileged and spoiled than those born in the years right after World War II. This younger group grew up or came of ago, after all, in the ‘70s and ‘80s, as the postwar boom was fading, colleges were becoming expensive, and the Reagan Revolution was pulling the rug out from under the middle class.

And it turns out that those young Boomers are indeed a kind of transition generation. It’s the group now retiring that will take most of the spoils of the U.S. postwar boom and leave the rest of us with scraps:

In the past, the elderly were usually poorer than other age groups. Now, they are the last generation to widely enjoy a traditional pension, and are prime beneficiaries of a government safety net targeted at older Americans. They also have profited from the long rise in real estate prices that preceded the recession. As a result, more seniors now fall into the middle class — defined in this case between the 40th and 80th income percentile — than ever before.

If you wonder why you are working so hard to get a job, please note that a lot of these guys are sitting on theirs or at least working part-time.

The Times piece shows how a variety of Americans in that sub-generation is faring. Some are struggling, like the rest of us. But between the fancy cruises and fat pensions and gated communities and golf courses and vintage ‘57s Chevys, it’s not a world that younger Americans have any reason to expect. In fact, it sounds like something from a museum of postwar affluence.

So part of us is glad the American middle class will go out with a boom, so to speak. We don’t begrudge these people – our teachers and professors, our older friends, our parents and other relatives – comfort in their gray years. The way Americans, in the days before social security and other protections, lost their footings in old age was simply inhumane. But why couldn’t the prosperity be spread so that those born in the ‘50s, ‘60s, and after can enjoy the same stability and wealth?

Well, this is a complicated one, and we’ll nod to the usual suspects: Globalization, technology, and the depletion of natural resources (especially energy) meant that the postwar boom would not last forever.

But you know what else the original Boomers brought us? Despite their dabbling with progressivism and hippie utopianism, this group served as the shock troops for market-worshipping neoliberalism and the Reagan-Thatcher shift in the ‘70s and ‘80s. They gave us junk bonds and the privatization push and Gordon Gekko. Some of them went into the corporate world and started dismantling.

Let’s hope they enjoy their retirements. But these gray Boomers and grayer Silents – not all of them, but enough to do substantial damage – put forces in motion that mean for the rest of us, the twilight years will be significantly less cozy.

    

 

Why Dylann Roof Acted on His Racism

 

With the discovery of Dylann Roof’s manifesto this weekend, it becomes much more difficult to dismiss his attack on the AME Church as mental illness or evidence of “anti-Christian hostility.” It seems clear that he was a self-radicalized domestic terrorist. Roof may have been a “troubled loner,” but his decision to commit violence was made in the context of a larger culture of racism.

As a religion scholar, I see Roof’s fantasy of sparking a race war as an example of millennial violence; conservative declension narratives and calls to “take our country back” are religious language as much as political rhetoric.

Without giving details as to how are our country was “lost” or what taking it back would actually mean, this rhetoric spins a narrative of a fallen nation as well as a prophecy of a redeemed and perfected one. It is precisely the troubled loners - those who find nothing of value in the present order - who become the most invested in these prophecies.

Scholars who study apocalyptic movements long have noted that radical political movements such as Marxism and Nazism are founded on prophecies of a secular apocalypse. Like religious prophecies of the apocalypse, these movements claim that history will inevitably lead to a new and “perfect” world order.

In some cases it’s easy to see how these political visions were adapted from ideas of Christ’s millennial kingdom; after all, the Nazi Third Reich was expected to last 1,000 years. Although Roof’s manifesto is brief and poorly written, it contains the elements of a racist millennial order, laying out his opinion of blacks, whites, Hispanics, Asians, and Jews and assesses what role they might have in his ideal world - even speculating about an alliance with “racist Asians.”

The millennial imagination is powerful because it frequently translates into action. The political philosopher Eric Voegelin coined the term “immanentize the eschaton” to describe the goal of political movements that seek to bring about the millennial kingdom through human efforts.

White supremacist ideology typically imagines apocalyptic violence and race war as the means of ushering in its millennial kingdom. In Mein Kampf, Hitler wrote, “The main difficulty is not to establish a new order of things but to clear the ground for its establishment.”

The classic text of racist apocalypticism is The Turner Diaries, published in paperback in 1978 and written by Hitler admirer and influential white supremacist William Luther Pierce (pseudonym: Andrew Macdonald). The novel describes an alternate timeline in which a secretive group of Jews has risen to power in the United States, confiscated all firearms, and legalized the rape of white women in the name of combatting racism. A white supremacist group called “The Order” resists this regime by waging a terror campaign which triggers a civil war in which more whites rally to the cause. Ultimately, The Order succeeds in taking over America after the protagonist, Earl Turner, flies a crop-duster containing a nuclear warhead into the Pentagon. The novel culminates in a nuclear genocide in which all non-white races are exterminated.

The ADL calls Pierce’s novel “The Bible of the Right Wing” and it is, in many ways, a religious text. Its conclusion describes how “the dream of a White world finally became a certainty” through Turner’s martyrdom. In Pierce’s millennial vision, a new calendar is created marking time from the beginning of his apocalyptic race war. If Roof did not read The Turner Diaries, he certainly had contact with people who had.

In the days after the attack on the A.M.E. Church, discussion focused on how Roof’s actions were related to a wider culture of racism, both in South Carolina and throughout American society. It seems that within the culture of racist apocalypticism there is a division of labor between dreamers and doers, prophets and martyrs.

Numerous hate crimes and terror plots were inspired by The Turner Diaries, including Timothy McVeigh’s attack on the Oklahoma City Building in 1995. Turner and his allies blow up the FBI headquarters using a fertilizer bomb packed into a truck—the same method used by McVeigh. Not surprisingly, McVeigh was an avid fan of The Turner Diaries, mailing it to friends and selling copies at gun shows. Pierce did not applaud McVeigh’s actions but rather denounced his attempt to immanentize the eschaton, even while his National Alliance continued to campaign.

Those seeking to hermetically seal Roof’s actions from a larger cultural pattern will emphasize that he only completed the ninth grade, that he was unemployed, that he likely abused drugs, and that we was arrested following a strange incident at a mall. But these details suggest exactly the sort of person who might become invested in a millennial prophecy. In the current order, Roof was a loser—perhaps more so because of his privilege as a white male and his limited education.

But in a racist apocalyptic fantasy, he could be a hero of world-changing significance. While apocalypticism focuses on the future, it is really about a desire to escape the unhappiness of the present. Hitler wrote that, “Weltanschauung [worldview] represents a declaration of war against an existing order of things, against present conditions, in short, against the established Weltanschauung.” Roof’s fantasy of sparking a race war may have also been a declaration of war against his own present conditions.

As we continue to learn more about Roof and the motivations for his actions, it is important to analyze not only Roof’s circumstances but also the forces that shaped his imagination. What did he think might happen after his attack and who were the voices that contributed to this apocalyptic vision? We need to answer these questions, because it’s a vision that other angry young white men are certainly contemplating.

    

 

The Tea Party and Southern Racism

 

Tea Partiers say you don’t understand them because you don’t understand American history. That’s probably true, but not in the way they want you to think.

Late in 2012, I came out of the Lincoln movie with two historical mysteries to solve:

How did the two parties switch places regarding the South, white supremacy, and civil rights? In Lincoln’s day, a radical Republican was an abolitionist, and when blacks did get the vote, they almost unanimously voted Republican. Today, the archetypal Republican is a Southern white, and blacks are almost all Democrats. How did American politics get from there to here?

One of the movie’s themes was how heavily the war’s continuing carnage weighed on Lincoln. (It particularly came through during Grant’s guided tour of the Richmond battlefield.) Could any cause, however lofty, justify this incredible slaughter? And yet, I realized, Lincoln was winning. What must the Confederate leaders have been thinking, as an even larger percentage of their citizens died, as their cities burned, and as the accumulated wealth of generations crumbled? Where was their urge to end this on any terms, rather than wait for complete destruction?

The first question took some work, but yielded readily to patient googling. I wrote up the answer in “A Short History of White Racism in the Two-Party System“. The second turned out to be much deeper than I expected, and set off a reading project that has eaten an enormous amount of my time over the last two years. (Chunks of that research have shown up in posts like “Slavery Lasted Until Pearl Harbor“, “Cliven Bundy and the Klan Komplex“, and my review of Ta-Nehisi Coates’ article on reparations.) Along the way, I came to see how I (along with just about everyone I know) have misunderstood large chunks of American history, and how that misunderstanding clouds our perception of what is happening today.

Who really won the Civil War? The first hint at how deep the second mystery ran came from the biography Jefferson Davis: American by William J. Cooper. In 1865, not only was Davis not agonizing over how to end the destruction, he wanted to keep it going longer. He disapproved of Lee’s surrender at Appomattox, and when U. S. troops finally captured him, he was on his way to Texas, where an intact army might continue the war.

That sounded crazy until I read about Reconstruction. In my high school history class, Reconstruction was a mysterious blank period between Lincoln’s assassination and Edison’s light bulb. Congress impeached Andrew Johnson for some reason, the transcontinental railroad got built, corruption scandals engulfed the Grant administration, and Custer lost at Little Big Horn. But none of it seemed to have much to do with present-day events.

And oh, those blacks Lincoln emancipated? Except for Booker T. Washington and George Washington Carver, they vanished like the Lost Tribes of Israel. They wouldn’t re-enter history until the 1950s, when for some reason they still weren’t free.

Here’s what my teachers’ should have told me: “Reconstruction was the second phase of the Civil War. It lasted until 1877, when the Confederates won.” I think that would have gotten my attention.

It wasn’t just that Confederates wanted to continue the war. They did continue it, and they ultimately prevailed. They weren’t crazy, they were just stubborn.

The Lost Cause. At about the same time my American history class was leaving a blank spot after 1865, I saw Gone With the Wind, which started filling it in like this: Sadly, the childlike blacks weren’t ready for freedom and full citizenship. Without the discipline of their white masters, many became drunks and criminals, and they raped a lot of white women. Northern carpetbaggers used them (and no-account white scalawags) as puppets to control the South, and to punish the planter aristocrats, who prior to the war had risen to the top of Southern society through their innate superiority and virtue.

But eventually the good men of the South could take it no longer, so they formed the Ku Klux Klan to protect themselves and their communities. They were never able to restore the genteel antebellum society — that Eden was gone with the wind, a noble but ultimately lost cause — but they were eventually able to regain the South’s honor and independence. Along the way, they relieved their beloved black servants of the onerous burden of political equality, until such time as they might become mature enough to bear it responsibly.

A still from The Birth of a Nation

That telling of history is now named for its primary proponent, William Dunning. It is false in almost every detail. If history is written by the winners, Dunning’s history is the clearest evidence that the Confederates won.

Margaret Mitchell’s 1936 novel had actually toned it down a little. To feel the full impact of Dunning-school history, you need to read Thomas Dixon’s 1905 best-seller, The Clansman: a historical romance of the Ku Klux Klan. Or watch the 1915 silent movie made from it, The Birth of a Nation, which was the most popular film of all time until Gone With the Wind broke its records.

The iconic hooded Klansman on his horse, the Knight of the Invisible Empire, was the Luke Skywalker of his day.

The first modern war. The Civil War was easy to misunderstand at the time, because there had never been anything like it. It was a total mobilization of society, the kind Europe wouldn’t see until World War I. The Civil War was fought not just with cannons and bayonets, but with railroads and factories and an income tax.

If the Napoleonic Wars were your model, then it was obvious that the Confederacy lost in 1865: Its capital fell, its commander surrendered, its president was jailed, and its territories were occupied by the opposing army. If that’s not defeat, what is?

But now we have a better model than Napoleon: Iraq.

After the U.S. forces won on the battlefield in 1865 and shattered the organized Confederate military, the veterans of that shattered army formed a terrorist insurgency that carried on a campaign of fire and assassination throughout the South until President Hayes agreed to withdraw the occupying U. S. troops in 1877. Before and after 1877, the insurgents used lynchings and occasional pitched battles to terrorize those portions of the electorate still loyal to the United States. In this way they took charge of the machinery of state government, and then rewrote the state constitutions to reverse the postwar changes and restore the supremacy of the class that led the Confederate states into war in the first place.

By the time it was all over, the planter aristocrats were back in control, and the three constitutional amendments that supposedly had codified the U.S.A’s victory over the C.S.A.– the 13th, 14th, and 15th — had been effectively nullified in every Confederate state. The Civil Rights Acts had been gutted by the Supreme Court, and were all but forgotten by the time similar proposals resurfaced in the 1960s. Blacks were once again forced into hard labor for subsistence wages, denied the right to vote, and denied the equal protection of the laws. Tens of thousands of them were still physically shackled and subject to being whipped, a story historian Douglas Blackmon told in his Pulitzer-winning Slavery By Another Name.

So Lincoln and Grant may have had their mission-accomplished moment, but ultimately the Confederates won. The real Civil War — the one that stretched from 1861 to 1877 — was the first war the United States lost.

The missed opportunity. Today, historians like Eric Foner and Douglas Egerton portray Reconstruction as a missed opportunity to avoid Jim Crow and start trying to heal the wounds of slavery a century sooner. Following W.E.B. DuBois’ iconoclastic-for-1935 Black Reconstruction, they see the freedmen as actors in their own history, rather than mere pawns or victims of whites. As a majority in Mississippi and South Carolina, and a substantial voting bloc across the South, blacks briefly used the democratic system to try to better their lot. If the federal government had protected the political process from white terrorism, black (and American) history could have taken an entirely different path.

In particular, 1865 was a moment when reparations and land reform were actually feasible. Late in the war, some of Lincoln’s generals — notably Sherman — had mitigated their slave-refugee problem by letting emancipated slaves farm small plots on the plantations that had been abandoned by their Confederate owners. Sick or injured animals unable to advance with the Army were left behind for the slaves to nurse back to health and use. (Hence “forty acres and a mule”.) Sherman’s example might have become a land-reform model for the entire Confederacy, dispossessing the slave-owning aristocrats in favor of the people whose unpaid labor had created their wealth.

Instead, President Johnson (himself a former slave-owner from Tennessee) was quick to pardon the aristocrats and restore their lands. That created a dynamic that has been with us ever since: Early in Reconstruction, white and black working people sometimes made common cause against their common enemies in the aristocracy. But once it became clear that the upper classes were going to keep their ill-gotten holdings, freedmen and working-class whites were left to wrestle over the remaining slivers of the pie. Before long, whites who owned little land and had never owned slaves had become the shock troops of the planters’ bid to restore white supremacy.

Along the way, the planters created rhetoric you still hear today: The blacks were lazy and would rather wait for gifts from the government than work (in conditions very similar to slavery). In this way, the idle planters were able to paint the freedmen as parasites who wanted to live off the hard work of others.

The larger pattern. But the enduring Confederate influence on American politics goes far beyond a few rhetorical tropes. The essence of the Confederate worldview is that the democratic process cannot legitimately change the established social order, and so all forms of legal and illegal resistance are justified when it tries.

That worldview is alive and well. During last fall’s government shutdown and threatened debt-ceiling crisis, historian Garry Wills wrote about our present-day Tea Partiers: “The presiding spirit of this neo-secessionism is a resistance to majority rule.”

The Confederate sees a divinely ordained way things are supposed to be, and defends it at all costs. No process, no matter how orderly or democratic, can justify fundamental change.

When in the majority, Confederates protect the established order through democracy. If they are not in the majority, but have power, they protect it through the authority of law. If the law is against them, but they have social standing, they create shams of law, which are kept in place through the power of social disapproval. If disapproval is not enough, they keep the wrong people from claiming their legal rights by the threat of ostracism and economic retribution. If that is not intimidating enough, there are physical threats, then beatings and fires, and, if that fails, murder.

That was the victory plan of Reconstruction. Black equality under the law was guaranteed by the 14th Amendment. But in the Confederate mind, no democratic process could legitimate such a change in the social order. It simply could not be allowed to stand, and it did not stand.

In the 20th century, the Confederate pattern of resistance was repeated against the Civil Rights movement. And though we like to claim that Martin Luther King won, in many ways he did not. School desegregation, for example, was never viewed as legitimate, and was resisted at every level. And it has been overcome. By most measures, schools are as segregated as ever, and the opportunities in white schools still far exceed the opportunities in non-white schools.

Today, ObamaCare cannot be accepted. No matter that it was passed by Congress, signed by the President, found constitutional by the Supreme Court, and ratified by the people when they re-elected President Obama. It cannot be allowed to stand, and so the tactics for destroying it get ever more extreme. The point of violence has not yet been reached, but the resistance is still young.

Violence is a key component of the present-day strategy against abortion rights, as Judge Myron Thompson’s recent ruling makes clear. Legal, political, social, economic, and violent methods of resistance mesh seamlessly. The Alabama legislature cannot ban abortion clinics directly, so it creates reasonable-sounding regulations the clinics cannot satisfy, like the requirement that abortionists have admitting privileges at local hospitals. Why can’t they fulfill that requirement? Because hospitals impose the reasonable-sounding rule that their doctors live and practice nearby, while many Alabama abortionists live out of state. The clinics can’t replace them with local doctors, because protesters will harass the those doctors’ non-abortion patients and drive the doctors out of any business but abortion. A doctor who chooses that path will face threats to his/her home and family. And doctors who ignore such threats have been murdered.

Legislators, of course, express horror at the murder of doctors, just as the pillars of 1960s Mississippi society expressed horror at the Mississippi Burning murders, and the planter aristocrats shook their heads sadly at the brutality of the KKK and the White Leagues. But the strategy is all of a piece and always has been. Change cannot stand, no matter what documents it is based on or who votes for them. If violence is necessary, so be it.

Unbalanced. This is not a universal, both-sides-do-it phenomenon. Compare, for example, the responses to the elections of our last two presidents. Like many liberals, I will go to my grave believing that if every person who went to the polls in 2000 had succeeded in casting the vote s/he intended, George W. Bush would never have been president. I supported Gore in taking his case to the Supreme Court. And, like Gore, once the Court ruled in Bush’s favor — incorrectly, in my opinion — I dropped the issue.

For liberals, the Supreme Court was the end of the line. Any further effort to replace Bush would have been even less legitimate than his victory. Subsequently, Democrats rallied around President Bush after 9/11, and I don’t recall anyone suggesting that military officers refuse his orders on the grounds that he was not a legitimate president.

Barack Obama, by contrast, won a huge landslide in 2008, getting more votes than any president in history. And yet, his legitimacy has been questioned ever since. The Birther movement was created out of whole cloth, there never having been any reason to doubt the circumstances of Obama’s birth. Outrageous conspiracy theories of voter fraud — millions and millions of votes worth — have been entertained on no basis whatsoever. Immediately after Obama took office, the Oath Keeper movement prepared itself to refuse his orders.

A black president calling for change, who owes most of his margin to black voters — he himself is a violation of the established order. His legitimacy cannot be conceded.

Confederates need guns. The South is a place, but the Confederacy is a worldview. To this day, that worldview is strongest in the South, but it can be found all over the country (as are other products of Southern culture, like NASCAR and country music). A state as far north as Maine has a Tea Party governor.

Gun ownership is sometimes viewed as a part of Southern culture, but more than that, it plays a irreplaceable role in the Confederate worldview. Tea Partiers will tell you that the Second Amendment is our protection against “tyranny”. But in practice tyranny simply means a change in the established social order, even if that change happens — maybe especially if it happens — through the democratic processes defined in the Constitution. If the established social order cannot be defended by votes and laws, then it will be defended by intimidation and violence. How are We the People going to shoot abortion doctors and civil rights activists if we don’t have guns?

Occasionally this point becomes explicit, as when Nevada Senate candidate Sharron Angle said this:

You know, our Founding Fathers, they put that Second Amendment in there for a good reason and that was for the people to protect themselves against a tyrannical government. And in fact Thomas Jefferson said it’s good for a country to have a revolution every 20 years. I hope that’s not where we’re going, but, you know, if this Congress keeps going the way it is, people are really looking toward those Second Amendment remedies and saying my goodness what can we do to turn this country around? I’ll tell you the first thing we need to do is take Harry Reid out.

Angle wasn’t talking about anything more “tyrannical” than our elected representatives voting for things she didn’t like (like ObamaCare or stimulus spending). If her side can’t fix that through elections, well then, the people who do win those elections will just have to be intimidated or killed. Angle doesn’t want it to come to that, but if liberals won’t yield peacefully to the conservative minority, what other choice is there?

Gun-rights activist Larry Pratt doesn’t even seem regretful:

“The Second Amendment is not for hunting, it’s not even for self-defense,” Pratt explained in his Leadership Institute talk. Rather, it is “for restraining tyrannical tendencies in government. Especially those in the liberal, tyrannical end of the spectrum. There is some restraint, and even if the voters of Brooklyn don’t hold them back, it may be there are other ways that their impulses are somewhat restrained. That’s the whole idea of the Second Amendment.”

So the Second Amendment is there not to defend democracy, but to fix what the progressive “voters of Brooklyn” get wrong.

It’s not a Tea Party. The Boston Tea Party protest was aimed at a Parliament where the colonists had no representation, and at an appointed governor who did not have to answer to the people he ruled. Today’s Tea Party faces a completely different problem: how a shrinking conservative minority can keep change at bay in spite of the democratic processes defined in the Constitution. That’s why they need guns. That’s why they need to keep the wrong people from voting in their full numbers.

These right-wing extremists have misappropriated the Boston patriots and the Philadelphia founders because their true ancestors — Jefferson Davis and the Confederates — are in poor repute.

But the veneer of Bostonian rebellion easily scrapes off; the tea bags and tricorn hats are just props. The symbol Tea Partiers actually revere is the Confederate battle flag. Let a group of right-wingers ramble for any length of time, and you will soon hear that slavery wasn’t really so bad, that Andrew Johnson was right, that Lincoln shouldn’t have fought the war, that states have the rights of nullification and secession, that the war wasn’t really about slavery anyway, and a lot of other Confederate mythology that (until recently) had left me asking, “Why are we talking about this?”

By contrast, the concerns of the Massachusetts Bay Colony and its revolutionary Sons of Liberty are never so close to the surface. So no. It’s not a Tea Party. It’s a Confederate Party.

Our modern Confederates are quick to tell the rest of us that we don’t understand them because we don’t know our American history. And they’re right. If you knew more American history, you would realize just how dangerous these people are.

    

 

The Migrant Question

 

In my holiday photos, among the temples and sunsets, are snaps of almost-strangers. Our erudite guide in Burma, a waiter in Sri Lanka who was kind to my young sons, a family of weavers in Kerala we came across in that most awkward of First World meets Third World encounters, “the village tour”. They feel good, such meetings, they assuage liberal guilt: here we are, a universe apart, sharing a moment, getting along.

And then you shake hands, trade warm words, maybe promise to send a photo (but always forget once you’re home). Yes, lovely to meet you. How quaint, you still cook on an open fire. Have my Biro for your little girl. Fascinating to hear how your education was wrecked by the military junta. Now off you walk in the blistering heat to your oneroomed corrugated iron house with four hours electricity a day. I’ll take my air-conditioned car to my hotel for a shower then a cocktail and tomorrow I’ll fly back to my comfortable life.

You don’t need to be a bleedingheart, just a sentient person, to wonder why they don’t hate us. Unless you believe a person’s life is decided by karma it is hard to justify why birth place should determine fate. And yet that is the first principle of world order: you get the nationality you are given. Unless you are rich or very tough.

But now the waiters and guides and village ladies are massing on our borders. They are drowning in the Mediterranean for our peaceful streets, stacked supermarkets, casual freedoms. They have beached on the holiday island of Kos. They are ripping open onion lorries, squeezing beneath our camper vans. After 11-hour flights from Jo’burg, frozen in the landing gear of planes, they are plunging on to shop roofs in Richmond.

And seeing the scenes in Calais, however much sympathy you have for those displaced by war, fleeing Isis, or who just want — and why the hell not? — what we have, there lurks among our humanitarian impulses an unsettling fear. Oh, the awful, cowardly contradictions of our age! It is possible to feel for the young men who have crossed continents to reach Calais, awe at their endeavour, concern for their squalid camp conditions, but also a powerful, overwhelming desire for them just to go away. Because how many more will come if we are too kind? How will our first-class cabin on Planet Earth feel if invaded by the masses from steerage?

The ironies are everywhere. I read about young Macedonian men, banned from entering the EU on trains and buses, who are cycling. Pedalling hundreds of miles to find work, they have literally got on their bikes, like Norman Tebbit’s dad! Then there are the British second homeowners outraged that migrants are impeding their journey to the sun. One man blamed freedom of movement for allowing these invaders to travel across Europe without passports being checked: the same principle that lets him glide between Kent and his place in Île de Ré.

Then comes the self-serving hypocrisy of whole nations. And there is nothing like migration for turning the EU from transnational superpower to every-government-for-itself mess. France blames Britain for the crisis in Calais yet shuts its own border with Italy at Ventimiglia. Italy refuses to process migrants because it knows it may have to keep them for good. Hungary plans a vast fence around its border. Austria says sod Schengen and has its police demanding papers on trains. The Royal Navy ferries migrants to Italy, but we won’t resettle any because David Cameron knows the first boatload will fuel the EU referendum No campaign. Meanwhile he talks tough about punishing evil people-traffickers for drowning children when really he just wants an excuse to blow up the boats.

Freedom of movement, whatever its benefits for business, has taken us to a political dark place. How much more welcome would gassed, exhausted Syrians be if Britain hadn’t lately admitted 200,000 Romanians and Bulgarians. That they are vastly different cases — war-traumatised refugees versus mainly low-paid economic migrants — isn’t widely appreciated. In the hardening public heart, they are an equal drain upon our housing stock and health service. And since the government cannot stop a single EU citizen arriving, it launches punitive measures against those it can, such as the awful, peevish new plan to kick out all non-EU workers who after five years are not earning £35,000 a year. Which is every blameless Filipino nurse who ever shored up the NHS.

Because in a single year our net population went up by 500,000 people, half of this due to migration. Is that OK? Are the freedom of movement advocates cool about that? Do they have any upper limit at all? We know there is not enough housing, that rough-sleeping rates in London have soared (and that many are arriving migrants), that slum landlords are back, shoving 26 souls in a suburban semi. I see apartment blocks shooting up everywhere in south London (most unaffordable). But where is the new infrastructure: health clinics, commuter trains, leisure centres. Let alone any plan to help disparate people cohere.

Ahead of us lies a decision. Does the EU turn into Australia: blockade Libyan waters with gunboats, outsource processing centres to Africa, move the human suffering and ghastly hypocrisies of our world from sight. Build ever higher walls to protect our privileged way of life. Or do we have an honest discussion and plan for a more populous, less comfortable but fairer future? The world is on the move: we must decide.

    

 

The EU Is Doomed

 

Reader, could we strike a deal? Barring any new shock, I promise not to write another column about Britain’s renegotiation of our EU membership until the autumn — and in return will you agree to indulge me just this Saturday?

There’s something I want to get off my chest so that in future you at least know where this columnist is coming from. Because he detests the fruitcake tendency of euro-bores who think Brussels plans to eat our children, you’ve perhaps supposed he’s “pro-EU”? He’s not.

I can’t stand the European Union. It’s probably doomed. The best it can hope for is to limp. It is a bird unlikely ever to fly.

My allergy to the EU as an institution - almost everything about the institution irritates, from its bossy mandarins to its greedy MEPs to its indecisive heads of state; from its corruptions to its sniggering, winking, rule-benders to its self-lunching grandiosity; from its headin-the-sand aversion to hard decisions, to its itch to regulate trivialities.

Bossy, flabby, broken-backed and pompous, the European Union has a big yellow streak running right through it. Like the Holy Roman Empire, the organisation has become an overhang from the past into the future; and the future is eating away its foundations. If, or when, it collapses the sage “it was always inevitables” will ping across cyberspace like a comet shower. If I’m honest, I’m not even a eurosceptic: I’m a Europhobe. There. That’s better. Got that out of my system like a good spit.

But after expectoration should come calculation. As the semidetached member-state we British are, as of now, and in all the circumstances, and on the balance of costs and benefits, would it a good idea for us to walk out tomorrow? Do we want, and will it benefit us, to become (as, mark my words, Britain would become) the nation that killed the European Union?

On balance, I’m inclined to think not. I’d be inclined to let it run into the sand all on its own. To us the EU is a nuisance, it is not an existential threat. It doesn’t actually cost much. We do get benefits, tangible and intangible, from membership, and if David Cameron can tweak things a little further in the direction of commonsense then I’m not persuaded the costs of quitting would outweigh the benefits of staying.

In short, it’s probably best to string along with the EU for the time being but I wouldn’t go to the stake on it. Tens of millions of my fellow-citizens think like this. But we’re open to persuasion the other way if “Europe” gets too big for its boots. Ours is neither a noble, nor a principled, nor even a particularly clear position to take but it does summarise, if not a position, a disposition that is very common. In it there’s a message to both sides of the coming referendum campaign, and to our European partners. All three should realise that precisely because most of us don’t feel all that strongly, the mood could flip.

It’s just possible we may this very week be looking at the beginning of the end of the EU. But to suspect that a political construct must meet its nemesis does not require me or my country to be its nemesis. Rather the opposite: we can relax.

In the period of “renegotiation” ahead there is reason to be relaxed. Either Greece crashes out of the euro or, to hold on to Greece, the EU bends over backwards. And “bends over backwards” will do no justice to the indignity. If Grexit is followed by Greece leaving the EU, then “to lose one country,” Mrs Merkel could echo Lady Bracknell in wailing, “may be regarded as a misfortune. To lose both looks like carelessness.” They’ll be desperate to keep us.

If Greece stays, we must conclude that there is no contortion other member states won’t knot themselves into in order to keep even a mosquito of a country from departing. If an EU prepared to break every rule in the book for Greece is not prepared — for Britain, a powerhouse of the European economy — to nod through a little creative interpretation of the rule book, then we must see this as a constructive expulsion from the European club. And I trust Cameron is ready at some point (not yet) to say so. My guess is that he’ll win this one. I would then vote to stay on in the EU for the time being, grumbling, as is probably our destiny.

It’s not a big ask for Cameron to put to me and millions of my fellow half-hearts, and it’s not a big ask for the prime minister to put to his fellow heads of government. The chances are that he’ll get something resembling a deal, he’ll recommend it, and we’ll take it. Most of the rest will be noise.

Otherwise we’re out, and some years later the EU will disintegrate, perhaps agonisingly. It would be a pity, though — I put it no higher than that — for this to look like Britain’s fault.

    

 

How Debates Shift

 

Within just a few days of Dylann Roof’s racially motivated murder of 9 African-American worshippers and clergy in Charleston’s historic Emanuel A.M.E. Church, a sea change appeared to be under way with regards to the Confederate flag — this after decades of tense and slow-moving debate about whether the symbol deserves any kind of place in modern public life.

In short order, the governors of South Carolina and Alabama asked for the flag to be taken down from their respective Capitol grounds, other southern states showed a sudden willingness to reduce the visibility of the flag, and Amazon and Walmart stopped selling it. All this occurred against the backdrop of a loud chorus of online activists arguing that it was time to take the flag down once and for all — a few days after the shooting, the #takeitdown hashtag was tweeted 12,000 times in one day. Why all the sudden movement on an issue that had been a sore culture-war sticking point for decades? Yes, Roof’s massacre was horrific, but it obviously wasn’t the first racist violence to have occurred in a state where the Confederate flag flies.

“The pace of this change has been quite staggering,” said Dr. Jonathan Knuckley, a political scientist at the University of Central Florida who studies southern politics. The why ties into some basic, vital aspects of how Americans’ political opinions are formed and expressed. Foremost among them is the idea that most Americans simply aren’t all that informed about most policy issues, and when they do form opinions, they look around for highly visible cues to guide them toward the “right” opinion. (The notion that most Americans simply aren’t savvy when it comes to politics and policy may whiff of elitism, but it’s also one of the more durable findings in political science — in 2011, for example, about a third of Americans couldn’t name the vice-president.)

Dr. Timothy Ryan, a political scientist at the University of North Carolina, explained that until recently, this was true of the Confederate flag as well. “The typical citizen, if you asked them what they thought about the Confederate-flag issue in South Carolina two or three weeks ago, they would be making up their opinion on the fly in that moment,” he said. “Whereas now people have had some time to think about it, have had a push to think about it.”

As a result of this push, these voters will use whatever available cues come to mind to generate an opinion — a news segment they saw, a recent conversation with a friend. And those who sit somewhere in the middle and who are giving serious thought to the Confederate-flag issue for the first time are awash in anti-flag sentiments, whether delivered via Twitter, on news reports of anti-flag protests, or on radio spots covering Walmart and Amazon’s decision. These days there are tons of cues to draw upon, and very few of them would nudge one to support the Confederate flag.

Perhaps the most potent of such cues is the now-infamous photograph of Roof posing in front of the Confederate flag. “It doesn’t take much to process,” said Knuckley. “It’s kind of one of those gut, visceral, I-don’t-even-have-to-think-about-this-issue [images].” This cue, and others like it, affects voters on both sides of the issue. “The other side of that coin — it becomes a lot more difficult to be for [the flag],” said Knuckey. “Just a month or so ago, someone could have made a perfectly, in their mind, rational argument. It’s the kind of issue now that’s difficult to be in favor of.”

That doesn’t mean that support for the flag is now going to drop to zero, Knuckey emphasized. Ryan agreed. “I bet you haven’t changed so many minds among the people who are really strong, meaningful supporters,” he said. But that’s not the point — the point is those folks on the middle, say, third of the Confederate-flag-opinion spectrum. Those who supported the flag, but just barely, are now seeing all sorts of highly visible cues indicating that the country is turning against them,while those who were just barely against it will have their preference intensified.

The end result? A shift in polling, perhaps (there haven’t yet been any surveys released that allow for apples-to-apples comparisons on the flag issue from before and after the church attack), but, just as important, a group of “antis” who are much more engaged and vocal than they were before the shooting — in part because they’re feeding off the sense that, nationwide, people are moving against the flag. Political scientists call this “preference intensity,” and it’s incredibly important: A minority of citizens who are stridently opposed to a new bill can, in the right setting, “beat” a majority of voters who are slightly for it but don’t care all that much.

To Knuckey, all this negative attention will likely affect not just voters being surveyed, but southern legislators themselves. Those legislators have always been aware that they represent a loud contingent of pro-flag folks, but now, in the wake of the A.M.E. shooting, they have to factor in the existence of a fully engaged, energized activated group of voters on the other side of the issue as well. So all the negative attention the flag has gotten “makes a vote to take it down easier now than it would have been a month ago,” he said.

In the long run, of course, the AME shooting will fade from the news. And David Paleologos, director of the Suffolk University Political Research Center, which just released a poll showing the nation to be about evenly divided on the question of whether the flag is racist — it was the first time Suffolk had polled on this issue, and results therefore can’t give any sense of the trajectory of opinion on this issue — said that there’s a chance that opinion will bounce back in favor of the flag. That is, fewer cues could mean a reversion to old, less strongly held opinions.

In the meantime, though, what we’re witnessing isn’t just a shift in opinion, but policy change — albeit minor ones, in some cases — on the part of multiple state houses and huge retailers. Even if public opinion reverts back to where it was before the shooting, a new status quo is in place and it’ll be difficult, in those places that have responded to this sudden surge in anti-Confederate-flag sentiment, for the flag to once again be raised — or sold.

    

 

The GOP entertainment auditions

 

On Tuesday, if all goes according to plan, Ohio governor John Kasich is scheduled to announce his presidential candidacy. In any normal year, someone with Kasich’s résumé — ex-investment banker, former Congressman, and moderate two-term governor of the swing state that has decided every presidential election since 1964 — would surely have no trouble breaking through. But this isn’t any normal year. Kasich is set to become the 16th declared candidate to enter the Republican primary, tying the all-time size record. According to the latest Public Policy Poll, Kasich will debut in second-to-last place with one percent support. If these numbers hold, he'll be barred from the opening Fox News debate scheduled for August 6 in Cleveland.

That a popular Midwest governor who was reelected with 64 percent of the vote last year finds himself at the bottom of the barrel is just the latest proof that this year's GOP primary has gone completely off the rails. The grown-ups in the party have taken to blaming Donald Trump for the chaos, but the truth is that the forces are much bigger than Trump's hair. What this year's primary shows is that — at least when it comes to presidential elections — the GOP is at risk of becoming less of a political party and more like a talent agency for the conservative media industry. Jumping into the race provides a (pseudo)candidate with a national platform to profit from becoming a political celebrity. "If you don’t run, you’re an idiot," a top GOP consultant told me.

In the old days, the path to profiting from politics led politicians into the corner offices of banks, corporations, and lobbying firms. Many still go that route. But with her 2008 breakout, Sarah Palin disrupted the GOP nominating process and made being a potential primary contender a full-time job. Her decision to cash in by quitting the Alaska governor's office for Fox News and tea-party stardom established a new business model. As this year’s ballooning GOP field shows, there are many long-shot candidates who are seeking to follow her path. Since January 2014, Ben Carson has earned as much as $27 million from delivering 141 speeches and publishing three books including You Have a Brain: A Teen’s Guide to T.H.I.N.K. B.I.G. Former Hewlett Packard CEO Carly Fiorina made nearly $1 million in speeches last year and published a memoir. Mike Huckabee’s Fox News contract was worth $350,000 a year before he left to join the race, according to sources. This year he also released a book God, Guns, Grits, and Gravy. Ted Cruz made a reported $1.5 million for his book, A Time for Truth.

These candidates have made six- and seven-figure paydays even before the first ballot is cast. With hours of free airtime on television to promote their brand, their market value is sure to increase. “Even if you lose, you exponentially increase your marketability,” the consultant told me. “Right now, let's say you’re giving speeches for $20 grand. You run and it becomes $40,000. If you do well, maybe there’s a Fox show. Then you write a book about how to save the party. Then you write another about why the next president sucks. There’s a million marketing opportunities."

After Mitt Romney’s 2012 loss, a GOP-commissioned autopsy revealed that voters saw the party as “scary’ ‘narrow-minded’ and ‘out of touch.’” This year’s reality-show primary significantly complicates Republicans’ efforts to soften the brand for a broader electorate. After all, a candidate seeking to monetize a demographic niche has zero incentive to modulate their message for wide appeal. “The conversation during the primary is driven by self-serving interests and aimed at a certain constituency,” complains another top GOP strategist. “There is no need to be responsible for those particular candidates in language, issue-focus, or anything else since it's not about the overall party.”

The size of the GOP primary fields has paralleled the growth of conservative media. In 1996, the year Fox News launched, ten candidates ran. In 2000, it was 13. This year, the total is likely to reach 17 when former Virginia governor Jim Gilmore gets in next month. “There is a cottage industry that doesn't exist for Democrats,” a GOP strategist told me.

What this means is that, on the left, the political celebrity economy is divided along the same unequal lines as the real one. The Clintons, with Bill’s multi-million-dollar speeches and Hillary’s $14 million book advance, are the one percent. Beyond them, there’s no functioning market that would reward a bunch of candidates for contesting their monopoly. “The institutions don’t exist,” Bob Shrum, the veteran campaign strategist, says. “We don’t have a network dedicated to giving people a place to go.” Sure, there's MSNBC, but the channel reaches a much smaller audience than Fox. Liberal talk radio bombed with the demise of Air America. And liberal books, by and large, don't sell like conservative titles do. Right now Ted Cruz's book is on the Times best-seller list. Is anyone dying to read a new release from Martin O’Malley?

The disparity between the size of the two primary fields is driven by political and structural forces. The rise of billionaire donors and super-PACs enable more fringe GOP candidates to fund their campaigns. Conservatives’ palpable sense of cultural victimhood encourages them to embrace (and reward) their former candidates even if they lose badly. “The people on the right are heroes to their supporters, and that’s how their books sell,” Shrum says. And conservatives who promote free-market gospel on the lecture circuit can get easily booked by deep-pocketed corporations who benefit from their message. "A bank is never going to hire Bernie Sanders to speak, but it might hire Rick Perry," says one GOP adviser.

In at least one way, it's ironic that Republicans are now fretting that their media-driven primary is damaging the party's electoral prospects. They are, after all, the party of the free market. What is more free than a candidate earning millions from the primary process?

    

 

Counter Extremism

 

The status quo is not working. More than 700 British-born Muslims have travelled to Syria to help to build an empire based on murder. An emissary of that empire shot dead 30 Britons on a Tunisian beach last month. The challenge posed by Islamic State demands a no-holds-barred response on many fronts. The lesson of Islamist terrorism is that when its world view goes unchallenged and its recruiters go on recruiting, innocents die. Mr Cameron is right that this is the struggle of our generation. Whether as victims, perpetrators or numbed bystanders we are indeed in it together, but if it is to end it will have to be the Muslim mainstream that does the heavy lifting.

The government’s proposals include banning orders for extremist groups; “extremism disruption orders” for individuals radicalising young people; and a new role for Ofcom tackling foreign broadcasters who give hate preachers a platform.

They will come before parliament in an extremism bill, probably this autumn. They will intensify police scrutiny of certain mosques and of courses, such as one in Rochdale on which we report today, for men only, on “understanding the caliphate”. Inevitably they will attract the criticism that the state is limiting free speech in its assault on hate speech, but this is not a new distinction. It is a distinction that may have to be redefined but that does not mean it cannot be fairly policed.

In particular the government intends to pursue non-violent extremists who until now have operated just inside the law. Islamic leaders will be expected not just to condemn Isis-style atrocities but to state clearly that their supposed theological justification is simply wrong. Crucially, the new law will tackle head-on pernicious conspiracy theories spouted by bigots and swallowed whole in isolated pockets of Muslim Britain — theories that blame Jews for 9/11 and the British government for 7/7 and define the world view of a lethal minority bent on repeating such “spectaculars”.

Yesterday’s speech was wide-ranging and ambitious but underpinned by simple logic. Its starting point was that Islamist extremism is a cult. It is a cult that invokes Islam but is defined by bombings, beheadings, torture and defenestration. Mr Cameron used the speech to address potential Isis recruits directly. “You are cannon fodder for them,” he said. “If you are a boy they will brainwash you, strap bombs to your body and blow you up. If you are a girl, they will enslave you and abuse you.”

It needed saying. Strip out jihad’s spurious claim to religion and the reluctance of some civil libertarians to see the full force of the law applied against it looks ridiculous. There is a practical risk that a crackdown on extremists will be misconstrued as a crackdown on moderates who worship the same God, and this is a risk that must be constantly reassessed. What is right in the struggle against Islamism is what works.

    

 

Lottery Reward For Voting

 

Los Angeles is tackling electoral apathy by indulging in a trade typically frowned upon in democracies — cash for votes.

When an unknown caller phoned Ivan Rojas last week and told him he had won $25,000 he assumed that he was being scammed, but in fact the 35-year-old security guard had become the first beneficiary of a scheme designed to energise some of America’s most reluctant voters. Although he didn’t know it, by casting a ballot in a local election to appoint school officials Mr Rojas had entered a competition to win a cash prize. “I was shocked. I couldn’t believe it,” he said.

The object of the scheme is to motivate voters — and particularly Latino voters, who have a historically low turnout — in a state once notorious for its fervent embrace of “direct democracy”.

Californians are regularly called upon to vote on a baffling array of issues. They can eject elected officials through recall votes, and they can reject acts passed by the state legislature through referendums. Through ballot initiatives they can even write their own laws — an option they have exercised on cannabis regulation and property taxes.

Of late, however, this bombardment of direct democracy appears to have turned the voters of Los Angeles off. Turnouts below 10 per cent have not been uncommon in local elections.

The organisers of the inaugural “voteria” competition (derived from lotería, the Spanish word for lottery) believe that it boosted turnout — albeit only to about 10 per cent — and further experiments are planned. A survey found that about 16 per cent of voters knew about the prize. Of those, a quarter said it had made them more likely to cast a ballot.

Across the US, voter engagement seems to be slipping. Last year’s national congressional election turnout was 42 per cent — the lowest since at least 1978. Latino voters, meanwhile, are particularly unlikely to vote. In the 2010 midterms, only 31.2 per cent of Hispanics voted compared with blacks (44 per cent) and whites (48.6 per cent).

In the 2012 presidential election, a record 11.2 million Hispanics voted but that still came to only 48 per cent of those eligible — a dip from 49.9 per cent in 2008 and lower than blacks (66.6 per cent) and whites (64.1 per cent). The overwhelming majority of Hispanic voters are historically Democrat supporters. Yet the prospect of voters across the nation being tempted by cash prizes in presidential elections is some way off.

Indeed, some believe that cash prizes only cheapen democracy. “The voteria underscores the cynical view that people don’t care about their local government any more and the only way to get them to vote is to bribe them,” the Los Angeles Times said.

Others claim that encouraging people to vote for money, when they do not feel engaged with the issues, can lead to entirely random results.

In the Los Angeles election in which Mr Rojas voted, analysts believe that the influx of Latino voters may have swung the ballot in favour of the victorious candidate, Ref Rodriguez. Residents who had heard of the voteria contest were more likely to cast their ballot for Mr Rodriguez by a margin of two to one, according to academics at Loyola Marymount University, quoted by the Los Angeles Times. Without the prize on offer, the vote may have gone to a recount.

    

 

Perot Didn't Cost Bush Re-election

 

As Donald Trump flirts with an independent candidacy for president, one of the most enduring political myths of our time is returning to the surface.

Surely, you’ve encountered the claim recently – that Ross Perot’s third party bid in 1992 cost George H.W. Bush a second term and allowed Bill Clinton to win with a mere plurality of the vote. Typically, it’s invoked to underscore Trump’s potential to play spoiler next year, draining critical support from the Republican nominee. Once again, we are told, a Clinton may end up securing the White House by default.

But the comparison is bogus. Yes, Perot did rack up a significant share of the vote in 1992 – 19%, the best for an independent since Teddy Roosevelt in 1912. But there’s never been a shred of evidence that his support came disproportionately from Bush’s column, and there’s considerable evidence that it didn’t.

Let’s start with the basics. Clinton was elected with 43% of the vote, to Bush’s 37.5%, a difference of nearly six million votes. To overtake Clinton in a two-way race, then, Bush would have needed to gain the lion’s share of the Perot vote, about two-thirds of it. But in the exit poll conducted on Election Day, just 38% of Perot’s backers said Bush was their second choice. Thirty-eight percent also said Clinton was. “The impact of Mr. Perot’s supporters on the campaign’s outcome,” wrote The New York Times, “appears to have been minimal.” The Washington Post’s conclusion: “Ross Perot’s presence on the 1992 presidential ballot did not change the outcome of the election.”

This is only part of the story. The Perot campaign was a soap opera-worthy saga that played out in multiple acts, and in each one there was no indication that he was disproportionately hurting Bush.

If anything, he started out as Bush’s ace in the hole. The Perot phenomenon kicked off on February 20, 1992, when the Dallas billionaire told CNN’s Larry King he would run for president if volunteers placed his name on the ballot in all 50 states. Perot had never before sought office, but he had folk hero status thanks to the daring rescue he’d engineered when some of his employees were trapped in Revolution-era Iran. That story was made into a movie, with Richard Crenna playing Perot.

A national grassroots mobilization ensued and Perot moved up in the polls – fast. By the late spring, he was running in first place. An ABC News/Washington Post poll in early June gave him 36% to Bush’s 30%, with Clinton back at 25%. Pundits teased the possibility of a deadlocked election being thrown to the U.S. House. Others wondered if Perot might just win outright. There had been serious independent candidacies in the recent past, like John Anderson in 1980 and George Wallace in 1968, but none had gained this kind of traction. It was a volatile and unprecedented situation.

But it was also clear at that moment that the main beneficiary of Perot’s rise was Bush, who was presiding over a dismal economy that only seemed to be worsening. That same ABC/Washington Post poll logged the president’s approval rating at just 35%. His rating for economic performance was even lower and unemployment was on the rise; it would spike to 7.8% by the middle of the year. In another June survey, only 33% of voters said Bush deserved a second term. Sixty-one percent said he didn’t. By every available metric, Bush was a profoundly vulnerable incumbent.

So Perot was doing him a huge favor: He was splitting the anti-Bush vote and cutting deeply into what should have been Clinton’s base. That ABC/Washington Post poll in early June found that among Democrats, Clinton was barely running ahead of Perot, 43% to 39%. Overall, 47% of Perot’s backers said Clinton was their second choice, compared to 31% for Bush. “[T]he poll suggests that Perot is now hurting Clinton much more than Bush,” the Post wrote.

This had to do with something that is often forgotten these days. The Bill Clinton of the spring of ’92 was regarded as a fatally damaged candidate who was doomed to lead his party to yet another national defeat. Against weak opposition, he’d endured a sex scandal and revelations of possible Vietnam draft-dodging to win the Democratic nomination. Already, Democrats had lost three consecutive national landslides and Republicans were widely thought to have a “lock” on the Electoral College. Clinton’s personal unfavorable rating was alarmingly high and the number of voters who called him dishonest was through the roof. Surely, even members of his own party had come to believe, the feared Republican attack machine would destroy him in the fall.

This was Act I of the Perot ’92 campaign: a stunning surge to the top fueled by voters who badly wanted Bush out but who also couldn’t stomach Clinton. This would also end up being Perot’s peak, because in Act II came a Clinton revival and a Perot crack-up.

You’ve probably seen the clip at some point, Bill Clinton sporting a pair of sunglasses and playing “Heartbreak Hotel” on his saxophone on Arsenio Hall’s late-night show. It was one of numerous moments that spring that prompted Americans to give the Democratic candidate a second look and to discover the warmth and charisma that is taken for granted these days. Clinton’s poll numbers improved and Democrats began returning home. At the same time, Perot was treated to intense media scrutiny that he’d never before encountered, and he didn’t hold up well. He also created controversy, like when he addressed the NAACP convention and referred to his audience as “you people.”

By the end of June, Clinton had the lead, which started to grow. A poll released on July 16 gave him 42%, with Bush at 30 and Perot far back with 20. That same day Perot, battered by negative coverage and furious with the media, called a press conference in Dallas and abruptly withdrew from the race. The Democrats were holding their convention that week and Perot said he now believed they had “revitalized” themselves under Clinton, who delivered his acceptance speech that night.



October also brought with it Act III of the Perot drama: his sudden re-entry. Initially, his support was low, owing to the bizarre and unnerving way he’d ended his campaign over the summer. Polls showed him in the single digits, drawing about evenly from the other two candidates. He was included in the debates, though, where he staged several breakthrough performances, winning back many former backers with his populist spirit and folksy one-liners. His support climbed into the teens and on Election Day nearly hit 20%.

In that last month, Bush narrowed the gap against Clinton from 16 points to the final margin of 5.5, and he did this even as Perot’s support rocketed up. But if Perot was stealing Bush’s supporters, how could this have happened? The answer, of course, is that Perot wasn’t drawing disproportionately from Bush at all. He represented a funky electoral coalition that included as many Democratic-friendly voters as Republican-friendly ones, and quite a few who’d previously been non-voters.

So where does the Perot myth come from? Some of it has to do with the very real anger that existed toward Bush on the right in ’92. Pat Buchanan, running on a platform that combined nativism and economic nationalism, embarrassed Bush with a strong showing in the New Hampshire primary and accumulated more than three million votes in the Republican primaries. These Buchanan voters are often seen as the backbone of the Perot movement, and certainly there was plenty of overlap. But Perot’s coalition also included frustrated Democrats who’d voted for Paul Tsongas in his primary campaign against Clinton. The centerpiece of Tsongas’s campaign was the exact same as Perot’s: a frantic warning about the nation’s skyrocketing debt.

It’s also true that Perot nursed an intense personal grudge against Bush. This made his campaign feel like a vendetta and may explain why when Bush was asked about Perot three years ago, he replied, “I think he cost me the election and I don’t like him.” On top of this, there’s the simple fact that attributing Clinton’s win to Perot made it easy for Republicans to dismiss his victory as a fluke. A generation later, it remains an article of faith on the right that Clinton only won because of Perot. And with so many voices repeating this claim so casually, it’s a claim that also seeps into mainstream accounts of the ’92 campaign.

This all helps explain the endurance of the Perot myth. There are also those who argue that Perot harmed Bush less directly, by ratcheting up the country’s anti-Bush mood. But this is a reach. Even before Perot stepped forward, Bush’s approval rating had fallen below 40%, and a poll taken the same week Perot went on “Larry King Live” found that a staggering 73% of Americans thought Bush wasn’t doing enough to improve the economy. All of the evidence indicates that Perot was a symptom of Bush’s weakness – not the cause of it.

Now consider Donald Trump for a minute. This week, a poll showed that in a two-way race, Hillary Clinton would defeat Jeb Bush by six points. With Trump added to the mix, though, her margin explodes to 16. This is clear and obvious evidence that Donald Trump, at least for now, would do serious harm to the GOP as an independent candidate. And now think about this: Never in the entire saga of Ross Perot’s candidacy – not in Act I, not in Act II, not in Act III – was there even a single poll that showed Perot doing the same thing.

    

 

Harnessing Anger

 

Ask Labour MPs how they are feeling at the moment, and one word comes back again and again: depressed. Nobody commits their life to politics to sit in opposition, and yet that is the only prospect they see ahead for years if not decades to come. Ask Labour members, activists and committed trade unionists how they are feeling, and you get a different answer: they are deeply angry.

That disconnect reminds me of a wealthy football club suffering a humbling defeat in the FA Cup just when the path to glory looks clear. The players and manager speak of disappointment, lessons learnt and a determination to put things right. But the fans want blood; at the very least an apology. They feel betrayed by the people who purport to represent them but do so with such little conviction, skill or success. Labour’s activists feel the same.

These are the individuals who pay their £46 membership then give up their evenings and weekends to hand out leaflets in the street, or go knocking on doors asking people how they intend to vote. They were the ones who warned that Ed Miliband was coming up as a problem on the doorstep even in solid Labour areas, and were told by his campaign chiefs to keep that feedback to themselves. For them, the election was there to be won against a manifestly unpopular coalition, and instead they watched lifelong working-class Labour voters turn away to Ukip, the SNP, the Greens, or even — horror of horrors — the Tories.

The local activists did not experience rejection at a national level like Miliband or at constituency level like Ed Balls. They experienced something far worse: their own workmates, neighbours and even their own mums, dads, siblings and spouses telling them to their face that they couldn’t bring themselves to vote Labour.

They would argue the toss with their friends and family and repeatedly get the response: “I agree with you. I think you’re right. And if you were in charge, I’d vote Labour. But that’s not what the lot running your party stand for any more. They don’t stick up for me. They don’t understand what life’s like round here.” And in the minds of those members and activists that alienation of ordinary working people who should ordinarily be voting Labour was the reason the party suffered its devastating defeat.

It is not exactly political science to work out why these individuals and their constituency parties have been flocking to support Jeremy Corbyn’s candidacy for the Labour leadership. Quite simply, he understands and articulates their anger in a way that none of his rivals has even tried to do, and — typical of the man — does so with great authenticity, intelligence and principle, and with solid and interesting policies to back it up.

He will not appreciate the comparison but, as young Labour activist Abby Tomlinson observed last week, Corbyn is having an insurgent effect similar to that of Nigel Farage during the last parliament. For a long period Farage successfully harnessed the anger and alienation of millions of people across a huge spectrum of issues. It hardly mattered whether he would ever get the chance to do anything about them, at least voters appreciated that he understood and shared their anger.

The more Ukip’s activists were dismissed by David Cameron and others as fruitcakes, loonies and clowns, the more it reinforced the impression that the modern Tory leadership simply didn’t understand the anger among much of its core support. Last week we saw Tony Blair and his acolytes making exactly the same mistake, dismissing Corbyn’s supporters with abuse, mockery and condescension, and treating the man himself — an MP of 32 years’ standing, liked by his Islington North constituents — with sneering contempt.

If the Blairite faction does not realise that dismissing and insulting the Corbyn campaign is counterproductive, its members can ask the current prime minister. He finally got the message on Ukip around the time of the 2013 local elections, when the Tories were pushed to their lowest vote share since 1982. By the time of the general election two years later, Cameron had developed a more effective argument: if you were simply angry about Europe or immigration you could vote for Farage, but if you wanted something done about it you had to vote for him. The Corbyn bubble will not burst until one of his rival candidates embraces that same strategy.

My advice to them: listen to why the activists are angry. Don’t tell them your analysis of what went wrong in May, ask for theirs. Apologise for being part of the team that let them down and explain why you’re best placed to make things different next time round. If they don’t take it, the MP for FA Cup holders Arsenal may soon be celebrating an altogether less likely triumph.

    

 

GOP and Fascism

 

In the political discussion of today, there always comes a risk of being discounted as a crackpot when using a word like “fascist” to describe a political opponent. The word, much like “socialist,” has been so abused since the fall of fascism that it lost its meaning quite some time ago. Comparisons of modern leaders to Hitler tend to be completely void of any substance, and there is even an Internet adage, “Godwin’s law,” that says, “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.”

In a recent article by Jeffrey Tucker, however, it is argued, quite justly in my opinion, that Donald Trump, whether he knows it or not, is a fascist (or is at least acting like one). Much like Mussolini and Hitler, Trump is a demagogue dedicated to riling up the people (particularly conservatives) with race baiting, traditionalism and strongman tough talk — and, according to polls, it’s working — for now. Tucker writes:

“Trump has tapped into it, absorbing unto his own political ambitions every conceivable resentment (race, class, sex, religion, economic) and promising a new order of things under his mighty hand.”

No doubt about it, Donald Trump has decided to stir the pot, and, as Tucker says, he seems to be running for a CEO position, rather than president of a nation. Trump discusses Iran and Mexico as if they were competing corporations, and says that, as president, or CEO, he will drive them into the ground, make them file for bankruptcy — something Trump legitimately knows a thing or two about. Trump, of course, is largely taken as a joke, and most rational commentators assume he is doing this for publicity — which he is certainly getting.

The thing is, his style — full of race baiting, xenophobia and belligerent nationalism — is not unique to Trump; he is simply the most blatant and vocal about it. There’s a reason he’s leading in the GOP polls: the party’s base likes what he’s saying. The people are angry about illegal immigrants murdering white women (anyone who has followed Bill O’Reilly over the past week knows what I’m talking about), homosexuals destroying the tradition of marriage, and so on. Much like fascism reacted to modernity and social progress in the early 20th century, right-wingers are reacting angrily to social progress of the new century. (Of course, there has been no economic progress, which is why the left is also angry.)

So is the GOP becoming the new fascist party? That might be an exaggeration, but it does share many similar features, and Trump, with his demagogic style, is simply exposing how very similar the passions of the GOP base are to the passions of fascism of the early 20th century.

The modern GOP is a party of unwavering and dogmatic patriotism mixed with traditionalism and intolerance. The social progression we have been witnessing over the past decade in America, most clearly with the acceptance of the LGBT community, seems to be triggering a reactionary movement on the right. We see this most recently with the religious freedom controversies and the angry protests of the Supreme Court’s gay marriage ruling. Fascism of the early 20th century was also largely a negative reaction to modernity (in a social sense at least; fascists did tend to worship technology). Communism, which was the ultimate evil to fascists, promoted the destruction of traditional institutions such as the family, the bourgeois state and organized religion. In some ways, fascism was the conservative answer to communism — the defender of tradition.

    

 

The Politics of Anger

 

There are certainly some questions that should be giving our politicians sleepless nights, even if they make it to their summer retreats. Will the deal between Greece and its creditors hold, and if not, how will that affect the rapid timetable David Cameron and George Osborne have in mind to renegotiate Britain’s membership of the European Union? What fresh atrocity is Islamic State planning to keep themselves in the spotlight over the summer, and is it British citizens on holiday overseas who are most at risk from their attacks, or are they capable of hitting the UK mainland?

Will the Chinese government’s extraordinary intervention in the nation’s stock markets stop the recent rout in shares, or is it just increasing the height of the precipice from which the world’s secondlargest economy will eventually fall?

And for Labour, can Jeremy Corbyn really win the leadership race, and if so, are the party’s MPs serious about their anti-democratic threat to oust him immediately? If there are no easy answers to those questions, it is because, as screenwriter William Goldman famously said. “Nobody knows anything.” Goldman was speaking about how to divine which Hollywood films would succeed at the box office, but the same is now true of our politics.

Imagine if just a year ago, you had predicted that, by this summer, Labour would have only one MP left in Scotland, the Liberal Democrats would have just eight seats nationwide, George Osborne would be joint-favourite to be the next prime minister, and Jeremy Corbyn would be the pick of the pollsters to be facing him across the dispatch box. People would have thought you were stark staring mad, and yet now these occurrences seem rational, or at least explicable.

The rise of the unpredictable is not restricted to the UK, or Greece or China. In Spain, as recently as November 2011, the two parties that have dominated politics since the death of Franco were collecting 75 per cent of the vote between them in opinion polls. Now, they are down to 50 per cent, and two parties that did not even exist at national level two years ago have amassed 32 per cent of the vote.

A year ago, rumours of Donald Trump standing for the US presidency were dismissed as yet another publicity stunt from a figure of fun. But just like Jeremy Corbyn, no one is laughing now, with the latest polls showing Trump surging ahead of Republican rivals Jeb Bush and Scott Walker.

What explains this surge in volatility? It is tempting to call it a reflection of our modern, fast-paced, click-button culture, where the same impatience and restlessness that makes it impossible for people to hold a five-minute conversation without checking their phone is also feeding into our attitude to politics.

In that context, worthy but dull politicians do not stand a chance; policies that cannot be distilled into a 140-character tweet are worthless; and the long-standing loyalties of individuals, families or constituencies to one political party or another mean nothing.

To have any chance of appealing to a jaded, alienated, often angry electorate in those circumstances, modern politicians must first be able to get their attention. That partly explains the attraction of entirely new parties such as Podemos in Spain. It is also fertile ground for charismatic characters such as Boris Johnson, Nigel Farage or Donald Trump, or politicians unafraid to voice strong, unconventional and radical opinions such as Jeremy Corbyn or Hilary Clinton’s socialist rival for the Democratic presidential nomination, Bernie Sanders.

However, we can only blame modern culture so far. The history of the 1930s tells us that you do not need YouTube and Twitter to experience a surge in political volatility, the success of new parties, the rise of charismatic leaders with radical opinions, or indeed the selfdestruction of the Labour party.

What the 1930s and the current decade have in common is the economic backdrop: the unwinding and unresolved consequences of a massive shock, permanently undermining faith in existing political and financial institutions to look after our economies in the interests of the people.

The global financial crisis has given way to an era of seemingly permanent loose monetary policy, and if that ends badly — as history tells us it will — then the political volatility we are currently seeing will increase still further and sweep away even more of our old certainties.

Indeed what makes the current decade potentially even more volatile than the 1930s is that it is not just Europe and America facing this threat, but China, Brazil, and every other emerging market economy, any of whose financial problems could themselves trigger another global crisis.

If that crisis comes, the timing could be very interesting for our unlikely bedfellows, Messrs Corbyn and Trump. Jeremy Corbyn is the only Labour leadership candidate talking in radical terms about the unresolved problems in our economy, while Donald Trump has warned that America faces such “financial ruin” in the years ahead that it will make a mere recession seem like a “nice” outcome.

If Corbyn wins the Labour leadership and Trump wins the Republican nomination, they could find themselves uniquely well-placed to lead public anger in Britain and America over any fresh crisis, and capitalise on the demands for change that will result.

Those Tory and Democratic strategists relishing the prospect of facing Corbyn and Trump could find themselves having to eat their words. And if that seems inconceivable, remember that the only guarantee in modern politics is unpredictability.

    

 

End Times and the Religious Right

 

One of the creepiest aspects of contemporary American politics is the unholy alliance between the Christian right and Israel. It’s uncomfortable because the religious right’s affinity for Israel is tied to a rather disturbing fever dream: Israel’s destruction. Many evangelicals are utterly convinced that every addition to the sum of suffering in the Middle East is but a sign of the end times, of Christ’s return.

They’re convinced because they interpret foreign affairs through the prism of Bronze Age biblical prophesy. Without getting bogged down in the colorful details of Christian eschatology, the story runs something like this: In order for Jesus to return and establish his Kingdom, the state of Israel must first be conquered by an invading army (preferably Persian or Arab) – because God says so. The unfortunate part (if you’re Jewish, at least) is that before Christ descends from the clouds, a holocaust of sorts must occur, resulting in the deaths of 2/3 of Israel’s people. For certain Christians, then, Israel must exist as a state (which is why they defend it so passionately), but it must also suffer immensely so that Christians can escape physical death in the form of the Rapture.

This is the rather sordid truth behind the Christian right’s love affair with Israel. In a recent interview on the radio program “Understanding the Times,” Michele Bachmann articulated the raw insanity of this position with perfect clarity. Bachmann called the nuclear agreement with Iran “the most important national security event of my lifetime.” Not for geopolitical reasons, of course, but because it fulfills God’s prophesy: “All the nations of the world signed an agreement that slams the door against Israel.” Even better, she continued, the agreement prepares the way for Israel’s ruination “with the United States leading that charge.” Which apparently is also part of God’s plan, according to Bachmann’s interviewer, Jan Markell, who expanded on Bachmann’s observation with the following (per Right-Wing Watch):

“There are consequences to doing things like this against God’s covenant land, there are horrible consequences,” Markell said. “Then you throw in some other things such as the Supreme Court decision back in late June and a lot of other things. Judgment isn’t just coming; judgment is already here.”

The prophesy to which Bachmann and Markell refer, as Scott Eric Kaufman noted earlier this month, is Zechariah 12:3, which refers to “all the nations of the earth” gathering against the state of Israel.

It’s probably not worth unpacking any more of this lunacy. The broader point is that people like Bachmann (and many other Republicans) really believe this stuff. Indeed, there’s a significant subset of the GOP that advocates for Israel on purely theocratic grounds: They yearn for the apocalypse. These people fancy themselves patriots, but they’re gleefully subordinating American Foreign Policy to religious dogma in order to hasten the End Times.

Said Bachmann during the same radio interview: “The prophets longed to live in this day that you and I are privileged to live in.” While the former Minnesota congresswoman is uncommonly honest about her beliefs, she is certainly not alone. It’s no accident that prominent Republicans are eager to champion Israel’s interests over our own: Their religious base demands it.

This madness is a political problem. At minimum, a state’s foreign policy is guided by the pursuit of self-interest. It’s true that Israel and America are allies (as they should be), but only so long as our interests align. The religious right doesn’t see the world in these terms, because they’re not interested in living peacefully on this planet. They’re drunk on otherworldly fantasies and, unfortunately, they also vote. Which is why Republicans consistently side with Netanyahu over Obama whenever there’s a legitimate conflict of interests: The base isn’t concerned with worldly things like peace and security and diplomacy – only prophesy. To the extent that people in office are animated by beliefs like this, our foreign policy will be misguided at best, suicidal at worst.

Michele Bachmann is Exhibit A in the case for purging religion from politics. Anyone using the bible as a basis for contemporary foreign policy can’t be trusted with that kind of authority. Individuals are free to believe whatever they want, but the people with a grip on the levers of power aren’t – they have a worldly responsibility that requires a connection to terrestrial reality.

If you’re “encouraged” by the apocalypse, as Michele Bachmann is, you’re unqualified.

    

 

Rules Are Not Fixed

 

Rules are rarely universal constants, received wisdom, never unchanging. We're frequently told that an invented rule is permanent and that it is the way that things will always be. Only to discover that the rule wasn't nearly as permanent as people expected.

We've changed the rules of football and baseball, many times. We've recognized that women ought to have the right to vote. We've become allies with countries we fought in World Wars. We've changed policies, procedures and the way we interpret documents and timeless books.

This is not weakness, nor is it flip flopping. Not all the changes are for the better, but the changes always remind us that cultural rules are fluid. We make new decisions based on new data. Culture changes. It has to, because new humans and new situations present new decisions to us on a regular basis. Technology amplifies the ever-changing nature of culture, and the only way this change can happen is when people decide that a permanent rule, something that would never, ever change, has to change. And then it does.

( more good ideas from Seth Godin)

    

 

Drone Warfare

 

One night in November 2004, outside the Iraqi city of Fallujah, a US marine patrol was warned by radio that a Predator was about to launch a strike in its area. The men had scarcely heard of drones, and one asked in bewilderment: “What’s a Predator?” They sat peering fascinated through their night vision goggles in anticipation. Sergeant Matthew Mardan wrote later: “We could hear a buzzing about a kilometre up in the air, you could hear the thing but couldn’t see it. A Hellfire missile hit the target.”

This was one among hundreds of such assaults launched against Sunni insurgent leaders, trainers and command and control facilities by the US joint special operations and command headquarters. Mardan said: “Our platoon commander goes: ‘That’s the future, gents. That’s what probably going to replace all of us’.”

Eleven years on from that incident, David Cameron made headlines with his announcement last week that in August an RAF drone killed Reyaad Khan, a Cardiffborn jihadist who was allegedly directing from Syria a terrorist plot to kill the Queen, and a second British man, Ruhul Amin, from Aberdeen.

Unmanned aerial vehicles (UAVs) are fast becoming the weapons of choice for western nations engaged in struggles against non-state enemies. They seem to carry none of the weighty political baggage associated with committing ground troops, and have been extensively employed in Pakistan, Yemen, Somalia, Iraq, Afghanistan — and now Syria. Yet they raise important moral and legal issues that western governments have scarcely begun to acknowledge, still less to address.

They are also a conspicuous manifestation of the ascent of remotely directed machines towards what is almost certain to become a dominance of later 21st-century battlefields. More than 2,000 years ago the historian Thucydides described war as “the human thing”. General George Monck wrote in the 1660s: “The chief parts of a soldier are valour and sufferance, and there is much honour gained by suffering in war, as by fighting valiantly.”

Yet the men who pushed the buttons that killed the British jihadists in Syria were controlling Reaper drones from a cabin at the RAF base at Waddington, Lincolnshire. Although airmen emphasise the immense psychological stresses on drone pilots, however remote from the battlefield, they are exposed to no hazard or physical discomfort. After finishing their shifts they could stroll through the August sunshine for a pint in the local.

There is no glory in the new form of warfare such as the dambuster Guy Gibson or the men who fought at Rorke’s Drift were deemed to have achieved. But then, perhaps there was never as much in the old one as romantics sought to pretend.

The drone era started some 30 years ago, with one remarkable man: an Israeli aeronautical engineer named Abraham Karem moved to California and began work in his garage on a revolutionary concept. At that stage, his objective was simply to create a new vehicle for aerial surveillance. Karem secured a Pentagon contract for his research, and the first prototypes of his Amber and Gnat designs, featuring the inverted V-tail which became a signature feature, flew in 1986. Amid the end of the Cold War, however, the project was frozen and in 1990 Karem’s company went bankrupt. For a time the Pakistanis, who had heard about the project, considered buying its prototypes. Then the CIA and Pentagon decided that, with or without a Soviet threat, they could use the drone concept.

They gave a research contract to a secretive company named General Atomics, owned by two rich, outlandish brothers with form in the spook world, aeronautics and nuclear physics, named Neal and Linden Blue. The Blues, who modelled their aviation business on that of Howard Hughes in the 1930s, hired Karem and his key personnel. They became creators of the first US operational drone, the Predator, and of its successor, the Reaper, which have generated vast though undisclosed profits for the brothers, now in their late seventies.

The research and development costs are secret, but the Predator construction programme was budgeted at $2bn (£1.3bn), the ongoing Reaper at $12bn. In the later 1990s over the former Yugoslavia, unarmed Predators were field-tested, their early glitches and shortcomings ironed out.

The CIA and US military then embraced the potential of drones not merely for observing the enemy, but also for killing him. Before the end of 2001, the agency was routinely firing Predator-mounted Hellfire missiles at selected targets in Afghanistan.

Chris Woods, a historian of America’s drones, has called the new weapons system “the world’s first airborne sniper rifle”. General David Deptula, between 2006 and 2010 a key figure in the US air force, said: “We’ve spent the past 100 years trying to figure out how to hit any target anywhere on the surface of the earth, all weather, day or night, rapidly and with precision. Guess what — we can do that.”

The virtues of the drone are first, its endurance — a capability to loiter over the battlefield at low speeds for up to 24 hours. One of the defining aspects of every modern military operations room, conspicuous in Iraq and Afghanistan, is an array of wall-mounted screens, providing constant television images of chosen sectors of terrain, with power to zoom in on vehicles and people below, although current drone cameras cannot identify faces.

Presidents and prime ministers are vulnerable to the lure of “Pred-porn”, demanding to see real-time pictures of operations in progress.

Hellfire missiles inflict far less “collateral damage”, to use the US military’s phrase for civilian casualties, than conventional bombing.

Between 1965 and 1973, the US army air forces launched half a million sorties against targets in Cambodia and Laos, dropped 4.7m tons of bombs and inflicted at least 100,000 civilian deaths, as well as an unknown number of North Vietnamese and Khmer Rouge casualties.Woods estimates that CIA drone strikes in Pakistan, Yemen and Somalia between 2002 and 2014 dropped just 250 tons of explosives that caused 500 civilian deaths, while imposing devastating attrition on the leadership and operatives of al-Qaeda — and now those of Isis in Iraq.

Today only three nations are significant users of operational drones: the US, Israel and Britain, which possesses two squadrons, each Reaper twice the size of a Predator. The American military controls Reapers in operational areas where they are supporting US armed forces. The CIA conducts its own programme of targeted killings around the world, which is not accountable to the Washington legislature, and arouses rising concern among libertarians.

Over the past decade, the Americans have “taken out” hundreds of al-Qaeda personnel through drone strikes, in a fashion that would make the front pages if its human agents were doing the business with bullets.

Now Britain is joining in, and we should not doubt what a new departure this represents. Contrary to the James Bond myth, during the entire history of Britain’s secret services, only very rarely have its agents deliberately killed its enemies.

It is futile to lament UAVs, because it is evident that they represent the future. The USAF now has more pilots for its Reapers than for its F-16 fighters. Scepticism about vast expenditure on the RAF’s £17.6bn Typhoon programmes and the cost of buying 14 American F-35s for the Royal Navy’s carriers (£2.5bn including support costs) was prompted partly by consciousness that the manned combat aircraft era is approaching its end.

UAVs, of ever-growing sophistication and power, are the future, inflicting death and destruction on the enemy at no human cost to one’s own side, and indeed at only a fraction of the cost of planes flown by human pilots in cockpits: Reapers retail for only around £10m each, though they are too vulnerable to operate in “highthreat environments”. The unmanned F-35s currently under development will cost as much as the piloted variant.

But it seems important to pause for a moment before welcoming this prospect of war with no “butcher’s bill” among our own forces. What happens when the other side uses drones, as it assuredly will? They are already the toys of children and hobby geeks. It is only a matter of time before terrorists employ them against us. In the hands of al-Qaeda, drones carrying even crude explosive charges represent more immediate and credible threats than dirty bombs.

Moreover, when the prime minister sombrely tells the House of Commons that British Reapers were used against jihadists in Syria to protect this country against an identified threat, we might ask a further question: what would be our attitude if the Russians or Chinese began to use UAVs to kill their enemies in Ukraine or Taiwan? Such action would prompt an international crisis. This is why President Vladimir Putin denounces what he considers the double standards of the West.

There is a temptation for British people merely to mutter “good riddance” about the liquidation of jihadists, enemies of our culture and potential threats to our society. But there are rising demands on both sides of the Atlantic for the codification of rules for the employment of drones for extrajudicial, extra-territorial killings. Many people are unconvinced by an authorisation system that merely delegates issue of a “licence to kill” to some unnamed US “senior official”, or to the National Security Council in Britain, even though Britain operates its Reapers under much tighter rules of engagement than the CIA.

Cameron is the latest of many prime ministers to relish reading raw intelligence and involving himself in operational decisions that follow from it. But his judgment on foreign policy issues has been shown to be uncertain. His initiative to topple Colonel Muammar Gadaffi has led to anarchy in Libya, and in August 2013 parliament proved no more convinced than was most of Britain’s security apparatus by his enthusiasm for an intervention against President Bashar al-Assad in Syria.

British intelligence organisations today stand in the front line of national security, and many of their senior personnel are people of the highest quality. But it would be foolish to forget the ghastly 2003 misjudgements by MI6 and Sir John Scarlett, then the chairman of the Cabinet Office joint intelligence committee, which prompted Britain’s involvement in the invasion of Iraq under false pretences.

The moral is not that we should not trust our intelligence services but that their activities must be subject to independent scrutiny, such as a mere parliamentary committee cannot provide.

The practicalities of this are not simple. The Reaper culture often requires life-ordeath calls within minutes, while a target is within the killing zone of a Hellfire missile. Yet it seems against the interests of justice and morality, not to mention international law, merely to shrug that circumstances oblige us to delegate authority on targeted killings to ministers, officials and spooks. In a new technological world, we must create new processes to uphold the ethics as well as the freedoms of democracy.

There is a temptation for the public to ignore what is being done in our name by drones, whose victims explode into fragments far from our line of sight. The Israelis use their Hermes and Heron UAVs ruthlessly against actual or potential enemies, with only limited regard for incidental civilian casualties.

Eitan Ben Eliyahu, a former commander of the Israeli air force, justified its policy by saying: “If you confront the terrorist while he’s executing his mission, this is too late. The idea is to prevent the terrorist operation from even starting. And these things cannot be done by jet aeroplanes or even helicopters.” But what if mistakes are made about the target’s identity, as has sometimes happened ? Some of the CIA’s misjudged strikes in Pakistan and Afghanistan have resulted in scores of civilian deaths, generating dire publicity especially within the combat region, and the recruitment of many more young men to the jihadist cause.

In 2008, the US ambassador in Islamabad was moved to write: “Even politicians who have no love lost for a dead terrorist are concerned by strikes within what is considered mainland Pakistan.” He expressed the fear, which persists to this day, that intense popular hostility to America’s aerial killing programme could eventually force Pakistan’s government to turn against the US. It is notable that only among Americans and Israelis do opinion polls show majority public support for drone strikes against potential enemies.

It is probably legitimate, though this is new ground in international law, for states to take pre-emptive action against nonstate enemies overtly committed to doing us harm. But it seems mistaken merely to nod through such a policy as a given, blithely citing the right to self-defence under article 51 of the UN charter.

We have entered a new world in which enemies are seeking to do us harm without benefit of flags, uniforms or declarations of war. We must employ new methods of defence, but these must be proportionate and controlled. On both sides of the Atlantic, we need a public and political debate, followed by a codification of published rules that command at least a substantial degree of international support.

The use of machines to kill on behalf of the state, a practice that will achieve ever greater centrality in future conflicts, cannot justify abrogating the moral and legal responsibilities involved in using lethal force against state enemies in war or peace.

    

 

Demographic Coping

 

MENTION “demographic crisis”, and most people think of countries where women each have six children and struggle to feed them. Much of Asia has the opposite problem: low fertility and an upside-down family structure (four grandparents, two parents, one child). Three-quarters of all the people in countries with exceptionally low fertility live in East and South-East Asia. Prosperous Japan, South Korea and Taiwan have fertility rates of 1.4 or below. The fertility rate is the number of children a woman can expect to have during her lifetime. A rate of 2.1 implies stability: the population is replacing itself. Demographers refer to rates of 1.4 or less as “ultra low”.

The difference between 2.1 and 1.4 may not sound like much. But consider what it has meant for Japan. In the early 1970s the country had a fertility rate of 2.1, with 2m children born every year. Four decades later the number of births has halved, with the fertility rate down to 1.4. Or take an even more dramatic example, China. In 1995 some 245m Chinese were in their 20s. By 2025, on current trends, there will be only 159m, a decline within a single generation of 86m. This will reduce by more than a third the segment of the population that is best educated, most technologically astute and most open to new ideas.

Demographic trends like this are often thought to be irreversible, implying that East Asia will be stuck in an endless cycle of decline. But history suggests that is far from certain. At the beginning of the 20th century much of Europe also had very low fertility rates. These then rose for decades, peaking in the baby-boom years of the 1950s and 1960s. Europe’s historical experience, argue two American demographers, Thomas Anderson of the University of Pennsylvania and Hans-Peter Kohler of the University of California, Berkeley, helps explain East Asia’s problems now—and suggests what could be done about them.

When the first wave of industrialisation swept through northern and western Europe, women started to go to school and then to look for jobs. In France in 1900 almost half of adult women were employed. And not just as domestic servants or milkmaids on family farms, as before: they also started to work in industry. Their new jobs were typically low-status clerical occupations which did not improve their bargaining power much, or change the basic social norm which held that husbands should earn most of the money and wives look after the children. At the time, an American sociologist, William Ogburn, coined the term “cultural lag” to describe the mismatch between the material conditions of life, which change quickly, and behaviour and attitudes, which are more resistant to change.

East Asia is experiencing a cultural lag even more extreme than the one that affected Europe in 1900. Female literacy is nearly universal, and in Japan and South Korea female college graduates outnumber male ones. Female labour-force participation is also high. But women are still treated in the old ways. Until recently Japanese women were expected to give up work on having children. Working or not, Japanese and South Korean women do at least three more hours of housework a day than their men.

Such cultural lags are associated with ultra-low fertility because if you force women to choose between family and career, then many will choose their career. In Tokyo, Bangkok and other Asian cities, rates of childlessness are sky-high. Women are refusing to marry. And if they do marry, they are getting hitched later in life, in practice reducing their likelihood of ever bearing children (births out of wedlock remain taboo and rare in Asia).

In Europe the cultural lag closed eventually. Social norms began to shift in the 1960s and have changed more rapidly in the past 20 years. Child care became more widely available. Men started to help with the laundry and the school run. Women therefore found it easier to have both a career and rugrats. In places where this process has gone furthest—France, Scandinavia, Britain—fertility rates are almost back up to the replacement level. In those where traditional male breadwinner/female homemaker roles have lingered, such as Germany and Italy, fertility rates remain low. Mr Anderson and Mr Kohler call the recovery in fertility the “gender equity dividend”.

Culture v the law of supply and demand

It is common to say that Asia will not reap such a dividend because traditional norms of family and marriage are more deeply entrenched there than in Europe. It is true that industrialisation took place much faster in Asia than in the old continent, so attitudes have more catching up to do.

Yet Asia is changing faster than traditionalists think and may change faster still. The age of first marriage in Japan and Korea has risen from 24-25 in 1970 to almost 30 now—an exceptionally big shift. High rates of childlessness and delayed marriage show that Asian women are dissatisfied with the choices on offer. No less important, there is a mechanism which may increase their scope to secure more palatable outcomes.

Everywhere, men marry women younger than themselves, and Asia is no exception. But in societies such as Asian ones where fertility is falling, older cohorts of the population are by definition larger than younger ones. So there are more men of, say, 25-30 wanting to marry than there are women of 20-25. Over time, small imbalances in the marriage market build up to create enormous pressure for change. By some estimates, by 2070 in some Asian countries there will be 160 men seeking a wife for every 100 women seeking a husband. Men will have to compete much harder if they want to attract a mate, and that surely means doing more housework. (Those who insist on old-fashioned gender roles will doom themselves to bachelorhood.) With more supportive husbands, women will find it easier to combine motherhood and career, so they will have more babies. Asian culture will adapt to reality, just like any other.

    

 

The Coming Labour Schism?

 

The Labour leader’s rivals will never have a better moment to break free. If they stay, they’ll follow the party to oblivion.

It’s there for the taking — but have they the courage to take it? This is the moment for the sane part of the official opposition to break free from the Labour party’s long and deathly embrace. If they are to be cut loose, then cometh the hour, cometh the man. Jeremy Corbyn could be the salvation of the sensible centre left.

My Times colleague, David Aaronovitch, is precisely wrong when (in “Where does the humiliated centre left go now?” — Sept 17) he argues that “it’ll have to be formed inside the Labour party because that is the only environment in which it can flourish”. His attractive credo — internationalism, liberalism, the marriage of private and collective enterprise — serves only to remind us how resistant Labour’s rancid old class-warriors have always been to such common sense.

My Times colleague, Philip Collins, condemns Corbyn as though his fellow Labour party member did not stand firmly within the traditions of the party Philip joined. Rational men like Philip tried to change the Labour party, they failed, and now it has rejected them. Corbyn is closer to Labour’s heart than Tony Blair will ever be. He isn’t just authentic Corbyn: he’s authentic Labour.

This, I am afraid, is the explanation for the rage of the moderate left in the media and in parliament. They’ve been brought up hard against what they wanted never to have to face: that something in their party’s very soul does not love them and never will.

Corbyn is deeply connected with Labour: he reaches back into the party’s history and its soul. He may be a caricature but caricatures convey deep truths. This bearded, reality-hating ideologue is a ghastly reminder of much that has always been dear to the people’s party. Corbynism is a not a revolution, it’s a counter-revolution. Labour is reverting to type in a manner so spectacular that the party may now choke itself to death in one, last, magnificent spasm of authenticity.

Labour are going nowhere. The party may (as I suggest) go out with a bang. Equally likely, some residual instinct for self-preservation will kick in, they’ll defenestrate Corbyn, and replace him with a less astringent nonentity, capable of papering over the cracks.

In which case the party will go out with a whimper, on a long, gentle amble into that good night: drifting on towards the next election — and the next, and the next — never winning, forever compromising, softly losing support in a sort of quarter-century slow puncture; all the while clutching in its deadly embrace the undogmatic liberalism of the band of dupes we call the left’s moderates. Forever hopeful that Labour can be made the 21st-century vehicle for their kind of politics, they will condemn a good product to a tainted brand.

Can’t you just see it: 2024, under another Tory government; Labour on 21 per cent? Alistair Darling (remember him) retired a little sadly from active politics. David Blunkett, voice cracking a bit now, penning a melancholy memoir. Alan Milburn old, rich and uninvolved. Alan Johnson still loyal, still trying not to sound disheartened, in the House of Lords. Yvette Cooper weary of being Keir Starmer’s shadow chancellor; Andy Burnham forgotten. And the Labour high command still struggling to frame an election manifesto that treads on the eggshells of Labour’s unreconciled internal tensions?

If not now, then when? When to walk away from this sapping battle? It isn’t just delusory for Labour moderates to hope to remake their party in their own image: it’s bad marketing even to want the brand. The name “Labour” has become indelibly associated in voters’ minds with 20th-century socialism, the working class and the trade unions. Overhanging from the last century, ideas of “industrial action”, class solidarity, trade-union bosses, nationalisation and a huge central state, cling to Labour’s clothes like the smell of stale cigarette smoke.

If we Tories think we “won” the last election then we fail to understand what a huge handicap the name “Labour party” placed upon our challengers.

I know what you’re thinking: “but look what happened to the SDP”. Well think again. The Social Democratic party almost made it. At one point they touched 50 per cent in the polls. Against all the odds imposed by our first-past-the-post electoral system they could still have broken through, but for the Falklands war. Labour’s own origins, displacing the Liberal party, should teach us that a third party can displace a second, but only if the second has lost the plot. I rest my case.

Have you ever studied the hermit crab? This small, shell-less crab walks around armoured within a discarded shell he has found on the beach and moved into. As circumstances require he may vacate one shell and find another.

Political tendencies, too, need shells. They need shelter, need organisations, need staff, need somewhere to live. Philosophies of government choose (may even create) political parties as their homes. But ideas are bigger than any party and no party has a God-given right to remain their host. Through all my political lifetime the Labour party has enjoyed a scratchy relationship with what we might call the socially concerned but essentially undoctrinaire side of British politics; people for whom neither “profit” nor “state intervention” are dirty words: the centre left.

It is time for these moderates to vacate their tenancy and cast off Labour’s shell. Fate, in the guise of Jeremy Corbyn, has given them a better excuse for making the leap than any could have dared hope. If they go they will say they have not left their party: their party has left them. They need to believe this, but it isn’t wholly true.

The Labour party is much the same old beast it was when I first entered the Commons 36 years ago. If it were capable of reimagining itself, then three successive election victories under Tony Blair would have cemented the new self-image. Labour moderates who suppose they can return to that battle and this time win, should be warned that this time it will be war not only within Westminster but with the massed ranks of the party beyond Westminster, and all its registered supporters.

Why fight to keep an address when you could take a new one? Drop all that weepy stuff about loyalty to the dear old party and ask whether there isn’t something bigger in politics to be true to: your own beliefs, your country’s interests. British politics is due a realignment. Come on out, you hermit crabs. There are other shells on the beach.

    

 

Political Activists

 

Yesterday someone said that they’d seen me in Cheltenham last weekend and wished they’d punched me while they had the chance. This was disconcerting: book festivals are not renowned for their violence and my aspiring assailant let slip that he was a poet. It would all have taken some explaining.

It wasn’t the poet’s cause that was interesting, it was his easy assumption that he was morally entitled to punch me. It didn’t occur to him that if I knew his views, then I might feel equally entitled to punch him. In his mind there was no contest of ideas — there was his correct and virtuous view and my wrong and vicious one. Righteous punching could go only one way.

Visitors to the Conservative party conference this week encountered the same failure of imagination while entering the secure zone. Leaving aside the spitting, chasing and throwing of various objects (anarchists are almost as unserious about the world as teenage jihadists) there was the more mainstream yelling of abuse and obscenities at anyone walking through the security gates. They were Tories and they were bad and so deserved abuse; the abusers were anti-Tories and they were good and therefore were entitled to deliver the abuse.

Nevertheless, even a very moral person is likely to ask themselves what impact behaviour like this is going to have on the causes they stand for and which supposedly make them so moral in the first place.

So I tried to imagine what dialogue might have taken place between those organising the various protests. Did they first discuss whether their objective was to get the government to change its mind about several things, or whether their main goal was the defeat of the government at the next general election? Let us suppose it was the first. The next step is to try to persuade key groups of government supporters of the need to change. The best way of doing that is to fill them with the fear of electoral defeat. Or if it’s the second, you need to win round voters who previously backed the Conservatives, as well as holding your own core vote. Highlighting hardships caused by government policies is one way to do that. Appeal to people’s innate sense of justice. Think inclusive.

So Sebastian and Chloe (and Reg) and the others discuss ways of achieving either end. And before they get to the inclusive thing Reg says: “I’ve got it! What we need to do is march round Manchester shouting ‘Tories Out’ and making it clear that a) we think anyone who isn’t us is a Tory and b) that Tories are scum.” And Chloe and Seb reply: “Brilliant! That should do it!” And off they go.

I am not one of those people outraged by being called names. A reasonably robust person soon learns to live with it and so long as you’re prepared to take it as well as dish it out I can cope with you. But I’m never going to support you. I am never going to do what you keep shouting at me to do. If I see you yelling away on television I slip myself an almost subliminal note to oppose you. And if you somehow imagine otherwise then you are suffering from AD — Activist’s Delusion.

Labour is now almost fully possessed by AD. Even quite sensible people have written that Jeremy Corbyn has (cliché du jour) “galvanised” the party. You may not agree with him or John McDonnell on everything but they have brought about a rejuvenation, a refreshment of Labour: 60,000 new members since JC was elected! An infusion of energetic new blood. And that, surely, has to be good. It has to help to put Labour on the path to victory.

Of course, in meetings around Manchester this week, you could find Tories afflicted by their own form of AD. There were sell-outs for impassioned discussions on Europe, of course. Standing ovation room only. The grassroots were galvanised. Brilliant, eh?

No. The main effect of an increase in the number of activists is not eventual political success, but an increase in Activist’s Delusion and therefore eventual political failure. It is a harbinger of defeat. There are two reasons for this. The first is Accrington Stanley. During the leadership election no meeting or rally held by Mr Corbyn (events commonly described as “huge” and “ebullient”) was bigger than the average home gate for the middle of Division Two side Accrington Stanley. Or, to put it another way, however large the numbers that activists believe are taking part in their activities (and they routinely exaggerate, not least to themselves) the electorate is unbelievably more vast. Yet they forget it.

The second problem is that very often activists are simply the wrong people. They like doing things that other people don’t and they are interested in things other people aren’t. Worse, being human, they tend to cluster together and mutually reinforce the erroneous notion that everybody is like them. Then, when it becomes apparent that the voters have not become enthused by leaving the EU or abolishing Trident or resisting austerity, the gap between hope and reality is filled by blaming the media. They brainwashed the people — obviously.

I say this not out of snideness — I was an activist for much of my early life, and come from a family of activists. I like them. They do things. But I note that anyone who positively seeks out the opportunity to sit in a hall and be spoken to for two hours by a dozen speakers whose opinions they already know, defines pleasure in a different way from most of his or her compatriots.

Renewed energy is not in itself always a good thing. Ask the parent of any toddler who suddenly gets a new lease of life at 10pm. And the problem in political parties has always been how to maintain a solid membership to do the various things that have to be done and to help refresh the gene pool, without allowing them to actually decide anything. The nature of activism works against compromise and even against persuasion. Democratic politics relies on both. As things stand, activists should be seen but not heard.

    

 

Baby Boomers Have Pillaged The Economy

 

One moment in the third Republican presidential debate encapsulates everything terrible about baby boomers and the way they’ve pillaged the U.S. economy. It came from Sen. Marco Rubio of Florida, a Generation Xer, who offered the standard line — you can hear it from the mouth of almost any American politician today — on how to keep Medicare and Social Security solvent. Rubio defended the idea that future workers will need to retire later or receive fewer benefits from those safety-net programs than current retirees. “Everyone up here tonight that’s talking about reforms,” he stipulated, was “talking about reforms for future generations. Nothing has to change for current beneficiaries.”

That’s smart politics: The biggest generational voting bloc by far in the upcoming election will be baby boomers, a group that is just starting to draw its first Medicare and Social Security benefits — and does not want anyone messing with those benefits, thank you very much.

It’s also bad economics.

Baby boomers gobbled up the best economy in American history and left the bones for the rest of us, argues The Post's Jim Tankersley, and it's time they help to fix it.

Boomers soaked up a lot of economic opportunity without bothering to preserve much for the generations to come. They burned a lot of cheap fossil fuels, filled the atmosphere with heat-trapping gases, and will probably never pay the costs of averting catastrophic climate change or helping their grandchildren adapt to a warmer world. They took control of Washington at the turn of the millennium, and they used it to rack up a lot of federal debt, even before the Great Recession hit.

If anyone deserves to pay more to shore up the federal safety net, either through higher taxes or lower benefits, it’s boomers — the generation that was born into some of the strongest job growth in the history of America, gobbled up the best parts, and left its children and grandchildren with some bones to pick through and a big bill to pay. Politicians shouldn’t be talking about holding that generation harmless. They should be asking how future workers can claw back some of the spoils that the “Me Generation” hoarded for itself.

When you look at the numbers, the advantages boomers have enjoyed are breathtaking. Start with the economy. Boomers went to work in a job market that their children rightly romanticize. It delivered living-wage work for wide swaths of Americans, even those who didn’t go to college, which by the way cost a fraction of what higher education costs today, even after you adjust for inflation. A single earner could provide for a family. Employees could reasonably expect to advance in their companies and work their way into the middle class. Incomes grew across the board.

Earlier this year, in a paper for the Brookings Institution, economist Robert Shapiro tracked the lifetime earnings paths for Americans who entered the labor market in the 1970s, 1980s, 1990s and early 2000s. He found a sharp generational divide. The typical U.S. household headed by someone who was 25 to 29 years old in 1975 saw its real income increase by 60 percent until it peaked and began to slowly decline before retirement. For a similar household in 1982, lifetime income peaked 70 percent higher than its starting point. Those are both boomer cohorts.

The groups that came after fared worse. Workers who were 25 to 29 in 1991 saw median earnings peak 50 percent above where they started. For the 2001 group, the peak was just over 20 percent higher. (Though there’s still time, theoretically, for their earnings to rise again.) For both those groups, the high point came much earlier in their working lives than it did for the boomers.

My generation, Gen X, is in far worse financial shape than our parents were at the same age. Millennials are even worse off than we are. Soon after the Great Recession ended, the Pew Research Center reported that middle-class families were 5 percent less wealthy than their parents had been at their age, even though today’s families work a lot harder — the average family’s total working hours has risen by a quarter over the past 30 years — outside the home, and even though they’re much likelier to include two wage earners. The ensuing recovery has made things worse. Middle-class families owned fewer stocks, businesses and homes in 2013 than they did in 2010, according to calculations by New York University economist Edward Wolff.

Meanwhile, future generations will have to pay the costs of weaning the world from fossil fuels and/or adapting to warmer temperatures, rising seas and more extreme weather. (Estimates vary, but some projections suggest they could total trillions of dollars for America alone.) They will also have to shoulder the burden of keeping America’s retirement promises to the boomers. The Congressional Budget Office estimates that the rising costs of Social Security and government health care that will stem from an aging population will consume two more percentage points of America’s economic output by 2040. If policymakers don’t find the revenue to pay for it all, the CBO projects that the national debt will climb past 100 percent of annual gross domestic product — quadruple its post-World War II low.

And yet almost no one suggests that boomers should share the pain of shoring up those programs. Folks my father’s age like to say they’ve paid for those benefits, so they should get them in full. But they haven’t. The Urban Institute has estimated that a typical couple retiring in 2011, at the leading edge of the boomer wave, will end up drawing about $200,000 more from Medicare and Social Security than they paid in taxes to support those programs. Because Social Security benefits increase faster than inflation, boomers will enjoy bigger checks from the program, in real terms, than their parents did.

The sin here isn’t exactly intentional: It’s not boomers’ fault that there are so many more of them than their predecessors (their ranks peaked near 80 million, some 30 million more than the Silent Generation before them) or that they’re living longer (retirees today can expect to live three or four years longer than their grandparents). The sin is that boomers have done nothing to ameliorate their easily foreseen threat to the U.S. Treasury. They have had every opportunity: Congress has been controlled by a baby boom majority since the beginning of the George W. Bush administration.

Did that majority sock away money for future safety-net costs? No. Pols talked about putting budget surpluses in a “lockbox,” but not for long. Instead they cut their own taxes, they deficit-financed two wars, they approved a new Medicare prescription drug benefit that their generation will be the first to enjoy in full. Partly as a result of those policies, the federal budget deficit has averaged 4 percent of GDP in the Bush/Obama era, more than double the average rate of the 50 years before that. Boomers let federal debt, as a share of the economy, double from where it was in 1970.

Meanwhile, they stood by while the economic bargain that lifted them as young workers began to unravel for their children. They opened global trade and watched millions of U.S. manufacturing jobs vanish; research by MIT economist Daron Acemoglu and colleagues suggests that normalized trade with China, the biggest driver of those losses, has by itself cost America at least 2 million jobs.

Then, boomers didn’t invest enough in new training programs for young workers, particularly men, who once could count on factory jobs to bring them a middle-class lifestyle. They allowed college costs to more than double from 1982 to 2012. Though, point in their favor: Many of them took out loans to send their children to school.

Boomers let public investments in research and development — a critical driver of future prosperity — fall steadily as a share of the economy; they’re down from 1.2 percent of GDP in 1976 to 0.8 percent today, a decline of one-third. In the 15 years boomers have been running Congress, economic growth has slid well below the average of a generation ago — to 1.9 percent a year, down from 3.2 percent for the preceding 25 years. Some of the brightest minds of their generation built fortunes working at Wall Street investment banks, then helped drive the economy into its worst recession since the Great Depression.

It’s increasingly clear that Generation X, and possibly millennials, haven’t learned from the boomers’ mistakes. My son will rightly criticize me someday for my generation’s love of SUVs. He’ll probably wonder why he has to pay higher taxes or work several more years just to get a retirement that’s worse than my dad’s or maybe even mine.

Every generation wants to leave a better world for the ones to follow. I truly believe that boomers had no idea, for a long time, that the sum of their choices — of their quest to make life as good as it could be for themselves — might be a worse world for their children. But it’s apparent now.

It’s too late to ask boomers, as a generation, to repent of their economic sins and undertake future struggles now that the damage has been done. It is fair, though, to ask for a dose of responsibility from the boomers now running for president. They have all cast themselves as truth-tellers, problem-solvers and makers of hard choices.

Political incentives argue against that — by census counts, there are as many boomer voters today as there are 25- to-44-year-olds and senior citizens combined. And nobody likes to be told he is a parasite. After I first outlined this argument to my father in 2012, he gifted me an actual lump of coal for Christmas.

But it’s not enough to talk soberly about saving the economy and the safety net for future generations, if future generations are making all the sacrifices. The boomer candidates — actually, all the candidates — need to be honest with boomers about how good they’ve had it in America and how it’s time to give back: They should take steps, right now, to reduce carbon emissions and head off a debt crisis. They should pay higher taxes or accept slimmer retirement benefits, and they should tell lawmakers to make cleaner energy a top priority. My generation should join them. The boomers running for president should lead with a call to change the country for the better. Wasn’t that always supposed to be their thing?

    

 

How Corrupt Is The US?

 

In November 2014, Arkansas voters approved a ballot measure that, among other reforms, barred the state’s elected officials from accepting lobbyists’ gifts. But that hasn’t stopped influence peddlers from continuing to provide meals to lawmakers at the luxurious Capital Hotel or in top Little Rock eateries like the Brave New Restaurant; the prohibition does not apply to “food or drink available at a planned activity to which a specific governmental body is invited,” so lobbyists can buy meals so long as they invite an entire legislative committee.

Such loopholes are a common part of statehouse culture nationwide, according to the 2015 State Integrity Investigation, a data-driven assessment of state government by the Center for Public Integrity and Global Integrity. The comprehensive probe found that in state after state, open records laws are laced with exemptions and part-time legislators and agency officials engage in glaring conflicts of interests and cozy relationships with lobbyists. Meanwhile, feckless, understaffed watchdogs struggle to enforce laws as porous as honeycombs.

Take the Missouri lawmaker who introduced a bill this year — which passed despite a veto by the governor — to prohibit cities from banning plastic bags at grocery stores. The state representative cited concern for shoppers, but he also happens to be state director of the Missouri Grocers Association, and is just one of several lawmakers in the state who pushed bills that synced with their private interests.

Or the lobbyist who, despite a $50 cap on gifts to Idaho state lawmakers, spent $2,250 in 2013 to host a state senator and his wife at the annual Governors Cup charity golf tournament in Sun Valley; the prohibition does not apply to such lobbying largess as long as the money is not spent “in return for action” on a particular bill.

In Delaware, the Public Integrity Commission, which oversees lobbying and ethics laws for the executive branch there, has just two full-time employees. A 2013 report by a special state prosecutor found that the agency was unable “to undertake any serious inquiry or investigation into potential wrongdoing.”

And in New Mexico, lawmakers passed a resolution in 2013 declaring that their emails are exempt from public records laws — a rule change that did not require the governor’s signature. “I think it’s up to me to decide if you can have my record,” one representative said.

These are among the practices illuminated by the State Integrity Investigation, which measured hundreds of variables to compile transparency and accountability grades for all 50 states. The results are nothing short of stunning. The best grade in the nation, which went to Alaska, is just a C. Only two others earned better than a D+; 11 states received failing grades. The findings may be deflating to the two-thirds of Americans who, according to a recent poll, now look to the states for policy solutions as gridlock and partisanship have overtaken Washington D.C.

Key findings:

Alaska earns top spot in the State Integrity Investigation ranking of state transparency and accountability, with a C grade and score of 76. Only two others earn better than a D+; 11 states receive failing grades. Michigan is last.

State open records laws are riddled with loopholes; many states exempt entire branches of government and use high fees and lengthy delays to suppress controversial material.

Most ethics entities are toothless and underfunded. In two out of three states they routinely fail to initiate investigations or impose sanctions when necessary.

Many part-time state lawmakers operate with glaring conflicts of interest. In 7 of 10 states, legislators at least occasionally vote on bills that could present a conflict.

States score well in several categories, with 29 earning a B- or better for auditing practices and 16 scoring a B- or above for budget transparency.

The top of the pack includes bastions of progressive government, including California (ranked 2nd with a C-), and states notorious for corrupt pasts (Connecticut, 3rd with a C-, and Rhode Island, 5th with a D+). In those New England states, scandals led to significant reforms and relatively robust ethics laws, even if dubious dealings linger in the halls of government. The bottom includes many western states that champion limited government, like Nevada, South Dakota and Wyoming, but also others, such as Maine, Delaware and dead-last Michigan, that have not adopted the types of ethics and open records laws common in many other states.

The results are “disappointing but not surprising,” said Paula A. Franzese, an expert in state and local government ethics at Seton Hall University School of Law and former chairwoman of the New Jersey State Ethics Commission. Franzese said that, with many states still struggling financially, ethics oversight in particular is among the last issues to receive funding. “It’s not the sort of issue that commands voters,” she said.

With a few notable exceptions, there has been little progress on these issues since the State Integrity Investigation was first carried out, in 2012. In fact, most scores have dropped since then, though some of that is due to changes made to improve and update the project and its methodology.

Since State Integrity’s first go-round, at least 12 states have seen their legislative leaders or top cabinet-level officials charged, convicted or resign as a result of ethics or corruption-related scandal. Five house or assembly leaders have fallen. No state has outdone New York, where 14 lawmakers have left office since the beginning of 2012 due to ethical or criminal issues, according to a count by Citizens Union, an advocacy group. That does not include the former leaders of both the Assembly and the Senate, who were charged in unrelated corruption schemes earlier this year but remain in office.

New York is not remarkable, however, in at least one regard: Only one of those 14 lawmakers has been sanctioned by the state’s ethics commission.

Grading the states

When first conducted in 2011-2012, the State Integrity Investigation was an unprecedented look at the systems that state governments use to prevent corruption and expose it when it does occur. Unlike many other examinations of the issue, the project does not attempt to measure corruption itself.

The 2015 grades are based on 245 questions that ask about key indicators of transparency and accountability, looking not only at what the laws say, but also how well they’re enforced or implemented. The “indicators” are divided into 13 categories: public access to information, political financing, electoral oversight, executive accountability, legislative accountability, judicial accountability, state budget processes, state civil service management, procurement, internal auditing, lobbying disclosure, state pension fund management and ethics enforcement agencies.

Experienced journalists in each state undertook exhaustive research and reporting to score each of the questions, which ask, for example, whether lawmakers are required to file financial interest disclosures, and also whether they are complete and detailed. The results are both intuitive — an F for New York’s “three men in a room” budget process — and surprising — Illinois earned the best grade in the nation for its procurement practices. All together, the project presents a comprehensive look at transparency, accountability and ethics in state government. It’s not a pretty picture.

“Across the board, accessing government has always been, but is increasingly, a barrier to people from every reform angle.” - Jenny Rose Flanagan, vice president at Common Cause

Downward trend, blips of daylight

Overall, states scored notably worse in this second round. Some of that decline is because of changes to the project, such as the addition of questions asking about “open data” policies, which call on governments to publish information online in formats that are easy to download and analyze. But the drop also reflects moves toward greater secrecy in some states.

“Across the board, accessing government has always been, but is increasingly, a barrier to people from every reform angle,” said Jenny Rose Flanagan, vice president for state operations at Common Cause, a national advocacy group with chapters in most states.

No state saw its score fall farther than New Jersey, where scandal after scandal seems to have sunk Gov. Chris Christie’s presidential aspirations deep into the muck of the state’s brawling, back-scratching political history. New Jersey earned a B+, the best score in the nation, in 2012 — shocking just about anyone familiar with the state’s politics — thanks to tough ethics and anti-corruption laws that had been passed over the previous decade in response to a series of scandals.

None of that has changed. But journalists, advocates and academics have accused the Christie administration of fighting and delaying potentially damaging public records requests and meddling in the affairs of the State Ethics Commission. That’s on top of Bridgegate, the sprawling scandal that began as a traffic jam on the George Washington Bridge but has led to the indictments so far of one of the governor’s aides and two of his appointees — one of whom pleaded guilty to conspiracy charges — and even to the resignations of top executives at United Airlines. As a result of these scandals and others, New Jersey dropped to 19th place overall with a D grade.

Admittedly, it’s not all doom and gloom. Iowa created an independent board with authority to mediate disputes when agencies reject public records requests. Gov. Terry Branstad cited the state’s previous grade from the Center when he signed the bill, and the move helped catapult Iowa to first in the nation in the category for access to information, with a C- grade (Iowa’s overall score actually dropped modestly).

In Georgia, good government groups latched on to the state’s worst-in-the-nation rank in 2012 to amplify their ongoing push for reforms. The result was a modest law the following year that created a $75 cap on the value of lobbyists’ gifts to public officials. The change helped boost the state’s score in the category of legislative accountability to a C-, sixth-best in the nation.

Perhaps the most dramatic reforms came in Virginia, where scandal engulfed the administration of outgoing Gov. Robert McDonnell in 2013 after it emerged that he and his family had accepted more than $170,000 in loans and gifts, much of it undisclosed, from a Virginia businessman. McDonnell and his wife were later convicted on federal corruption charges, but the case underscored the state’s woefully lax ethics laws and oversight regime; Virginia received an overall F grade in 2012. At the time, there was no limit on the value of gifts that public officials could accept, and they were not required to disclose gifts to their immediate family, a clause that McDonnell grasped at to argue that he had complied with state laws. (Appeals of the McDonnells’ convictions are pending.)

Over the next two years, newly-elected Gov. Terry McAuliffe and lawmakers passed a series of executive actions and laws that eventually led, in 2015, to a $100 cap on gifts to public officials from lobbyists and people seeking state business. They also created an ethics council that will advise lawmakers but will not have the power to issue sanctions. Advocates for ethics reform have said the changes, while significant, fall far short of what’s needed, particularly the creation of an ethics commission with enforcement powers. Still, they helped push the state's grade up to a D.

States also continued to score relatively well in the categories for auditing practices — 29 earned B- or better — and for budget transparency — 16 got a B- or above (the category measures whether the budget process is transparent, with sufficient checks and balances, not whether it’s well managed).

In Idaho, for example, which earned an A and the second best score for its budget process, the public is free to watch the Legislature’s joint budget committee meetings. Those not able to make it to Boise can watch by streaming video. Citizens can provide input during hearings and can view the full budget bill online.

New York earned the top score for its auditing practices — a B+ — because of its robustly-funded state comptroller’s office, which is headed by an elected official who is largely protected from interference by the governor or Legislature. The office issues an annual report, which is publicly available, and has shown little hesitation to go after state agencies, such as in a recent audit that identified $500 million in waste in the state’s Medicaid program.

Unfortunately, however, such bright spots are the exceptions.

Access denied

In 2013, George LeVines submitted a request for records to the Massachusetts State Police, asking for controlled substance seizure reports at state prisons dating back seven years. LeVines, who at the time was assistant editor at Muckrock, a news website and records-request repository, soon received a response from the agency saying he could have copies of the reports, but they would cost him $130,000. While LeVines is quick to admit that his request was extremely broad, the figure shocked him nonetheless.

“I wouldn’t have ever expected getting that just scot-free, that does cost money,” he said. But $130,000? “It’s insane.”

The cost was prohibitive, and LeVines withdrew his request. The Massachusetts State Police has become a notorious steel trap of information — it's charged tens of thousands of dollars or even, in one case, $2.7 million to produce documents — and was awarded this year with the tongue-in-cheek Golden Padlock award by a national journalism organization, which each year “honors” an agency or public official for its “abiding commitment to secrecy and impressive skill in information suppression.”

Dave Procopio, a spokesman for the State Police, said in an email that the department is committed to transparency, but that its records are laced with sensitive information that's exempt from disclosure and that reviewing the material is time consuming and expensive. "While we most certainly agree that the public has a right to information not legally exempt from disclosure,” he wrote, “we will not cut corners for the purpose of expediency or economy if doing so means that private personal, medi[c]al, or criminal history information is inappropriately released.”

It’s not just the police. Both the Legislature and the judicial branch are at least partly exempt from Massachusetts’ public records law. Governors have cited a state Supreme Court ruling to argue that they, too, are exempt, though chief executives often comply with requests anyway. A review by The Boston Globe found that the secretary of state’s office, the first line of appeal for rejected requests, had ruled in favor of those seeking records in only 1-in-5 cases. Needless to say, Massachusetts earned an F in the category for public access to information. But so did 43 other states, making this the worst performing category in the State Integrity Investigation.

While every state in the nation has open records and meetings laws, they’re typically shot through with holes and exemptions and usually have essentially no enforcement mechanisms, beyond the court system, when agencies refuse to comply. In most states, at least one entire branch of government or agency claims exemptions from the laws. Many agencies routinely fail to explain why they they’ve denied requests. Public officials charge excessive fees to discourage requestors. In the vast majority of states, citizens are unable to quickly and affordably resolve appeals when their records are denied. Only one state — Missouri — received a perfect score on a question asking whether citizens actually receive responses to their requests swiftly and at reasonable cost.

“We’re seeing increased secrecy throughout the country at the state and federal level,” said David Cuillier, director of the University of Arizona’s School of Journalism and an expert on open records laws. He said substantial research shows that the nation’s open records laws have been poked and prodded to include a sprawling list of exemptions and impediments, and that public officials increasingly use those statutes to deny access to records. “It’s getting worse every year,” he said.

After a series of shootings by police officers in New Mexico, the Santa Fe New Mexican published a report about controversial changes made to the state-run training academy. But when a reporter requested copies of the new curriculum, the program’s director refused, saying “I’ll burn them before you get them.”

In January, The Wichita Eagle reported that Kansas Gov. Sam Brownback’s budget director had used his private email address to send details of a proposed budget to the private email accounts of fellow staff members, and also to a pair of lobbyists. He later said he did so only because he and the rest of the staff were home for the holidays. But in May, Brownback acknowledged that he, too, used a private email account to communicate with staff, meaning his correspondence was not subject to the state’s public records laws. A state council is now studying how to close the loophole. A series of court cases in California are examining a similar question there.

Cuillier said in most states, courts or others have determined that discussions of public business are subject to disclosure, no matter whether the email or phone used was public or private. But the debate is indicative of a larger problem, and it reveals public records laws as the crazy old uncle of government statutes: toothless, antiquated appendages of a bygone era.

    

 

How We Respond To Tragedies

 

In the wake of last week's horrific attacks in Paris, a scolding narrative has taken hold in certain corners of the internet: The fact that the incident received blanket media coverage, but a suicide bombing in Beirut that killed 43 and occurred a day prior didn't, proves that — depending on whom you ask — Americans are racist, members of the media are racist, Americans don't care about the Middle East, or any one of a number of other theses pointing to a moral failing on the part of those who have expressed, as the critics tell it, the improper ratio of outrage and grief over Paris as compared to Beirut.

A conga line of think pieces followed as well. One of the more impressive examples of the subgenre appeared in The Independent. "Got a French flag on your Facebook profile picture?" asked the headline. "Congratulations on your corporate white supremacy." The author, Lulu Nunn, quickly explained why the flag-posters are doing the wrong thing:

So you want to show solidarity with France – specifically, with those killed in Paris this weekend. If you’re a British person who wants to do that because you feel sympathy and sadness for people who are brutally massacred, regardless of their nationality, then fine. I just hope that you also change your profile picture to a different country’s flag every time people are wrongly killed as the result of international conflicts – for example, during the attack on Beirut in Lebanon just the day before.

Articles by Fisher and Jill Filipovic have nicely added some nuance to this debate, pointing out that, given the many, many articles that were in fact written about the Beirut bombing, it isn't right to say the media ignored it. It's more accurate, they argue, to say that readers were much more engaged by, and much more likely to share, content about Paris than about Beirut.

The question of whether it's the media or its audience that is to blame for the (possibly fictitious) coverage gap is interesting for a number of reasons. But from a human-behavior standpoint, more interesting is the model of the "proper" response to death and destruction posited by Kohn, Nunn, and others.

In short, they seem to be saying that it's offensive when:

-People don't respond equally, with the same amount of shock and grief, to tragedies that are more surprising, given their location, than ones that are less surprising, given their location. (Political violence is fairly common in Lebanon and has been for decades; notwithstanding the disturbing recent trajectory, it is not common in Paris.)

-People don't respond equally, with the same amount of shock and grief, to tragedies that victimize people who are more "similar to" them than people who are "dissimilar" to them. (There are important cultural and historical connections between the U.S. and Paris that add weight to the notion that the French are more similar to us than the Lebanese.)

-People don't respond equally, with the same amount of shock and grief, to tragedies occurring in places to which they feel more connected as opposed to less connected. (Americans, for a million reasons, are more likely to be more familiar with and feel more warmly toward Paris than Beirut, and they're more likely to have traveled there.)

Is this realistic? Humans are always using shortcuts to make sense of the world around them, because without these shortcuts we'd drown in information overload; every time we heard tragic news we'd collapse in a heap, inconsolable from trying to comprehend the full weight of a single human life lost, which is an impossible thing to really do. So yes, people respond more viscerally to events that are new and unexpected, and to events that affect people "like us," whoever the "us" in question is.

The mistake is to assume that any of this is unique to "the media" or to "Americans" or to "privileged" people. Everybody does it — everybody carves up the world in similarly predictable ways, albeit with the boundaries drawn differently. It's silly to shame people for this, especially at a time when everyone's just doing the best they can to make sense of a series of awful events.

    

 

Middle East Quagmire

 

With the attacks in Paris and the downing of a Russian passenger plane, the Islamic State has declared war on the wider world, galvanizing new calls for an intensified global effort to defeat the emerging threat.

It may already be too late and too difficult, however, for any swift or easy solution to the tangled mess the Middle East has become in the four years since the Arab Spring plunged the region into turmoil.

What Jordan’s King Abdullah II referred to as a “third world war against humanity” has, more accurately, become a jumble of overlapping wars driven by conflicting agendas in which defeating the Islamic State is just one of a number of competing and often contradictory policy pursuits.

Jordan's King Abdullah II says that the world is facing another "world war" and that there must be a swift response to the threat posed by the Islamic State.

In those four years, four Arab states — Iraq, Syria, Libya and Yemen — have effectively collapsed. Civil wars are raging in all of them. World powers have lined up on different sides of those wars. And the chaos has given the heirs to the legacy of Osama bin Laden the greatest gift they could have hoped for: the gift of time and space.

Aided by the disinterest of a world wearied and wary after the failings of the Iraq war, an assortment of al-Qaeda veterans, hardened Iraqi insurgents, Arab jihadist ideologues and Western volunteers have moved into the vacuum left by the collapse of governments in Syria and Iraq and built themselves a proto-state. It can hardly be said to count as a real state, but it controls territory, raises taxes and maintains an army.

Any responses now “are very late in the game,” said Shadi Hamid of the Brookings Institution in Washington. “The costs of inaction have accumulated, and we can’t undo the damage of the past four years.”

The Islamic State is finding new footholds in Egypt, Libya and Afghanistan as state control crumbles there, too, confronting the world with a vastly bigger challenge than it faced after the 9/11 attacks in the United States, said Bruce Riedel, who is also with the Brookings Institution.

“We have now been fighting al-Qaeda and al-Qaeda offshoots, which is what the Islamic State is, since 1998,” he said. “We now face an enemy that has more sanctuaries and operating space than ever before. The battlefield is now much larger than it was before.”

It is also more complicated. At no point has any world power made defeating the Islamic State a top priority, including the United States, said Peter Harling of the International Crisis Group. “Everyone’s using the Islamic State,” he said. It’s a diversion “from what’s really going on.”

For the Obama administration, avoiding entanglement in another Middle East war has been the foremost policy priority, followed closely by the pursuit of a nuclear deal with Iran. There seems to be little doubt that the United States has soft-pedaled its Syria policy, ostensibly aimed at removing President Bashar al-Assad, in order not to jeopardize the Iran deal, Hamid said.

Russian intervention in the region has been driven primarily by President Vladimir Putin’s desire to reassert Russia’s stature as a global power and shore up Assad’s regime, hence the focus on targeting U.S.-backed moderate rebels rather than the Islamic State in the earliest days of its intervention.

Saudi Arabia, America’s most powerful Arab ally, is preoccupied above all by the challenge posed by Iran and is expending its military energies on fighting the Iranian-backed Houthi militias in Yemen.

Iran has prioritized the projection of its regional influence through Syria and Iraq to the Mediterranean, funding and arming proxy militias to defend its interests in Shiite-dominated areas of Iraq and to quell the anti-Assad rebellion mostly in the areas around Damascus, the Syrian capital.

And Turkey’s attention is focused mainly on its domestic Kurdish problem and on the perceived threat posed by the emergence of an autonomous Kurdish enclave along its border in northern Syria.

It seems unlikely that the Paris attacks will generate a more coherent international response, analysts say.

France has joined the United States and Russia in conducting airstrikes in Syria, raining bombs on the Islamic State’s self-proclaimed capital of Raqqa on two occasions since the bloodshed in Paris on Friday. On Tuesday, President François Hollande dispatched an aircraft carrier to the eastern Mediterranean, where Russian warships are already deployed for a fight that has focused mostly on Syrian rebels fighting Assad — some of whom also have been supported by France.

Syrian activists with the group Raqqa Is Being Slaughtered Silently, which maintains a network of undercover reporters in Raqqa, say that the initial French strikes, at least, hit only empty buildings vacated by the militants in anticipation of retaliation.

Experience has already demonstrated that the Islamic State is unlikely to be defeated with airstrikes alone, and for now, there are few other alternatives on the horizon, Riedel said.

“The Islamic State can be degraded by air power, but in the end someone has to provide the infantry that goes in and takes Mosul and Raqqa and restores governance and rule of law, and I don’t see anyone offering that.”

In America, the Paris attacks have precipitated peripheral debates about whether the United States is waging war on Islam and the question of whether to admit Syrian refugees, rather than ways to address the wider problems of the Middle East.

Although there have been calls for more robust intervention, including boots on the ground, from some members of Congress, President Obama has made it clear that he thinks the current U.S. strategy is working.

And in recent days, there has been a spurt of progress on the ground. An alliance of Kurdish and Arab fighters recaptured the eastern Syrian town of al-Hawl and dozens of surrounding villages. In northern Iraq, Kurdish fighters ejected the Islamic State from the town of Sinjar in 48 hours. The Iraqi army has made advances around Ramadi, the capital of the province of Anbar, which is now almost encircled.

The gains reflect a newly concerted effort to coordinate efforts among the diverse local forces fighting on the ground in order to pressure the Islamic State in multiple places at the same time, said Col. Steve Warren, spokesman for the U.S.-led coalition against the Islamic State.

“We’re fighting them across the entire depth of the battlefield at once. We are synchronizing and aligning our support,” he said. “Each one of these gains is not necessarily compelling, but when you look at them simultaneously it’s very compelling.”

The gains nonetheless leave unaddressed the core problem confronting the region, which is the collapse of viable state structures in the Middle East and the absence of any immediately apparent alternative to Islamic State rule in the mostly Sunni areas it controls, said Robert Ford, former U.S. ambassador to Syria and now a fellow with the Middle East Institute.

Kurdish fighters have made most of the advances in Syria so far, but they are unlikely to succeed in retaking core Sunni strongholds such as Raqqa and other cities in the Sunni Arab territories that form the heart of the Islamic State’s self-proclaimed caliphate.

“Kurds are never going to liberate Palmyra; Kurds are not going to clear the Islamic State out of Deir al-Zour,” he said. “Sunni Arabs are going to have to do that, and what Arab force is going to do that? Assad doesn’t have the forces; he can’t even take the suburbs of Damascus.”

Likewise in Iraq, Shiite militias and Iraqi Kurds have made most of the gains so far. The Iraqi army’s recent advances in Ramadi have isolated Fallujah, which has been under Islamic State control for nearly two years, leaving residents besieged and short of food.

Many residents “would like Daesh to be expelled,” said Issa al-Issawi, the mayor of Fallujah, who left the city when the Islamic State, also known as Daesh, overran it and is living in government-controlled territory. “But at the same time they have major concerns about what would happen next.”

Obama acknowledged the problem in justifying his reluctance to commit troops to the fight. “We can retake territory and as long as we leave troops there we can hold it, but that does not solve the underlying dynamic that is producing these extremist groups,” he told journalists in Turkey this week.

Progress is being made on standing up Arab forces to fight in both Iraq and Syria, said Warren, citing the creation of an Arab coalition in northern Syria and ongoing efforts to train Sunni fighters in Iraq. The three-to-five-year timeline for defeating the Islamic State offered by the administration when airstrikes were launched last year is still on track, he said. “Now it’s two to four years.”

That may be an optimistic assessment, analysts say. The networks and ideology generated by al-Qaeda survived more than a decade of American presence in Iraq and Afghanistan and are likely to endure for at least another decade, Riedel said.

In the meantime, more attacks of the kind launched in Paris are to be expected, a message the Obama administration seems to be trying to convey.

“We’ve always said there’s a threat of these kind of attacks around the world until we’ve made more progress,” Secretary of State John F. Kerry told CNN in an interview Tuesday.

    

 

You Are More Than 7 Times As Likely To Be Killed By A Right-Wing Extremist Than By Muslim Terrorists

 

Friday afternoon, one week after elected officials all over the country tried to block Syrian refugees from entering their states in an apparent effort to fight terrorism, a white man in Colorado committed what appears to be an act of terrorism in a Planned Parenthood clinic.

Though the details of Robert Lewis Dear’s motives for killing three people in the clinic and injuring nine others are still being revealed, Dear reportedly told law enforcement “no more baby parts,” an apparent reference to heavily edited videos produced by the Center for Medical Progress, which numerous politicians have cited to falsely claim that Planned Parenthood sells “aborted baby parts.” Dear’s actions, in other words, appear to be an act of politically motivated terrorism directed against an institution widely reviled by conservatives.

Though terrorism perpetrated by Muslims receives a disproportionate amount of attention from politicians and reporters, the reality is that right-wing extremists pose a much greater threat to people in the United States than terrorists connected to ISIS or similar organizations. As UNC Professor Charles Kurzman and Duke Professor David Schanzer explained last June in the New York Times, Islam-inspired terror attacks “accounted for 50 fatalities over the past 13 and a half years.” Meanwhile, “right-wing extremists averaged 337 attacks per year in the decade after 9/11, causing a total of 254 fatalities.”

Kurzman and Schanzer’s methodology, moreover, may underestimate the degree to which domestic terrorists in the United States are motivated by right-wing views. As they describe the term in their New York Times piece, the term “right-wing extremist” primarily encompasses anti-government extremists such as members of the sovereign citizen movement, although it also includes racist right-wing groups such as neo-Nazis. Thus, it is not yet clear whether Dear, who made anti-abortion remarks but also reportedly referenced President Obama, was motivated in part by the kind of anti-government views that are the focus of Kurzman and Schanzer’s inquiry.

Kurzman and Schanzer also surveyed hundreds of law enforcement agencies regarding their assessment of various threats. Of the 382 agencies they spoke with, “74 percent reported anti-government extremism as one of the top three terrorist threats in their jurisdiction,” while only “39 percent listed extremism connected with Al Qaeda or like-minded terrorist organizations.”

Meanwhile, the percentage of refugees that are connected to terrorist plots is vanishingly small.

    

 

Stochastic Terrorism

 

After months of verbal assault against Planned Parenthood and against women more broadly, Republican Christianists have gotten what they were asking for—bloodshed.

On November 27, a mass shooting left three dead and nine wounded at a Planned Parenthood clinic just miles from the headquarters of the Religious Right flagship, Focus on the Family. Was the shooting exactly what conservative Christian presidential candidates and members of congress wanted? Maybe, maybe not. But it is what they asked for. Republican members of the Religious Right incited violence as predictably as if they had issued a call for Christian abortion foes to take up arms. Inciting violence this way is called stochastic terrorism:

“Stochastic terrorism is the use of mass communications to incite random actors to carry out violent or terrorist acts that are statistically predictable but individually unpredictable. In short, remote-control murder by lone wolf.”

In an incident of stochastic terrorism, the person who pulls the trigger gets the blame. He—I use the male pronoun deliberately because the triggerman is almost always male—may go to jail or even be killed during his act of violence. Meanwhile, the person or persons who have triggered the triggerman, in other words, the actual stochastic terrorists, often go free, protected by plausible deniability. The formula is perversely brilliant:

A public figure with access to the airwaves or pulpit demonizes a person or group of persons. With repetition, the targeted person or group is gradually dehumanized, depicted as loathsome and dangerous—arousing a combustible combination of fear and moral disgust. Violent images and metaphors, jokes about violence, analogies to past “purges” against reviled groups, use of righteous religious language—all of these typically stop just short of an explicit call to arms.

When violence erupts, the public figures who have incited the violence condemn it—claiming no one could possibly have foreseen the “tragedy.” Stochastic terrorism is not a fringe concept. It is a terrorist modality that has been described at length by analysts. It produces terrorism patterns that should be known to any member of Congress or any presidential candidate who has ever thought deeply about national or domestic security issues, which one might hope, is all of them.

We can be confident that from the time of the standoff, communications teams for Carly Fiorina, Marco Rubio, Jeb Bush, Ben Carson, Mike Huckabee, Ted Cruz, Rick Santorum and others were scrambling to figure out the nuances of plausible deniability—weighing how best to distance themselves from the violence that killed a police officer and two others without making their protestations of surprised dismay sound as hollow as they actually are—without actually denouncing the disgust and dehumanization of women who have abortions and those who provide them. In fact, since the slaughter, several have doubled down on victim blaming and anti-Planned Parenthood rhetoric.

For months, Republican presidential candidates and conservative Christian members of Congress have been following this script for political gain. Elected Republicans in the states have sought to intimidate women and providers by demanding the release (and even publication) of identifying information and addresses—essentially a target list for perpetrators. They know exactly what they are doing. Since abortion was legalized in the United States, providers and clinics have been the target of 41 bombings and 173 arson attacks. Since the 1990’s, eleven providers, clinic staff or defenders have been murdered, including the three in Colorado:

March 10, 1993: Dr. David Gunn of Pensacola, Florida was shot and killed after being depicted in “Wanted Posters” by Operation Rescue.

July 29, 1994: Dr. John Britton and a clinic escort, James Barrett, were both shot to death outside another Florida clinic, which has been bombed twice including in 2012.

December 30, 1994: Two receptionists, Shannon Lowney and Lee Ann Nichols, were shot and killed in Brookline, Massachusetts by an abortion foe who had previously attempted murder in Virginia.

January 29, 1998: Robert Sanderson, a security guard at an abortion clinic in Birmingham, Alabama, died when the clinic was bombed.

October 23, 1998: Dr. Barnett Slepian was killed at his home in Amherst, New York, by a shooter with a high-powered rifle.

May 31, 2009: Dr. George Tiller, who provided late term abortions, was shot and killed in the lobby of his church, where he was serving as an usher.

November 27, 2015: Two civilians and a police officer died during a five hour siege in which a “lone wolf” assaulted patients and providers at a Planned Parenthood Clinic in Colorado Springs.

Since David Daleiden launched his baby parts hoax aimed at triggering the yuck factor and fueling outrage among gullible abortion foes, and since Republicans in high places decided that assaulting Cecile Richards (and all of the women she represents) was good electoral fodder, Planned Parenthood clinics in Washington and California have been set on fire. Righteous Christian abortion foes have made death threats against providers and clinics across the country. By November 27, law enforcement had documented nine serious criminal incidents or attempts. Now, finally, we have a mass shooting by a deranged sounding shooter who muttered something about “no more baby parts.”

The triggerman is in custody. But the real perpetrators likely will continue to have access to pulpits, radio stations, town halls, and television, where they will express carefully crafted dismay about the carnage, hoping we all won’t notice that the hands clutching the podium are covered in blood.

    

 

Stifling Free Speech

 

Leftwing “hate mobs” are stifling free speech on British university campuses, a leading academic has complained.

Anthony Glees, an expert in security and terrorism, described being shouted down at a university event at which he supported Prevent, the government’s counterterrorism programme.

Professor Glees, the director of the centre for security and intelligence studies at the University of Buckingham, said that a new generation of students arrived at university with “closed minds” and refused to allow people to advocate opinions they disagreed with.

He was a speaker last month at a seminar at the institute of advanced legal studies, part of the University of London, where he was heckled by students opposed to the Prevent strategy. A string of student unions at leading universities have passed motions refusing to cooperate with Prevent.

Professor Glees said that he had had similar experiences over the past three or four years, including at the universities of Cambridge and Warwick. Last month the Index on Censorship, which campaigns for free speech, pointed to a growing culture of intolerance among a new generation of British students whose instincts were to try to ban people they disagreed with.

A student union official at Cardiff University tried to ban Germaine Greer, who was a role model to liberation feminist students in the 1970s, from speaking at the university this month. Students launched a petition arguing that for the university to give a platform to the writer would endorse her views questioning whether transsexual women should be identified as women.

Professor Glees blamed the intolerance on a generation of students who grew up using the internet to reinforce their own convictions. He said that they had “a sense of their own value and worth without reference to the value and worth of other people who might be different from themselves”. He also blamed a minority of leftwing university academics whom he accused of “intellectual grooming” students, and said that they wanted to ban speakers they disliked yet allowed Islamist extremists to speak at campus events.

“You go to university to open your mind, not to demonstrate that you have closed it,” he said. “People who go to university and have closed minds are then cosseted by academics who support that closure — it is intellectual grooming of a most dangerous kind.”

Free speech applied only within the law, and at universities academic freedom was intended to apply to tenured academics to protect those who challenged prevailing orthodoxy, not to students or visiting speakers, he said.

    

 

Twitter and Facebook Arguments

 

The big question in political psychology is how people change their minds. At this point, researchers know more about what doesn’t work than about what does. What doesn’t work, at least not usually, is appealing to factual accuracy, or making moral arguments that don’t resonate with the other person’s belief system, or ridiculing the other person for being stupid or immoral or sheeplike in their political beliefs. To the extent that researchers know what does work, it appears that meaningful political discussion is a pretty painstaking process that involves trying to get on the same page as the person you’re trying to convince: appealing to their sense of morality (which, if you’re arguing over politics, is likely quite different from your own) and trying to find some sort of common ground. It’s the sort of thing done over a lengthy, nuanced, friendly conversation — one that might be helped along by a beer or three if you and they are the drinking type.

The reason Facebook and Twitter almost always lead to excruciatingly bad political arguments is that they mitigate against a two-friends-debate-politics-over-beers model and nudge people toward a screaming-match-with-an-audience model. Twitter is worse by far, but neither platform allows for the sort of intricately textured, sustained political conversation that might lead to progress; on Facebook, even if you start to get somewhere, the public nature of the conversation — the pressure to perform for likes and to show your allegiance and value to your “team” — all but ensures one or both of you will revert to point-scoring, or that someone’s dumb friend will pipe in and derail the conversation.

When I think about those times in which I’ve come away from an online political debate feeling better, rather than worse, about the person I’ve debated with, those conversations have almost always involved extended correspondence on a private channel (which, yes, could include Twitter DMs or Facebook messages). I’m not usually swayed all that much, but I do start to see the correspondent as a human being who came to their beliefs for human reasons rather than as a sloganeering, brainwashed troll. That’s progress!

The public political conversations on Facebook or Twitter, though? Almost everyone comes across as a sloganeering, brainwashed troll. The content is really awful, on both sides. So if you’re annoyed by Trump supporters — or Bernie Sanders supporters, since the underlying point applies regardless — but are genuinely interested in better understanding where they come from, it might be better to filter out examples of them at their least thoughtful and most rah-rah-look-how-right-I-am. All of that awful content leads only to further dehumanization and greater distance.

Talking politics publicly on Facebook or Twitter and talking politics productively, in other words, are two very, very different things.

    

 

GOP Extremism

 

The opinion piece headed "US democracy trumps all as a dysfunctional disgrace" by Australian academic Mark Triffitt that appeared in the Herald earlier this week resembled Sherlock Holmes' curious incident of the dog in the night-time in that there was silence when there shouldn't have been.

Just as the dog didn't bark when it should have, Triffitt's piece didn't contain a single reference to Democrats and Republicans or left and right. The clear inference, therefore, is that the depressing state of US politics is the fault of both sides and dysfunction is systemic.

Take this for example: "Nearly every leading 2016 presidential candidate is uttering outright lies, mostly false statements or half truths at least half the time they open their mouths." There's a word missing between "2016" and "presidential" and that word is "Republican." As I pointed out in this space on December 4, the Pulitzer Prize-winning fact-checking organisation PolitiFact found the lies of Republican candidates to be of a vastly different order of frequency and magnitude than those of Democrats Hillary Clinton and Bernie Sanders.

While the Democrats are still recognisably the party of Franklin Roosevelt, John Kennedy and Bill Clinton, the Republican Party has been swept off its mooring. Research shows the percentage of non-centrist Republicans in Congress has gone from 10 per cent during the Gerald Ford presidency (1974-77) to almost 90 per cent, while the Democrats' ratio has remained the same. Almost half the Republicans in the House of Representatives were found to be more extreme than the most extreme Democrat.

American democracy is dysfunctional because Republicans have largely abandoned consensus politics. Democracy is essentially the means by which society manages political division. It depends on tolerance, moderation, an acceptance that roughly half the citizenry doesn't share your perspective and priorities. Hence democracy and ideology are fundamentally incompatible.

Increasingly, Republicans pay lip service to this philosophy. Rather than seeing Democrats as rivals yet partners in the governing process, they view them as threats who must be nullified, a mindset which inevitably leads to the conclusion that the end justifies the means.

A particularly blatant example of this is the Republican-led House of Representatives select committee investigating the 2012 attacks on US diplomatic compounds in Benghazi, Libya in which two diplomats and two CIA contractors were killed. That's its ostensible purpose: its real aim, as some committee members have let slip, is to sabotage Hillary Clinton's bid for the presidency. (She was Secretary of State at the time.)

As Newsweek's Kurt Eichenwald has detailed, the scale of this investigation is unprecedented although the event under the microscope is anything but. Attacks on the US embassy in Beirut in 1983 and 1984, when Republican hero Ronald Reagan was president, claimed more than 80 lives. There have been 21 major assaults on US diplomatic facilities in the past 20 years; none of the others were the subject of a congressional investigation.

Yet the Benghazi committee has been examining an episode involving four deaths for longer than Congress looked into the attack on Pearl Harbour, the Iran-Contra scandal, 9/11 or the intelligence failures that facilitated the Iraq War.

The committee can demand any document, summon any witness, ask any question. In its latest report - there have been nine thus far - Clinton's name appears 36 times, the term "terrorist" 10. And even though it's abundantly clear the exercise is a taxpayer-funded witch-hunt, the media dutifully covers it as if it's the business of government as usual.

Triffitt's "plague on both your houses" approach mirrors that of the media, which continues to cover politics as if the centre hasn't moved, as if it's still a fixed point in the middle of the political spectrum with Democrats on one side and Republicans on the other. By failing to differentiate between traditional adversarial politics played by the rules and within the conventions of the game and the new bloody-mindedness, the media has enabled the right's extremism.

Other western democracies are post-Christian societies in which the great culture war of the late 20th century is already receding into history. In the US, however, it rages on. The religious right, which is, to a large extent, the Republican base, views the culture war in apocalyptic terms: lose and everything they hold dear will disappear. So they gravitate to candidates who embody their conviction that compromise amounts to defeat and defeat means the end of the world as they know it.

    

 

The Vital Q's To Ask In Polling

 

On the evening of Thursday, May 7, 2015, I was sitting on the set of the ITV election night programme, waiting for the exit poll. I had maintained, in defiance of the data, that Ed Miliband would not be prime minister. Just before the verdict, I turned to my colleague and said: “If David Cameron is not prime minister five minutes from now, then everything I know about politics is wrong”.

That was a controversial judgment because a credulous political class had taken the opinion polls, which showed stalemate and a probable Miliband premiership, at face value. This week the interim report of the inquiry into why the opinion polls were so inaccurate concluded that they had the wrong people in their samples. The electorate on which the opinion polls were based was younger (those over 70 tend to vote Conservative), more likely to vote and far more engaged with politics than the actual electorate, the one that had decided long before that Ed Miliband would not be prime minister.

The polls therefore confirmed the besetting bias of political pundits which is that they are too interested in politics. This leads directly to a methodological error which is to assume that political wisdom comes from the accumulation of information. Find every piece and, eventually, when all the pieces have been assembled, a clear picture will emerge. So many political professionals can supply you with a baroque explanation for what you know to be implausible. Making the world complex is, in fact, a low form of intelligence. True political wisdom is always distilled. It is what you need to know and nothing else. It comes in the form of an aphorism not in the pages of an encyclopaedia.

This week the Beckett report into why Labour lost the last election was published. If you read far enough you can find this definition of political knowledge in Beckett: “I realised that my own way was impoverishment, in lack of knowledge and in taking away, subtracting rather than in adding.” That is, of course, Samuel Beckett rather than Margaret. He was describing the revelation that caused him to break from his mimicry of Joyce. Subtraction, rather than addition, was what turned Beckett into a great writer.

Margaret Beckett’s report, alas, purveys no such wisdom. It is a character study for a party that is no longer interested in being serious. The Beckett report identifies a bewildering list of culprits for Labour’s defeat but never approaches self-awareness. Beckett blames variously, populist trends across Europe, the collapse of the Liberal Democrats, Tory tall tales about Labour profligacy, the SNP and newspapers that wouldn’t cover energy prices. Then there are the Tories who had too much money and sent too many letters to too many people, the media were beastly to Ed Miliband and could not cope with Labour’s nuanced approach to immigration. It was all so very difficult but it wasn’t really our fault. Beckett once contributed an essay to a collection edited by André Breton to answer the question What Is Surrealism? Surrealism is writing an official report into your election defeat and wasting your words with all this superfluous waffle.

There are only three things to know about politics and that might be one too many. The first thing to know is that your leader needs to be better than the other leader. The second thing is that your party needs a lead on economic competence. The third thing to remember is that, no matter how many opinion polls tell you that the first two things aren’t working, there really are only two things you need to remember.

In fact, two might also be one too many. It is rare for a party to be well ahead on the economy but well behind on leadership which suggests that the leader’s ranking may be a composite, which includes an assessment of economic competence. I keep an eye on both, just because I’m excessively interested in politics. But I promise you: watch these rankings, ignore everything else and you will get the election right. The tricky, unpredictable elections, such as 1992 and 2015, all yield to this analysis. How hard can it be, really? It’s like predicting the Boat Race when one boat has a hole in it. With this in mind, it’s worth pointing out that Jeremy Corbyn has a personal rating of -39, the worst ever recorded. Unless Labour changes its leader, the 2020 election is already over. There is nothing to see or say. Politics has gone on holiday, on a motorbike to eastern Europe with Diane Abbott.

Even reliable data is only a source of evidence rather than the first draft of the gospel. The polling industry will regain its accuracy and its work will be valuable again. Those of us who participate in and comment on politics, though, need to free ourselves from the chains of information. Any reputable statistician will stress that data always needs to be interrogated. If it contradicts the leadership numbers and it contradicts your own instinct, which accumulates over years of experience, that the man whose economic policy has made a TV audience laugh will not win, then there is something awry.

The leadership and economic numbers told the story of 2015 all along. The opinion poll inquiry has shown there was no fabled late swing. There has been a lot of chatter about whether the Tory strategist Lynton Crosby was worth the £2.4 million he appears to have been paid. Sir Lynton has a fine political intelligence and he is in fact chief among those who believe in distilled wisdom. “Scraping the barnacles off the boat” is the way he describes the importance of abandoning irrelevant arguments. But political commentators vastly overstate the importance of what they call political strategy. Does anyone suppose that if Labour had employed Crosby that Ed Miliband would now be prime minister? The Tories could put Crispin Blunt in a room full of poppers and ask him to run the 2020 campaign. They’d still win.

The 2015 election took place when Labour chose its leader. Ed Miliband was not a viable prime minister and he struck the wrong political tone. Jeremy Corbyn is an embarrassment and, as scientists say of a bad theory, not even wrong. Whatever the polls say, he is good enough to dip below 25 per cent. It is all there in Beckett whose subject is the will to survive in the face of despair. The two characters in Endgame trapped in dustbins. Vladimir and Estragon, waiting for the next Tony Blair to arrive. And the great last line of The Unnamable, speaking for vanquished moderates and voicing what I thought on the set at ITV on May 7: “I can’t go on. I’ll go on”.

    

 

The Welfare State Is Breeding Losers

 

It is a theory that has not made him many friends in liberal circles, but when you suggest that poverty is partly a result of personality, and that the welfare state is creating a “production line” of damaged children who become less employable adults, then you have to be ready to take flak.

Adam Perkins is prepared to be unpopular because he believes it to be true. His argument that the welfare state is breeding people with “employment-resistant” traits upset enough people for the London School of Economics to cancel a talk about his research, citing security concerns.

By stopping him from speaking, and exiling the academic to the growing ranks of pundits and provocateurs “no-platformed” for daring to espouse an unpopular opinion, the activists have served only to draw attention to his book, The Welfare Trait. In it Dr Perkins, a lecturer in the neurobiology of personality at King’s College London, suggests that the benefits culture unwittingly conspires to keep people poor. Cue the hullabaloo. Sitting in his tiny office in south London, Dr Perkins said it was a “gnawing of the conscience” that encouraged him to speak out and break the taboo. “People are reluctant to accept that personality plays any role at all. That’s what I call a flatearther. They are denying the evidence because they feel that it is placing blame. But I’m saying the person who gets produced as a result of this policy is not to blame. I am pointing my finger at the policymakers,” he insists.

“The welfare state has gone from a safety net to a production line of damaged personalities. I’m pretty sure that’s not what Lord Beveridge had in mind.” He pulls down his copy of the original 1942 report from a shelf and hands it over. “Well worth a read,” he says. The Welfare Trait is his first book —“and probably my last,” he adds, with a sheepish giggle.

There are two parts to his theory: first, that generous benefits “artificially inflate” the number of children born at the lowest end of the economic scale; and second, that while some children at the extreme ends of the spectrum are hardwired to fail or succeed regardless of their background, the personality of the average ones in the middle can be unfairly stunted by disadvantage. By the time they get to applying for jobs, children born to people on benefits will, on average, be less reliable, less co-operative and less conscientious than their better-off peers, he claims.

“What this personality difference means,” he says, “is the effect of cash transfers to that section of the population may not be the same as the effect of cash transfers to the average population, so you end up with the situation that extra money is being given to people who are, on average, less conscientious than the rest of the population and so the likelihood of that money being managed conscientiously to the benefit of the children of the recipients is lower.” He stresses: “Not in every case. Just on average . . . in the same way most professional basketball players are tall but you can find guys who are short and nippy.”

You start to see why he didn’t go down well in the students’ union. Lest anyone detect a whiff of disdain for the great unwashed, he says: “In that case I’m vilifying myself.” And here’s the twist: “I’m in the unconscientious category myself. I have the same employment-resistant profile.”

Dr Perkins, 43, spent most of his twenties “bumming around” between dead-end jobs and the dole. Though there is nothing too sink-estate about being the son of a civil servant and a marine, raised in a Devon village, he failed his A levels. He graduated with a 2:1 in biology from Cardiff university but admits: “I just loafed around. I was unemployable compared to my peers who had spent four summers working at Pfizer. I couldn’t be bothered.”

He moved to London and spent ten years drifting from one dead-end job to another. Warehouse work. Data entry. Casual shifts as a labourer. He estimates he has been sacked about eight times: once from a clothing warehouse in Wandsworth “for sloppy stock control”; another time from Sainsbury’s headquarters “for making too many errors in my data entry”. The problem? His personality every time. “I have a very patchy work record.”

He believes his own fecklessness gives him a certain insight unavailable to more conscientious academics. “Deep down, if I hadn’t got a place to save up money to do a masters, and was still working in the warehouse, I probably would have quit and gone on the dole. That’s shameful to admit but those labouring jobs are tough. They’re cold. They’re really, really tough. My hands used to bleed.”

The real reason his critics are so uncomfortable with his argument, he says, is because “they are wedded to the idea that we’re just like leaves being blown around by the economic winds of global business”. He adds: “That’s very comforting because there is no personal involvement. The trouble with this narrative is that it doesn’t take account of all these studies that say personality does influence chances of employment.”

He doesn’t mind if people disagree, but silencing unpopular views helps no one. “When it comes to science there’s no downside to discussion. For good scientific ideas, discussion will help them get adopted earlier. For bad scientific ideas, discussion will get them debunked earlier,” he says. “Personality is not the whole story . . . but until people start discussing this scientifically, we won’t know.”

    

 

African Strongmen

 

When Robert Mugabe celebrated his 92nd birthday last month, the focus was on his cakes. There was a six-storey cake crowned with his official portrait, with a sugar-icing crocodile snaking round its base. There was also an edible map of Africa at least a metre square. More striking, however, was the way he ate his slice. Slowly, with the bowl held close to his mouth. His eyes gazed into the food but his mind seemed miles away. It was a scene familiar to anyone who has visited an old people’s home: Africa’s inveterate strongman looks very frail indeed. Mr Mugabe is learning the curse of every dictator. He cannot live for ever (though that won’t stop him trying) and when he goes, Zimbabwe will confront a crisis that it has been putting off ever since 1980: how to chose a successor.

Strongmen follow a familiar pattern. A charismatic rebel leader swaps his bush fatigues for a smart suit and fills his cabinet with comrades. The country is gripped by euphoria at the end of years of strife. Anything is better than war or genocide, so people dare not ask too much about civil rights. Foreign donors, whether they’re governments or companies, queue up to help because they want peace to succeed. The strongman, they are pleased to say, is someone they can do business with. Such a charming man. He understands development, he’s encouraging investment (he might even have mineral resources) and he’s an Arsenal fan to boot. What do you mean he had someone bumped off? Are you sure? Well, these things are always a bit messy.

Corruption and oppression start to metastasise in the body politic, but progress needs stability, the donors insist. This is no time to cut them out. Sure, it’s far from perfect, but then it took western democracy centuries to evolve. Transparency and accountability? We can get around to those later. Only that doesn’t seem to happen. The strongman believes his own hype. He was there at his country’s rebirth. He has kept it on a steady path. It would fall apart without him.

Quietly at first, he starts to consolidate power. That rouses opposition among his closest aides, best placed to see his scheming, and eventually someone challenges him, which only proves our strongman right. If even these loyal cadres who braved the bush as brothers cannot stay united, what hope is there for the country? The strongman turns to the military and the security services to increase his grip on power. Perhaps his opponents chose exile.

If they are Rwandan that could mean a lifetime of looking over their shoulder for one of President Kagame’s assassins. Meanwhile Britain invests in his baby food factories. If the opponents are Ugandan, they will face exile, imprisonment or frequent arrest. That is OK because President Museveni is focused on stability and growth. Uganda is on course to become a middle-income nation in 36 years’ time and the Department for International Development is always there to help. The strongman is a tactician. He knows he must keep the donors on side. So of course he will hold elections. And yes, of course, he will win. Then he will win again. The constitution says that’s enough, but by now our strongman is addicted. Unfortunately, so are the donors.

They expressed concern when Rwanda, Burundi and Congo (Brazzaville) abolished their two-term limits last year so their presidents could cling to power, but no one turned off the taps. They put stability over democracy without realising it’s a false choice. In the absence of the rule of law, these countries are simply storing up trouble for the future.

President Museveni won a fifth term last month in an election that was neither free nor fair. President Deby of Chad, who has ruled since 1990, is expected to win a fifth term next month. President Kabila of the Democratic Republic of Congo is yet to set a date to elect his successor, which suggests he isn’t planning to stand down, as constitutionally required, at the end of this year.

Africa’s march towards democracy has stalled. In 1990, just three of its 48 mainland countries were democracies, according to the US-based campaign group Freedom House. By 1994, that number was 18. Now, only 19 qualify. This march was supposed to be led by a rising middle class who would demand better and more accountable government. But the middle class isn’t there. People who earn £250 a month selling mobile phones or £300 a month installing internet broadband aren’t rich enough to stop Barclays selling off their Africa business or stop Nestle retrenching. The world’s biggest food manufacturer cut half its product lines in Africa last year and laid off 15 per cent of its staff.

Strongmen usually fall out with their friends when they grow tired of donors’ nagging. Some fall back on natural resources, or find new friends such as China (who often want those resources) but neither routes are particularly lucrative right now. Growth falters under the weight of corruption. What strongmen have in common is that they die. Successions do not go smoothly and the legacy of a dead strongman is rarely good.

    

 

Marxism Always Fails

 

Every Marxist government in history has been a repressive nightmare. Marxists — aside from the ones who defend the remaining Marxist regimes — consider this a strange coincidence that has no bearing on Marxist ideology. I recently pointed this out, in light of the resurgence of Marxist thought among some left-wing intellectual circles. In an essay in In These Times, Tyler Zimmer writes what he purports to be a response, but that in fact confirms my point for me.

The problem with Marxism, I argue, lies in its class-based model of economic rights. Liberalism believes in political rights for everybody, regardless of the content of their ideas. Marxists believe political rights belong only to those arguing on behalf of the oppressed — i.e., people who agree with Marxists.

Zimmer begins by insisting that self-described Marxist regimes such as the Soviet Union, Maoist China, Cuba, North Korea, etc., all of whose leaders were immersed in Marxist thought, were not real Marxists at all. (Zimmer: “[T]hese authoritarian monstrosities had virtually nothing to do with [what] Marx himself said or did.”)

Zimmer proceeds to explain why the liberal idea that everybody should enjoy the same right to express their political idea is a failure, and lays out the Marxist concept of what free speech should really mean:

Marxists value free speech because they are committed to building a society where all can decide matters of public concern democratically, as genuine equals. Thus, the Marxist has a consistent way of explaining why speech that aims to dominate or marginalize others should be challenged rather than protected: it is contrary to the very values animating our commitment to free speech in the first place.

This explains why, to quote Jelani Cobb, “the freedom to offend the powerful is not equivalent to the freedom to bully the relatively disempowered.” It also provides a principled, consistent basis for opposing and disrupting the public acts of openly racist organizations that seek to subordinate, harm, scapegoat or marginalize others. [T]he (socialist) goal of cooperating and governing public life together as full equals gives us a principled criterion for deciding which forms of expression deserve protection and which don’t.

Zimmer is articulating the standard left-wing critique of political liberalism, and all illiberal left-wing ideologies, Marxist and otherwise, follow the same basic structure. These critiques reject the liberal notion of free speech as a positive good enjoyed by all citizens. They categorize political ideas as being made on behalf of either the oppressor class or the oppressed class. (Traditional Marxism defines these classes in economic terms; more modern variants replace or add race and gender identities.) From that premise, they proceed to their conclusion that political advocacy on behalf of the oppressed enhances freedom, and political advocacy on behalf of the oppressor diminishes it.

It does not take much imagination to draw a link between this idea and the Gulag. The gap between Marxist political theory and the observed behavior of Marxist regimes is tissue-thin. Their theory of free speech gives license to any party identifying itself as the authentic representative of the oppressed to shut down all opposition (which, by definition, opposes the rights of the oppressed). When Marxists reserve for themselves the right to decide “which forms of expression deserve protection and which don’t,” the result of the deliberation is perfectly obvious.

In the contemporary United States, these ideas are confined by the fact that only in certain communities (like college campuses) does the illiberal left have the power to implement its vision, and even there it is constrained by the U.S. Constitution. If illiberal ideas were to gain more power, the scale of their abuses would widen.

    

 The Lesson of Chenobyl

 

Exactly 30 years ago today Pravda, the official Communist party newspaper, published its first substantial report on the nuclear explosion at Chernobyl, ten days after the worst man-made accident in history. This was the first formal indication that anything was seriously amiss.

In the days after the explosion at the nuclear power plant in northern Ukraine, 120,000 people were evacuated from a 22-mile exclusion zone, as frantic efforts were made to drain the plant of radioactive water and encase the site in concrete. More than 200 people died from acute radiation sickness, and thousands more would suffer the debilitating and life-shortening effects from a spew of radioactivity 400 times greater than the Hiroshima bomb. Of this, the vast majority of Soviet citizens knew nothing.

The Chernobyl meltdown devastated thousands of lives, but the fallout from the ensuing cover-up was historically more significant: it poisoned relations between Moscow and its satellites, undermined faith in Communist rule and brought down the Soviet Union. It also demonstrated, in the most dramatic way, what happens when a state tries to contain and conceal unpleasant truth in a concrete covering of lies.

Chernobyl, Vladimir Putin said last month, should serve as a “harsh lesson to humanity”. But the main lessons — about official dishonesty, state control of the media and domination of neighbouring states — are precisely those Putin is choosing to ignore.

The news that something terrible had happened at Chernobyl came not from the USSR, but Sweden, two days after the event, when scientists there detected a surge of radiation. Moscow initially denied there was a problem. It would take three days before the first acknowledgement in the Soviet media. In a terse, 15-second announcement, the 21st item on the news, an announcer declared: “The trouble has passed.”

The May Day parade went ahead in Kiev, even as officials were evacuating their own families, aware of the radioactive cloud drifting overhead. When Pravda printed its report on May 6, it claimed the situation was “under control” and focused on the courage of firefighters. Three weeks after the explosion, Mikhail Gorbachev, the general secretary of the party, finally commented on the accident, but accused the US and Nato of unleashing a “poisoned cloud of anti-Sovietism”. Gorbachev later insisted the scale of the disaster had been kept from him too.

When the truth finally emerged, relations between rulers and ruled within the Soviet Union changed radically, and permanently. Glasnost was in its first phase, but the Chernobyl catalyst set off a chain reaction that made demands for greater openness unstoppable: Soviet citizens had been lied to on a grand scale, and now they knew it.

A series of myths went up in smoke at Chernobyl. Far from protecting the people, officials were revealed as lazy, corrupt, and brutally indifferent to the health of both the population and the environment. Nuclear power had been a central symbol of Soviet scientific prowess; now that reputation for technological mastery lay in ruins. Obedient Soviet citizens who had never questioned the state came to see it as fallible, insincere, even actively dangerous. An emboldened media began investigating other areas of Soviet life, including the crimes of Stalinism and the real state of the economy. More than half of all Soviet nuclear power stations had been built outside Russia, and the radiation that rained down on Ukraine and Belarus seemed to confirm Moscow’s willingness to put non-Russians at greater peril. Independence movements surged.

Soviet planes were said to have been used to “seed” clouds east of Chernobyl, to prevent the radioactive cloud spreading to Russia. Whether that was true or not was politically irrelevant — the corrosive damage lay in the belief that it could be.

Five years after Chernobyl, the Soviet Union was dissolved, destroyed from within by political liberalisation and media freedom. The underlying economic, social and political causes of that disintegration had been brewing long before the accident, but Gorbachev has asserted that the disaster, “even more than my launch of perestroika [restructuring], was perhaps the real cause of the collapse of the Soviet Union”.

Putin was a rising KGB star at the time. Instead of seeing a warning in the political meltdown that resulted from stifling the truth, Putin’s Russia adopted many traits of the Soviet Union, including tight media control, intimidation of neighbours (notably Ukraine) and the instinct to blame western propaganda for setbacks.

In a statement to mark the 30th anniversary of Chernobyl, Putin made no mention of the cover-up and instead stressed the “heroic” response of the emergency services. “The magnitude of the tragedy could have been immeasurably larger if it were not for the incomparable bravery and self-sacrifice of firefighters, military personnel, experts and medics who honourably fulfilled a citizen’s duty.” The language is authentic Soviet boilerplate. The official news agency Tass described the suggestion that “the authorities concealed the truth from the population” as a “delusion”.

 

Today the area around Chernobyl remains a ghost world, an abandoned countryside reclaimed by wolves and bears, with weed-sprouting tarmac and towns of crumbling concrete. This is the graveyard of the Soviet Union; the epitaph to a repressive, deceitful form of government that never quite died.

 

 

The Politics of Pragmatic Compromise

 

If you only read short summaries of President Obama’s commencement address at Howard University, you probably missed the thrust of his remarks, which was an extended argument against the political far left. With the exception of a handful of digressions and jokes, this case formed the spine of his remarks, which mounted a detailed defense of his political style combined with a rebuttal of his critics on the left.

 

1. The world has grown more fair and prosperous over the course of his adult life, especially in its racial equality. “America is a better place today than it was when I graduated from college,” he began, repeating the line for emphasis. Dismissing the straw man of a “post-racial society,” an unrealistic expectation Obama noted he had never promised, he emphasized that opportunities for African-Americans have expanded across society:

In my inaugural address, I remarked that just 60 years earlier, my father might not have been served in a D.C. restaurant — at least not certain of them. There were no black CEOs of Fortune 500 companies. Very few black judges. Shoot, as Larry Wilmore pointed out last week, a lot of folks didn’t even think blacks had the tools to be a quarterback. Today, former Bull Michael Jordan isn’t just the greatest basketball player of all time — he owns the team. (Laughter.) When I was graduating, the main black hero on TV was Mr. T. (Laughter.) Rap and hip hop were counterculture, underground. Now, Shonda Rhimes owns Thursday night, and Beyoncé runs the world. (Laughter.) We’re no longer only entertainers, we're producers, studio executives. No longer small business owners — we're CEOs, we’re mayors, representatives, Presidents of the United States.

Obama lays out the predicate in detail, because it’s the most important premise of his argument. Bernie Sanders has argued that “it’s too late for Establishment politics” — that progress is too meager to be worth continuing, and that a radical new course, a metaphorical “revolution,” is required to truly make a difference. Though he wouldn’t embrace a loaded term, Obama is making the case that the dreaded “Establishment politics” is working.

2. Political change is necessarily incremental. Not only is incremental progress working, but there is no other alternative. Obama cited the Civil Rights Act, the Voting Rights Act, and the Emancipation Proclamation as imperfect political compromises. “They did not make up for centuries of slavery or Jim Crow or eliminate racism or provide for 40 acres and a mule,” but they made the world better. The belief that compromise is immoral leads to distrust of the political mechanisms that actually can produce positive change, making those systems less effective as people lose hope in them:

If you think that the only way forward is to be as uncompromising as possible, you will feel good about yourself, you will enjoy a certain moral purity, but you’re not going to get what you want. And if you don’t get what you want long enough, you will eventually think the whole system is rigged. And that will lead to more cynicism, and less participation, and a downward spiral of more injustice and more anger and more despair. And that's never been the source of our progress. That's how we cheat ourselves of progress.

3. Successful change can only be accomplished by persuading those who don’t share your beliefs. Obama invoked a police reform bill he helped pass through the Illinois state legislature, frankly confessing that the bill could not have passed if he hadn’t persuaded police to support it. It may have been true that police abuse was rampant, but by approaching the negotiation from a position of respect and empathy with the pressures faced by the well-intended members of law enforcement, he was able to build consensus. “Change requires more than just speaking out — it requires listening as well,” he said, “In particular, it requires listening to those with whom you disagree, and being prepared to compromise.” Browbeating does not work:

The point is, you need allies in a democracy. That's just the way it is. It can be frustrating and it can be slow. But history teaches us that the alternative to democracy is always worse. That's not just true in this country. It’s not a black or white thing. Go to any country where the give and take of democracy has been repealed by one-party rule, and I will show you a country that does not work.

And democracy requires compromise, even when you are 100 percent right.

4. Protest is just one part of bringing change. Obama praises the role of demonstrations in bringing issues onto the political agenda, but insists that protest alone is useless unless it leads to negotiated political resolution:

You see, change requires more than righteous anger. It requires a program, and it requires organizing. … We remember Dr. King’s soaring oratory, the power of his letter from a Birmingham jail, the marches he led. But he also sat down with President Johnson in the Oval Office to try and get a Civil Rights Act and a Voting Rights Act passed. ...

Brittany Packnett, a member of the Black Lives Matter movement and Campaign Zero, one of the Ferguson protest organizers, she joined our Task Force on 21st Century Policing. Some of her fellow activists questioned whether she should participate. She rolled up her sleeves and sat at the same table with big city police chiefs and prosecutors. And because she did, she ended up shaping many of the recommendations of that task force. And those recommendations are now being adopted across the country — changes that many of the protesters called for. If young activists like Brittany had refused to participate out of some sense of ideological purity, then those great ideas would have just remained ideas. But she did participate. And that’s how change happens.

5. Democratic deliberation must be open. The hard work of persuading a majority to work with you means taking their concerns seriously. Open discourse means, rather than beginning from the assumption that your side represents tolerance and goodness and the opponents bigotry, demoralizing the debate where it is possible to do so. The spreading impulse on the left to shut down ideas they find offensive is counterproductive and undemocratic:

Our democracy gives us a process designed for us to settle our disputes with argument and ideas and votes instead of violence and simple majority rule.

So don’t try to shut folks out, don’t try to shut them down, no matter how much you might disagree with them. There's been a trend around the country of trying to get colleges to disinvite speakers with a different point of view, or disrupt a politician’s rally. Don’t do that — no matter how ridiculous or offensive you might find the things that come out of their mouths.

This last point is especially interesting to me, since the growing strain of illiberalism on the left, which habitually tries to shut down opposing views on any identity-related questions, is somewhat of a hobbyhorse. I’m grateful for the hate-clicks as well as the proliferation of rebuttals that actually substantiate my argument. At the same time, this weekend’s address is at least the fourth time Obama has denounced political correctness. He first did so in a speech in September, again in an interviewwith George Stephanopoulos in November (“And so when I hear, for example, you know, folks on college campuses saying, ‘We're not going to allow somebody to speak on our campus because we disagree with their ideas or we feel threatened by their ideas —’ you know, I think that's a recipe for dogmatism”), and again in an interview with Steve Inskeep in December. While my criticisms of p.c. have generated many, many responses from the left, I have noticed the almost complete dearth of left-wing responses to Obama’s, which run along almost identical lines to my own. This seems odd because — I can say this without any suspicion of false modesty — Barack Obama is far more influential than I am. Every time Obama denounces the left’s practice of suppressing opposing views I search the sources that defend (or deny) p.c. for outraged rebuttals and have found none. My suspicion is that this is because p.c.-niks rely so heavily on identity to discredit opposing views, it is convenient for them to identify opposition to p.c. with a white male, and highly inconvenient to identify it with a famous, liberal African-American. But I’m open to alternative, less ungenerous explanations for why Obama’s repeated attacks on p.c. have been met with such conspicuous silence.

In any case, Obama has concluded that the left, and especially the young left, has turned away in important respects from his political values. In the final year of his presidency, he has begun to defend his own ideals with increasing force and urgency.

 

 

 

    

 

 

 

 

    

Other Resources

Go to Index page