economicprincipals.com

David Warsh: What an economist and public servant! And his brother was a spy

Sir Alexander Cairncross

Milton Friedman was recognized with a Nobel Memorial Prize in 1976, but something more important to economics happened that year, and I don’t mean the bicentennial of the American Revolution. The Glasgow Edition of the Works of Adam Smith appeared that year as well, timed to commemorate the two hundredth anniversary of the publication of An Inquiry into the Nature and Causes of The Wealth of Nations.

“Modern economics can be said to have begun with the discovery of the market,” began Sir Alexander Cairncross, chancellor of the University of Glasgow, in his opening address to the convocation that introduced the new edition.

He continued:

“Although the term ‘market economy’ had yet to be invented, its essential features have debated the strength and limitations of market forces [ever since] and have rejoiced in their superior understanding of these forces. The state, by contrast, needed no such discovery.”

Cultural entrepreneurs in economics, even the most effective among them, such as John Maynard Keynes and Milton Friedman, do their work against the background of hard-earned knowledge of others, standing on the shoulders of giants and all that.

Eight beautiful volumes had rolled off the presses: two containing The Wealth of Nations; another with Smith’s first book, The Theory of Moral Sentiments,; three more volumes of essays, on philosophical subjects (which includes the famous essay on the history of astronomy), jurisprudence, and rhetoric and belles letters; a collection of correspondence and some odds and ends; and, in the eighth, an index to them all.

Each contains introductions by top Smith scholars, with edifying asides tucked in among the footnotes. Two companion volumes accompanied the release, published by Oxford University Press: Essays on Adam Smith, and The Market and the State: Essays in Honor of Adam Smith, by way of penance. Smith had been educated at the University of Glasgow but scorned Oxford, where he spent six post-graduate years, mostly reading. Inexpensive volumes of any or all of the Glasgow edition can be had from the Liberty Fund.

A feast, in other words, for those interested in thinking about such things.

One such was Cairncross, whose Wikipedia entry begins this way:

“Sir Alexander Kirkland Cairncross KCMG FRSE FBA (11 February 1911 – 21 October 1998), known as Sir Alec Cairncross, was a British economist. He was the brother of the spy John Cairncross [worth reading!] and father of journalist Frances Cairncross and public health engineer and epidemiologist Sandy Cairncross.”

More to our point, for twenty-five years Cairncross was chancellor of Glasgow University (1971-1996). It was he who commissioned the Glasgow edition of Smith.  He delivered the inaugural address I quoted above.

Before that, however, Cairncross became an economist, as an under graduate at Glasgow and then, beginning in 1932, at Trinity College, Cambridge, under John Maynard Keynes and his increasingly incensed rival, Dennis Robertson. Keynes published his General Theory of Employment, Interest, and Money to great excitement in 1936; Robertson steered Cairncross away from theory and into applied economics.  After graduating with honors, he returned to Glasgow as a lecturer and wrote a textbook.

His service in government during World War II and after was extensive and exemplary:  the Ministry of Aircraft Production; Treasury representative at the Potsdam Conference; a stint at The Economist; adviser first to the Board of Trade, then to the Organization for European Economic Cooperation; 10 years as Professor of Applied Economics at Glasgow; then, for another decade, various high-ranking positions in the Treasury.  The appointment as Glasgow’s chancellor came in 1971.

If you are interested in post-war Britain, particularly the Sixties, the Royal Academy’s biographical minute on Cairncross makes interesting reading. Quietly told in 1964 about his brother’s treachery as a paid agent of the KGB, he called it “perhaps the greatest shock I ever experienced.”

Cairncross was a Keynesian economist, his biographers say. He was critical of monetarism and dismissed the idea of a “natural” rate of unemployment as absurd. He considered that industrial planning, while necessary in wartime, was no model for peacetime governments. Cairncross “shows that you don’t have to be flamboyant to achieve great influence,” wrote a former boss, “and that you do not have to be malicious to be interesting.”

“By some odd quirk of memory,” his biographers write, in his autobiography, A Life in the Century, Cairncross neglected to mention the Glasgow edition of the works of his fellow Scot that he commissioned, “although he himself had given the opening paper.” Yet that comprehensive record of the circumstances, and, at their center, the founding work – An Inquiry into the Nature and Causes of the Wealth of Nations – from which modern economics emerged may have been his single most durable accomplishment.  Cairncross concluded his introductory address to the convocation this way:

“We are more conscious perhaps than Adam Smith of the need to see the market within a social framework and of the ways in which the state can usefully rig the market without destroying its thrust. We are certainly far more willing to concede a larger role for state activities of all kinds, But it is a nice question whether this is because we can lay claim, after two centuries, to a deeper insight in determining the forces determining the wealth of nations or whether more obvious forces have played the largest part: the spread of democratic ideals, increasing affluence, the growth of knowledge, and a centralizing technology that delivers us over to the bureaucrats.”

For an accounting of the lives among the bureaucrats of some distinguished present-day economists, see this column next week.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.

David Warsh: 'Suzerainties in economics are personal'

The Great Dome at the Massachusetss Institute of Technology, in Cambridge, Mass.

SOMERVILLE, Mass.

When I was a young journalist, just starting out, the economist whose writings introduced me to the field was Gunnar Myrdal. He hadn’t yet been recognized with a Nobel Prize, as a socialist harnessed to an individualist, Friedrich Hayek. That happened in  in 1974.  But he had written An American Dilemma: The Negro Problem and Modern Democracy (1944) , about the policy of segregation that had been restored de jure after the U.S. Civil War.  A subsequent project, Asian Drama: An Inquiry into the Poverty of Nations (1968), longer in preparation, was in the news.

Myrdal’s pessimistic assessment of the prospects for economic growth in India, Vietnam, and China began to fade soon after it appeared. The between-the-wars era of economics in which he was prominent already had been superseded by a new era, dominated by Paul Samuelson, whose introductory college textbook Economics (1948), supplemented by the highly technical Foundations of Economic Analysis (1947), quickly replaced overnight Alfred Marshall’s Principles of Economics, whose first edition had appeared in 1890.

Basic textbooks dominate their fields by dint of the housekeeping that they establish.  Samuelson has ruled economics ever since through the language he promulgated; mathematical reasoning was widely adopted within a few years by newcomers to the profession.  Ruling textbooks are sovereign. Since the discovery and identification of the market system two hundred and fifty years ago, there have been only five such sovereign versions: Adam Smith, David Ricardo, John Stuart Mill, Alfred Marshall, and Samuelson (brought up to knowledge’s frontiers thirty years ago by Andreu Mas-Colell).

Sovereignty is binary; it either exists or doesn’t. A suzerainty, on the other hand, though part of the main, sets its own agenda. John Fairbank taught that Tibet was a suzerainty of China. (This Old French word signifies a medieval concept, adopted here to describe modern sciences, as in Dani Rodrik’s One Economics, Many Recipes (2007).

Suzerainties are personal. They rule through personal example. Replacing Myrdal as suzerain in my mind, in 1974, practically overnight, was Robert Solow. Eight years his junior, Solow was Samuelson’s research partner at the Massachusetts Institute of Technology, for the next thirty years.  Samuelson retired in 1982, died in 2009. Solow soldiered on.

Solow turned 99 last week, hard of hearing but sharp as ever otherwise (listen to this revealing interview if you doubt it.)  By now his suzerainty has passed to Professor Sir Angus Deaton, 78, of Princeton University.

What is required to become a suzerain?  Presidency of the American Economic Association and a Nobel Prize are probably the basic requirements: recognition by two distinct communities, one for good citizenship within the profession, the other for scientific achievement beyond it, to the benefit to all humanity.

In Deaton’s case, as in Myrdal’s, it helps to have displayed a touch of Alexis de Tocqueville, whose two-volume classic of 1835 and 1840, Democracy in America, set the standard for critical criticism by a visitor from another culture, and, in the process,  founded the systematic study today we call political science.  Deaton grew up in Scotland, earned his degrees at Cambridge University, and was professor of economics at the University of Bristol for eight years, before moving to Princeton. in 1983.  For the first twenty years he taught and worked in relative obscurity on intricate econometric issues. In 1997, he began writing regular letters for the Royal Economic Society Newsletter, reflecting on what he had learned recently about American life, “sometimes in awe, and sometimes in shock”.

In 2015, the year Deaton was recognized by the Nobel Foundation for “his analysis of consumption, poverty, and welfare,” he published The Great Escape: Health, Wealth, and the Origins of Inequality. Five years later, Deaths of Despair and the Future of Capitalism appeared, by Deaton and Anne Case, his fellow Princeton professor and economist wife, just as the Covid epidemic began. It became a national best-seller, focusing attention on the fact that life expectancy in the United States had recently fallen for three years in a row – “a reversal not seen since 1918 or in any other wealthy nation in modern times.”

Hundreds of thousands of Americans had already died in the opioid crisis, they wrote, tying those losses, and more to come, to “the weakening position of labor, the growing power of corporations, and, above all, to a rapacious health-care sector that redistributes working-class wages into the pockets of the wealthy.”

Now Deaton has written a coda to all that. Economics in America: An Immigrant Economist Explores the Land of Inequality (Princeton 2023) will appear in October, offering a backstage tour during the year that Deaton has been near or at the pinnacle of it.  I spent most of Friday and Saturday morning reading it, more than I ordinarily allot to a book, and found myself absorbed in its stories about particular people and controversies, on the one hand, and, on the other, increasingly apprehensive about finding something pointed about it to say.

Then it occurred to me.  I have long been a fan of Ernst Berndt’s introductory text, The Practice of Econometrics: Classic and Contemporary (Addison-Wesley, 1991), mainly because it scattered one- or two-page profiles of leading econometricians throughout pages of explication of their ideas and tools.  Deaton’s new book is far better than that, because no equations are to be found in the book, and part of some of those letters to British economists have been carefully worked in.

The argument about David Card and the late Alan Krueger’s celebrated paper pater about a natural experiment with the minimum wage along two sides of the Delaware River, New Jersey and Pennsylvania, is carefully rehashed (both were Deaton’s students).  The goings-on at Social Security Day at the Summer Institute of the National Bureau of Economic Research is described.  The “big push” debate in development economics among William Easterly, Jeffrey Sachs, Treasury Secretary Paul H O’Neill, and Joseph Stiglitz get a good going-over. Econometrician Steve Pischke’s three disparaging reviews of Freakonomics are mentioned.  Rober Barro and Edward Prescott are raked over with dry Scottish wit; Edmond Malinvaud, Esra Bennathan, Hans Binswanger-Mkhizer, and John DiNardo are celebrated. The starting salaried of the most sought-after among each year’s newly-minted economics PhDs are discussed:

My taste is for theory because developments in theory are where news is apt to be found. That’s why I liked Great Escape and Deaths of Despair so much.  Economics in America is undoubtedly the best book about applied economics I’ve ever read, its breadth and depth.  But it is a book about applied economics – the meat and potatoes topics that I have tended to avoid over the years. What I craved when I finished is a book about the one-time land of equality that is Britain today.

Other suzerainties exist in economics.  The same and/or credentials apply: presidency of the AEA and realistic hopes of a possible Nobel Prize. They tend to be associated with particular universities: Robert Wilson, Guido Imbens, Susan Athey, Paul Milgrom and Alvin Roth at Stanford; George Akerlof (emeritus), David Card and Daniel McFadden at Berkeley; Claudia Goldin and Lawrence Katz at Harvard; William Nordhaus and Robert Shiller at Yale; James Heckman and Richard Thaler at Chicago; Daron Acemoglu and Peter Diamond at MIT; Sir Angus Deaton, Christopher Sims, and Avinash Dixit at Princeton.

Alas, the reigning head of the suzerainty in which I am most interested, macroeconomist Robert Lucas, died earlier this year, and won’t soon be replaced. He succeeded Sherwin Rosen, his best friend in the business, in the AEA presidency in 2001. Rosen died the same year, a decade or two short of what might have been his own trip to Stockholm.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.

David Warsh: The mid-terms, ‘national conservatism’, U.K. confusion, Anglo-Saxon invasion

Ohio Democratic Congressman and U.S. Senate candidate Tim Ryan

SOMERVILLE, Mass.

The way I see it, the United States and its NATO allies goaded Vladimir Putin into a war that Russia cannot hope to win and which Putin is determined not to lose.  What will happen next? The U.S. mid-term congressional elections, that’s what.

Interesting election campaigns are unfolding all across the nation. I take everything that Republican Party strategist Karl Rove says with a grain of salt, but suspect he is correct when he predicts the GOP will pick up around 20 seats in the House in November, enough to give Trumpist Republicans a slender majority there for the next two years.  The Democrats likely will control the Senate, setting the stage for the 2024 presidential election.

To my mind, the most interesting contest in the country is the Senate election involving 10--term Congressman Tim Ryan and Hillbilly Elegy author J. D. Vance, a lawyer and venture capitalist. That’s because, if Ryan soundly defeats Vance, he’s got a good shot at becoming the Democratic presidential nominee in 2024.  Ryan and Vance have agreed to two debates, Oct. 10 and 17.

By ow, it goes practically without saying that President Joe Biden will not run for re-election on the eve of turning 82.  Ryan, who will be 51 in 2024, challenged House Speaker’s Nancy Pelosi’s leadership of the Democratic Party party in 2016, and sought the party’s presidential nomination in 2020.

A demonstrated command of the battleground states of the Old Northwest – Ohio, Pennsylvania, Michigan and Wisconsin – would make Ryan a strong contender against Florida Gov. Ron DeSantis, the likely Republican nominee.

It was a turbulent week. Having proclaimed Russia’s annexation of four Ukrainian regions, Putin delivered what Robyn Dixon, Moscow bureau of The Washington Post, described as “likely the most consequential” of his 23 years in office. But rather than a clarion call to restore Russian greatness as he clearly intended,” she wrote, “the address seemed the bluster and filibuster of a leader struggling to recover his grip — on his war, and his country.”

A starling miscalculation by Britain’s new Conservative Party government threatened to destabilize global financial markets.  And the costs of gradual global warming continue to make themselves clear.

In the midst of all this, an old friend called my attention to two essays that seemed to take antithetical views of the prospects.  The first, “How Europe Became So Rich’’, by the distinguished economic historian Joel Mokyr,

Many scholars now believe, however, that in the long run the benefits of competing states might have been larger than the costs. In particular, the existence of multiple competing states encouraged scientific and technological innovation…. [T]he ‘states system’ constrained the ability of political and religious authorities to control intellectual innovation. If conservative rulers clamped down on heretical and subversive (that is, original and creative) thought, their smartest citizens would just go elsewhere (as many of them, indeed, did).

Mokyr concluded, “Far from all the low-hanging technological fruits having been picked, the best is still to come.”

The second, “Seven Years of Trump Has the GOP taking the Long View’’, by long-time newspaper columnist Thomas Edsall, cites the success of Viktor Orban in governing Hungry, and then examines various signs of the vulnerability of the liberal state in America. These include  the durability of the Trump base, and  an incipient “National Conservatism” project, created in 2019 by the Edmund Burke Foundation,  since joined by an array of scholars and writers associated with such institutions, magazines and think tanks as the Claremont Institute, Hillsdale College, the Hoover Institution, the Federalist, the journal First Things, the Manhattan Institute, the Ethics and Public Policy Center and National Review.

What characterizes national conservatism?  Commitments to the infusion of religion and traditional family values into the government sphere, Edsall says, and, perhaps especially, opposition to “woke progressivism,” Conservative Conference chairman Christopher DeMuth puts it this way: Progressives promote instability and seek “to turn the world upside down:”

[M]ayhem and misery at an open national border. Riot and murder in lawless city neighborhoods. Political indoctrination of schoolchildren. Government by executive ukase. Shortages throughout the world’s richest economy. Suppression of religion and private association. Regulation of everyday language — complete with contrived redefinitions of familiar words and ritual recantations for offenders.

All I could do in reply was to take comfort in evidence that borders have been open to chaotic traffic for a very long time, across  barriers more intimidating than the Rio Grande River, resulting in successful assimilation.  Science magazine (subscription required) reported last month that new archaeological evidence tends to support the traditional view of “How the Anglo-Saxons Settled England”.

An 8th Century (C.E.) history written by a monk named Bede asserted that Rome’s decline in about 400 C.E. opened the way to an invasion from the east. Tribes from what is today northwestern Germany and southern Denmark “came over into the island, and [the Angles, Saxons, and Jutes] began to increase so much, that they became terrible to the natives.”

For a time, archaeologists doubted Bede’s account, preferring to think that only a relatively small bands of warrior elites could have successfully imposed their culture on the existing population. “Roman Britain looks very different from the Anglo-Saxon period 200 years later,” one archaeologist acknowledged to Science. But DNA samples from the graves of 494 people who died in England between 400 and 900 show they derived more than three-quarters of their ancestry from northern Europe.

The results address a long-standing debate about whether past cultural change signals new people moving in or a largely unchanged population adopting new technologies or beliefs. With the Anglo-Saxons, the data point strongly to migration, says University of Cambridge archaeologist Catherine Hills, who was not part of the research. The new data suggest “significant movement into the British Isles … taking us back to a fairly traditional picture of what’s going on.”

That doesn’t mean that the migration was especially turbulent, as when the Vikings began raiding a few centuries later, leaving relatively few genetic traces behind. Language changed relatively quickly after the Anglo-Saxons began arriving, a sign that people were talking, not fighting. Integration and intermarriage persisted for centuries. Indeed, archaeologists discovered “one high status woman in her 20s with mixed ancestry was laid to rest near modern Cambridge under a prominent mound with silvered jewelry, amber beads, and a whole cow.” That suggests more complexity than simple conquest, one archaeologist told Science.

Same as it ever was, in other words – all but the cow.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals. com

 

David Warsh: Looking back at a big U.S. mistake and a great success

A protester on Wall Street in the wake of the AIG bonus payments controversy is interviewed by news media during 2008-2000 financial crisis.

SOMERVILLE, Mass.
If you strip away all that we have learned since, in order to look back at the election of 2016, the remarkable thing is that Donald Trump didn’t defeat Hillary Clinton by a wide margin then {and lost the popular vote by 3 million votes} and win re-election in a landslide. He was against America’s war in Iraq; soft on Russia; tough on China; phobic on immigration, especially from Mexico. He deplored blue-collar criminals, talked up investment in infrastructure, and sought tax cuts for corporations and the well-to-do. In a nation eager for change, Trump was a television populist running against a Yale Law School elitist who possessed an indelible record in government going back twenty-five years. Only climate change was missing from his platform.

The election was close mainly because a large number of swing voters, just short of preponderance, understood that Trump was a shady character who played well on fears.  We know now that he is rotten to the core, at least most of us do, Republicans as well as Democrats. Yet the issues that Trump brought to the 2016 election remain at center of the of the 2024 campaign. While President Biden adopted as many of Trump’s positions as he dared, and modified others as much as he could, Trump’s all-but-forgotten core political agenda dominates debate today.

But never mind Trump, and his cunning political instincts. A better way to think about America’s future is to reflect on how the three decades unwound since the end of the Cold War, and ask, in the simplest possible terms, how events could have been otherwise. Arguments range the length of the policy spectrum – abortion, inflation, inequality, health care, mass incarceration, gun control, civil violence – the list is long. My aim here is to identify two overarching American policies with global reach that brought us to the present day. One has been a spectacular failure; the other, a brilliant, if inconspicuous, success.

The failure has been NATO enlargement, which Russian President Vladimir Putin asserts that the U.S. disavowed in 1990 in return for the acquiescence of the former Soviet Union to the reunification of Germany within NATO. Expansion was barely noticed when President Bill Clinton’s administration sought, in 1992, to admit Poland, Czechoslovakia and Hungary to the alliance. After Clinton added Estonia, Latvia, Lithuania, Bulgaria, Romania, Slovakia and Slovenia to the list of prospective members in 1997, Russia began to take umbrage – all the more so after Vladimir Putin took over in 2000.

George W. Bush paid little attention to Putin’s objections, and in 2008 put Ukraine and Georgia on the path to membership. Putin expressed strong opposition, starting and winning a small war, in Georgia. After Hillary Clinton and John Kerry kept up the NATO pressure as secretaries of state under President Obama, Putin annexed the Crimean Peninsula and began battling for portions of eastern Ukraine.

By the time that Trump began pandering to Putin, it was too late to change course.   Secretary of State Antony Blinken last November signed a “strategic partnership charter” with Ukraine; the next day, Putin commenced planning an inept invasion of his sovereign neighbor. Finland and Sweden have signed on to the alliance in protest, but six months into a proxy war with Russia, NATO, if not Ukrainian President Volodymyr Zelenskyy, has lost sympathy in much of the rest of the world. Meanwhile, China’s plans to absorb Taiwan lie just over the horizon. Joe Biden has said the U.S. intends to prevent it. America’s deeply unpalatable alternative to war there is the same as it was in Ukraine. Let the weak negotiate with the strong, and do what it can to prepare itself for Cold War II.

And the most successful U.S. policy since 1990?  That would be the management of the global monetary system by the Federal Reserve System, in concert with the central banks and treasuries of other leading nations, including China and Russia.   I am not thinking of the “Great Moderation” of the Nineties and the run-up in the Oughts to the crisis that began in 2007. I doubt that sequence is as yet very well understood.  But there should be no doubt about what happened in 2008. Under its primary mandate to serve as “lender of last resort” in a banking panic, the Fed led a desperate effort to fund and implement the response it had quietly organized in the course of the year before to halt the stampede that began after Lehman Brothers failed – a firebreak, not a “bailout.”

Had the panic continued, a second Great Depression  might have taken hold, perhaps even more stubborn and costly to resolve than the first. There should be no doubt about what happened in 2008, yet to this point there is mainly confusion. The best book may still be Last Resort: The Financial Crisis and the Future of Bailouts (Chicago, 2018), by Eric Posner, but it is not good enough. Understanding in some detail the Fed’s response to the panic is important for what it says about the prospects for responding effectively to climate change. (In the same way, much can be learned from the rapid success of the multinational campaign to develop a COVID vaccine.)

“NATO enlargement” and  “The Panic of 2008,” are on their way to becoming tropes, catch phrases,  figures of  speech, that evoke complicated historic developments, much as “The Titanic,” “The Guns of August,” “Pearl  Harbor,” “Dunkirk” and “D-Day” do today. The process of distillation takes time and much exploration by countless interpreters.  This is a weekly column; I try to keep it under a thousand words, eight paragraphs or so.  But I return to these topics, and a few others, again and again, adding new material and elaborating.


The mistakes involved in “NATO enlargement” have yet to be acknowledged; central bankers’ resourceful response to “The Panic of 2008” is still poorly understood.  To learn more, I can only suggest for now that you keep reading.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.

David Warsh: Competing with expansionist China while managing internal threats to our democracy

SOMERVILLE, Mass.

I have, at least since 1989, been a believer that competition between the West and China is likely to dominate global history for the foreseeable future.  By that I mean at least the next hundred years or so.

I am a reluctant convert to the view that the contest has arrived at a new and more dangerous phase. The increasing belligerence of Chinese foreign policy in the last few years has overcome my doubts.

It was a quarter century ago that I read World Economic Primacy: 1500-1990, by Charles P. Kindleberger. I held no economic historian in higher regard than CPK, but I raised an eyebrow at his penultimate chapter, “Japan in the Queue?”  His last chapter, “The National Life Cycle,” made more sense to me, but even then wasn’t convinced that he had got the units of account or the time-scales right.

The Damascene moment in my case came last week after I subscribed to Foreign Affairs, an influential six-times-a-year journal of opinion published by the Council on Foreign Relations. Out of the corner of my eye, I had been following a behind-the-scenes controversy, engendered by an article in the magazine about what successive Republican and Democratic administrations thought they were doing as they engaged with China, starting with the surprise “opening” engineered by Richard Nixon and Henry Kissinger in 1971. I subscribed to Foreign Affairs to see what I had been missing.

In 2018, in “The China Reckoning,’’ the piece that started the row, foreign- policy specialists Kurt Campbell and Ely Ratner had asserted that, for over fifty years, Washington had “put too much faith in its power to shape China’s trajectory.”  The stance had been previously identified mainly with then-President Donald Trump.  Both Campbell and Ely wound up in senior positions in the Biden administration, at the White House and the Pentagon.

In fact the proximate cause of my subscription was the most recent installment in this fracas. To read “The Inevitable Rivalry,’’ by John Mearsheimer, of the University of Chicago, an article in the November/December issue of the magazine, I had to pay the entry rate. His essay turned out to be a dud.

Had U.S. policymakers during the unipolar moment thought in terms of balance-of-power politics, they would have tried to slow Chinese growth and maximize the power gap between Beijing and Washington. But once China grew wealthy, a U.S.-Chinese cold war was inevitable. Engagement may have been the worst strategic blunder any country has made in recent history: there is no comparable example of a great power actively fostering the rise of a peer competitor. And it is now too late to do much about it.

Mearsheimer’s article completely failed to persuade me. Devotion to the religion he calls “realism” leads him to ignore two hundred years of Chinese history and the great foreign-policy lesson of the 20th Century: the disastrous realism of the 1919 Versailles Treaty that ended World War I and led to World War II, vs. the pragmatism of the Marshall Plan of 1947, which helped prevent World War III. There is no room for moral conduct is his version of realism. It is hardball all the way.

My new subscription led me to the archives, and soon to “Short of War,’’ by Kevin Rudd, which convinced me that China’s designs on Taiwan were likely to escalate, given President Xi Jinping’s intention to remain in power indefinitely. (Term limits were abolished on his behalf in 2018.)  By 2035 he will be 82, the age at which Mao Zedong died.  Mao had once mused that repossession of the breakaway island nation of Taiwan might take as long as a hundred years.

Beijing now intends to complete its military modernization program by 2027 (seven years ahead of the previous schedule), with the main goal of giving China a decisive edge in all conceivable scenarios for a conflict with the United States over Taiwan. A victory in such a conflict would allow President Xi to carry out a forced reunification with Taiwan before leaving power—an achievement that would put him on the same level within the CCP pantheon as Mao Zedong.

That led me in turn to “The World China Wants,’’ by Rana Mitter, a professor of Chinese politics and history at Oxford University. He notes that, at least since the global financial crisis of 2008, China’s leaders have increasingly presented their authoritarian style of governance as an end in and of itself, not a steppingstone to a liberal democratic system. That could change in time, she says.

To legitimize its approach, China often turns to history, invoking its premodern past, for example, or reinterpreting the events of World War II. China’s increasingly authoritarian direction under Xi offers only one possible future for the country. To understand where China could be headed, observers must pay attention to the major elements of Chinese power and the frameworks through which that power is both expressed and imagined.

The ultimate prize of my Foreign Affairs reading day was “The New Cold War,’’ a long and intricately reasoned article in the latest issue by Hal Brands, of Johns Hopkins University, and John Lewis Gaddis, of Yale University, about the lessons they had drawn from a hundred and fifty years of competition among great powers. I especially agreed with their conclusion:

As [George] Kennan pointed out in the most quoted article ever published in these pages, “Exhibitions of indecision, disunity and internal disintegration within this country” can “have an exhilarating effect” on external enemies. To defend its external interests, then, “the United States need only measure up to its own best traditions and prove itself worthy of preservation as a great nation.”

Easily said, not easily done, and therein lies the ultimate test for the United States in its contest with China: the patient management of internal threats to our democracy, as well as tolerance of the moral and geopolitical contradictions through which global diversity can most feasibly be defended. The study of history is the best compass we have in navigating this future—even if it turns out to be not what we’d expected and not in most respects what we’ve experienced before.

That sounded right to me. Worries exist in a hierarchy: leadership of the Federal Reserve Board; the U.S. presidential election in 2024; the stability of the international monetary system; arms races of various sorts; climate change. Subordinating all these to the China problem will take time.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this essay originated.

David Warsh: Pinning things down using history

SOMERVILLE, Mass.

In Natural Experiments of History, a collection of essays published a decade ago, editors Jared Diamond and James Robinson wrote, “The controlled and replicated  laboratory experiment, in which the experimenter directly manipulates variables,  is often considered the hallmark of the scientific method” – virtually the only approach employed in physics, chemistry, molecular biology.

Yet in fields considered scientific that are concerned with the past – evolutionary biology, paleontology, historical geology, epidemiology, astrophysics – manipulative experiments are not possible. Other paths to knowledge are therefore required, they explained, methods of “observing, describing, and explaining the real world, and of setting the individual explanations within a larger framework “– of “doing science,” in other words.

Studying “natural experiments” is one useful alternative, they continued – finding systems that are similar in many ways but which differ significantly with respect to factors whose influence can be compared quantitatively, aided by statistical analysis.

Thus this year’s Nobel Prize in Economic Sciences recognizes Joshua Angrist, 61, of the Massachusetts Institute of Technology; David Card, 64, of the University of California, Berkeley; and Guido Imbens, 58, of Stanford University, “for having shown that natural experiments can answer central questions for society.”

Angrist, burst on the scene in in 1990, when “Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security administrative records” appeared in the American Economic Review. The luck of the draw had, for a time, determined who would be drafted during America’s Vietnam War, but in the early 1980s, long after their wartime service was ended, the earnings of white veterans were about 15 percent less than the earnings of comparable nonveterans, Angrist showed.

About the same time, Card had a similar idea, studying the impact on the Miami labor market of the massive Mariel boatlift out of Cuba, but his paper appeared in the less prestigious Industrial and Labor Relations Review.  Card then partnered with his colleague, Alan Krueger, to search for more natural experiments in labor markets.  Their most important contribution, a careful study of differential responses in nearby eastern Pennsylvania to a minimum-wage increase in New Jersey, appeared as was Myth and Measurement: The New Economics of the Minimum Wage (Princeton, 1994). Angrist and Imbens, meanwhile, mainly explored methodological questions.

Given the rule that no more than three persons can share a given Nobel prize, and the lesser likelihood that separate prizes might be given in two different years, Krueger’s tragic suicide, in 2019, rendered it possible to cite, in a single award, Card, for empirical work, and Angrist and Imbens, for methodological contributions.

Princeton economist Orley Ashenfelter, who, with his mentor Richard Quandt, also of Princeton, more or less started it all, told National Public Radio’s Planet Money that “It’s a nice thing because the Nobel committee has been fixated on economic theory for so long, and now this is the second prize awarded for how economic analysis is now primarily done. Most economic analysis nowadays is applied and empirical.” [Work on randomized clinical trials was recognized in 2019.]

In 2010 Angrist and Jörn-Staffen Pischke described the movement as “the credibility revolution.” And in the The Age of the Applied Economist: the Transformation of Economics since the 1970s. (Duke, 2017), Matthew Panhans and John Singleton wrote that “[T]he missionary’s Bible today is less Mas-Colell et al and more Mostly Harmless Econometrics: An Empiricist’s Companion (Angrist and Pischke, Princeton, 2011)

Maybe so.  Still, many of those “larger frameworks” must lie somewhere ahead.

“History,’’ by Frederick Dielman (1896)

                                                          

That Dale Jorgenson, of Harvard University, would be recognized with a Nobel Prize was an all but foregone conclusion as recently as twenty years ago. Harvard University had hired him away from the University of California at Berkeley in 1969, along Zvi Griliches, from the University of Chicago,  and Kenneth Arrow, from Stanford University (the year before). Arrow had received the Clark Medal in 1957, Griliches in 1965; Jorgenson was named in 1971. “[H]e is preeminently a master of the territory between economics and statistics, where both have to be applied in the study of concrete problems.” said the citation. With John Hicks, Arrow received the Nobel Prize the next year.

For the next thirty years, all three men brought imagination to bear on one problem after another. Griliches was named a Distinguished Fellow of the American Economic Association in 1994; he died in 1999. Jorgenson, named a Distinguished Fellow in 2001, began an ambitious new project  in 2010 to continuously update measures of output and inputs of capital, labor, energy, materials and services for individual industries. Arrow returned to Stanford in 1979 and died in 2017.

Call Jorgenson’s contributions to growth accounting “normal science” if you like – mopping up, making sure, improving the measures introduced by Simon KuznetsRicard Stone, and Angus Deaton.  It didn’t seem so at the time. The moving finger writes, and having writ, moves on.

                                                                xxx

Where are the women in economics, asked Tim Harford, economics columnist of the Financial Times the other day. They are everywhere, still small in numbers, especially at senior level, but their participation is steadily growing. AEA presidents include Alice Rivlin (1986); Anne Krueger (1996); Claudia Goldin (2013); Janet Yellen (2020); Christina Romer (2022), and Susan Athey, president elect (2023).  Clark medals have been awarded to Athey (2007), Esther Duflo (2010), Amy Finkelstein (2012), Emi Nakamura (2019), and Melissa Dell (2020).

Not to mention that Yellen, having chaired the Federal Reserve Board for four years, today is secretary of the Treasury; that Fed governor Lael Brainerd is widely considered an eventual chair; that Cecilia Elena Rouse chairs of the Council of Economic Advisers; that Christine Lagarde is president of European Central Bank; and that Kristalina Georgieva is managing director of the International Monetary Fund, for a while longer, at least.

The latest woman to enter these upper ranks is Eva Mörk, a professor of economics at Uppsala University, apparently the first female to join the Committee of the Royal Swedish Academy of Sciences that recommends the Economics Sciences Prize, the last barrier to fall in an otherwise egalitarian institution. She stepped out from behind the table in Stockholm last week to deliver a strong TED-style talk (at minutes 5:30-18:30 in the recording) about the whys and wherefores of the award, and gave an interesting interview afterwards.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.

           



David Warsh: Goldin's marriage manual for the next generation

440px-+Verlobung_Frau_und_Mann_-_mit_Ringtausch_-_und_Treue_Verlobungsversprechen_-_Dresden_-_Bild_011.jpeg

SOMERVILLE, Mass.

For many people, the COVID-19 pandemic has been an eighteen-month interruption. Survive it, and get back to work. For those born after 1979, it may prove to have been a new beginning. Women and men born in the 21st Century may have found themselves beginning their lives together in the midst of yet another historic turning point.

That’s the argument  that Claudia Goldin advances in Career and Family: Women’s Century-long Journey toward Equity (Princeton, 2021). As a reader who has been engaged as a practitioner in both career and family for many years, I aver that this is no ordinary book. What does greedy work have to do with it?  And why is the work “greedy,” instead of “demanding” or “important?” Good question, but that is getting ahead of the story.

Goldin, a distinguished historian of the role of women in the American economy, begins her account in 1963, when Betty Friedan wrote a book about college-educated women who were frustrated as stay-at-home moms.  Their problem, Friedan wrote, “has no name.” The Feminine Mystique caught the beginnings of a second wave of feminism that continues with puissant force today.  Meanwhile, Goldin continues, a new “problem with no name” has arisen:

 Now, more than ever, couples of all stripes are struggling to balance employment and family, their work lives and home lives.  As a nation, we are collectively waking up to the importance of caregiving, to its value, for the present and future generations. We are starting to fully realize its cost in terms of lost income,  flattened careers, and trade-offs between couples (heterosexual and same sex), as well as the particularly strenuous demands on single mothers and fathers.  These realizations predated the pandemic but have been brought into sharp focus by it.

A University of  Chicago-trained economist; the first woman tenured by Harvard’s economics department; author of five  important books, including, with her partner, Harvard labor economist Lawrence KatzThe Race between Education and Technology (Harvard Belknap, 2010); recipient of an impressive garland of honors, among them the Nemmers award in economics; a former president of the American Economic Association:  Goldin has written a chatty, readable sequel to Friedan, destined  itself to become a paperback best-seller – all the more persuasive because it is rooted in the work of hundreds of other labor economists and economic historians over the years.  Granted, Goldin is expert in the history of gender only in the United States; other nations will compile stories of their own.  .

To begin with, Goldin distinguishes among the experiences of five roughly-defined generations of college-educated American women since the beginning of the twentieth century.  Each cohort merits a chapter. The experiences of gay women were especially hard to pin down over the years, given changing norms.

In “Passing the Baton,” Goldin characterizes the first group, women born between 1878-97, as having had to choose between raising families and pursuing careers.  Even the briefest biographies of the lives culled from Notable American Women make interesting reading: Jeannette RankinHelen KellerMargaret SangerKatharine McCormickPearl BuckKatharine WhiteSadie AlexanderFrances Perkins. But most of that first generation of college women never became more prominent than as presidents of the League of Women Voters or the Garden Club.  They were mothers and grandmothers the rest of their lives.

In “A Fork in the Road,” her account of the generation born between 1898 and 1923,  Goldin dwells on 75-year-old Margaret Reid, whom she frequently passed at the library as a graduate student at Chicago, where Reid had earned a Ph.D. in in economics  in 1934. (They never spoke; Goldin, a student of Robert Fogel, was working on slavery then.)  Otherwise, this second generation was dominated by a pattern of jobs, then family. The notable of this generation tend to be actresses – Katharine HepburnBette DavisRosalind RussellBarbara Stanwyck – sometimes playing roles modeled on real-world careers, as when Hepburn played a world-roving journalist resembling Dorothy Thompson in Woman of the Year 

In “The Bridge Group,” Goldin discusses the generation born between 1924-1943, who raised families first and then found jobs – or didn’t find jobs. She begins by describing what it was like to read Mary McCarthy’s novel, The Group (in a paper-bag cover), as a 17-year-old commuting from home in East Queens to a summer job in Greenwich Village.  It was a glimpse of her parents’ lives – the dark cloud of the Great Depression that hung over w US in the Thirties, the hiring bars and marriage bar that turned college-educated women out of the work-force at the first hint of second income.

“The Crossroads with Betty Friedan” is about the Fifties and the television shows, such as I Love Lucy, The Honeymooners, Leave It to Beaver and Father Knows Best that, amid other provocations, led Betty Friedan to famously ask, “Is that all there is?” Between the college graduation class of 1957 and the class of 1961, Goldin finds, in an enormous survey by the Women’s Bureau of the U.S. Labor Department, an inflection point.  The winds shift, the mood changes. Women in small numbers begin to return to careers after their children are grown:  Jeane KirkpatrickErma BombeckPhyllis SchaflyJanet Napolitano and Goldin’s own mother, who became a successful elementary school principal. Friedan had been right, looking backwards, Goldin concludes, but wrong about what was about to happen.

In “The Quiet Revolution,” members of the generation born between 1944-1957 set out to pursue careers and then, perhaps, form families. The going is hard but they keep at it.  The scene is set with a gag from the Mary Tyler Moore Show in 1972.  Mary is leaving her childhood home with her father, on her way to her job as a television news reporter.  He mother calls out, “Remember to take your pill, dear.” Father and daughter both reply, “I will.”  Father scowls an embarrassed double-take. The show’s theme song concludes, “You’re going to make it after all.” The far-reaching consequences of the advent of dependable birth control for women’s new choices are thoroughly explored.  This is, after all, Goldin’s own generation.

“Assisting the Revolution,” about the generation born between1958-78, is introduced by a recitation of the various roles played by Saturday Night Live star Tina Fey – comedian, actor, writer.  Group Five had an easier time of it. They were admitted in increasing numbers to professional and graduate schools. They achieved parity with men in colleges and surpassed them in numbers.  They threw themselves into careers. “But they had learned from their  Group Four older sisters that the path to career must leave room for family, as deferral could lead to no children,” Golden writes. So they married more carefully and earlier, chose softer career paths, or froze their eggs.  Life had become more complicated.

In her final chapters – “Mind the Gap,” “The Lawyer and the Pharmacist” and “On Call” – Goldin tackles the knotty problem.  The gender earnings gap has persisted over fifty years, despite the enormous changes that have taken place.  She explores the many different possible explanations, before concluding that the difference stems from the need in two-career families for flexibility – and the decision, most often by women, to be on-call, ready to leave the office for home.  Children get sick, pipes break, school close for vacation, the baby-sitter leaves town.

The good news is that the terms of relationships are negotiable, not between equity-seeking partners, but with their employers as well. The offer of parental leave for fathers is only the most obvious example. Professional firms in many industries are addicted to the charrette – a furious round of last-minute collaborative work or competition to meet a deadline. Such customs can be given a name and reduced.  Firms need to make a profit, it is true, but the name of the beast, the eighty-hour week, is “greedy work.”

It is up the members of the sixth group, their spouses and employers, to further work out the terms of this deal.  The most intimate discussions in the way ahear will occur within and among families. Then come board rooms, labor negotiations, mass media, social media, and politics.  Even in its hardcover edition, Career and Family is a bargain. I am going home to start to assemble another photograph album – grandparents, parents, sibs, girlfriends, wife, children, and grandchildren – this one to be an annotated family album.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this essay first ran.

A "Wife Wanted" ad in an 1801 newspaper "N.B." means "note well".

A "Wife Wanted" ad in an 1801 newspaper
"
N.B." means "note well".

          

David Warsh: Prepare for multigenerational contest between China and the West

China’s national emblem

China’s national emblem

SOMERVILLE, Mass.

China is building missile silos in the Gobi Desert. The U.S. has agreed to provide nuclear-submarine technology to Australia, enraging the French, who are building a dozen diesel subs that they had expected to sell to the Aussies. Xi Jinping last week rejected Joe Biden’s suggestion that the two arrange a face-to-face meeting to discuss their differences.  Clearly, the U.S. “pivot” to the Pacific is well underway.  Taiwan is the new hotspot, not to mention the Philippines and Japan.

The competition between China and the West is a contest, not a cold war.  Financial Times columnist Philip Stephens was the first in the circle of those whom I read to make this point.  “The Soviet Union presented at once a systemic and an existential threat to the West,” he wrote. “China undoubtedly wants to establish itself as the world’s pre-eminent power, but it is not trying to convert democracies to communism….”  The U.S. is not trying to “contain” China so much as to constrain its actions.  He continued,

Beijing and Moscow want a return to a nineteenth century global order where great powers rule over their own distinct spheres of influence.  If the habits and the institutions created since 1945 mean anything, it has been the replacement of that arrangement with the international rule of law.

I’m not quite sure what Stephens means by “the international rule of law.”  The constantly changing Western traditions of freedom of action and thought? Is it true, as George Kennan told Congress in 1972, that the Chinese language contains no word for freedom? Is it possible that Chinese painters produced no nudes before the 20th Century?

The co-evolution of cultures between China and the West has been underway for 4,000 years, proceeding at a lethargic pace for most of that time. While the process has recently assumed a breakneck pace, it can be expected to continue for many, many generations before the first hints of consensus develop about a direction of change.

A hundred years?  Three hundred? Who knows? Already there is conflict. There may eventually be blood, at least in some corners of the Earth. But the world has changed so much since 1945 that “cold war” is no longer a useful apposition. The existential threat today is climate change.

China’s cultural heritage is not going to fade away, as did Marxist-Leninism. The script of that drama, written in Europe in the 19th Century, has lost much of its punch. Vladimir Putin has embraced the Russian Orthodox Church as a source of moral authority.  Xi Jinping has evoked the egalitarian idealism of Mao Zedong in cracking down on China’s high-tech groups and rock stars, and strictly limiting the time its children are allowed to play video games.

But what is the Western tradition of “rule of law” that presumes to become truly international, eventually? Expect an answer some other day. Meanwhile, I’m cooking pancakes for my Somerville grandchildren.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this essay first appeared.

440px-Blueberry_pancakes_(3).jpeg



David Warsh: America’s fracturing ‘politics of purity’ since the ‘70s

Ralph Nader in 1975, in his heyday

Ralph Nader in 1975, in his heyday

SOMERVILLE, Mass.

Like a lot of people, I am interested in what has been happening in the world, the U.S. in particular, since the end of World War II.  I am especially intrigued by goings on in university economics, but I take a broad view of the subject.  I grew up in the Fifties, and the single most persuasive account I’ve found of the underlying nature of changing times since 1945 has been a series of five books by historian Daniel Rodgers, of Princeton University. In Age of Fracture (Belknap, Harvard, 2011), Rodgers described very well my experience of the increasingly thinner life of things.

Across the multiple fronts of ideational battle, from the speeches of presidents to books of social and cultural theory, conceptions of human nature that in the post-World War II era had been thick with context, social circumstance, institutions, and history gave way to conceptions of human nature that stressed choice, agency, performance and desire. Strong metaphors of society were supplanted by weaker ones. Imagined collectivities shrank; notions of structure and power thinned out. Viewed by its acts of mind, the last quarter of the century was an era of disaggregation, a great age of fracture.

But I’m always interested in a new narrative.  One such is Public Citizens: The Attack on Big Government and the Remaking of American Liberalism (Norton, 2021), by historian Paul Sabin, of Yale University.  Sabin employs the career of Ralph Nader, the arc of which extends from Harvard Law School and auto-safety crusader in Sixties to his Green Party candidacy in the U.S. presidential election of 2000, as a metaphor for a variety of other liberal activists who mounted assaults of their own on centers of government power in the second half of the 20th  Century.

The harmonious post-war partnership of business, labor and government proclaimed in the Fifties by economist John Kenneth Galbraith and New Dealer James Landis, symbolized by the success of the Tennessee Valley Authority’s government-sponsored electrification of the rural South, was not built to last.  But how did government go from being the solution to America’s problems to being the cause of them? It was more complicated than Milton Friedman and Ronald Reagan, Sabin shows.

Jane Jacobs (The Death and Life of Great American Cities, 1961), Rachel Carson (Silent Spring, 1962) and Nader (Unsafe at Any Speed. 1965), were exemplars of a new breed of critics of capture industrial manipulation and capture of government function, Sabin writes.  Jacobs attacked large-scale city planning and urban renewal. Carson exposed widespread abuses by the commercial pesticide industry. Nader criticized automotive design. These were only the first and most visible cracks in the old alliance of industries, labor unions and federal administrative agencies.  Public-interest law firms began springing up, loosely modeled on civil-rights organizations. The National Resources Defense Council; the Conservation Law Foundation; the Center for Law and Social Policy and many other start-ups soon found their way into federal courts. Nader tackled the leadership of the United Mineworkers Union, leading then-UMW President Tony Boyle to order the murder of reform candidate Tony Yablonski, his wife, and daughter, on New Year’s Eve, 1969.

In Age of Fracture, Rodgers wrote that “The first break in the formula that joined freedom and obligation all but inseparably together began with Jimmy Carter.” Carter’s outside-Washington experience as a peanut farmer and Georgia governor, as well as his immersion in low-church Protestant evangelical culture led him to shun presidential authority. “Government cannot solve our problems, it can’t set our goals, it cannot define our vision,” he said in 1978.

Sabin takes a similar view but offers a different reason for the rupture. Caught in between the idealistic aspirations of outside critics inspired by Nader and the practical demands of governing by consensus, Carter struggled to maintain the traditional balance but failed to placate his critics. “Disillusionment came easily and quickly to Ralph Nader,” Sabin writes.  “I expect to be consulted, and I was told that I would be,” Nader complained almost immediately. Reform-minded critics attacked Carter from nearly every direction. A fierce primary challenge by Sen. Edward M. Kennedy (D.-Mass.) failed in 1980. The stage was set for Ronald Reagan.

Sabin recalls the battles of the 1970s with grim determination to show the folly of politics of purity.  Nader made his first run for the presidency as leader of the Green Party in 1996, challenging Bill Clinton and Bob Dole. He was in his sixties; his efforts were half-hearted.  In his second campaign, in 2000, he campaigned vigorously enough to tip the election to George W. Bush. Even then it wasn’t Nader’s last hurrah. He ran again, in 2004, as candidate of the Reform Party; and a fourth time, as an independent, in 2008. At 87, he is today conspicuously absent from the scene.

The public-interest movement initiated by urbanist Jane Jacobs, scientist Rachel Carson and Ralph Nader was effective in its early stages, Sabin concludes. The nation’s air and water are cleaner; its highways and workplaces safer; its cities more open to possibility. But Sabin is surely right that all too often, go-for-broke activism served mainly to undermine confidence in the efficacy of administrative government action among significant segments to the public.

The critique of federal regulation was clearly not the whole story, any more than was The Great Persuasion, undertaken in 1948 by the Mont Pelerin Society, pitched unsuccessfully in 1964 by presidential candidate Barry Goldwater, and translated into slogans in 1980 by Milton and Rose Friedman. Nor is the thoroughly disappointing 20-year aftermath to 9/11, another day when the world seemed to many to “break apart,” as historian Dan Rodgers put it in an epilogue to Age of Fracture.

What might put it back together? Accelerating climate change, perhaps.  But that’s another story.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column first appeared.

David Warsh: Time to read ‘Three Days at Camp David’

The main lodge at Camp David

The main lodge at Camp David

SOMERVILLE, Mass.

The heart-wrenching pandemonium in Kabul coincided with the seeming tranquility of the annual central banking symposium at Jackson Hole, Wyoming. For the second year in a row, central bankers stayed home, amid fears of the resurgent coronavirus. Newspapers ran file photos of Fed officials strolling beside Jackson Lake.

Market participants are preoccupied with timing of the taper, the Fed’s plan to reduce its current high level of asset purchases. That is not beside the point, but neither is it the most important decision facing the Biden administration with respect to the conduct of economic policy. Whom to nominate to head the Federal Reserve Board for the next four years? For a glimpse of the background to that question, a good place to start is a paper from a a Bank of England workshop earlier in the summer

Central Bank Digital Currency in Historical Perspective: Another Crossroads in Monetary History, by Michael Bordo, of Rutgers University and the Hoover Institution, brings to mind the timeless advice of Yogi Berra:  when you come to a crossroad, take it.

Bordo briefly surveys the history of money and banking. Gold, silver and copper coinage (and paper money in China) can be traced back over two millennia, he notes, but three key transformations can be identified in the five hundred years since Neapolitan banks began experimenting with paper money.

First, fiduciary money took hold in the 18th Century, paper notes issued by banks and ostensibly convertible into precious metal (specie) held in reserve by the banks. Fractional banking emerged, meaning that banks kept only as much specie in the till as they considered necessary to meet the ordinary ebb and flow of demands for redemption, leaving them vulnerable to periodic panics or “runs.”  Occasional experiments with fiat money, issued by governments to pay for wars, but irredeemable for specie, generally proved spectacularly unsuccessful, Bordo says (American Continentals, French assignats).

Second, the checkered history of competing banks and their volatile currencies, led, over the course of a century, to bank supervision and monopolies on  national currencies, overseen by central banks and national treasuries.

Third, over the course of the 17th to the 20th centuries, central banks evolved to take on additional responsibilities:  marketing government debt; establishing payment systems; pursuing financial stability (and serving as lenders of last resort when necessary to halt panics); and maintaining a stable value of money. For a time, the gold standard made this last task relatively simple: to preserve the purchasing power of money, maintain a fixed price of gold. But as gold convertibly became ever-harder to manage, nations retreated from their fiduciary monetary systems in fits and starts. In 1971, they abandoned them altogether in favor of fiat money. It took about 20 years to devise central banking techniques with which to seek maintain stable purchasing power.

As it happens, the decision-making at the last fork in the road of the international currency and monetary system is laid out with great clarity and charm in a new book by Jeffrey Garten, Three Days at Camp David: How a Secret Meeting in 1971 Transformed the Global Economy (2021, Harper) Garten spent a decade in government before moving to Wall Street.  In 2006 he returned to strategic consulting in Washington, after about 20 years at Yale’s School of Management, ten of them as dean.

The special advantage of his book is how Garten brings to life  the six major players at the Camp David meeting, Aug. 13-15, 1971 – Richard Nixon, John Connally, Paul Volcker, Arthur Burns, George Shultz, Peter Peterson and two supporting actors, Paul McCracken an Henry Kissinger – and explores their stated aims and private motives.  The decision they took was momentous:  to unilaterally quit the Bretton Woods system, to go off the gold standard, once and for all. It was a transition the United States had to make, Garten argues, and in this sense bears a resemblance to Afghanistan and the present day:

A bridge from the first quarter-century following [World War II] –where the focus was on rebuilding national economies that had been destroyed and on re-establishing a functional world economic system – to a new emvironment where power and responsibility  among the allies had to be readjusted . with the burden on the United States being more equitably  shared and with the need for multilateral cooperation to replace Washington’s unilateral dictates.

What about Nixon’s re-election campaign in 1972?  Of course that had something to do with it; politics always has something to do with policy, Garten says. But one way or another, something had to be done to relieve pressure on the dollar. “The gold just wasn’t there” to continue, writes Garten.

The trouble is, as with all history, that was fifty years ago.  What’s going on now?

Read, if you dare, the second half of Michael Bordo’s paper, for a concise summary of the money and banking issues we face. Their unfamiliarity is forbidding; their intricacy is great.  The advantages of a digital system may be manifest. “Just as the history of multiple competing currencies led to central bank provision of currency,” Bordo writes, “ the present day rise of cryptocurrencies and stable coins suggests the outcome may also be a process of consolidation towards a central bank digital currency.”

But the choices that central bankers and their constituencies must make are thorny.  Wholesale or retail? Tokens or distributed ledger accounts? Lodged in central banks or private institutions? Considerable work is underway, Bordo says, at the Bank of England, Sweden’s Riksbank, the Bank of Canada, the Bank for International Settlements, and the International Monetary Fund, but whatever research the Fed has undertaken, “not much of it has seen the light of day.”

Who best to help shepherd this new world into existence?  The choice for the U.S. seems to be between reappointing Fed Chairman Jerome Powell, 68, to a second term, beginning in February, or nominating a Fed governor Lael Brainard, 59, to replace him.  President Biden is reeling at the moment. I am no expert, but my hunch is that preferring Brainard to Powell is the better option overall, for both practical and political ends. After all, what infrastructure is more fundamental to a nation’s well-being than its place in the global system of money and banking?

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column first ran.

                                                     

David Warsh: Of The Globe, John Kerry, Vietnam and my column

An advertisement for The Boston Globe from 1896.

An advertisement for The Boston Globe from 1896.

SOMERVILLE, Mass.

It has taken six months, but with this edition, Economic Principals finally makes good on its previously announced intention to move to Substack publishing.  What took so long? I can’t blame the pandemic. Better to say it’s complicated. (Substack is an online platform that provides publishing, payment, analytics and design infrastructure to support subscription newsletters.)

EP originated in 1983 as columns in the business section of The Boston Sunday Globe. It appeared there for 18 years, winning a Loeb award in the process. (I had won another Loeb a few years before, at Forbes.) The logic of EP was simple: It zeroed in on economics because Boston was the world capital of the discipline; it emphasized personalities because otherwise the subject was intrinsically dry (hence the punning name).  A Tuesday column was soon added, dwelling more on politics, because economic and politics were essentially inseparable in my view.

The New York Times Co. bought The Globe in 1993, for $1.13 billion, took control of it in 1999 after a standstill agreement expired, and, in July 2001, installed a new editor, Martin Baron.  On his second morning on the job, Baron instructed the business editor, Peter Mancusi, that EP was no longer permitted to write about politics. I didn’t understand, but tried to comply. I failed to meet expectations, and in January, Baron killed the column. It was clearly within his rights. Metro columnist Mike Barnicle had been cancelled, publisher Benjamin Taylor had been replaced, and editor Matthew Storin, privately maligned for having knuckled under too often to the Boston archdiocese of the Roman Catholic Church, retired. I was small potatoes, but there was something about The  Globe’s culture that the NYT Co. didn’t like.  I quit the paper and six weeks later moved the column online.

After experimenting with various approaches for a couple of years, I settled on a business model that resembled public radio in the United States – a relative handful of civic-minded subscribers supporting a service otherwise available for free to anyone interested.  An annual $50 subscription brought an early (bulldog) edition of the weekly via email on Saturday night.  Late Sunday afternoon, the column went up on the Web, where it (and its archive) have been ever since, available to all comers for free.

Only slowly did it occur to me that perhaps I had been obtuse about those “no politics” instructions.  In October 1996, five years before they were given, I had raised caustic questions about the encounter for which then U.S. Sen. John Kerry (D.-Mass.) had received a Silver Star in Vietnam 25 years before. Kerry was then running for re-election, I began to suspect that history had something to do with Baron ordering me to steer clear of politics in 2001.

                                                                      • ••

John Kerry had become well known in the early ‘70s as a decorated Navy war hero who had turned against the Vietnam War. I’d covered the war for two years, 1968-70, traveling widely, first as an enlisted correspondent for Pacific Stars and Stripes, then as a Saigon bureau stringer for Newsweek. I was critical of the premises the war was based on, but not as disparaging of its conduct as was Kerry. I first heard him talk in the autumn of 1970, a few months after he had unsuccessfully challenged the anti-war candidate Rev. Robert Drinan, then the dean of Boston College Law School, for the right to run against the hawkish Philip Philbin in the Democratic primary. Drinan won the nomination and the November election. He was re-elected four times.

As a Navy veteran, I was put off by what I took to be the vainglorious aspects of Kerry’s successive public statements and  candidacies, especially in the spring of 1971, when in testimony before the Senate Foreign Relation Committee, he repeated accusations he had made on Meet the Press that thousands of atrocities amounting to war crimes had been committed by U.S. forces in Vietnam. The next day he joined other members of the Vietnam Veterans against the War in throwing medals (but not his own) over a fence at the Pentagon.

In 1972, he tested the waters in three different congressional districts in Massachusetts before deciding to run in one, an election that he lost. He later gained electoral successes in the Bay State, winning the lieutenant governorship on the Michael Dukakis ticket in 1982, and a U.S. Senate seat in 1984, succeeding Paul Tsongas, who had resigned for health reasons. Kerry remained in the Senate until 2013, when he resigned to become secretary of state.  [Correction added]

Twenty-five years after his Senate testimony, as a columnist I more than once expressed enthusiasm for the possibility that a liberal Republican – venture capitalist Mitt Romney or Gov. Bill Weld – might defeat Kerry in the 1996 Senate election. (Weld had been a college classmate, though I had not known him.) This was hardly disinterested newspapering, but as a columnist, part of my job was to express opinions.

In the autumn of 1996, the recently re-elected Weld had challenged Kerry’s bid for a third term in the Senate, The campaign brought old memories to life. On Sunday Oct. 6, The Globe published long side-by-side profiles of the candidates, extensively reported by Charles Sennott.

The Kerry story began with an elaborate account of his experiences in Vietnam – the candidate’s first attempt. I believe, since 1971 to tell the story of his war. After Kerry boasted of his service during a debate 10 days later, I became curious about the relatively short time he had spent in Vietnam – four months. I began to research a column. Kerry’s campaign staff put me in touch with Tom Belodeau, a bow gunner on the patrol boat that Kerry had beached after a rocket was fired at it to begin the encounter for which he was recognized with a Silver Star.

Our conversation lasted half an hour. At one point, Belodeau confided, “You know, I shot that guy.” That evening I noticed that the bow gunner played no part in Kerry’s account of the encounter in a New Yorker article by James Carroll in October 1996 – an account that seemed to contradict the medal citation itself. That led me to notice the citation’s unusual language: “[A]n enemy soldier sprang from his position not 10 feet [from the boat] and fled. Without hesitation, Lieutenant (Junior Grade) Kerry leaped ashore, pursued the man behind a hootch and killed him, capturing a B-40 rocket launcher with a round in the chamber.” There are now multiple accounts of what happened that day. Only one of them, the citation, is official, and even it seems to exist in several versions. What is striking is that with the reference to the hootch, the anonymous author uncharacteristically seems to take pains to imply that nobody saw what happened.

The first column (“The War Hero”) ran Tues., Oct. 24. Around that time, a fellow former Swift Boat commander, Edward (Tedd) Ladd, phoned The Globe’s Sennott to offer further details and was immediately passed on to me. Belodeau, a Massachusetts native who was living in Michigan, wanted to avoid further inquiries, I was told. I asked the campaign for an interview with Kerry. His staff promised one, but day after day, failed to deliver. Friday evening arrived and I was left with the draft of column for Sunday Oct. 27 about the citation’s unusual phrase (“Behind the Hootch”). It included a question that eventually came to be seen among friends as an inside joke aimed at other Vietnam vets (including a dear friend who sat five feet away in the newsroom): Had Kerry himself committed a war crime, at least under the terms of his own sweeping indictments of 1971, by dispatching a wounded man behind a structure where what happened couldn’t be seen?

The joke fell flat. War crime? A bad choice of words! The headline?  Even worse. Due to the lack of the campaign’s promised response, the column was woolly and wholly devoid of significant new information. It certainly wasn’t the serious accusation that Kerry indignantly denied. Well before the Sunday paper appeared, Kerry’s staff apparently knew what it would say. They organized a Sunday press conference at the Boston Navy Yard, which was attended by various former crew members and the admiral who had presented his medal. There the candidate vigorously defended his conduct and attacked my coverage, especially the implicit wisecrack the second column contained.  I didn’t learn about the rally until late that afternoon, when a Globe reporter called me for comment.

I was widely condemned. Fair enough: this was politics, after all, not beanbag. (Caught in the middle, Globe editor Storin played fair throughout with both the campaign and me). The election, less than three weeks away, had been refocused. Kerry won by a wider margin than he might have otherwise. (Kerry’s own version of the events of that week can be found on pp. 223-225 of his autobiography.)

                                                                           • ••

Without knowing it, I had become, in effect, a charter member of the Swift Boat Veterans for Truth. That was the name of a political organization that surfaced in May 2004 to criticize Kerry, in television advertisements, on the Web, and in a book, Unfit for Command.  What I had discovered in 1996 was little more than what everyone learned in 2004 – that some of his fellow sailors disliked Kerry intensely. In conversations with many Swift Boat vets over the year or two after the columns, I learned that many bones of contention existed. But the book about the recent history of economics I was finishing and the online edition of EP that kept me in business were far more important. I was no longer a card-carrying member of a major news organization, so after leaving The Globe I gave the slowly developing Swift Boat story a good leaving alone. I spent the first half of 2004 at the American Academy in Berlin.

Whatever his venial sins, Kerry redeemed himself thoroughly, it seems to me, by declining to contest the result of the 2004 election, after the vote went against him by a narrow margin of 118,601 votes in Ohio. He served as secretary of state for four years in the Obama administration and was named special presidential envoy for climate change, a Cabinet-level position, by President Biden,

Baron organized The Globe’s Pulitzer Prize-winning Spotlight coverage of Catholic Church secrecy about sexual abuse by priests, and it turned into a world story and a Hollywood film. In 2013 he became editor of The Washington Post and steered a steady course as Amazon founder Jeff Bezos acquired the paper from the Graham family and Donald Trump won the presidency and then lost it. Baron retired in February. He is writing a book about those years.

But in 2003, John F. Kerry: The Complete Biography by the Boston Globe Reporters Who Know Him Best was published by PublicAffairs Books, a well-respected publishing house whose founder, Peter Osnos, had himself been a Vietnam correspondent for The Washington Post. Baron, The Globe’s editor, wrote in a preface, “We determined… that The Boston Globe should be the point of reference for anyone seeking to know John Kerry. No one should discover material about him that we hadn’t identified and vetted first.”

All three authors – Michael Kranish, Brian Mooney, Nina Easton – were skilled newspaper reporters. Their propensity to careful work appears on (nearly) every page. Mooney and Kranish I considered I knew well.  But the latter, who was assigned to cover Kerry’s early years, his upbringing, and his combat in Vietnam, never spoke to me in the course of his reporting.  The 1996 campaign episode in which I was involved is described in three paragraphs on page 322. The New Yorker profile by James Carroll that prompted my second column isn’t mentioned anywhere in the book; and where the Silver Star citation is quoted (page 104), the phrase that attracted my attention, “behind the hootch,” is replaced by an ellipsis. (An after-action report containing the phrase is quoted on page 102.)

Nor did Baron and I ever speak of the matter. What might he have known about it? He had been appointed night editor of The Times in 1997, last-minute assessor of news not yet fit to print; I don’t know whether he was already serving in that capacity in October 1996, when my Globe columns became part of the Senate election story. I do know he commissioned the project that became the Globe biography in December, 2001, a few weeks before terminating EP.

Kranish today is a national political investigative reporter for The Washington Post. Should I have asked him about his Globe reporting, which seems to me lacking in context? I think not. (I let him know this piece was coming; I hope that  eventually we’ll talk privately someday.) But my subject here is how The  Globe’s culture changed after NYT Co. acquired the paper, so I believe his incuriosity and that of his editor are facts that speak for themselves.

Baron’s claims of authority in his preface to The Complete Biography by the Boston Globe Reporters Who Know Him Best strike me as having been deliberately dishonest, a calculated attempt to forestall further scrutiny of Kerry’s time in Vietnam. In this Baron’s book failed. It is a far more careful and even-handed account than Tour of Duty: John Kerry and the Vietnam War (Morrow, 2004), historian Douglas Brinkley’s campaign biography. Mooney’s sections on Kerry’s years in Massachusetts politics are especially good. But as the sudden re-appearance of the Vietnam controversy in 2004 demonstrated, The Globe’s account left much on the table.

                                                                        • ••

I mention these events now for two reasons.  The first is that the Substack publishing platform has created a path that did not exist before to an audience – in this case several audiences – concerned with issues about which I have considerable expertise. The first EP readers were drawn from those who had followed the column in The Globe.  Some have fallen away; others have joined. A reliable 300 or so annual Bulldog subscriptions have kept EP afloat.

Today, with a thousand online columns and two books behind me – Knowledge and the Wealth of  Nations: A Story of Economic Discovery (Norton, 2006) and Because They Could: The Harvard Russia Scandal (and NATO Expansion) after Twenty-Five Years (CreateSpace, 2018) – and a third book on the way, my reputation as an economic journalist is better-established.

The issues I discuss here today have to do with aspirations to disinterested reporting and open-mindedness in the newspapers I read, and, in some cases, the failure to achieve those lofty goals. I have felt deeply for 25 years about the particular matters described here; I was occasionally tempted to pipe up about them. Until now, the reward of regaining my former life as a newsman by re-entering the discussion never seemed worth the price I expected to pay.

But the success of Substack says to writers like me, “Put up or shut up.” After the challenge it posed dawned in December, I perked up, then hesitated for several months before deciding to leave my comfortable backwater for a lively and growing ecosystem. Newsletter publishing now has certain features in common with the market for national  magazines that emerged in the U.S. in the second half of the 19th Century – a mezzanine tier of journalism in which authors  compete for readers’ attention. In this case, subscribers participate directly in deciding what will become news.

The other reason has to do with arguments recently spelled out with clarity and subtlety by Jonathan Rauch in The Constitution of Knowledge: A Defense of Truth (Brookings, 2021). Rauch gets the Swift Boat controversy mostly wrong, mixing up his own understanding of it with its interpretation by Donald Trump, but he is absolutely correct about the responsibility of the truth disciplines – science, law, history and journalism – to carefully sort out even the most complicated claims and counter-claims that endlessly strike sparks in the digital media.

Without the places where professionals like experts and editors and peer reviewers organize conversations and compare propositions and assess competence and provide accountability – everywhere from scientific journals to Wikipedia pages – there is no marketplace of ideas; there are only cults warring and splintering and individuals running around making noise.

EP exists mainly to cover economics. This edition has been an uncharacteristically long (re)introduction. My interest in these long-ago matters is strongly felt, but it is a distinctly secondary concern. I expect to return to these topics occasionally, on the order of once a month, until whatever I have left to say has been said: a matter of ten or twelve columns, I imagine, such as I might have written for the Taylor family’s Globe.

As a Stripes correspondent, I knew something about the American war in Vietnam in the late Sixties. As an experienced newspaperman who had been sidelined, I was alert to issues that developed as Kerry mounted his presidential campaign. And as an economic journalist, I became interested in policy-making during the first decade of the 21st Century, especially decisions leading up to the global financial crisis of 2008 and its aftermath. Comments on the weekly bulldogs are disabled.  Threads on the Substack site associated with each new column are for bulldog subscriber only. As best I can tell, that page has not begun working yet. I will pay close attention and play comments there by ear.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this essay originated.

           


David Warsh: Whatever happened to Decoration Day?

“The March of Time” (oil on canvas), Decoration Day {now called Memorial Day} in Boston, by Henry Sandham (1842-1910), a Canadian painter.

“The March of Time” (oil on canvas), Decoration Day {now called Memorial Day} in Boston, by Henry Sandham (1842-1910), a Canadian painter.

SOMERVILLE, Mass.

Decoration Day began on May 1, 1865, in Charleston, S.C., when an estimated 10,000 people, most of them former slaves, paraded to place flowers on the newly dug graves of 257 Union soldiers who had been buried without coffins behind the grandstand of a race course. They had been held in the infield without tents, as prisoners of war, while Union batteries pounded the city’s downtown during the closing days of the Civil War.

The evolution of Decoration Day over the next fifty years was one of the questions that led historian David W. Blight to write Race and Reunion: The Civil War in American Memory (Harvard, 2001). After Blight’s book appeared, it was quickly overshadowed by the events of 9/11.  Eric Foner conveyed its message most clearly in The New York Times Book Review – but only on page 28.  Today Race and Reunion is more relevant than ever. For a better idea of what the book is about than I can give you, read Foner’s review.

When I was a kid, May 30, Decoration Day was still ostensibly about remembering the Civil War, but the events of that May day in Charleston were no part of the story (though the POW camp at Andersonville, Ga., certainly had become part of the lore.). The names of veterans of various wars were read on the village green.  A bugler played taps. Decoration Day had been proclaimed a day of commemoration in 1868, when the commander of the Grand Army of the Republic ordered soldiers to visit their comrades’ graves. In 1890 it was declared a state holiday in New York.

And by the time that Woodrow Wilson, the first Southerner to be elected president since the Civil War, spoke at Gettysburg, on July 4, 1913, fifty years after the battle itself, the holiday had become national – but the experiences of black Americans had all but dropped out of the narrative. The hoopla was about the experiences of the Blue and the Gray, never mind that many blacks had served in the Union army.

Soon after the war had ended, another war had begun, a contest of ideas about how the meaning of the war was to be understood: the emancipation of the slaves vs. the reconciliation of the contending armies. The politics of Reconstruction – the attempted elevation of Blacks to full citizenship and constitutional equality – ended in defeat. In his book, Blight wrote, “The forces of reconciliation overwhelmed the emancipation vision in the national culture.” Decoration Day gradually became Memorial Day, just as Armistice Day in November became Veterans Day. Americans got what the novelist William Dean Howells said they inevitably wanted:  tragedies with a happy endings.

The age of segregation didn’t end until the Sixties. Black leaders such as Frederick Douglass and W.E. B. Du Bois had burnished the vision of emancipation. Educators, writers, and agitators articulated it and put it into practice. A second Reconstruction began in the years after World War II. In the 1960s the Civil Rights Movement reached a political peak.  A new equilibrium was achieved and lasted for a time.

So don’t fret about “Critical Race Theory.”  A broad-based Third Reconstruction has begun.  Blight was an early text, as was Derrick Bell’s Faces at the Bottom of a Well: The Permanence of Racism, which appeared in 1992. The tumult will continue for some time. Rising generations will take account of it. A new equilibrium will be attained. It will last for a time, before a Fourth Reconstruction begins.

In the meantime, the new holiday of Juneteenth is an appropriate successor to the original Decoration Day – a civic holiday of importance second only to the Fourth of July.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.

\

David Warsh: Through the decades with my brilliant friend and occasional rescuer The Copy Editor

440px-Example_of_copyedited_manuscript.jpeg

SOMERVILLE, Mass.

Economic Principals has been preparing for months to move to Substack’s publishing platform next week, figuring out what to bring and what to leave behind. A critical feature also will make the move: EP’s conscience, teacher and fellow-traveler, The Copy Editor.

We met mornings some forty years ago, walking to the paper {The Boston Globe} from the train. He was working in the library at the time, having turned down an offer from The Atlantic Monthly. I was a newly hired economics reporter. We’d attended the same college, Harvard, 15 years apart, and read some of the same books. We had undertaken similar undergraduate theses: his on historian Henry Adams and critic Edmund Wilson, in History and Literature; mine on Henry Adams and newspaper columnist Joseph Alsop, a China hand, in Social Studies. The Copy Editor finished his thesis with great distinction and graduated with high honors; I abandoned mine and graduated with no distinction.

Clear from the beginning was that he was unusually acute – more acute than I about many things; faster, too. One day without thinking I exaggerated the barriers I had run into as a quarter-miler in high school, claiming 51-second laps, perhaps, instead of 53-s? He had been an alternate in the mile relay for a team that had won the state championship. He forgave and remembered.

Before long he had moved to the book department. The Globe had some 550 editorial employees in those days.  I keep a photo on the office wall of a house ad, “Every One’s a Critic:” fifteen lively souls arrayed on stools, The Copy Editor among them. By then it was clear that he was a prodigy; what was unusual was that he served as cook and bottle-washer as well. He displayed deeply ingrained habits: helping others, performing introductions, giving parties, constructing networks. We lived near one another, knew each other’s families and friends.

Certain things stand out, none more than “Wing Tips on the Beach,” a Sunday feature story that became one of three finalists in that category for a Pulitzer Prize in 1994. To this day, I have never read a more revealing interpretation of Richard Nixon than that meditation on a famous photograph of the former president. It did not win, but the author persevered, devising a more capacious framework for his story. When Nixon at the Movies: A Book about Belief appeared in 2004, it quickly gained a place on the relatively short shelf of indispensable second-generation receptions of the Nixon story. And when the Pulitzer finally came, in 2008, it was for criticism, specifically “For his penetrating and versatile command of the visual arts, from film and photography to painting.”

At some point The Copy Editor had begun to read what I wrote before I turned it over to the editor’s desk, to “save you from yourself,” he regularly explained. The tumult of the sale of the paper to The New York Times cost us much. He remained at The Globe and become ever more one of its foremost citizens, knowledgeably recalling in print the long-ago saga of the Bulger clan one week; visiting a museum or reviewing the latest edition of the Fast and Furious movies the next; educating the stream of talented newcomers to the paper all the while. He stayed with me, too, after I left the paper, in 2002.

Almost certainly I would not have kept at EP if he had not. We have had our occasional, sometimes major, differences of opinion. His relatively humble title is designed to emphasize that he is not responsible for opinions published here. But never have I had a friend as loyal, generous and shrewd as The Copy Editor.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this column originated.         

David Warsh: Spence Weart sums up the global-warming crisis that's upon us

Average surface air temperatures from 2011 to 2020 compared to a baseline average from 1951 to 1980 (Source: NASA)

Average surface air temperatures from 2011 to 2020 compared to a baseline average from 1951 to 1980 (Source: NASA)

SOMERVILLE, Mass.

Every April since I first read it, in 2004, I take down and re-read some portions of my copy of The Discovery of Global Warming (Harvard), by Spence Weart. (The author revised and expanded his book in 2008.) I never fail to be moved by the details of the story: not so much his identification of various major players among the scientists – Arrhenius, Milankovitch, Keeling, Bryson, Bolin – but by the account of the countless ways in which the hypothesis that greenhouse-gas emissions might lead to climate change was broached, investigated, turned back on itself (more than once), debated and, eventually, confirmed.

In the Sixties, Weart trained as an astrophysicist. After teaching for three years at Caltech, he re-tooled as a historian of science at the University of California at Berkeley. Retired since 2009, he was for 35 years director of the Center for the History of Physics of the American Institute of Physics, in College Park, Md.

This year, too, I looked at the hypertext site with which Weart supports his much shorter book, updating it annually in February, incorporating all matter of new material. It includes recent scientific findings, policy developments, material from other histories that are beginning to appear. The enormous amount of material is daunting. Several dozen new references were added this year, ranging from 1956 to 2021, bringing the total to more than 3,000 references in all. Then again, all that is also reassuring, exemplifying in one place the warp and woof of discussion taking pace among scientists, of all sorts, that produces the current consensus on all manner of questions, whatever it happens to be. Check out the essay on rapid climate change, for example.

Mainly I was struck by the entirely rewritten Conclusions-Personal Note, reflecting what he describes as “the widely shared understanding that we have reached the crisis years.”

 Global warming is upon us. It is too late to avoid damage — the annual cost is already many billions of dollars and countless human lives, with worse to come. It is not too late to avoid catastrophe. But we have delayed so long that it will take a great effort, comparable to the effort of fighting a world war— only without the cost in lives and treasure. On the contrary, reducing greenhouse gas pollution will bring gains in prosperity and health. At present the world actually subsidizes fossil fuel and other emissions, costing taxpayers some half a trillion dollars a year in direct payments and perhaps five trillion in indirect expenses. Ending these payments would more than cover the cost of protecting our civilization.

Plenty else is going on in climate policy. President Biden is hosting a virtual Leaders Summit on Climate on Thursday, April 22 (Earth Day) and Friday, April 23. Nobel laureate William Nordhaus pushes next month The Spirit of Green: The Economics of Collisions and Contagions in a Crowded World (Princeton), reinforcing Weart’s conviction that it actually costs GDP not to impose a carbon tax on polluters.  Public Broadcasting will roll out later this month a three-part series in which the BBC follows around climate activist Greta Thunberg in A Year to Change the World. And Stewart Brand, who in 1967 published the first Whole Earth Catalog, with its cover photo of Earth seen from space, is the subject of a new documentary, We Are as Gods, about to enter distribution. There is other turmoil as well. But if you are looking for a way to observe Earth Day, reading Spencer Weart’s summing-up is an economical solution.

David Warsh, an economic historian and a veteran columnist, is proprietor of Somerville-based economicprincipals.com, where this column first appeared.

© 2021 DAVID WARSH, PROPRIETOR

—Graphic by Adam Peterson

—Graphic by Adam Peterson

     

David Warsh: Economic models and engineering

model.jpg

SOMERVILLE, Mass.

The best book about developments in the culture of professional economics to appear in the last quarter century is, in my opinion, The World in the Model: How Economists Work and Think (Cambridge, 2012), by Mary S. Morgan, of the London School of Economics and the University of Amsterdam. The best book of the quarter century before that is, again, according to me, An Engine, Not a Camera: How Models Shape Financial Markets (MIT, 1997), by Donald MacKenzie, of the University of Edinburgh.

Both books describe the introduction of mathematical models in the years beginning before World War II. Both consider how the subsequent use of those techniques has changed how economics is done by economists. Morgan’s book is about the kinds of models that economists devise experimentally, not those that interest MacKenzie most, models designed to be tested against the real world.  A deft cover illustrates Morgan’s preoccupation, showing the interior of a closed room with only a high window. On the floor of the room are written graphic diagram of supply and demand. The window opens only to the sky outside, above the world itself, a world the model-builder cannot see. The introduction of statistical inference to economics she dealt with in The History of Econometric Ideas (Cambridge, 1990).

I remember the surprise I felt when I first read Morgan’s entry “Economics” in The Cambridge History of Science Volume 7: The Modern Social Sciences (Cambridge, 2003). She described two familiar wings of economics, often characterized in the 19th Century as “the science of political economy” and “the art of economic governance.” Gradually in that century they were relabeled “positive” economics (the way it is, given human nature) and “normative” economics (the way it ought to be).

Having practiced economics in strictly literary fashion during the modern subject’s first century, Morgan continued, economists in the second half of the 19th Century began adopting differential calculus as a language to describe their reasoning. In the 20th Century, particularly its second half, the two wings have been firmly “joined together” by their shared use of “a set of technologies,” consisting mainly of mathematics, statistics and models.  Western technocratic economics, she wrote, had thereby become “an engineering science.”

I doubted at the time that it was especially helpful to think economics that way.

Having read Economics and Engineering: Institutions, Practices, and Cultures (2021, Duke), I still doubt it. That annual conference volume of the journal History of Political Economy appeared earlier this year, containing 10 essays by historians of thought, with a forward by engineering professor David Blockley, of the University of Bristol, and an afterword by Morgan herself. Three developments – the objectification of the economy as a system; the emergence of tools, technologies and expertise; and a sense of the profession’s public responsibility – had created something that might be understood as “an engineering approach” to the economy and in economics, writes Morgan. She goes on to distinguish between two modes of economic engineering, start-fresh design and fix-it-up problem-solving, noting that enthusiasm for the design or redesign of whole economies and/or vast sectors of them had diminished in the past thirty years.

It’s not that the 10 essays don’t make a strong case for Morgan’s insights about various borrowings from engineering that have occurred over the years: in particular, Judy Klein, of Mary Baldwin University, on control theory and engineering; Aurélien Saïdi, of the University of Paris Nanterre, and Beatrice Cherrier, of the University of Cergy Pontoise and the Ecole Polytechnique, on the tendencies of Stanford University to produce engineers; and William Thomas, of the American Institute of Physics, on the genesis at RAND Corp. of Kenneth Arrow’s views of the economic significance of information.

My doubts have to do with whether the “science” of economics and the practice of its application to social policy have indeed been in fact been “firmly joined” together by the fact that both wings now share a common language. I wonder whether more than a relatively small portion of what we consider to be the domain of economic science is sufficiently well understood and agreed-upon by economists themselves as to permit “engineering” applications.

Take physics. In the four hundred years since Newton many departments of engineering have been spawned: mechanical, civil, electrical, aeronautical, nuclear, geo-thermal. But has physics thereby become an engineering science?  Did the emergence of chemical engineering in the 1920s change our sense of what constitutes chemistry? Is biology less a science for the explosion of biotech applications that has taken place since the structure of the DNA molecule was identified in 1953? Probably not.

Some provinces of economics can be considered to have reached the degree of durable consensus that permits experts to undertake engineering applications.  I count a dozen Nobel prizes as having been shared for work that can be legitimately described as economic engineering: Harry Markowitz, Merton Miller and William Sharpe, in 1990, for “pioneering work in financial  economics”;  Robert Merton and Myron Scholes, in 1997, “for  a new method to determine the value of derivatives”;  Lloyd Shapley and Alvin Roth, in 2012, “for the theory of stable allocations and the practice of market design”:  Abajit Banerjee, Esther Duflo and Michael Kremer, in 2019, for “their experimental approach to alleviating global poverty”; and Paul Milgrom and Robert Wilson, in 2020, for “improvements to auction theory and inventions of new auction formats.”

This is where sociologist Donald McKenzie comes in. In An Engine Not a Camera, he describes the steps by which, in the course of embracing the techniques of mathematical modeling, finance theory had become “an active force transforming its environment, not a camera, passively recording it,” but an engine, remaking it. When market traders themselves adopted models from the literature, the new theories brought into existence the very transactions of which abstract theory had spoken – and then elaborated them. Markets for derivatives grew exponentially.  Such was the “performativity” of the new understanding of finance. After all, writes Morgan in her afterword, hasn’t remaking the world been the goal of economic-engineering interventions all along?

Natural language has a knack for finding its way in these matters. We speak easily of “financial engineering” and “genetic engineering.”  But “fine-tuning,” the ambition of macro-economists in the 1960s, is a dimly remembered joke. The 1942 photograph on the cover of Economics and Engineering – graduate students watching while a professor manipulates a powerful instrument laden with gauges and controls – seems like a nightmare version of the film Wizard of Oz.

John Maynard Keynes memorably longed for the day when economists would manage to get themselves thought of as “humble, competent people on a level with dentists.”  Nobel laureate Duflo a few years ago compared economic fieldwork to the plumbers’ trade.  “The scientist provides the general framework that guides the design…. The engineer takes these general principles into account, but applies them to a specific situation…. The plumber goes one step further than the engineer: she installs the machine in the real world, carefully watches what happens, and then tinkers as needed.”

The $1.9 trillion American Rescue Plan Act that became law last week, with its myriad social programs, is not founded on what “the science” says. It is an intuition, an act of faith. Better to continue to refer to most economic programs as “strategies” and “policies” instead of “engineering,” and consider effective implementations to be artful work.

David Warsh, an economic historian and a veteran columnist, is proprietor of Somerville-based economicprincipals.com, where this column first appeared

      Copyright 2021 by David Warsh, proprietor    

David Warsh: Of Biden in '72, Trump now and economic engineering

Reading The Wealth of Nations was a revelation for David Warsh

Reading The Wealth of Nations was a revelation for David Warsh

Calibrate v, trans.  to determine the caliber of; spec. to try the bore of a thermometer, or similar instrument, so as to allow in graduating it for any irregularities; to graduate a gauge of any kind with allowance for its irregularities.

Nearly fifty years ago, I showed up as a new employee at the Wilmington (Del.) News-Journal on election night, 1972, the evening that New Castle County Councilor Joe Biden was elected to the U.S. Senate. Biden was 29, I was 28. The second-floor newsroom bubbled with intoxicating excitement and indignation. (President Richard Nixon had carried 49 states). But when Biden was elected president last autumn, I felt as though I had somehow turned the page.

I don’t know whether it was four years of Donald Trump, twelve months of COVID-19, the passage of nearly twenty years since I last worked for a newspaper, or faint memories of New Castle County politics, but a week ago I found the column I was working on, about economics and engineering, more interesting than the prospects of Biden’s presidency. I recognized late in the day that I hadn’t found a way to make it interesting to readers of Economic Principals, my Web site. So I put the topic aside and took a bye.  Walking home, I remembered the title of Albert Hirschman’s little book Shifting Involvements: Private Interests and Public Action.

On the other hand, as a copyboy at Chicago’s City News Bureau in 1963, my second assignment had been to cover a Walter Heller press conference at the Palmer House hotel. Heller was then chairman of the Council of Economic Advisers. My attendance at the session was perfunctory; the news desk didn’t want a story. But from the Palmer House on, I was looking someplace other than City Hall in which to invest my interests.

That was all the more so after I returned to college to read history of social thought. Just five authors were on the calendar of the sophomore tutorial that year: Alexis de Tocqueville, Karl Marx, Emile Durkheim, Max Weber and Sigmund Freud. That was the year they omitted Adam Smith altogether and his Inquiry into the Nature and Causes of the Wealth of Nations, perhaps because of the temper of the times (this was 1970-71). When a couple of years after that, in the course of another assignment, I went back to read Smith for myself, it was a revelation.  I have been fundamentally interested ever since in the stories we tell (and don’t tell) about economics.

Take that column about economic engineering. I found myself thinking over the last few days not so much about the conclusion of an independent market monitor  that the operator of Texas’s  power-grid had overcharged residents as much $16 billion during a cold snap last month as about Gov. Greg Abbott’s decision to end the state’s mask mandate and permit businesses to reopen at 100 percent capacity.

The decision to permit the highest legal rates to apply for as much as 32 hours longer than warranted was been reviewed Friday by the Texas Electricity Reliability Council, which declined to reverse the charges, even though the principles would seem to be well-established among specialists. “It’s just nearly impossible to unscramble this sort of egg,” the new chair of the Public Utility Commission said during a commission meeting.

But principles for telling people when and where to wear masks are anything but well-understood and universally agreed-upon. They have something to do with what we call culture. As economist David Kreps said of his 2018 book, The Motivation Toolkit: How to Align Your Employees’ Interests with Your Own: “My colleagues here [at Stanford University and its Graduate School of Business] don’t think this is economics, but it is.” I cannot seem to put that book away.

Economic Principals, says in its flag, “Economic Principals:  a weekly column about economics and politics, formerly of The Boston Globe, independent since 2002.” Economic Principals has written frequently about politics for the last five years.  Going forward, I hope put the subject mostly on autopilot.

The Trump presidency was accidental. It happened only because so many voters deemed inappropriate a Hillary Rodham Clinton presidency. Trump’s tenure proved to be a turning point, a climax, a crisis that slowly will resolve.

How close he came to re-election! How desperately he fought to hold on to office, even after he lost!  The sharpest student of Trump’s career I know, who follows the literature much more widely than I do, believes it was because the now-former president understands that he is facing ruin. Suspicion of money laundering for Russian purchasers of apartments is at the heart of the case.

What I expect to happen is this: the Republican Party will gradually rebuild itself, election by election, as new generations – Gen X (b. 1961-1981) and the Millennials (b. 1982– 2004) take  over from the Boomers (b. 1943-1960). The Democratic Party will take advantage of a strong economy and threats posed by the rising great-power competitor that is China to deliver the country into a new era. Congressional elections will be closely fought every two years, but I am guessing that the Democrats may remain in the White House until 2032, though only the Republicans have won the presidency three straight times since Harry Truman.

Meanwhile, I’ll return to engineering and economics next week, and try again to find something interesting to say about blueprints, instruments, and toolkits. Plenty of other columns are already in line. Re-calibration complete!

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville, Mass.-based economicprincipals.com, where this column first ran.

David Warsh: An old man against the world; don't eat WSJ baloney on Texas crisis

Rupert Murdoch

Rupert Murdoch

SOMERVILLE, Mass.

Financial Times columnist Simon Kuper chose a good week in which to write about a leading skeptic of climate change. “For all the anxiety about fake news on social media,” Kuper wrote last weekend, “disinformation on climate seems to stem disproportionately from one old man using old media.”

He meant Rupert Murdoch, 89, whose company, News Corp., owns The Wall Street Journal, the New York Post, The Times of London, and half the newspaper industry in Australia. Honored with a lifetime achievement award in January by the Australia Day Foundation, a British organization, Murdoch posted a video:

“For those of us in media, there’s a real challenge to confront: a wave of censorship that seeks to silence conversation, to stifle debate, to ultimately stop individuals and societies from realizing their potential. This rigidly enforced conformity, aided and abetted by so-called social media, is a straitjacket on sensibility. Too many people have fought too hard in too many places for freedom of speech to be suppressed by this awful ‘woke’ orthodoxy.”

There is some truth in that, of course – but not enough to justify the misleading baloney on the cause of crisis in Texas which the editorial pages of the WSJ published last week.

Murdoch is a canny newspaperman, and since acquiring The WSJ,  in 2007, he has the good sense not to tamper overmuch with its long tradition of sophistication and sobriety in its news pages. He even replaced the man he had put in charge of the paper, Gerard Baker, after staffers complaints that Baker’s thumb was found too frequently on the scale of its coverage of Donald Trump.

Then again, neither has he tinkered with the more controversial traditions of the newspaper’s editorial opinion pages, to which Baker has since been reassigned as a columnist. The story has often been told about how managing editor Barney Kilgore transformed a small-circulation financial newspaper competing mainly with The Journal of Commerce in the years before World War II into a nationwide competitor to The New York Times, and worth $5 billion to Murdoch. A major contributor to it was Vermont Royster, editor of the editorial pages in 1958–71; and an occasional columnist for the paper for another two decades years after that. He was succeeded by Robert Bartley. Royster died in 1996.

In 1982, Royster characterized the beginnings of the change this way: “When I was writing editorials, I was always a little bit conscious of the possibility that I might be wrong. Bartley doesn’t tend to do that so much. He is not conscious of the possibility that he is wrong.”

Royster hadn’t seen anything yet. With every passing year, Bartley became firmer in his opinions. In the 1990s, his editorial pages played a leading role in bringing about the impeachment of President Clinton.  Bartley died in 2002, and was succeeded by Paul Gigot, who has presided over a continuation of the tradition of hyper-confidence. The editorial page enthusiastically supported Donald Trump until the Jan. 6 assault on the Capitol.

Last week the WSJ published four editorials on the situation in Texas.

Tuesday, A Deep Green Freeze: “[A[n Arctic blast has frozen wind turbines. Herein is the paradox of the left’s climate agenda: the less we use fossil fuels, the more we need them.”

Wednesday, Political Making of a Power Outage: “The Problem is Texas’s overreliance on wind power that has left the grid more vulnerable to bad weather than before.”

Thursday, Texas Spins in the Wind: “While millions of Texans remain without power for the third day, the wind industry and its advocates are spinning a fable that gas, coal, and nuclear plants – not their frozen turbines – are to blame.”

Saturday, Biden Rescues Texas with… Oil:  “The Left’s denialism that the failure of wind power played a starring role in Texas catastrophic power outage has been remarkable”

Then on Saturday, the news pages weighed in, flatly contradicting the on-going editorial-page version of events with a thoroughly reported account of its own, The Texas Freeze: Why State’s Power Grid Failed:  “The core problem: Power providers can reap rewards by supplying electricity to Texas customers, but they aren’t required to do it and face no penalties for failing to deliver during a lengthy emergency.

“That led to the fiasco that led millions of people in the nation’s second-most populous state without power for days.  A severe storm paralyzed almost every energy source, from power plants to wind turbines, because their owners hadn’t made the investments needed to produce electricity in subfreezing temperatures.”

All three major American newspapers are facing major decisions in the coming year:  Amazon’s Jeff Bezos must replace retiring executive editor Martin Baron at The Washington Post; New York Times publisher A.G. Sulzberger presumably will name a successor to executive editor Dean Baquet, who will turn 65 in September (66 is retirement age there); and Murdoch will presumably replace Gigot, who will be 66 in May.

The WSJ editorial page could play an important role in American politics going forward by sobering up. But only if Murdoch – or, more likely, his eldest son, Lachlan, who turns 50 this autumn – selects an editor who writes sensibly, conservatively, about dealing with climate change.   

David Warsh, an economic historian and a veteran columnist, is proprietor of Somerville-based economicprincipals.com, where this column first ran.

Editor’s note: Both Mr. Warsh and New England Diary’s editor, Robert Whitcomb, are Wall Street Journal alumni.


Of Harvard, Summers, Russia and the future

The Kremlin— Photo by A.Savin

The Kremlin

— Photo by A.Savin

SOMERVILLE, Mass.

Some years ago, I set out to write a little book about Harvard University’s USAID project to teach market manners to Boris Yeltsin’s Russian government in the 1990s.  The project collapsed after leaders of the Harvard mission were caught seeking to line their own pockets by gaining control of an American firm they had  brought in to advise the Russians. Project director Andrei Shleifer was a Harvard professor. His best friend, Lawrence Summers, was U.S. assistant Treasury secretary at the time.

There was justice to be served. The USAID officer who blew the whistle, Janet Ballantyne, was a Foreign Service hero. The victim of the squeeze, John Keffer, of Portland, Maine, was an exemplary American businessman, high-minded and resourceful.

But I had something besides history in mind.  By adding a chapter to David McClintick’s classic story of the scandal, “How Harvard Lost Russia,’’ in Institutional Investor magazine in 2006), I aimed to make it more complicated for former Treasury Secretary Summers, of Harvard University, to return to a policy job in a Hillary Rodham Clinton administration.

It turned out there was no third Clinton administration. My account, “Because They Could, ‘‘ appeared in 2018. So I was gratified last August when, with the presidential election underway, Summers told an interviewer at the Aspen Security Forum that “My time in government is behind me and my time as a free speaker is ahead of me.” Plenty of progressive Democrats had objected to Summers as well.

Writing about Russia in the1990s meant delving deeper into the history of U.S.-Russia relations than I had before. I developed the conviction that, during the quarter century after the end of the Cold War, U.S. policy toward Russia had been imperious and cavalier.

By 1999, Yeltsin was already deeply upset by NATO expansion. The man he chose to succeed him was Vladimir Putin. It wasn’t difficult to follow the story Through Putin’s eyes. He was realistic to begin with, and, after 9/11, hopeful (Putin was the among the first foreign leaders to offer assistance to President George W. Bush).

But NATO’s 2002 invitation to the Baltic states — Lithuania, Latvia and Estonia — all former Soviet Republics, the U.S .invasion of Iraq, the Bush administration’s supposed failure to share intelligence about the siege of a  school in Beslan, Russia, led to Putin’s 2007 Munich speech, in which he complained of  America’s “almost uncontained hyper use of force in international relations.”

Then came the Arab Spring. NATO’s intervention in Libya, ending in the death of Muammar Gaddafi in 2011, was followed by Putin’s decision to reassume the Russian presidency, displacing his hand-picked, Dimitri Medvedev, in 2012. Putin blamed Hillary Clinton for disparaging his campaign.

And in March 2014, Putin’s plans to further a Eurasian Union via closer economic ties with Ukraine having fallen through, Ukrainian President Viktor Yanukovych fled to Moscow in the face of massive of pro-European Union demonstrations in Kyiv’s Maidan Square. Russia seized and annexed Ukraine’s Crimean Peninsula soon after that.

The Trump administration brought a Charlie Chaplin interlude to Russian-American relations. Putin saw no problem: He offered to begin negotiating an anti-hacking treaty right away.  Neither did Trump:  Remember Russian Foreign Minister Sergey Lavrov’s Oval Office drop-by, the day after the president fired FBI Director James Comey?

Only the editorial board of The Wall Street Journal, among the writers I read, seemed to think there was nothing to worry about in Trump’s ties to Russia. Meanwhile, Putin rewrote the Russian Constitution once again, giving himself the opportunity to serve until 2036, when he will be 84.

But Russia’s internal history has taken a darker turn with the return of Alexander Navalny to Moscow. The Kremlin critic maintains that Putin sought his murder in August, using a Soviet-era chemical nerve-agent. Navalny survived, and spent five months under medical care in Germany before returning.

Official Russian media describe Navalny as a “blogger,” when he is in fact Russia’s opposition leader. He has been sentenced to at least two-and-a-half years in prison on a flimsy charge, and face other indictments. But his arrest sparked the largest demonstrations across Russia since the final demise of the Soviet Union. More than 10,000 persons have been detained, in a hundred cities across Russia, according to Robyn Dixon, of The Washington Post. Putin’s approval ratings stand at 29 percent

What can President Biden do? Very little. However much Americans may wish that Russian leaders shared their view of human rights, it should be clear by now there is no alternative but to deplore, to recognize Russian sovereignty, to encourage its legitimate business interests, discourage its trickery, and otherwise hope for the best. There are plenty of problems to work on at home.

David Warsh, an economic historian and veteran columnist, is proprietor of economicprincipals.com, where this columnist first appeared.

           

David Warsh: Toward a third Reconstruction

The March on Washington, D.C., on Aug. 28, 1963, during the Civil Rights Movement — what might be called the “Second Reconstruction.’’

The March on Washington, D.C., on Aug. 28, 1963, during the Civil Rights Movement — what might be called the “Second Reconstruction.’’

SOMERVILLE, Mass.

Until recently, Reconstruction was a topic in American history of interest chiefly to high school juniors preparing to take the college Advanced Placement exam.  During the 13 years after the Civil War, the United States reintegrated the states that had seceded from the Union and struggled to define the legal status in them of African-Americans under the 13th, 14th and 15th Amendments to the Constitution.  By 1916, when President Woodrow Wilson reintroduced segregation to the federal workforce, the hard-fought gains of the episode had faded from living memory.

Then again, every America born before, say, 1960, has a first-hand  experience of the Civil Rights Movement. It is often dated from President Harry Truman’s 1948 decision to integrate U.S. armed forces after the contradictions of segregation re-emerged and became untenable during World War II. There was Jackie Robinson and the integration of Major League Baseball, and then the marches with their dramatic confrontations. Martin Luther King Jr. was assassinated in 1968; George Wallace was roundly defeated as a third-party presidential candidate (though he did better than any third-party candidate between Theodore Roosevelt and H, Ross Perot).  After 1970, most people turned their attention to other concerns.  Ill-feelings were cosmetically treated away on television: Archie Bunker and the Cosby show.

Events of the last several years, often summed up by the assertion that Black lives matter, have often been portrayed as the beginnings of a Third Reconstruction. The implication is that the Civil Rights Movement was the second: historian C. Vann Woodward said as much. There may one day be a fourth. The Rev. William Barber II, pastor of Greenleaf Christian Church, in Goldsboro, N.C., anticipated as much in 2016 with The Third Reconstruction: Moral Mondays, Fusion Politics, and the Rise of a New Justice Movement.  

The view that the history of the Unites States is essentially inseparable from the history of slavery was forcefully voiced by The New York Times, in 2019, in its 1619 Project. “Our democracy’s founding ideals were false when they were written,” asserted Nikole Hanna-Jones, in an opening essay. “Black Americans have fought to make them true.” Not everyone was convinced. But the Trump administration’s rejoinder, the “1776 Project” of its 1776 Commission, released last week, has been quickly dismissed. The inauguration ceremonies brought all this to mind. Three books, more than any others in the last 30 years, have done more to open my eyes:

The Promised Land: The Great Black Migration and How It Changed America (1991), by Nicholas Lemann, then of The Atlantic Monthly, decisively put on the map the enormous changes wrought after 1944 by the introduction of the mechanical cotton picker, replacing workers whose numbers had soared after 1794, when the invention of the cotton gin  made the crop newly much more profitable. The subsequent migration of unemployed farm workers from the rural South to the metropolitan North brought a cascade of changes in the lives of the migrants, and the cities in which they sought homes and jobs. Ghettoes, unemployment, single-parent families, drugs and crime were among the unintended effects. So was newfound political power and, for many, greater affluence.

Race and Reunion: The Civil War in American Memory (2002), by David Blight, of Yale University, demonstrated that of the three quite different stories that emerged from the Civil War, it was the vision of reconciliation between the mostly white armies of the North and the South that came to dominate, permitting the White supremacist vision of continuing racial segregation and reasserted white privilege to eclipse an emancipationist vision of constitutional equality for African-Americans citizens. Blight followed up with a Pulitzer-Prize-winning biography, Frederick Douglass: Prophet of Freedom (2018).

The Color of Law: A Forgotten History of How Our Government Segregated America (2017), by Richard Rothstein, of the Economic Policy Institute, recovered a lost history of how bankers and real-estate agents successfully enlisted federal, state and local policies to create and maintain racially homogenous neighborhoods in cities and suburbs nationwide. The patterns of segregation that resulted violate constitutional rights, he argued, and now require remediation. His memorable account of how such policies loomed in the background of the 2014 killing of Michael Brown, in Ferguson, Mo., is here.

Last week I read Revisiting Time on the Cross after 45 Years: The Slavery Debate and the New Economic History, by Eric Hilt, of Wellesley College and the National Bureau of Economic Research. Hilt noted that a wave of celebrated books that have appeared in recent years evaluating the role of slavery in the development of the American economy, among them Walter Johnson’s River of Dark Dreams; Edward Baptist’s The Half that Has Never Been Told; Sven Beckert’s Empire of Cotton; and Beckert and Seth Rockman’s Slavery’s Capitalism.

Yet some of the arguments resemble those that appeared first in Time on the Cross: The Economics of American Negro Slavery, by Robert Fogel and Stanley Engerman, published in 1974. Time on the Cross was a work of enormous novelty, undertaken in the service of the much- ballyhooed variety of economic history calling itself “cliometrics” (or “econometric history’’). Its fundamental assertions were, as Hilt puts them, that slavery was “profitable, productive and humane.” A storm of controversy followed.

The debate over measurement issues has moved on since then, Hilt notes; the technical literature has become hard for layfolk to follow.   Time on the Cross’s assertions of the fundamental benevolence of slaveholders have been thoroughly disproved. Yet Fogel and Engerman’s purely economic conclusions about the profitability and productivity of slavery stand up pretty well. Fogel later shared a Nobel Prize with economic historian Douglass North.

The slave economies of the South were thriving before the Civil War. Secessionist politicians and their business backers knew it. The North undertook the Civil War for the best of reasons.  Its leaders knew that slavery was wrong. A hundred and fifty years later, Americans of all sorts are still working to mitigate its ill-effects.

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville-based economicprincipals.com, where this essay first appeared.

    

David Warsh: Aaron Burr, the Gunpowder Plot and our current scoundrel-in-chief

Vice President Aaron Burr in 1802

Vice President Aaron Burr in 1802

SOMERVILLE, Mass.

Donald Trump has begun appearing in the rear-view mirror.  I have compared Trump to Aaron Burr, the scoundrel who sought to overturn the 1800 presidential election in the early days of the Republic. Hit this link for an explanation of what he did.

Later, starting in 1804, Burr sought to invent around U.S. elections altogether, hoping to foment a breakaway-nation that he could govern in the Spanish Southwest.

No other episode in American history comes close.  But after following news about the Jan. 6 assault on the Capitol,  I’m inclined to believe that the history of England offers a more illuminating comparison.  I’ve been thinking about Guy Fawkes and the 17th Century Gunpowder Plot.

The background was the Protestant Reformation, which had begun in Germany, in 1517, with Martin Luther. Starting in 1533, King Henry VIII withdrew Britain its allegiance to the Roman Catholic Church. Catholics, especially Jesuit priests, had a hard time of it under Henry’s daughter Queen Elizabeth I. Elizabeth’s Catholic cousin, Mary, Queen of Scots, was executed for treason in 1587.

Elizabeth died childless, in 1603, without having named an heir. Queen Mary’s son, Elizabeth’s nephew, peacefully acceded to the throne as James I of England and James VI of Scotland. But repression of Catholics continued, and in 1605 a group of English Catholics plotted to blow up the House of Lords during the opening of Parliament, on Nov. 5. Betrayed by a letter of seemingly mysterious provenance, one of the plotters, Guy Fawkes, was discovered guarding 36 barrels of gunpowder in the basement the night before, quite enough to demolish the building. The plotters were captured and, one way or another, put to death, including some Catholic clergy linked to the plot.

GunpowderPlot.jpg

The world was smaller then. Passions ran deeper. Globalization had only just begun. But it is hard to think of any other episode in Anglo-American history whose rhetorical aim resembled more closely that of the mob that descended on the U.S. Capitol on Jan. 6, having been dispatched on their mission by President Trump.

One group of skirmishers came within seconds of sighting Vice President Pence as he, along with his wife and daughter, was spirited out of the Senate chamber and into a hideaway, according to a Jan. 16 story  in The Washington Post. The gunpowder plotters intended to kill King James and install his nine-year-old daughter on the throne as Catholic monarch. The Capitol Hill mob vigorously denounced Pence as a traitor for his failure to overturn the presidential-election results. A mock gallows was installed on the western approach to the Capitol.

Persecution of Catholics continued but was mild relative to the century before for the remainder of the reign of James I – he died in 1625 – but anti-Catholic sentiment persisted in England for two centuries, through a not-unrelated civil war and a very-related “Glorious Revolution.”

No one knows what might be the long-term effects of the assault on the Capitol. My hunch is that it will be remembered as thoroughly repugnant and rejected and condemned until it is forgotten. For a higher-resolution characterological account of the Trump story, see “What TV Can Tell Us about How the Trump Show Ends,” by Joanna Weiss. The Trump presidency resembled an antihero drama, Weiss says, more closely than reality TV, the saga of Tony Soprano her chief case in point.

With this emergency behind, I am returning this weekly to my main concern, economics, meanwhile finishing a book on the last hundred years of its textbook versions.  As for my plans to shift to the Substack publishing platform, they are postponed; Economic Principals will move at the end of the year, by which time the book will have begun wending its way its way to the press.  Luck be a lady this year!

David Warsh, a veteran columnist and an economic historian, is proprietor of Somerville, Mass.-based economicprincipals.com, where this column originated.