“But yes I think it can be very easily done”
Given the Baby Boomers’ irrational reverence for everything 1960s — incense, peppermints and the like -- give them their due for recalling the golden anniversary of a truly momentous year: 1965.
Camelot faded and from innocence bled armaments when the first combat troops (3,500 Marines) were dispatched in March; by November the Pentagon informed President Lyndon Johnson that it needed 400,000 personnel to vanquish the Viet Cong. Thus the stain of Vietnam became the defining event for a generation of Americans.
But the ‘60s were more than the turbulence of war. In fact, 1965 would have been memorable for casting a postmodern panorama: The Social Security Amendments (Medicare and Medicaid); The Voting Rights Act; The Immigration and Nationality Act; the first flights of Project Gemini space program; the closing of the Second Vatican Council (from which emerged three future popes); Casey Stengel’s retirement from baseball after 56 years; and television’s debut of jazz-tinged A Charlie Brown Christmas.
Hurricane Betsy, with winds of 145 mph, roared by New Orleans, killing 76, and became the first hurricane to cause a billion dollars in damage. The Gateway Arch was completed in St. Louis. Bob Dylan went electric at Newport and the Beatles went to Shea Stadium in New York. Rebellion occurred in Watts and demonstrations in Selma.
However, three unrelated, but monumental, developments — all within six weeks of each other — meant that 1965 would be the most consequential year of 20th Century American history as a predictor of the cultural, political and technological condition of early 21st Century America: the The Moynihan Report, passage of The Elementary and Secondary Education Act, and publication of Moore’s Law.
Known formally as “The Negro Family: The Case for National Action,” it was authored by Daniel Patrick Moynihan, then assistant secretary of labor, and, later, one of the Senate’s greatest thinkers. Originally labeled “For Office Use Only,” but released in March, it focused on the roots of black poverty in America.
Describing a “tangle of pathology,” he wrote that “expansion of welfare programs... can be taken as a measure of the steady disintegration of the Negro family structure over the past generation.” Absence of a “nuclear family” would hinder progress towards economic and political equality.
With pedagogic prescience, Moynihan illuminated the idea that such disintegration would beget social and cultural regression. In 1965, it was estimated that 23.6 percent of black children and just 3.07 percent of white children were born to single mothers. Today, those rates have been far exceeded (72 percent of black children; 29 percent of white children).
In 2012, ominously, 1,609,619 children were born to unmarried women, ushering a massive new generation reliant on civic altruism and government support. The long term ramifications are unknown but such instability is unprecedented and may help explain polarizing gaps in the normalcy of upward mobility.
The Elementary and Secondary Education Act was signed into law by President Johnson on Palm Sunday, April 11, a mere three months after being proposed, and is today in its ninth iteration (No Child Left Behind). At the time, it was the most expansive federal education bill -- an arena once the exclusive province of state and local educators.
Some have suggested that it marked the last time the federal government would consider any matter exempt from federal intrusion. Anything could be a constitutional imperative. It spawned the Department of Education and, more recently, Common Core State Standards Initiative. Fundamentally, it legitimized, if not anticipated, the largesse of Obamacare.
Today, the federal government allocates about $141 billion for education and, since 1965, over $267 billion has been spent to assist states in educating disadvantaged children. Despite requiring a “culture of accountability,” achievement is stagnant. According to the National Assessment of Educational Progress, reading proficiency of 17-year-olds has remained flat since the early 1970s.
On April 19, in the trade journal “Electronics” appeared a seminal essay, “Cramming More Components onto Integrated Circuits.” Dr. Gordon Moore, schooled in physical sciences rather than electronics, unwittingly changed the course of computing. He noticed processing speeds for clusters of transistors — the electronic engines of computers -- were effectively doubling every two years. He reasoned that such trends would continue through 1975. Remarkably, in 2015, his “lucky extrapolation” -- what became known as “Moore’s Law” -- is still relatively intact and nearly a self-fulfilling prophesy.
Americans today can trace the seemingly urgent, relentlessly constant, pace of technological change to Moore. Silicon Valley considers it a social contract, a driver of improvement. Before Moore’s observations, it was challenging to fabricate a single silicon transistor. Now, state-of-the-art advancements produce 1.5 billion transistors on a single wafer. Engineering scientists are conducting research in “self-assembly polymer molecules” and extreme ultraviolet lithography in order to extend the law.
Michael S. Malone, without a hint of hyperbole, concluded in his book, The Intel Trinity: “It has been said that if in 1965 you had looked into the future using any traditional predictive tool — per capita income, life expectancy, demographics, geopolitical forces, et cetera — none would have been as effective a prognosticator, none a more accurate lens into the future than Moore’s Law.”
In his State of the Union address on Jan. 4 of that year, Johnson envisioned a “Great Society” whereby “society will not flower spontaneously from swelling riches and surging power.” Fifty years later, with 1965 as a catalyst, that society is largely realized.
James P. Freeman is a Cape Cod-based writer