A shadow has fallen across the future of autonomous transportation, one of the key aspects of the city of the future and of the widespread use of artificial intelligence. It comes from Boeing in the form of the computer problem that has grounded the world’s fleet of 737 Max 8 aircraft.
No definitive cause of the crashes of Lion Air Flight 610, in the Java Sea, which killed 189 people, and Ethiopian Airlines Flight 302, en route from Nairobi to Addis Ababa, which killed 157 people, has been established yet. But the everything points to the computerized stall-avoidance system.
In terms of computing in aircraft, this is no more than an embarrassment. In terms of loss of life, it is ghastly. In terms of the public confidence in the growing role of computing in everything, it is grave.
These crashes have stimulated public fear, and public fear hangs around. So does institutional fear -- even when the problem has been identified and remediated.
Consider these events, which have left a long-lasting residue of fear:
Thalidomide was a drug developed in Germany and first marketed there to pregnant women as an anti-depressant.. Use spread around the globe and the results were devastating: More than 10,000 babies were born without one or two major limbs, like arms and legs.
I am told, although it is never mentioned, thalidomide haunts the drug industry. It has affected both the development of new drugs and the regulation of drugs to this day. The long delays and exhausting trials new drugs go through are partly due to something that happened in the late 1950s.
The Three Mile Island nuclear-power plant accident, in Pennsylvania in 1979, has affected nuclear design and regulation of nuclear plants ever since, although no life was lost. There was a partial meltdown of the core and the result fed the anti-nuclear movement which, ironically, pushed utilities back to coal -- now under attack because of its environmental impact.
The Max 8 problem, in terms of computing in aircraft, is no more than a glitch, possibly the result of a rush to market. But the loss of life is terrible and the loss of confidence immeasurable.
A whole array of high-tech companies is hoping to bring autonomous transportation to the streets within a decade or not much longer. These include Uber, Lyft and Google. Tesla would like to see autonomous electric trucks handling intercity deliveries.
This push to the driverless has huge energy and resources behind it. It is a part of what has come to be known as the smart city revolution. It also is part of what has been described as the Fourth Industrial Revolution.
Early autonomous cars have depended on sensors to guide them. The car in front slows and the car behind picks this up from its sensors. When autonomous vehicles are fully developed, these cars and all the others on the road will be in constant communication with each other. Car A will tell Car B, “I am breaking” and so on down through a line of traffic. It is coming.
The message from Boeing’s catastrophe is: Get it right or you will scare the public off, as happened with Three Mile Island. Some willing propagandists scared the public off nuclear — our best way of making a lot of electricity without carbon.
The technology in aircraft is very sophisticated. Almost all passenger airliners have been able to land themselves once they intercept a radio signal, called the glide slope, at an advanced airport. They are packed full of computers operating all sorts of wondrous systems.
If all the computers on the fatal Max 8s had been talking to each other, as traffic will have to in the coming era of autonomous vehicles, they might well have shut down the stall avoidance system that was mis-sensing an imminent stall.
The neo-Luddites will try to exploit the Boeing catastrophe into slowing smart city development. The challenge for autonomous technology is to get it right. Not rush to market.
Llewellyn King is executive producer and host of White House Chronicle, on PBS. His email is email@example.com and he’s based in Rhode Island and Washington, D.C.