Les Adler was a faculty member in the Hutchins School from 1970-2012, serving also as Provost from 1977-1979, from 1987-1997, and as the Dean of Extended Education. Additionally, he spent a half-year teaching in England in 1983 for the American Institute for Foreign Study, and a year in Southeast Asia as Fulbright Professor of American history and foreign policy as the National University of Singapore in 1991-1992. He earned his BA degree in Russian and European history from the University of New Mexico in 1963 and his MA (1965) and Ph.D. (1970) degrees in American history from the University of California at Berkeley.
His original research interests dealt with the cultural origins of the Cold War, culminating in several articles and a book, The Red Image: American Attitudes Toward Communism in the Cold War Era, published in 1991. Most recently, his research has focused on the multi-faceted life and career of the political activist, writer, historian, philosopher of science and innovative interdisciplinary thinker, Arthur Koestler.
A book reviewer on subjects in East European and American history, culture and foreign affairs for the San Francisco Chronicle since 1987, he has also contributed articles and essays on issues related to the Persian Gulf War, Sikh religious traditions and Nuclear Accidents during the Cold War era to several national newspapers and National Public Radio.
In addition to teaching both lower and upper division interdisciplinary classes in Hutchins, Dr. Adler also served as Director of the Hutchins Center for Interdisciplinary Learning which manages a variety of projects designed to share the innovative teaching and learning experience of the Hutchins School with the larger regional, statewide and national communities.
Slightly more than a decade later, in 1914, two shots by a Serbian terrorist in little-known Sarajevo, Bosnia, set off World War I, a conflagration of earthshaking global consequences that no one expected or wanted, causing nearly 40 million military and civilian deaths, along with results that continue to reverberate throughout Europe, Asia and the Middle East down to the present day.
In 1950, at a peak in Cold War tensions, political and strategic overreach by General Douglas MacArthur commanding United Nations forces fighting North Korean aggression against the South triggered an unanticipated and overwhelming Chinese response. The result was three years of additional bloody warfare in Korea, which in addition to hundreds of thousands of deaths on all sides brought the world to the brink of nuclear war.
Once again, in 1962, a dangerous miscalculation by Soviet Premier Nikita Khrushchev in attempting to place nuclear missiles in Cuba to balance the presence of American missiles around his country, brought the world closer than ever to the edge of a nuclear war with unimaginable global consequences. How close we came to disaster, we later learned, rested less on diplomacy than on the heroic decision by a Soviet submarine captain NOT to fire a nuclear torpedo at an American destroyer despite faulty indications that an attack was underway.
We know now that on at least three other occasions, false-positive readings on radar screens in both Soviet and American nuclear-defense systems nearly led to the launching of retaliatory responses that could have brought catastrophic results for both human civilization and the global environment. In each case, only the actions of individual humans under intense pressure, choosing to interpret the reports as electronic glitches rather than incoming missile tracks, prevented disaster.
Exactly what chain of events might be set off by the provocative statements, military posturing, accidents, missile tests, war games or even deliberate actions by players in the current dramatic standoff between North Korean dictator, Kim Jong-un, and President Donald Trump is not yet known. What we do know, however, is that massive historical conflicts and global disasters can be triggered by rogue individuals or unanticipated events at multiple levels in complex systems–often in ways that are unimaginable and, in fact, entirely unpredictable.
The more heated the crisis atmosphere, the more likely it is that preexisting ideological predispositions or perceptual biases rather than objective facts will determine the decision-making process. Were American destroyers really under attack in the Gulf of Tonkin in the summer of 1964–as early reports reaching Washington indicated? Or to what degree were the reports interpreted, or shaped, to bring about the desired political results?
Likewise, how was the intelligence perceived or ‘fixed’ in the run-up to the 2003 invasion of Iraq to support the incorrect preconception that Saddam Hussein was on the cusp of developing nuclear weapons? And what cycle of seemingly endless tragedy in the region has ensued because of that decision?
Recent exposure to what have been called “Black Swans” (unpredictable or unforeseen events with extreme consequences) like the sudden collapse of the Soviet Union in 1989, 9/11, the almost-complete global financial meltdown in 2008, or even the election of Donald Trump, should certainly give us pause; as should our growing understanding of the sensitivity of interconnected planetary systems to human activity.
What Chaos Theorists describe as the butterfly effect might, at least metaphorically, allow us to recognize the snowballing impact that small changes in one part of a complex system can initiate in the system as a whole.
What this means in the current nuclear standoff with North Korea is that there is no room for even the slightest miscalculation, error or lack of caution. Threats and over-heated rhetoric can only set the stage for a cascade of disastrous consequences, which only sheer good fortune has prevented multiple times during our dangerous three-quarter-century experiment of dancing on the edge with the bomb. We can no longer rely on blind luck to save us–from ourselves.