Where we left off, in the previous post, “Little Dancer Coincidences,” was with the notion that “discontinuous discoveries” can result in a shift in our core beliefs. This notion comes from the book, Seeing What Others Don’t: The Remarkable Ways We Gain Insights, by Gary Klein who, as mentioned previously, is a research psychologist specialized in “adaptative decision-making.” Klein studied 120 cases, drawn from the media, books, and interviews, involving stories of how people “unexpectedly made radical shifts in their stories and beliefs about how things work.” From these cases, Klein was able to organize his research into five different strategies for how people gain insights, including: Connections Coincidences Curiosities Contradictions, and Creative Desperation According to Klein, all of the 120 cases he examined fit one of these strategies, but most relied on more than one.
Klein begins with the strategy of connections, and before proceeding with several fascinating examples, recalls the story told earlier in the book of Martin Chalfie, a biologist at Columbia University who–by virtue of attending a seminar on a topic unrelated to his work–ends up getting the idea for a natural flashlight that would let researchers look inside living organisms to watch their biological processes in action. At the time he attended the seminar, Chalfie was studying the nervous system of worms. The seminar covered topics that didn’t interest Chalfie initially, according to Klein; suddenly the seminar speaker described how jellyfish can produce visible light and are capable of bioluminescence. This led to Chalfie’s insight applicable to his own field. His insight led to an invention “akin to the invention of the microscope,” writes Klein, because it enabled researchers to see what had previously been invisible. For his work, Chalfie (seen in the photo to the left above) received a Nobel Prize in 2008.
Like Chalfie, certain people make connections between unrelated matters that their close colleagues don’t. Klein also tells the story of how the Japanese Admiral Isoroku Yamamoto (April 4, 1884- April 18, 1943) saw the implications of the British attack on the First Squadron of the Italian Navy early in World War II–before the United States had entered the conflict–then sheltered in the Bay of Taranto. Since the bay was only 40 feet deep, the Italians believed their fleet was safe from airborne torpedoes. The British, however, had devised adjustments to their torpedoes, including adding wooden fins to them, so that they wouldn’t dive so deeply once they entered the water. For Yamamoto, the successful British attack at Taranto produced the “insight that the American naval fleet “safely” anchored at Pearl Harbor might also be a sitting duck,” writes Klein. Yamamoto refined his ideas until “they became the blueprint for the Japanese attack on Pearl Harbor on December 7, 1941” (although he himself was opposed to Japan’s decision to go to war with the U.S.); ironically, his other insight was that Japan would lose the war with the United States. Yamamoto studied in the U.S., and had two postings in Washington, D.C. as naval attache; he had insights about the U.S. that his colleagues did not. He was resented by his more militaristic colleagues for his views.
Organizations generally block the pathways of connections (and other strategies) needed for such insights to occur, according to Klein. This is because organizations are primarily concerned with avoiding errors. Ironically, this risk-aversion makes people inside organizations reluctant to speak up about their concerns, leading organizations to “miss early warning signals and a chance to head off problems.” Such problems are common in many fields, including science, according to Klein. Promoting forces that can countervail risk-aversion sometimes requires designating “insight advocates,” writes Klein, even though he admits he is dubious that any organization would sustain them or “any other attempt” to strengthen the forces for insight creation. Another method he suggests is to create an alternative reporting channel so that people can publish work that doesn’t go “through routine editing” and thus would “escape the filters.” But, he thinks this method “may work better in theory than in practice.”
A key problem for many organizations is not related to having or noticing insights, but instead it is “about acting on them.” Organizations that are less innovative because they are stifling insights, he says, “should be less successful” than they could be. The deleterious effect of the defect-exposing Six Sigma program on U.S. corporations is an example of how an all-out focus on eliminating errors gets in the way of innovation, says Klein. Clearly it is not a simple matter to balance the needs for efficiency and innovation within the same organization, particularly a “mature” organization. Klein concludes that the examples he gives are, for him, a “collective celebration of our capacity for gaining insights; a corrective to the gloomy picture offered by the heuristics-and-biases-community.” He continues: “Insights help us escape the confinements of perfection, which traps us in a compulsion to avoid errors and in a fixation on the original plan or vision.”
Klein ends up recommending “habits of mind that lead to insights” and help us spot connections and coincidences, curiosities and inconsistencies. The more successful we perceive ourselves being because of our beliefs, “the harder it is to give them {our beliefs} up.” The habits of mind Klein has covered in his book may “combat mental rigidity,” he writes. “They are forces for making discoveries that take us beyond our comfortable beliefs. They disrupt our thinking.” There is a “magic” that occurs when we have an insight, Klein concludes, and it “stems from the force for noticing connections, coincidences, and curiosities; the force for detecting contradictions; and the force of creativity unleashed by desperation.” So, while there is no blueprint for insight creation in Klein’s book, the many examples he cites are compelling reminders of the crucial role that insights play in stimulating new directions in any endeavor.
It seems, then, that insights can be both the source of surprises as well as help spur readiness for surprises. They can be the needed “black swans” to deal with inevitable “black swan events.” A take-away from this book: There may be no ten-step list to creating insights but understanding how to create favorable conditions to disrupt our thinking–so as to stimulate new connections and ideas–seems like useful knowledge in a world of inevitable surprises. Ostriches with their heads in the sand may not do as well as those who see what others don’t.