Earlier this week I had a chance to see the glass-walled exhibit containing the wax figurine sculpted by the artist Edgar Degas in the late 19th century out of bric-a-brac and old paintbrushes and wire laying around in his studio. While at the National Gallery of Art in Washington D.C., I learned that this sculpture caused quite an uproar in the art world at the time. Depicting a real-life 14 year-old aspiring ballerina of limited means, Degas captured the tensions over the haves and have-nots of his era. (These young ballerinas were often called “opera rats” , and regularly exploited by unscrupulous individuals (whom Degas also frequently painted) who hung around the theater scene of the time.)
Breaking with conventions of the age, which did not include making sculptures out of trash and using real fabric to dress a figurine, Degas did something all successful artists do: he forced people to change their perspectives on issues they would rather ignore or take for granted.
By coincidence, later this week, in a class, I was told that Degas did not use any measurement techniques to do his figure drawings. We were told to draw something without looking at the paper on which we were drawing. This was new to me: but the teacher said to the class, “Your intellect gets in the way of your ability to see” if you study what you are doing. This was fascinating; I had just read Seeing What Others Don’t: The Remarkable Ways We Gain Insight, by psychologist and developer of “naturalistic decision-making,” Dr. Gary Klein. Too much focus, he writes, on eliminating errors prevents us from having insights.
Most places we work focus on preventing mistakes and not on fostering insights. Klein explains, mistakes embarrass organizations, and it’s easier to measure reduction of mistakes than it is to measure increasing production of insights. (The enthusiasm over Six Sigma’s statistical approach to eliminating errors has just about killed off any potential for insights in the organizations that rely on it, he says, for instance.) How natural is it not to make mistakes, and what are the downsides?
According to Klein, a risk-averse environment leads to a checklist mentality. He notes that: “A checklist mentality is contrary to a playful, exploratory, curiosity-driven mentality.” Of course, we want people with our lives in their hands–pilots, surgeons, and others–to use a checklist if this assures they won’t forget to close the doors before take-off or that they remember to remove a surgical tool in our brain. And organizations everywhere play it safe by tabulating how many of their employees had the required training in this or that–a form of accountability and insurance, if not guarantees that errors won’t be made.
Apparently controversies, such as the ones that swirled around Degas’s “Little Dancer,” are necessary for helping us, eventually, to reframe our perspectives. And this reframing does not involve minor adjustments or “adding more details,” according to Klein; the changes involved are not incremental. Instead shifts occur that change our core beliefs. Such shifts are “discontinuous discoveries,” he writes, giving many of his own examples accrued during years of study in his quest to learn where insights come from.
These shifts transform us in several ways, changing how we “understand, act, see, feel, and desire.” They transform our thinking, and give us a different viewpoint, thus changing how we act and even “our notions of what we can do.”
It seems possible, in an age of digital hyperconnectivity and empowered individuals, etc, that integrating improved understanding (and insights) of how to develop and convey appealing narratives already has become something separating winners and losers in the battles for attention, “hearts-and-minds” and other contests of our age. Perhaps this always was true but is amplified by today’s unprecedentedly interdependent world.