The Smarter Screen: Surprising Ways to Influence and Improve Online Behavior
October 27, 2015
Behavioral economist Shlomo Benartzi tackles how we interact with the proliferation of the internet and screens in our everyday lives, and how they can be designed in a way that encourages us to make better decisions on them.
On October 1, 2013, the United States government launched a new Web site, www.healthcare.gov, that was designed to help people choose health insurance. In essence, the site was a shopping portal, allowing consumers to compare prices and features on all of the insurance plans available in their local area. Because the government hoped to sign up millions of uninsured Americans, they decided to rely on the scale of the Web.
While most of the media coverage of the Web site centered around its glaring technical glitches, very little attention was paid to a potentially far more important issue: Did the Web site actually help consumers find the best insurance plans? Given the reach of Obamacare, even seemingly minor design details could have a huge impact, influencing a key financial decision in the lives of millions of Americans.
Unfortunately, research suggests that most people probably made poor insurance choices on the Web site. A study conducted by Saurabh Bhargava, George Loewenstein, and me demonstrated that the typical subject using a simulated version of healthcare.gov chose a plan that was $888 more expensive than it needed to be. This was equivalent to roughly 3 percent of their income. Meanwhile, an earlier study, led by Eric Johnson at Columbia University, found that giving consumers more health care options on sites like healthcare.gov dramatically decreased their ability to find the best plan. In fact, even offering people a modest degree of choice meant that nearly 80 percent of them picked suboptimally.
Can this problem be fixed? The online world offers us more alternatives than ever before: the average visitor to healthcare.gov was offered forty-seven different insurance plans, while Zappos.com features more than twenty-five thousand women's shoes. But how should Web sites help us choose better?
On the morning of February 21, 2010, an American Predator drone began tracking a pickup truck and two SUVs traveling on a road near the village of Shahidi Hassas in southern Afghanistan. As the drone followed the vehicles, it beamed a live video feed to a crew of analysts based at Creech Air Force Base outside Las Vegas.
Such intelligence is now a staple of modern warfare. The CIA used drones to gather intel on Osama bin Laden's hideout; the Israeli Defense Forces flew dozens of unmanned aircraft over Gaza during the recent conflict; the United States Air Force accumulated more than five hundred hours of aerial video footage every single day in Afghanistan and Iraq.
This flood of information creates an obvious problem: someone has to process it. Unfortunately, the evidence suggests that drone crews are often overwhelmed by the visual data. One study, led by Ryan McKendrick at George Mason University, showed that people simulating the multitasking environment of drone operators performed worse on an air defense task; another experiment, which looked at gunners in armored vehicles, found that the soldiers failed to perform their primary task effectively—noticing the bad guys— when a second task was added to the list. In experiment after experiment, the surplus of digital information creates blind spots on the screen.
That's what happened to the analysts tracking those vehicles in southern Afghanistan. According to an internal military investigation, the cubicle warriors in Nevada couldn't handle all of the available information as they toggled back and forth among the video feed, radio chatter, and numerous instant messages. As a result, they failed to notice that the truck and SUVs were actually filled with civilians. And so the drone operators gave the order to fire, unleashing a barrage of Hellfire missiles and rockets. Twenty-three innocent people were killed in the attack.
How can we make such tragedies less likely to happen? What should the Air Force and CIA do to minimize the risk of blind spots on screens? And how can other organizations, from financial institutions to hospitals, deal with the same problem of digital information overload?
On December 14, 2013, Jessica Seinfeld used the Uber app to drop her children off across town at a bar mitzvah and a sleepover.
Unfortunately, the ride took place in the midst of a New York City snowstorm, which meant that Uber had put surge pricing into effect. (When demand for drivers is high—say, during a blizzard, or on New Year's Eve—Uber systematically raises its rates to entice more drivers to enter the marketplace.) During this storm, demand for drivers was so high that some Manhattan customers were charged 8.25 times the normal fare. Although Uber warned its customers about the surcharge before they ordered a ride, the warning clearly wasn't effective, as social media soon lit up with complaints of price gouging. Jessica Seinfeld, for instance, posted a picture of her $415 Uber bill on Instagram, while many others lamented their crosstown rides that cost more than $150. Uber had provided a valuable service—helping people get home in a bad storm—but had also angered a lot of customers. It's never a good sign when your company is the reason people are tweeting the hashtag #neveragain.
The surge pricing problem is indicative of a more common digital hazard, which is that people often think very fast on screens. Uber customers, of course, benefit from this quick pace, as the streamlined app makes it easy for people to book rides with a few taps of the thumb. However, when surge pricing is in effect that same effortless ease can backfire, since consumers book rides on their phone without realizing how much the rides are going to cost.
How should Uber fix its app? Is there any way to help consumers avoid online decisions they'll soon regret?
These three stories illustrate a few of the many ways in which the digital revolution is changing the way we live, from the analysis of military intelligence to the booking of taxi rides. They reveal an age in which we have more information and choices than ever before, and are able to act on them with breathtaking speed. But these stories are also a reminder of the profound challenges that remain. We have more choices, but we choose the wrong thing. We have more information, but we somehow miss the most relevant details. We can act quickly, but that often means we act without thinking.
It's a clich to complain about these trends. It's easy to lament all the ways the online world leaves us confused and distracted, forgetful and frazzled.
This book is not about those complaints. It is not about how smartphones make us stupid. It is not a requiem for some predigital paradise.
Instead, this book is about how screens can be designed to make us smarter. It's a book of behavioral solutions and practical tools that can improve our digital lives. It's about how the same technological trends that lead people to buy the wrong insurance plan and book a $415 taxi ride can be turned into powerful digital opportunities, rooted in the latest research about how we think and choose on smartphones, tablets, and computers.
Here are three examples of potential solutions. If you want to encourage people to select the best health care plan, or choose the right product on your Web site, then you might want to consider a choice tournament modeled on Wimbledon and March Madness. (Instead of giving people all the options at once, you divide the best options into different rounds—work led by Tibor Besedes shows this dramatically improves decision making.)
And if you want to help intelligence analysts avoid blind spots, it's often helpful to zoom out and provide fewer details about the scene. (In a real-world study conducted in Israel, providing less detailed feedback led to big improvements in decision making among investors. I bet it would also help drone operators.) This fix is not just about giving people less information—it's also about using new information compression technologies to help us cope with our limited attention.
Finally, companies like Uber can do a better job of educating their customers—and thus avoiding a mob of angry ones—by carefully deploying ugly fonts on their Web sites and apps. (This runs counter to the common belief that information should always be as easy to process as possible.) The same approach can also be used to close the digital reading gap, as many studies suggest that we read significantly worse on screens than we do on paper.
These are just a few suggestions for how businesses and governments can use the tools and tactics of behavioral science to improve our online behavior. [The Smarter Screen] is filled with many more examples, as I believe we are on the cusp of a huge opportunity: By taking advantage of this practical research, we can dramatically boost the quality of our digital decisions. We can see better, learn more, and regret less.
Excerpted from The Smarter Screen: Surprising Ways to Influence and Improve Online Behavior.
Published by Portfolio/Penguin, an imprint of Penguin Random Housee LLC.
Copyright 2015 by Shlomo Benartzi.
All Rights Reserved.
ABOUT THE AUTHORSSHLOMO BENARTZI is a professor and co-chair of the Behavioral Decision-Making Group at UCLA's Anderson School of Management. He is the author of Save More Tomorrow and Thinking Smarter. He has extensive experience applying behavioral economic insights in the real world, having increased the saving rates of millions of Americans through his work with Richard Thaler on Save More Tomorrow, and has advised many government agencies and businesses.
JONAH LEHRER is a science writer living in Los Angeles.