The consequences of bias in design and innovation—even unintended bias—can be devastating.
When carmakers decades ago added shoulder restraints to lap seat belts, a disproportionate number of women, including pregnant women, died in auto accidents in which they were wearing belts.
Investigators found that the vast majority of research on the restraints involved men, and the crash-test dummies they used were in the likeness of men. It never occurred to researchers and developers to consider the body shapes of women—much less pregnant women.
The first cardiac pacemakers didn’t fit well or function properly in women’s chest cavities, likely because the developers were men and didn’t consider the differences in body types.
More recently, early facial-recognition programs had difficulty recognizing the faces of people of color, often spitting out disturbing conclusions, even determining they weren’t human.
And the early voice-activated navigation and emergency-response systems in cars didn’t recognize or respond to women’s voices, because the data input was based on male voices. Women were literally rendered voiceless.
Not every example of bias is so crucial. But experts say bias has profound effects across multiple layers of business, including hiring, promotions, office protocol, the composition of teams, the customers companies seek and the projects they pursue.
Studies show those problems can have a double-digit-percentage impact on profit.
And for companies that pursue technological advances and innovative solutions, bias can have an enduring impact, making it easy for the cycle to be perpetuated.
But because much of the bias in business is implicit and unconscious, it’s as difficult to root out as a stealthy moll in a vegetable garden.
IBJ talked to nine local business executives for this story. None said it is easy to eliminate bias from technology and innovation, and some said it’s nearly impossible.
That’s in part because the problem is widespread. Harvard University’s Implicit Association Test concluded that more than 85% of people display implicit bias. Many executives IBJ talked to think that number is low.
That’s because, in some ways, humans are wired to be biased.
“Our brains are designed to take in a couple of data points and jump to a conclusion,” said Santiago Jaramillo, co-founder and CEO of Fishers-based human-resources software company Emplify. “That’s why, when we see a stick on the ground, we jump and scream, ‘Snake!’”
Bias and even stereotypes do have some practical functions. They allow us to go through life without having to redefine every single situation we run across on a daily basis.
Take for instance, a gas station. The vast majority of customers have a preconceived notion of how it works. You pull up to a small vertical structure, insert a rectangular piece of plastic to activate it, and pump fuel into a hole in your vehicle that allows it to run. Think how time-consuming it would be if you had to relearn everything you’ve ever known about a gas station every time your car’s fuel gauge hit ‘E.’
But when stereotypes creep into research, innovation and development, the result is often erroneous—and sometimes unfair—assumptions that can lead to bad and embarrassing business decisions.
Add in technology, and the bias perpetuates—even worsens.
After all, individuals’ biases are deeply rooted in history, culture and learned behavior. They can be recognized and overcome—or balanced by diversity in the decision-making process.
But when individuals bake those biases into something like artificial intelligence—which learns from its inputs—the technology can drive the bias forward at breakneck speed.
“This is a really big problem,” said Mary Murphy, an Indiana University professor of psychology and brain sciences. “Bias is one of the most replicable problems in all of psychology.”
While technological advancement often emphasizes ever-increasing speed, the key to combating bias may be tapping the brakes.
“The best way to fight implicit bias is to slow down decision making,” Murphy said. “When we have to make split-second decisions and use categorizations, implicit bias creeps in.
“People are cognitive misers,” she said. “We rely on easy shortcuts, often with bias built in, which can make it easier to make those split-second decisions.”
Murphy said it’s important to focus on facts—not feelings—to root out bias.
“Anytime you hear something like, ‘It feels right,’ ‘It’s a good fit,’ or, ‘He feels like one of us,’ that’s a good sign implicit bias is happening,” Murphy said.
Kelli Jones has heard those phrases all too often.
“I am not a fan of culture fit,” said Jones, co-founder of Black Hatch Fund, a venture capital fund and accelerator supporting black tech entrepreneurs. “It’s not a valid metric for anything.”
Kate Maxwell, a technical director at Raytheon Technologies, travels globally for her job. She’s been a leader on workplace and diversity issues within Raytheon and beyond. And Maxwell—an engineer who often finds herself the only woman in the room—doesn’t wear rose-colored glasses.
“If we’re being honest, we’re not going to totally eliminate implicit bias,” she said. “We deal with so much data every day, your mind just takes mental shortcuts.
“There are steps we can take to mitigate it, and the first step is to be introspective and admit it’s there.”
Maxwell said companies must build diversity into everything they do “to challenge ourselves and the assumptions we make as humans and to combat the bias that exists in all of us.”
But Jaramillo said diversity isn’t enough.
“Diversity is like being invited to the party,” he said. “Inclusiveness is like being asked to dance.”
That inclusiveness—or engagement—in business has tangible benefits.
“Employee engagement is antecedent to innovation,” Jaramillo said. “When an employee is disengaged, they’re not going to innovate. You’re not looking for solutions when you’re not a part of the problem-solving process.”
And that is expensive. Gallup Inc., a Washington, D.C.-based research firm, estimates that disengagement costs U.S. companies $450 billion to $550 billion per year.
Jaramillo isn’t too concerned whether bias is intentional or unintentional. The result, he said, is the same.
“Whether it’s implicit or not doesn’t matter to the person impacted,” Jaramillo said. “All they know is, they’re not included.”
But even those like Jaramillo who try hard to be inclusive don’t always succeed.
He recently created a heat map—a graphic that relies on intensity of color to convey a message. One of the people testing the product said he couldn’t make heads or tails out of it. It turns out, he was color-blind, something that had not occurred to Jaramillo to account for.
To uncover our bias, he said, companies need to include more people in the process.
“I don’t discover my blind spots on my own,” Jaramillo said. “I discover it when someone else illuminates it.”
To mitigate bias, he said, companies have to broaden the group included and engaged in research and development. “You don’t want group-think, but a group of thinkers is needed to illuminate blind spots and implicit bias.”
Bias can come from anywhere, but might be most dangerous when it comes from the top down.
“If you have a lot of leaders that look the same and act the same, you start to get comfortable with the status quo, and you’re not innovating,” Maxwell said. “Bias on an organizational level is detrimental. You can’t push ideas down on people. It can’t be one way or the highway. There’s such a danger in that.”
Maxwell said that only by allowing “lateral thinking”—a way of approaching a problem that eschews traditional deductive or step-by-step logic in favor of a more imaginative approach—can a company truly innovate
“At the heart of innovation is creativity,” she said. “To do this, you have to think divergently.”
Affinity bias, favoring those who look and act like us, can lead to a whole slew of problems that impede innovation.
Black Hatch’s Jones said affinity bias has been a brick wall for minority-owned startups.
That’s why she started Black Hatch. “All the venture capitalists are white men,” she said. “So the vast majority of capital for startups goes to white men.”
Maxwell agreed. “Venture capitalists and research funds have for a long time favored people who look like them,” she said. “It’s what they’ve been comfortable with. That has to change.”
In recent years, Jones has seen some women breaking through, but said that’s not enough.
“We have to be real about what diversity and inclusion is,” she said. “Being inclusive can’t just mean 50% women. A lot of those women are white. That doesn’t solve the problem.
“Technology adoption, creativity and innovation comes out of being diverse a lot more than people like to admit.”
Maxwell and Jones said one way for women and minorities to break through the wall of bias is to seek champions that are part of the majority.
It might seem unfair that a member of the minority would need a member of the majority to back them before they are heard, but some local executives said that’s the reality.
“Having male mentors and champions has been so important for me. It’s helped change minds,” Maxwell said.
Jones’ champions and allies “have put people on notice [that] you can’t just ignore the woman or black person in the room,” she said. “It totally changes the tone. It’s super important to have people support you that don’t look like you in a corporate environment. We need more of that.”
With artificial intelligence and robots on the horizon, the debate about bias in business is likely to intensify.
Computers are devoid of the emotions that often feed bias in humans, and they have been, in some cases, used to curtail bias.
But humans can feed biased thinking into machines, and computers can learn from humans’ biased behavior and replicate it on an even bigger scale.
“Artificial intelligence is the next frontier. We’re in the middle of a revolution, and we’re trying to figure out how to take bias out,” IU’s Murphy said. “It’s a Herculean task.”
But local AI expert Christopher Day said computers—and computer programs—can be rid of bias.
“You have to make sure you don’t insert data into algorithms that will skew the outcome,” said Day, founder of DemandJump, an Indianapolis company that makes software that allows customers to map buyers, potential buyers and competitors, and target buyers where and when they are making buying decisions.
“You have to be very intentional with what you feed those algorithms,” Day said. “Clean data sources are the key.”
One example of AI gone wrong is Amazon’s failed attempt to use the technology to screen job candidates. The company scrapped that initiative in 2016—after nearly three years of development—when it discovered the AI-driven screener was discriminating against women.
“I haven’t seen much evidence that bias can be taken out of algorithms,” Murphy said. “The problem is, all algorithms need data … and they learn associations and make suggestions. All of these data sets at scale have our biases built in.”•