Kirk Hanson, Ann Tenbrunsel, Hersh Shefrin, Michael Engh, S.J.
Most people want to be ethical — and consider themselves to be. But incidents ranging from stolen library books to rogue trading illustrate that many people do not act as ethically as they want to, or as they think they do.
“With all the evidence to support rational, good choices in the workplace or the marketplace, why don’t we all behave that way?” said Ann Skeet, director of leadership ethics at the Markkula Center for Applied Ethics at Santa Clara University. Skeet gave an introduction to a May 11 forum called, “The Behavioral Movement: What Business Professionals Should Know About Human Nature,” sponsored by the Business Ethics Partnership of the Markkula Center.
Two speakers addressed what we know about why people behave unethically – and how the conditions that contribute to this behavior may be particularly acute in high-pressure environments like Silicon Valley.
“The culture of Silicon Valley is different than in most other places,” said Hersh Shefrin, the Mario L. Belotti Professor of Finance at Santa Clara University’s Leavey School of Business and a pioneer in the field of behavioral finance. “This is a risk-taking culture and a culture where goals are set very high.”
This can make Silicon Valley workers especially vulnerable to the pressures that can lead to unethical decisions. For example, the increasing use of global teams, which can require phone calls early in the morning and late at night as well as regular hours in the office, may contribute to fatigue – a risk factor for poor decision-making.
Still, Shefrin said, “we’re not as unique as we think we are – just more so.” Workers in Silicon Valley are subject to the same psychological issues as workers anywhere else.
For example, all workers have blind spots, said Ann E. Tenbrunsel, professor in the College of Business Administration at the University of Notre Dame and the Rex and Alice A. Martin Research Director of the Institute for Ethical Business Worldwide. She addressed the psychology of ethical decision making, or “why people behave unethically despite the best intentions.”
There have been significant efforts to improve ethics: at the regulatory level; at the organizational level, with millions spent on training; and at the educational level, with ethics being infused into the curriculum at many universities, Tenbrunsel said. Still, the headlines announcing bad behavior keep coming.
“We haven’t taken the psychology of the decision maker into account,” Tenbrunsel said. She listed four ethical blind spots that contribute to poor decision making — ethical illusions, ethical fading, dangerous reward systems and motivated blindness — and elaborated on the first two.
Ethical illusions are based on “illusions of our own ethicality,” Tenbrunsel said. She cited studies showing that library books on ethics – presumably checked out by people who think about ethics – are stolen more often than non-ethics books. And when people are asked to rate how honest they are, a majority of people rate themselves above average, which is statistically not possible.
“We really seem to engage in hyperinflation about things related to morality and ethicality,” Tenbrunsel said. “If everyone thinks their companies are ethical, we don’t do a good job of really trying to find the problems.”
It helps to think of three stages of the decision-making process, Tenbrunsel said: prediction, action and recollection. Before making a decision, people generally predict that they will act in accordance with their values. When it comes to taking action, that is not always what happens. But after the fact, “we remember that we did better than we did,” Tenbrunsel said.
Why don’t people behave as they predict they will? One reason, said Tenbrunsel, is that prediction involves high-level ideals, whereas the action phase is more about the details and what is feasible at that particular moment.
Forces such as hunger, fatigue and fear come into play, for example, and may overwhelm idealistic plans. “The body and mind’s goal is to mitigate it,” Tenbrunsel said.
Ethical fading, the second blind spot Tenbrunsel discussed, happens when a person making a decision doesn’t view the decision as one that involves ethics. People use financial criteria to make financial decisions and legal criteria to make legal decisions, for example. So if a decision can be categorized as something other than an ethical one, it makes it easy to not consider ethics.
Language plays a role in this area, as well: For example, a decision about “runoff” may be viewed differently than one about “pollution.”
Shefrin continued the conversation by examining rogue trading, an example of how “finance and psychology and ethics all interconnect.”
Because trading involves taking risks, it is useful to understand the psychology behind risk-taking. For example, most people will choose a sure gain over a smaller chance to win a larger amount. But they will choose the risk of a large loss over a sure loss.
“Three of the most important emotions associated with what happens when you face a risk are fear, hope and aspiration,” Shefrin said. “People who are excessively fearful tend not to take risks that are worth taking in an actuarial sense, and people who are excessively hopeful tend to shoot for the stars when it’s not appropriate. In a situation like the rogue trading cases, traders find themselves in a situation where the pressures to succeed are so great that they take imprudent risks.”
In addition to the psychology of the individuals involved, the strength of corporate processes and the way corporate culture encourages or discourages risk-taking play a role.
“Strong corporate cultures that include an ethical dimension can help deal with the vulnerabilities,” Shefrin said. “The tone always starts at the top.”