How moral disengagement – and workplace culture – quietly undermine ethics
“We are not the rational animals that we would like to think we are.
“The human brain appears to be prewired for self-justification and for reducing dissonance.”
So says Lorne Michael Hartman of York University’s Schulich School of Business, in his recent study in the International Journal of Ethics and Systems.
Most professionals see themselves as fair, rational and ethical, yet research in behavioural ethics shows people are far less objective than they assume.
Hartman describes this gap between self‑image and reality as the “ethicality gap.”
“The idea is that there’s a gap or a disconnect between how ethical we think we are and how ethical we actually are.”
Referencing corporate scandals such as the Volkswagen emissions scandal, the pharma opioid crisis or the Boeing whistleblower controversy, Hartman cites the challenges of encouraging ethical behaviour at work.
How is it that people do things that are inconsistent with their own moral standards and values?
“We find ways to justify it, rationalize it, minimize it, frame it in a way that we can still feel OK about ourselves,” he says in an interview with Canadian HR Reporter.
So, how can HR better combat this costly behaviour? Hartman’s study, “The rationalizing animal: moral disengagement and ethical decision making,” concludes that culture is key:
“Cultivating an ethical workplace culture requires a holistic approach with ongoing effort, combining leadership commitment, employee engagement, policy changes and continuous reinforcement of desired behaviours.”
3 studies on ethical behaviour
Hartman’s paper reports on three linked projects. In Study 1, individuals were trained to recognize moral disengagement openings in everyday ethical scenarios. However, moral disengagement remained unaffected, found Hartman.
Study 2 tried a different tack, with the logic that an individualist framing – emphasizing personal accountability – might reduce diffusion of responsibility (“Someone else will handle it”), while a collectivist framing might reinforce loyalty to the group, for better or worse.
Again, the impact was limited. Hartman found there was a greater reduction in moral disengagement than a collectivist mindset (particularly amongst female participants), but the change in moral disengagement was “insignificant.”
In Study 3, Hartman zoomed in on two sets of drivers: environmental factors such as organizational design and cultural norms, and individual differences such as people’s baseline level of moral disengagement.
The results were clear: “The effect of the environment on individuals is far greater than an individual’s effect on the environment.”
Participants who perceived their environment as aggressively results-driven — focused on “success, getting a good job, making lots of money,” as Hartman puts it in the interview — reported higher willingness to make poor ethical choices and higher moral disengagement. Those who experienced cultures that emphasized modesty, consideration and respect for others showed the opposite pattern.
Culture key to ethical behaviour
Hartman sees culture as the crucial amplifier. Reflecting on his research, he says it’s difficult to get people to change their beliefs about themselves “and almost impossible to get them to change it for the worse — but the context matters.”
“If you put an ethical, moral person into an unethical, dysfunctional culture and environment... the culture will win out almost every time.”
And even for those people who resist or report, such as whistleblowers, they may suffer the consequences, says Hartman, citing cases such as Boeing where those who spoke up faced firing and informal blacklisting.
Academic Lance Ferris echoes this emphasis on environment and situation: “Highly competitive cultures that emphasize winning at all costs can unintentionally encourage people to cut corners or break rules.”
And leaders should look at what is rewarded in practice, says the professor of organizational behaviour/human resources at the Telfer School of Management at the University of Ottawa.
"If an unethical salesperson wins sales awards by tricking customers, other employees will quickly learn what the organization really values,” he says, regardless of what is written in the company’s values statement.
Mix of 3 factors related to ethical behaviour
Ferris similarly cautions against assuming that only “bad people” behave badly. He tells Canadian HR Reporter that unethical behaviour at work “usually comes from a mix of three things: the person, the organization, and the situation.”
Some employees may indeed be more self‑interested or manipulative, but Ferris stresses that organizations matter too.
“Highly competitive cultures that emphasize winning at all costs can unintentionally encourage people to cut corners or break rules. And the situation itself makes a difference,” particularly when harm feels distant or abstract rather than personal.
Ferris cites the work of Canadian psychologist Albert Bandura and his theory of moral disengagement, which outlines eight mechanisms people use to neutralize discomfort and protect their sense of being good.
“For example, people can reframe the behaviour so it seems less unethical. They’ll say, ‘The ends justify the means’ or argue what they did pales in comparison to what other employees have done. They can also try to shift responsibility, saying they were just following orders or that everyone in the team knew what they did but no one stopped them.”
And, they may downplay the harm, he says, such as saying stealing from the company is ok because the company is making huge profits.
Loyalty, bystanders and organizational goals
In high‑stakes corporate scandals, bad behaviour often isn’t done for the benefit of the individual, says Hartman: “It’s usually done to the benefit of the organization, to help the company to achieve its goals, to save team members’ jobs.”
That mindset spreads responsibility across many people, he says, in perpetuating the misbehaviour or being aware of it.
Hartman links this directly to “the bystander effect” and describes “diffusion of responsibility” as a major moral disengagement mechanism in organizational life.
Ferris also highlights how context shapes our willingness to cross lines. He notes that people are much less likely to behave unethically when the harm feels personal.
“People are more willing to take $100 from an anonymous customer than from a colleague they know — even though the behaviour is objectively the same.”
Limits of ethics programs to impact behaviour
As a result, Hartman argues that formal ethics and compliance programs “fail to target these environmental factors and the role of moral disengagement mechanisms in facilitating unethical behaviour.”
Codes of conduct, online modules and legal briefings may check a governance box, but they leave intact the deeper patterns that make self-justification easy.
“Those traditional surveillance, sanctioning, training, ethics and compliance programs that almost all organizations in the private and public sector have, they don’t work,” he writes, because they address only the visible tip of the “ethical iceberg” rather than the structural and cultural base.
Why compliance training doesn’t fix ‘ethicality gap’
Ferris, who is cautious about overstating his expertise in training, reaches a parallel conclusion about many corporate programs.
“Ethics training can struggle because it assumes the problem is the employee,” he says. “If you want to reduce unethical behaviour, you need to look not only at the employees, but also the system they work in.”
Traditional compliance approaches that focus solely on individual awareness, he suggests, are likely to fall short.
Ferris’s diagnosis is consistent: organizations cannot treat ethics as a problem to be solved solely with policies, legal barriers and tick‑box e‑learning.
“If you want to reduce unethical behaviour, you need to look not only at the employees, but also the system they work in.”
That means paying close attention to culture, incentives and day‑to‑day pressures – the forces that make it easy or hard for people to tell themselves comforting stories about what they are doing.
How to resist rationalization at work
If individual-level ethics training has limited impact, Hartman believes the real leverage lies in designing organizations that make reflection easier and self‑justification harder. His work outlines several interventions “to mitigate self-serving biases,” many of which target culture and systems rather than personal virtue.
Hartman emphasizes the need to reward the process, not only the result. Under intense performance pressure, it’s easy for employees to focus solely on hitting numbers, he says.
In that environment, he argues, leaders must actively value how outcomes are achieved and avoid messages like “do whatever it takes to achieve that end goal.”
Another practical suggestion is to build “ethical speed bumps” into decision processes as people are more likely to rationalize decisions under pressure.
“Providing pauses where people can reflect a little bit, apply some screening, if you will, to their actions and decisions. can sometimes give them an opportunity, a kind of a window to, revisit their intentions and actions,” says Hartman, citing as examples checklists or second reviews.
Hartman’s research also highlights the value of fostering a culture and HR practices that resist ethical misbehaviour by encouraging “modesty, consideration of others and supportive relationships in the workplace,” along with collaboration and conflict resolution.
For Hartman, it also starts at the top of the house: “The senior team of the organization needs to be fully committed and involved and aligned” along with modelling “moral humility” when mistakes are made.
That can include having individual executives receive 360 feedback and coaching focused on “modesty, showing consideration, collaboration and conflict resolution,” along with overall training for employees in communication skills and emotional intelligence and “focusing performance measures and rewards on personal accountability, giving and receiving feedback, cooperation and conflict resolution.”
Hartman further argues for “promoting ‘psychological safety’ by encouraging diverse perspectives so everyone feels safe to voice opinions without fear of reprisal and cultivating a supportive feedback culture focused on constructive feedback rather than criticism.”
He also stresses alignment, by “ensuring company policies and procedures (e.g. hiring and promotion) are aligned with the desired ethical culture.”
Tools to pause, reflect
Ferris’s practical advice aligns with this systems lens. He suggests that organizations look closely for signs that “the culture encourages unethical behavior,” such as when someone who tricks customers becomes a top performer.
At the same time, he emphasizes that most employees want to do the right thing. In his view, helping them recognize “the warning signs that they are morally disengaging from a situation,” and giving them concrete tools to pause and reflect, can make a real difference when combined with supportive culture and incentives.
Ferris adds that lasting change comes from addressing the person, the organization and the situation together. Rather than trying to turn people into saints, their work suggests, organizations should focus on building cultures and systems that make it harder to rationalize – and easier to act in line with the values most employees already believe they hold.
On the question of ethics training, Ferris ends up in a similar place to Hartman’s data. He observes that “ethics training can struggle because it assumes the problem is the employee. And that can certainly be the case. But remember, unethical behavior can also come from the culture employees work in or the situations they face. If you want to reduce unethical behavior you need to look not only at the employees, but also the system they work in.” That diagnosis aligns closely with Hartman’s conclusion that there is “too much emphasis on traiingin” when programs target only codes and compliance.
Ferris is modest about his expertise in designing courses – “I’m not an expert in training,” he notes – but he does outline what a more psychologically informed approach might look like. “I would try and help people recognize the warning signs that they are morally disengaging from a situation. They are easy to recognize, and once that happens, they make unethical behaviour more likely,” he says. To support that, he recommends “a practical tool… like a decision-making checklist for major decisions: have we thought through the consequences? Are we complying with relevant laws and regulations?”
Crucially, Ferris underscores the importance of culture in reinforcing – or undermining – those tools. He suggests looking carefully for “signs the culture encourages unethical behavior. For example, if an unethical salesperson wins sales awards by tricking customers, other employees will quickly learn what the organization really values.” That insight echoes Hartman’s emphasis on aligning incentives and recognition with stated values, so that the people who model ethical behaviour are not sidelined while high-performing rule-breakers are celebrated
At the same time, he is careful to point out that we may overestimate how widespread misconduct really is.
“It’s important to remember that the vast majority of employees do behave morally. Those that don’t behave morally – particularly those who get caught – are the ones who get newspaper articles written about them, so we might think unethical behaviour is rampant. But while there are bad actors and bad organizations out there, they are the minority.”
.