McIntire Professor David Lehman Explains Why Some Rules Are More Likely to Be Broken in HBR

October 8, 2019

Article Image Group

David Lehman

What poor decisions caused the BP Deepwater Horizon oil spill, the Wells Fargo’s account scandal, or Chipotle’s foodborne illness outbreaks? While many studies have aimed to determine which organizations are more susceptible to a lax attitude that lends itself to rule breaking, an upcoming paper by McIntire Management Professor David Lehman and his colleagues focused on the rules themselves, discovering that the design of rules can actually be the root cause for noncompliance in organizations.

Lehman, along with Professors Bruce Cooil and Rangaraj Ramanujam of Vanderbilt University’s Owen Graduate School of Management, recently explained the findings of their Journal of Management paper in an Oct. 7, 2019, Harvard Business Review article titled "Why Some Rules Are More Likely to Be Broken."

Focusing on 289 restaurants in Santa Monica, CA, from 2007 to 2010, Lehman and his co-authors compiled a dataset looking at more than 80,000 rule observations from more than 1,000 hygiene inspections. The results? Complexity is often to blame for a rule being ignored: The more difficult it was to follow, the more likely a rule would be violated. Moreover, bigger rules were more likely to be corrected by a subsequent inspection, and a restaurant was more than twice as likely to break a rule if it had broken the rule before.

Lehman and his colleagues close the article by offering suggestions on how managers might better cope with—and enforce—regulations.

Read the full article at Harvard Business Review.