I did something dumb the other day.  After the inside of our fireplace was re-mortared so we wouldn’t burn down our new house in Connecticut, I decided to paint it so the fresh mortar wouldn’t show.  The dumb part of my plan was rejecting my wife’s sensible suggestion to cover the granite hearth I sat on while I worked.  I did have my reasons.

Back in high school, I painted houses with friends in the Summer, and I’ve painted lots of places since then.  Evidently, experience taught me the wrong lesson.

Rather than acknowledging that paint is inherently messy, I figured that I knew what I was doing.  After all, I was painting the inside of our fireplace, so I didn’t see how any drips could get on the hearth on the outside.

I did not consider my feet, which got drips and splatters on them, which then streaked paint along the hearth.  Granite is porous.  My very patient wife was disappointed.

Lucky for me, the day was saved by the arrival of another tradesman to fix something else.  Sally asked if he had anything to fix the damage I had just caused.  A few swipes of acetone later, you couldn’t see anything, and I was on my way out of the doghouse.

“To err is human,” as the saying goes. I think I’m pretty smart, I deal with risk for a living, and I have worked around paint a long time.  Yet, look what I did even after my wife asked me to be careful in our new house. 

Once I made amends, I had two options: I could forget all about this, chalking it up to ‘no harm, no foul.’  Or I could write about my experience since the lesson is no less valuable for having avoided a bad result.  After all, it was still a dumb thing to do.

By now, you should recognize the concept of “near-miss reporting.”  The more I thought about my paint episode, the more I wondered why we don’t publicly proclaim our near misses.  After all, they are the most pain-free lessons we can learn.  If you can laugh at yourself, you can discuss your near misses, which hopefully will help someone else avoid doing the same thing themselves. 

To my understanding, near-miss reporting is relatively uncommon in the United States because of fear of litigation.  I recently came across an event producer whose policy was to retain no records of incidents that were unaccompanied by a medical treatment report.  That’s a lot of missed learning opportunities.

Because of our inherent fallibility as humans, as well as the many amazing things we do as event professionals, I am less distressed when something new doesn’t quite work than when we repeatedly do something dumb because it hasn’t resulted in disaster – yet. 

There is a name for this.  The overconfidence that we will keep avoiding the foreseeable consequences of flawed work is called the “normalization of deviance.”  The term was coined by sociologist Diane Vaughan in her book, The Challenger Launch Decision.  The gist is that even NASA, which has countless redundant safety measures because the risk of any failure is so high, became institutionally desensitized to an overstressed O-ring because nothing bad ever happened.  Until it did.

Lots of smart, conscientious people were aware of the O-ring situation and knew it was wrong.  But the absence of negative outcomes caused them to accept it as good enough. 

Our work as event professionals is not as obviously life-threatening as launching a rocket into space.  But event sites are full of dangers when people are uninformed, inattentive, fatigued, or suffering from lots of other human frailties.  This is why it is so important not to cut corners on safety – my cases are full of people who were convinced that their risky behavior was harmless until the moment someone or something got broken. 

The Event Safety Alliance is big on checklists and trigger charts and written guidance because we all need the reminders.  ESA is also glad to continue our educational mission so we can support each other to make better choices, avoid normalizing deviance, and make new mistakes rather than repeating old ones.

Earlier this month, about 200 people gathered in person and remotely for ESA’s weather event, rebranded as the Weather Planning for Mass Gatherings Workshop.  As before, we did not teach anyone to be their own meteorologist.  Instead, we taught about resisting the siren song of weather apps when people’s lives are in your care, and how to build a weather action plan based on modern meteorological science.

We were addressing our own normalization of deviance problem.  Weather apps are good enough if you’re deciding whether to carry an umbrella.  They are neither accurate nor current enough when you need every second to move an exposed crowd to shelter.  (Lawyer’s summary of the problem: time and tilt.  If you want details, talk to a meteorologist.  We know some good ones.) 

I have ranted in other contexts about the flawed logic of “situational awareness.”  (People focused on their work rarely see something or say something beyond their immediate attention.)  This is a different intellectual challenge.  It’s hard to convince people unsafe practices really are dangerous when disaster is usually avoided anyway.

I’m okay with the paradox.  Even if we need to reinforce our convictions by talking with smart friends periodically, that’s preferable to a clearer causal connection yielding more disasters, don’t you think?

From now on, let’s all use a drop cloth when we paint.