It seems as though the opposite of “careless” ought to be “careful.” That the best way to avoid avoidable errors is to try harder, to put more care into the work.
This means that if surgeons were more careful, there would be fewer errors. And that so many of the mistakes that mess things up would go away if people just tried harder.
And this is true. For a while. But then, it’s not effort but systems that matter.
Years ago, I created a trivia game for Prodigy. The first batch of 1,000 questions was 97% perfect. Which is fine, until you realize that this meant that 30 questions had an error. And every error ruined the experience for the user.
The second batch, we tried extra hard. Really hard. Our backs were against the wall and we couldn’t afford any errors. Our effort paid off in a 50% decrease in errors. We were down to 1.5%. Alas, that’s still 15 game breakers.
Then, I got smart and I changed the system. Instead of having trivia writers work really hard to avoid mistakes, we divided our team in half. Half the team used the encyclopedia (yes, it was a long time ago) to write the questions, and they made a photocopy of the source, along with the question, and put it in a notebook.
The other half of the team got the notebook and was charged with answering the question based on the source. They got a bonus of $20 for every question they found where their answer was more correct than the original.
The result of the new system? Zero error for the next 5,000 questions.
We need to put care into our systems. We need to build checklists and peer review and resilience into the way we express our carefulness. It seems ridiculous that a surgeon needs to write her name (with a Sharpie) on the limb that she’s about to operate on, but this simple system adjustment means that errors involving working on the wrong limb will go to zero.
In school, we harangue kids to be more careful, and spend approximately zero time teaching them to build better systems instead. We ignore checklists and processes because we’ve been taught that they’re beneath us.
Instead of reacting to an error with, “I need to be more careful,” we can respond with, “I can build a better system.”
If it matters enough to be careful, it matters enough to build a system around it.