
To Err is Human To forgive…well in aviation we shoot them

Have all the dissenters in the factory shot. I don’t want our employees to be dissatisfied,” said Charlie Chaplin in “The Great Dictator.” Don’t shoot – I’m not a dissenter. I’m just presenting information from the world of science.
We have all heard of the learning curve. As we learn and gain experience we become more efficient and our tendency for committing errors decreases.
The error rate falls dramatically as we learn. Notice, however, that the learning curve never gets to zero errors. The error rate reduction slows noticeably and might even level off as we gain experience. This is because we will never know everything, we will never get to zero, and we will never be perfect. When we are first in the business, we absorb everything in big chunks. We tend to learn the minutia as we gain proficiency in the industry. Although it improves us as individuals, the effect on error reduction is not as impressive. As long as we continue to learn, we will continually be pointed toward the unachievable goal of zero. We are headed in the right direction but there is another force at work that thwarts that progress. That force is overconfidence.
Now let’s look at the confidence curve (or better termed the over-confidence curve) in relation to errors.
Oddly but truly, we tend to commit more mistakes as we gain confidence with our jobs. Be it complacency, norms or professional arrogance, the fact remains that this condition exists. Analysis of errors reveals that errors were clumped around those new to the industry (not a surprise) and those who were very experienced (not expected).Although the climb in errors is not as steep as the newbie’s, it is still an alarming phenomenon.
Putting it together
Putting the two error curves together forms a total picture of error analysis.
I guess the trick is to not learn too much and get out of the business at mid-career and find another profession since it is just going to get worse, right? No, wrong answer. This is not entirely an individual fault problem. As professionals our knowledge and training, and the expected recall of that information, are put to the test on a daily basis. We are also expected to know certain information without looking at a reference even though a reference is required when it comes to applying that knowledge.
Imagine seeing a doctor who said, “I’ll have to look that up.” Would your level of confidence be less? You expect your doctor to have an answer without using a reference. They memorized a bunch of books and should be able to recite them when required. You have placed that expectation upon them and they, in turn, respond as you expect. This is customer satisfaction theory. Find out what the customer wants and meet their expectation. The customer is always right. The customer is not right in this case. The customer has been acclimated over time to expect that kind of response and the trend continues. The doctors, in like kind, respond to that “norm” and jam as much as possible in their brains. Even an actor who has memorized his lines to the play will not recite them verbatim every time. It has been statistically shown that such diagnosis from memory produces only 60 percent accuracy. They do the same things that we are accused of as aircraft technicians — make educated assumptions and shot-gun a problem.
Our errors as aircraft technicians result in less than two percent of aircraft fatalities. Not bad, when you consider the number of people flying. The rate is about the same with medical errors in American hospitals. The importance is driven home when you put numbers to that statistic. There are 100,000 deaths due to errors per year in American hospitals. There is enough information to show that if a reference or checklist would have been used, the deaths could have been reduced drastically.
Types of errors
In the book “The Checklist Manifesto,” author Atul Gawande points out how routine surgical tasks are so complicated that errors are inevitable. He divided error into two categories. The first is errors of ignorance (mistakes we make because we don’t know enough). This is your learning curve where errors are less over time. The second is errors of ineptitude (mistakes we made because we don’t make proper use of what we know). This is the arrogance or over-confidence factor where errors increase over time. He goes on to state that in the modern world we are more apt to error due to ineptitude.
We are used to checklists in aviation. Pilots and maintenance staff use them. When it was suggested that checklists be used in the medical profession, it was met with stiff resistance even though it was shown that it would save lives. The medical doctors thought it was unprofessional to use a checklist. Do you sense professional arrogance there? It’s been said that pilots are buried with their mistakes and doctors bury their mistakes. This explains why pilots will use a checklist as a normal course of business but medical doctors won’t.
Despite this early resistance, there has been a behavioral shift in the medical ranks and there appear to be checklists in use. You will see nurses carrying clipboards and asking your name and date of birth to verify their records. They don’t assume anymore. It has become standard practice at most hospitals to map out an incision or amputation on the person’s body and have it checked by another doctor — just like us with an inspector buy off. Failure to follow prescribed procedures has been our nemesis in aviation and it is the same in medical practice and other industries. We are not alone in this practice so you shouldn’t be surprised that aviation maintenance technicians don’t follow instructions when it is a global problem.
If you wanted to graph this phenomenon it would run from professional confidence at one end to professional arrogance at the other. Yes, we get comfortable in our positions over time and we tend to do things by memory that we have done a hundred times previously. We become confident in our abilities. The problem with us as individuals is that we don’t know our abilities accurately. We have a clear idea of the boundaries in what we can and cannot do but the large mix of gray before we get to that “cannot do” point that gets us in trouble. As aircraft mechanics we have this “let me give it a try” attitude. We are not daredevils by any means but we have this drive to get the job done. It could be a tendency to please the customer or the boss, a personal challenge or just inner satisfaction, but we over judge our abilities and wind up swimming in those tumultuous gray waters.
Over confidence
Our confidence in ourselves is admirable and a true mark of the professionals that we are — but it can also get us into trouble when our ability and our confidence don’t match. Over-confidence is bolstered and tempered by successful outcomes. If we skip step five we can shave 30 minutes off the job. If we are successful once, it gives us the confidence to do it twice then three times and so on. Each successful outcome increases our confidence to continue to work around the instructions until that one day comes when it doesn’t.
Over-confidence is exceeding our ability beyond our capability but with the intent of a successful outcome. Over-confidence’s primary focus is success and is a benevolent action. We can be driven to that gray area and skirt the rules and regulations because of a lack of resources and feel somewhat secure that we can accomplish the task. Half of the Dirty Dozen are a lack of something. In order to compensate, we have to either slow production or deviate from standard practice.
Arrogance, on the other hand, is over-confidence exceeding our ability beyond our capability but with a disregard for the potential outcome. It comes down to intent. With professional arrogance a successful outcome is secondary because success will reflect their actions and a failure can be pawned off to something or someone else. The primary focus is on posturing and is a narcissistic action.
We are not protected from error because of our experience, our training, or our tenure on the job. We must maintain our vigilance and beware of our susceptibility.
Patrick Kinane is an FAA-certificated A&P with IA and commercial pilot with instrument rating. He has 50 years of experience in aviation maintenance. He is an ASQ senior member with quality auditor and quality systems/organizational excellence manager certifications. He is an RABQSA-certified AS9100 and AS9110 aerospace industry experienced auditor and ISO9001 business improvement/quality management systems auditor. He earned a bachelor of science degree in aviation maintenance management, a master’s of science degree in education, and a Ph.D. in organizational psychology. Kinane is presently a senior quality management systems auditor for AAR CORP and a professor of organizational behavior at DeVry University.