0 Items $0.00

Is “Deskilling” a Threat to Safety in your Workplace?

Is “Deskilling” a Threat to Safety in your Workplace?

Maybe you’ve heard the term de-skilling. It refers to the loss of knowledge or skills on the part of a person as a result of technology doing more of the work for them.  In work environments with increased automation, workers find themselves with less of what used to be “the work” to do because it is done by robotics or computer-assisted technology.  The skills previously required to do the work fade from lack of practice, and few people in close proximity to the work understand the complex programming and systems that drive the automation. This can pose a serious safety risk under certain conditions.

There is a paradox in how technology affects safety at present. Overall, safety has been much improved by increasing automation and assisting technologies in many industries: aviation with better computer systems in aircraft, automobile braking systems that have become more automated, and in manufacturing settings where everything from assisting technology to robotics has reduced human exposure to hazards.  But, what happens when those technologies fail, as they inevitably will do? In fact, there is evidence that the human operators may not be prepared to handle these situations manually.  In these circumstances the level of technology is making things less safe.

Take an aviation industry example.  In 2009, in one of the most disturbing air crashes ever, Air France 447 fell out of the sky from 35,000 feet into the Atlantic Ocean killing all onboard.  The Airbus A330 had an airspeed sensor failure while flying through icy conditions, and the pilots reacted in ways that stalled the aircraft.  Despite having 3 pilots in the cockpit trying to solve a very solvable problem, the pilots never figured out how to recover the plane before its fatal impact. Standard nose-down stall recovery maneuvers could have saved the aircraft if used after the onset of the stall.  The mystery was why didn’t these pilots recognize the stall and apply the proper maneuvers to recover from it?  With increasing automation on fourth-generation airliners, it’s not often that pilots manually fly the aircraft, and notable crashes like Air France 447 have led to much discussion of pilot deskilling.  For example, the FAA published a report in 2013 on aviation automation and found that pilots were relying too much on automation to do the flying.  Their very first recommendation was that pilots need more practice in manually flying the aircraft.  (http://www.faa.gov/aircraft/air_cert/design_approvals/human_factors/media/oufpms_report.pdf )

To take a closer look at this, it’s important to understand that automation today requires that operators monitor systems, but monitoring can drift into a quite passive activity, especially where the systems correct their own deviations.  Even in work environments where the systems do not correct themselves, monitoring behavior is tricky and often weak.  Witness the recent news that TSA screeners at airports missed 95% of the threats in repeated controlled testing by their agency. With self-correcting technology such as modern flight path management systems (autopilot, auto throttle, automated course control, etc.) operators have less and less to do, making it easier to lose focus on the task, relying on technology and sitting by as their operational skills fade over time. It requires no big leap to predict that the same thing will likely happen to all of us with the advent of self-driving automobiles expected in the near future.

If your workplace is experiencing greater automation and changing work requirements for operators from what they used to do (the work) to what they are currently doing (monitoring the technology and systems now doing the bulk of the work) it is critical that you assess the potential for a safety disaster and act accordingly.  Be certain that operators understand the systems well enough to handle a system failure.  If they don’t, implement targeted training to build their knowledge. And even further, build in future practice for operators to work in manual mode (if that is possible) as well as practice dealing with systems failure.

It is our experience in safety that most companies do not provide sufficient practice opportunities during training for operators to become truly fluent in their normal work tasks before they begin work.  Notable exceptions occur in some aspects of military training such as weapons operation, but such practice to mastery or fluency is rare for operators in civilian jobs.  Practice opportunities for handling systems/technology failures may be even less common.  Live or simulation-based practice for these situations needs to be much more frequent to counter the deadly slide toward deskilling and overconfidence in automation where human operators are working jointly with fallible technology or automated work systems.  Perhaps next-generation technology can pose problems to those monitoring human operators to keep their skills at the necessary level.  Getting the technology to reinforce desired human behaviors may be an intriguing future direction, but for now, what are you doing to prevent deskilling? Get practical tips  to increase your effectiveness and keep workers safe.

Subscribe to ADI's NewsFeed      A Supervisor's Guide to Safety Leadership                


Posted by Cloyd Hyten, Ph.D.

Cloyd is a senior consultant and thought leader in the field of performance improvement. Dedicating more than 20 years to this work, Cloyd has also served on the Editorial Board of the Journal of Organizational Behavior Management, and served as President of the OBM Network. Outside of work, Cloyd enjoys history, food and football.



This is a very interesting topic and issue. Google found that during the second stage of their self-driving cars, letting employees use them, that the “drivers” immediately drifted to texting, game playing and even sleeping… scary since the system is designed to shut down immediately if it encounters a problem it cannot solve and give full control back to the drivers. Google called this "the hand-off problem" and has not developed a way to solve it. Needless to say, the second stage was quickly aborted and they are reconsidering how to use the technology, currently they are moving to small, slow, driverless cars that might be useful in a city or on a campus. For anyone interested in hearing more about it, follow this link- http://www.npr.org/sections/alltechconsidered/2015/08/20/433000643/how-close-are-we-really-to-a-robot-run-society Nice article Cloyd!

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.