Deckplate Compliance or Systemic Failure?

BY CAPT (SEL) J. LEE BENNETT, NAVSAFECEN

Modern-day warships are a network of technologically advanced and complex systems, with each nested subsystem having its own unique techniques and procedures for how to maintain the supporting equipment or respond to malfunctions and failures. However, procedures currently utilized by the fleet (i.e., planned maintenance and casualty control) are typically written to deal with a singular task or issue and assumes the Sailor’s actions will only affect that particular piece of equipment. As such, these procedures fail to take into account other issues within that subsystem or the networked shipboard environment as a whole. Therefore, the human factor is required to think critically and sort out second and third-order effects of their intended actions.

The difficulties associated with mentally navigating multiple layers of complexity are compounded when systems are not fully operational. In those cases, the Sailor has to rely on combinations of multiple procedures in order to control the situation and prevent the possibility of a cascading casualty. Furthermore, Sailors must contend with documentation that is often times out of date, conflicting, or missing altogether. Therefore, we need to answer two questions: How much information is one Sailor expected to absorb and instantly recall in a constantly changing and highly-complex environment? And, at what point does too much information become cognitively and physically degenerative?

While theoretical studies regarding the amount of information a human can absorb before their decision-making ability becomes impaired can be traced back to the 1960s, groundbreaking empirical studies began being published around the 1980s. Later, two organizational theorists consolidated both approaches in their 2004 article, The Concept of Information Overload. Their paper shows a central agreement among researchers that “the performance (i.e., the quality of decisions or reasoning in general) of an individual correlates positively with the amount of information he or she receives – up to a certain point. If further information is provided beyond this point, the performance of the individual will rapidly decline.” Once information-processing capacity is surpassed, “additional information becomes noise and results in a decrease in information processing and decision quality.”

Other studies stress the time factor as the most important issue regarding the information overload problem. Since time is “an intrinsic factor due to its direct effect on information overload,” responding to an emergency reduces the possibility of conducting proper research prior to taking initial actions. Meaning, a person must then rely on their own training and experience, hoping that they have not unintentionally caused a negative chain-reaction. Essentially, if the quantity and complexity of the information required to complete a task exceeds the individual’s ability to integrate it all into proper actions within a given time, then information overload has been reached and the person’s decision-making performance will decline.

Factors that signal the occurrence of information overload are the feelings of stress, confusion, pressure, anxiety, and low motivation. In addition to the short-term cognitive restrictions on an individual’s immediate performance, the long-term effects of information overload on a person are very similar to stress-related mental and physical illnesses. One study shows that 25 percent of workers and 36 percent of managers experienced and reported health issues as a direct result of the excessive information required to do their jobs. Mental health practitioners refer to this condition as information fatigue syndrome and its presence is clearly evident in today’s fleet.

Mishaps in the past few years have resulted in billions of dollars in damages within the fleet. Subsequent investigations routinely point toward similar human factors (HFACs) as root causes: lack of training, supervision, and communication, or simply not following procedures. When reviewed individually, these HFACs seem adequate and appropriate. However, taking a step back and reviewing them collectively reveals a much larger systemic problem.

Sailors are at risk of saturation with too much information (sometimes conflicting or outdated) regarding the ship’s materiel condition and the subsequent work-arounds for those deficiencies. All this additional information can be difficult to absorb on a routine basis and can border on the impossible in a casualty condition. Dr. Steven J. Spear states in his book, The High-Velocity Edge, that high-velocity organizations “understand and solve problems, not put up with them,” whereas the fleet not only puts up with its persistent problems of information overload, it has institutionalized them.

A 2016 review of the U.S. Navy’s Arleigh Burke-class destroyers revealed some distressing statistics. According to a data call in conjunction with this review, on average, those ships had 25 active casualty reports, 21 active temporary departures from specifications, 16 active temporary standing orders and a backlog of 1,930 job sequence numbers) per vessel. The compounding effect all these work-arounds have on Sailors is clear – they produce an environment in which complexity clouds their minds and inhibits or delays proper actions. Instructions require Sailors to fully understand the condition of all equipment prior to taking their watch. If each destroyer has, on average, over 2,000 deficiencies with work-arounds, and an unknown number of [alternatives] “Sailor Alts”, can they truly understand the condition of their equipment? Such was the case in a 2016 ship fire caused by a can taped in place inside an electrical control box in order to keep a ventilation switch in the “on” position.

In today’s fleet, fewer people, inconsistent maintenance funds, and higher operational tempos come together to form conditions that encourage a climate of “just make it work” and “do whatever it takes to pass the inspection.” Dr. Spear described this mentality more succinctly when he wrote, “If you define a problem only in terms of whether you have adhered to the standard, you set a low bar for a pass, but if you define a problem by the much more rigorous criteria of whether work is being done without delay, without waste, and without strain of any kind, you set a much higher bar and create more reason to try to improve on what you are doing.” Working around problems does not fix them – they continue to exist in increasingly larger numbers and complicate the Sailor’s ability to take proper actions. In this environment, information overload will continue being at least a contributing factor, and quite possibly the real root cause, of mishaps until corrected.