Navy Interoperability: Making Weapons Work as One
The crew inside the Navy Aegis destroyer is tracking an incoming jet, as are radar operators high above in the Air Force AWACS command-and-control plane. The pilot of the intercepting F-18 fighter sees the approaching plane on the screen too. But there’s a problem. Though the three weapons systems are coordinating tracking data at the speed of light, each depicts the jet’s location, trajectory or identity differently. On one scope, it even appears as two aircraft. And worse — none of the officers who must react to the potential danger is aware of the discrepancy.
The scenario may be hypothetical, but it is also incredibly common — and serious. Over the past three decades, friendly aircraft have frequently been misidentified as hostile, and vice versa, because weapons systems failed to work together seamlessly. Some of these incidents occurred in combat, risking the lives of U.S. troops. “Interoperability problems are legion,” says Paul Symborski, who has devoted most of his 37-year career at CNA to identifying the root causes of such breakdowns in communications. He is one of dozens of CNA analysts who — along with government employees and contractors — have taken a lead role over the past few decades in helping the military address interoperability issues. In fact, CNA led the development of a process to measure and quantify interoperability, creating a whole new way to look at the interconnected world of weapons systems.
Quantifying data link errors for military interoperability
Symborski had just joined the research organization in 1984 when a CNA team testing the Aegis Weapon System determined that its advanced sensors and integrators were excellent at tracking aircraft, but much less effective at sharing that information effectively over data links. The research scientist recalls his manager saying, “Paul, go take a look at the link.” Symborski has been looking at data links ever since.
Interoperability problems begin when one weapons system misinterprets or does not receive messages from another. Even though all messages are supposed to conform to certain data-link standards, each combat system has its own way of processing those messages. Among the most common symptoms of this ailment is that information displayed on the scopes of two or more weapons systems will disagree about the identity, velocity or location of a plane. One aircraft might appear as two tracks on a scope. “Bad things can happen with extra tracks,” warns Symborski, speaking from Naval Station Norfolk, where he has been posted since 2005 as part of CNA’s Field Program. “You don’t want extra tracks.”
The young analyst set about finding ways to quantify the problem, the beginning of a decades-long quest. Analyzing a series of field exercises as well as combat and non-combat operations, he compared the information depicted on scopes at different sites. “We reconstruct what the user had available to him on each platform: on the ship, at the missile battery, on the AWACS,” explains Symborski. “Then we compare that to say: ‘Are they the same? Are they different? How are they different?’”
He and his colleagues developed a methodology to quantify the accuracy, clarity, completeness and consistency of the information available to the operators of different weapons systems. The methodology “catches things that wouldn’t necessarily have been plain to the observer,” he says. “You can drill down to the root cause, and then you can say, ’this piece of equipment needs to be fixed, because look what it’s doing.’”
Interoperability data from naval exercises
To collect exercise data, Symborski goes to sea.
“You really do need to be there. You can look at data all day long, but you don’t necessarily know what’s going on in the mind of the user. Ordinarily I’m at shoulder of the warfare commander on the ship, at the front table in combat, listening to the decisions being made and seeing how they’re taking the data, how they’re talking about it. I sometimes have a headset on so I can hear the communication they’re having both off-ship and internally, so I can really understand what they think they’re seeing and how they’re reacting to it.”
Symborski is not alone in this process. He assembles as many as a dozen analysts from CNA and other organizations for an exercise, posting a researcher on each weapons system to collect data and observations. “It’s not just a matter of: ‘Hey, give me the hard drive when you’re done,’” he says. “There are a lot of subtleties and difficulties involved in capturing the data correctly.” Given the high level of cooperation needed on board, “I meet every ship’s responsible officers personally,” Symborski says, “even if I have to fly halfway around the world.”
Back on shore, Symborski and his colleagues compare recordings of the data available to scope operators to the actual GPS data gathered from the planes themselves. Wherever they spot inconsistencies, they can access a complete record of the data-link messages to track down the source. Though many problems persist, the operations researcher says: “I’ve had situations where we found an issue with the system, passed it to the system program office, and they were tremendously embarrassed and said, ‘My goodness, we’d better fix this right away.’ And literally within months there was a fix out in the fleet.”
For all of the problems solved, new ones appear each time defense contractors and the different armed services make advances in technology. “As systems become more and more networked, the problem gets bigger and bigger,” says Carla Barrett, a research scientist with CNA’s Surface Warfare Program who has specialized in interoperability for more than two decades. “Everybody knows their little pookah, but what they don’t understand is how it all works together to create something the warfighter can use,” she adds. “We’re often the only ones who can span that gap.”
So CNA has also used its deep data analyses as a lever for more systematic changes in interoperability. In the 1990s, the attention this research brought to the problem helped lead to the development of a new track-correlation algorithm that is now nearly universal in weapons systems in the U.S. and allied militaries. A decade later, as new problems proliferated with interactions among multiple data links and varying interpretations of this correlation standard, Barrett helped call the attention of the Chief of Naval Operations to the problem. She was asked to participate in a team to improve the rule sets for sharing and processing air-picture data among ships and aircraft. These changes were installed widely in the fleet.
The analysts who get close to this subject soon get caught up in the reality that what may seem like dry data links and algorithms are actually matters of life or death. Symborski even gets choked up when recalling a moment during a 2004 exercise at a U.S. Marine Corps Tactical Air Operations Center, when he first witnessed what had been multiple tracks correlating into one — the result of years of effort on an algorithm he helped develop. “The tracks you present to warfighters, they’re going to make decisions about whether to engage or not based on that data,” says Barrett. “That’s why most people who work on this stuff don’t sleep much.”