Press "Enter" to skip to content

Free the Data: Vice Chiefs Launch an Acquisition Crusade

NELLIS AFB, Nevada — On this shadeless corner of the city of Las Vegas, the U.S. military does some of its most important combat pilot training in high-tech simulators — one building apiece for F-16s, F-35s, and F-22s. But there’s a problem: actual air combat doesn’t occur in neat, vendor-specific environments. So the Air Force is constructing a new building on a different corner of the base to better integrate data to help create simulations that are more credible.

“It’s easy to go out and collect data for one specific scenario, but applying that to a broad scenario, it’s a very, very difficult challenge,” said Lt. Col. Chris “Slam” Duncan, commander of the 31st Combat Training Squadron. 

Duncan said that challenge is exacerbated by the way aircraft makers collect and silo their data.

“Not only do you have the complications of what God has created on this world, with physics, environmentals, electromagnetic operating environments, the variables that are just happenchance,” he said. “Additionally to that, the different companies have taken different approaches to solving that. Sometimes that language isn’t the same, so the language needs to be interpreted across the environments.” 

That task of interpreting falls to the government, he said, “That is the problem. Getting companies on the same page is sometimes difficult.”

All this makes it harder for the Air Force to create the sort of training scenarios that actually represent a future conflict with a highly capable adversary.

Duncan’s story isn’t unique. America’s sophisticated jets, drones, combat vehicles, satellites, and other gear produce data that the Defense Department can’t access, use, or share in the way that it wants to. 

Gen. John Hyten, the vice chairman of the Joint Chiefs of Staff, and the services’ vice chiefs are looking to change that. In May, they intend to release a set of mandates that they hope will reshape how defense contractors produce things for the U.S. military.

The week before last, Hyten and members of the Joint Requirements Oversight Council, or JROC, journeyed west to speak to the heads of some of America’s most innovative companies, large and small. They met with senior leaders at leading companies like Qualcomm and Microsoft as well as Rebellion Defense, a startup focusing on software and data analysis products for defense applications. They had dinner with Elon Musk at SpaceX. The objective wasn’t to discover new products but to understand how digitally-driven industries produce things in the year 2021.

These discussions helped the JROC come to consensus on at least one principle for the new mandates, Hyten said on a plane during the trip.

“Data can’t be stovepiped,” he said. “It must be shared and made available to the community. There’s still [intellectual property] in the platform owned by the contractors, but in order to maintain the system, we need the data. To operate the system we need the data. It can not be stovepiped. Period.”

More broadly, Hyten said, the discussions helped the service leaders understand just how much more slowly the Pentagon is moving than its adversaries.

Normally, the acquisition process works like this: services like the Army, Navy, Air Force, etc., take their program idea to the JROC, whose members evaluate whether the proposed weapon meets interoperability standards and whether it fills a capability gap. But they aren’t the only group that looks at the proposal. 

“As we go through that process, every functional area of the Pentagon takes a look at it. The logistics guys take a look at it,” Hyten said. DoD’s chief information officer “takes a look at it. The testers take a look at it to make sure all their stuff is in these requirements. It takes a year or two to get through the process and that’s required before you even start a program or go into production.”

That Industrial Age process takes far too long says Hyten. A central point of the trip was to help other service chiefs better understand how much more quickly production happens in businesses that are digitally led — software companies like Microsoft but also manufacturers like SpaceX. 

“In building whatever the next thing is…you need to have this iterative process with software,” he said. “Every company we looked at updates software every day. You can’t update software and then put it through a six-month approval process to operate. That’s six months off of the current pace of software development.”

With their new directives, Hyten and the JROC will try to push the Pentagon away from so many hard requirements and toward something more of a framework. They want the services to have more leeway to build things and test as they go, as software companies do, rather than setting hard goals that cannot be changed if they become obsolete. By mandating a framework, a few core things, rather than a lot of little requirements, Hyten hopes to shave months or years off the process of getting new programs approved.

“A lot of the stuff we add in as we go through are the joint interfaces that the services have to meet. Oh, and by the way, we always get them wrong because we don’t know ten years in advance what those are really going to be,” Hyten said. 

A more flexible framework would free the services from the need to get JROC’s approval for every little change. 

“Then they can just go as fast as they can go,” he said.

But there’s one thing that has to happen for the move away from hard requirements to succeed, Hyten said: the data that emerges from the various platforms must be sharable and usable across the Defense Department. 

“You have to make all of your data available. Not everyone is going to be able to access your data. There’s going to be credentials. Classification. Various impediments,” he said. “But you can’t hoard your data. It can’t be proprietary. You have to be able to make the data available to the broad world.”

“Open and share your data” is a bigger change than it may sound like. Increasingly complex platforms like fighter jets throw off a lot of data that manufacturers can use to bid on secondary contracts related to maintenance and sustainment.

“If a company owns the technical data package, then you basically own the sustainment piece of the program,” said Tara Murphy Dougherty, CEO of data analytics company Govini. “You don’t have to reengineer or figure out what the configuration charges are that have to be made. You can be the most competitive bidder for the sustainment contracts.”

It’s a great deal and business model for the companies that win the original design-and-build contracts — and a terrible one for the Pentagon, which winds up paying far more money than it might, just to maintain and improve its ships, planes, and more.

In 2014, Dan Widdis, now chief data scientist at Data Valuation Ventures, was working with the Army on their helicopter fleet. His team was trying to predict maintenance needs from vibration data. But when he asked the Army for reliability data sorted by the source of each part, the service refused to give him the names of the manufacturers. “I’m not sure of the underlying reason for that but I’m guessing it has to do with support contracts,” Widdis said. “That’s kind of where we are. I think it’s going to be a challenge.”

Widdis said the services were somewhat hostage to their contracts. 

“It’s not a clearance issue. It’s all [Non-disclosure agreements] bidding and the whole procurement process,” he said. 

Widdis said that it was the way things had been going for decades, but it can vary somewhat depending on the service and the contractor. 

“I do think it’s mostly in the context of anything that would impact procurement, life-cycle cost, maintenance stuff, future maintenance. That’s where there’s sensitivity.”

A former senior military officer said that the Defense Department has only recently begun to realize that it needs to negotiate for data rights.

“Early on, I don’t think we were smart enough” to demand it, said the former official, who asked to remain anonymous. They recalled contractors who appeared to gloat when they managed to retain data rights, knowing that they could extract high prices when defense officials returned for it.

Some of these defense contractors have tried to recast themselves more  “as trusted systems and data integrators, capable of fusing the data from their own platforms, and as well as those from other manufacturers,” said the former official. “After contributing to the technical debt that the government was crippled with, it’s like they thought, ‘Let’s go ahead and be the fusion guys,’ after they themselves contributed to the siloes. This was not particularly helpful, as they were less than impressive when they expanded without serious investment in the skills required to attack the problem.”

The challenge of building an internet-of-things, with ubiquitous, shareable data across lots of devices, is hard enough for the IT companies that are already leading the world. For the Defense Department, “It was made far more difficult by companies who sought to follow the work into this space without the proper technical skills,” the former official said. 

“It’s been a long, costly process for the government as these providers failed year after year to demonstrate mastery over multiple technologies and siloed data,” they said “They either have realized this and adapted, or divested, and returned the work to those who are qualified to do it.” 

The JROC’s new rules alone won’t solve the problem of proprietary data. Pentagon policies controlled by civilian leadership — currently, the Biden administration — have to help, Hyten said.  

But, he says, the Defense Department as a whole is beginning to understand the scope of the problem and how it’s putting the United States behind competing nations like China and Russia, where the government doesn’t have to fight with companies over the data on the weapons those companies or manufacturing organizations produce.

Complete data access will also be essential for developing effective, trustworthy artificial intelligence tools for military use, Hyten said. 

“When people wanted to talk about AI, they only wanted to talk about algorithms. If we don’t have ubiquitous large data to work with, AI is not going to work. You need access to all data,” he said.

Bias in data is a real threat for future Defense Department use of AI. When private companies like Google reveal new AI products or capabilities with data bias, the results can be very embarrassing. But for the Defense Department, consequences are much higher, literally war and peace in some instances. “The only way to avoid bias is to have access to all data that are pertinent to the problem, even if it’s bad. What many people fail to realize is that machine learning, the machine will teach itself to understand what it’s looking at. If it has all the data,” he says. “We have to get out of our own way. You have to make all the data that’s relevant to a problem relevant to the machine. Right now we have a biased answer.”

New data access demands may force traditional defense companies to adopt new business models, but Hyten hopes that a new attitude toward openness will enable an entirely new generation of weapons, vehicles, and things the very function of which can be changed and updated far more rapidly, through software, similar to the way software companies like Microsoft, and even a car company like Tesla, are able to update and maintain products through remote software fixes. The alternative is to keep struggling to catch up to state-of-the-art that’s changing faster than today’s processes can adapt. 

“I think you see from the other vice chiefs, we’re all in on how to make this happen. We’re excited about it,” he said. 

source: NextGov