In my previous Nextgov column, I got to interview the general manager of a company that is using artificial intelligence to help the military plan out its maintenance schedules. That program uses AI in a really good way that plays to its strengths, namely its ability to consider thousands of data points, much more than a human ever could, to come up with an action plan for maintenance that maximizes both efficiency and safety. However, when it comes time to actually perform the maintenance, those tasks must be delegated back to a human. But what happens if those physical tasks are also extremely complicated?
The CV-22 Osprey is a perfect example of a military aircraft that is both revolutionary and complicated to maintain. First put into service in 2007, it can act as both a helicopter and an airplane. In addition to being really amazing to watch, its versatility in nearly any environment has put it into heavy service with the Air Force and Marines. Ospreys have been on combat missions in Iraq, Libya, Kuwait and Afghanistan. But it also has had a fair number of problems because of the difficulty in maintaining the aircraft. According to a report in Seapower Magazine, in 2019 four out of every 10 CV-22 Ospreys in active service were not available for combat. There were a variety of reasons cited, but maintenance issues were a primary cause.
A company called GridRaster is trying to help improve the situation using both augmented reality (AR) and virtual reality (VR) programs designed for maintenance crews charged with maintaining the aircraft. They are working with the Air Force Special Operations Command on this effort.
I was able to interview GridRaster Co-Founder and COO Dijam Panigrahi about bringing technologies like AR and VR, which are most often used in video games like “Pokemon” or “The Witcher” these days, into the practical world of aircraft maintenance.
Nextgov: Most VR platforms require a headset and lots of other special gear, while AR is able to run on a smartphone or a tablet. But you have found a way to support both technologies through one platform?
Panigrahi: Yes, our platform can support both AR and VR applications. We achieve it by providing a unified API. In VR, we are able stream experiences at ultra-low latency over the network in very high fidelity, ensuring the realism. For AR, we also utilize the camera feeds and provide a high precision alignment of virtual objects in the physical world.
Some examples of suitable applications are pilot training in an ultra-realistic environment for virtual reality and then maintenance of aircraft using augmented reality.
Nextgov: What is the secret to make it work with both technologies, because they are quite different? In VR, users are experiencing a fully virtual world where you could almost forget that it’s a rendered environment. Augmented reality by contrast is totally based in the real world, which users see through their device’s camera. The rendering happens over top of that, whether it’s monsters in a game or the technical specifications for a CV-22 Osprey part for maintenance crews.
Panigrahi: GridRaster does that by building a strong portfolio of patents around its core technical breakthroughs, which are captured by three key components of the platform.
The first is 3D Spatial Mapping. By streaming raw camera data including RGB color data and, where available, depth data from the device to the GridRaster server we achieve high fidelity 3D scene reconstruction, scene segmentation and 3D object recognition using 3D vision and deep learning-based AI.
The second part is the 3D AI computer vision. That is where semantic segmentation and object identification is performed on the reconstructed 3D world, and objects of interest are registered against the database of 3D models and digital twins.
Finally, we use low latency rendering. GridRaster’s rendering component renders the 3D object or scene at a high-frame rate and low motion-photon latency with on-device predictive rendering and reconstruction. The real-world spatial mapping is continually updated using 3D computer vision and the model is rendered precisely over the corresponding object as required.
Nextgov: Why is the Air Force looking to help their technicians working in Osprey maintenance programs?
Panigrahi: The complexity of aircraft wiring for the CV-22 Osprey continues to increase with the fielding and deployment of advanced communications, integrated aircraft survivability systems and aircraft data collection systems. Maintenance professionals are required to know all variants of the systems. Technicians must maintain extraordinarily complex wiring installations, routing, clamp and abatement placements. Even the slightest mistake comes with extreme safety risk and the potential loss of the aircraft.
The CV-22 nacelle wiring has been a major readiness degrader, with 90% of maintenance work being performed in relation to the nacelle. Air Force Special Operations Command has been looking for solutions to provide improved accuracy and speed of field installations and a simplified approach to sustained maintenance for wiring harnesses.
Augmented reality technologies are a game changer in achieving mission readiness goals.
Nextgov: The military version of the maintenance program is still being tested and evaluated, but you have a civilian version being used right now. In that system, maintenance crews are able to see proper wiring configurations rendered on their devices and then line them up over the real thing using onboard cameras to check their work and get assistance when needed. Has this improved the accuracy of technicians?
Panigrahi: Absolutely. The medium is such a powerful tool that it can help a novice perform at a more expert level because all the instructions are being overlaid right in front of them along with all of the contextual information. We have seen in similar scenarios at a large aerospace and defense company where technicians with hardly one year of experience have been able to outperform technicians with four or five years of experience. This is also an absolutely amazing tool to bridge the skills gap which has been building up in most industrial organizations such as manufacturing.
Nextgov: Being on the cutting edge of this effort, do you have any predictions about where AR and VR technology will go in terms of military applications in the future?
Panigrahi: AR and VR has already been established as a great medium for modeling, simulations and training. And AR will be used in an even bigger way for improving the sustainment of aircrafts. As the technology further evolves, we are going to see many other use cases.
I’m particularly excited about the mixed reality holodecks, where senior officers and decision-makers back at central command centers can teleport themselves to remote outposts with access to all the contextual information there and make some collaborative decisions on missions. We are not too far from it. I think it will be possible over the next three to four years.
John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology. He is the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys