Dr Jeremy Walker explains how digital mapping and novel visualisation tools can augment the CBRN mission.

Traditional CBRN reconnaissance and hazard mitigation missions entail warfighters in burdensome personal protective equipment (PPE) bringing cumbersome equipment downrange into the hot zone to search for contamination. This mission requires painstaking effort to detect and collect samples, to mark where they were found, and in some cases to bring confirmatory measurement analytics downrange to identify threats.

While significant improvement is being made in sensing to facilitate these missions, they still require the warfighter to expend a significant amount of time proximal to contamination without a clear picture of where it is and how extensively it may be spread.

All this is about to change. Teledyne FLIR is helping to modernise the CBRN Mission by developing new integrated capabilities that will protect the warfighter from harm and improve their ability to complete their mission by taking advantage of unmanned aerial systems (UAS) and unmanned ground systems (UGS) equipped with CBRN sensors.

These capabilities will enable the warfighter to have improved situational awareness of CBRN threats while remaining at a safe distance from exposure.

DTRA contract
Teledyne FLIR was recently awarded a contract from the US Defense Threat Reduction Agency Joint Science & Technology Office (DTRA JSTO). Teledyne FLIR will develop tools for digitally mapping CBRN sensor data and fusing it with real-world, Geo-registered positional data or LIDAR-generated 3D Maps.

This will also enable visualisation of these threats via insertion of a novel Augmented Reality (AR) Plug-In within the Army’s Android Team Awareness Kit (ATAK), a situational awareness tool that enables operators to access various types of integrated mission data, such as drone video feeds, 2D terrestrial maps, and other capabilities.

One goal of the effort is to develop an AR Plug-In that will enable an operator to visualise any integrated CBRN sensor data in a see-through perspective on their phone or tablet device. Threat icons, along with information about threat class/type and range, will be visualised.

The Teledyne FLIR team will additionally enable the same information to be utilised on a Heads-Up Display (HUD), the goal being to implement the capability on the Army’s Integrated Visual Augmentation System (IVAS) HUD for future CBRN missions.

NBCRV SSU
An example of the use of UAS and UGS for CBRN is the new Nuclear Biological Chemical Reconnaissance Vehicle Sensor Suite Upgrade (NBCRV SSU) being modernised by Teledyne FLIR as the prime contractor. The platform represents a first ever integration of CBRN sensors nested with unmanned platforms that are compatible with a command and control (C2) system.

The Stryker sensor suite combines standoff and point CBRN sensor information on the platform or the autonomous systemsand transmits digital warning messages through the C2 system to warn follow-on forces of the presence and approximate location of battlefield contamination.

One can readily imagine the potential to further enhance the mission by adding AR capabilities to enable the mounted reconnaissance team to more intuitively pilot the unmanned systems, or to get digital AR overlays of where contamination is to improve precision.

This digital data package can be further leveraged by the mobile force to enable contamination avoidance, or by dismounted reconnaissance or hazard mitigation missions who may need to further exploit or decontaminate the threat.

The ability to view precisely where the contamination is will accelerate the execution of sensitive site assessment (SSA) and sensitive site exploitation (SSE) missions. This is especially useful in the case where a robotic vehicle (such as those being fielded as part of the MTRS Increment 2 Program of Record) could enter a building remotely carrying CBRN sensor payloads and an area mapping tool, such as a LIDAR puck, to find and tag all contamination in a building before an operator even enters.

The remotely piloted UGS could transmit this ‘digital threat map’ back to operators, who can then have a precise understanding of exactly where the threats of concern are before they go downrange.

Visual data, shared through the TAK suite, can be visualised by the operators on devices or on a HUD as they execute their mission, thereby improving their ability to locate, sample and collect threats while minimising exposure to threats.

CIDAS
Another key deliverable of the DTRA-funded development effort is a capability to digitally map chemical contamination for the decontamination mission. The contamination indicator and  decontamination assurance system (CIDAS) is a technology developed by Teledyne FLIR that uses biotechnology-driven chemistries to detect chemical agents such as nerve agents and sulphur mustard (blister agent) on surfaces via a localised colour change.

The chemistry is applied to a vehicle via spraying, and turns colour within a few seconds or minutes to reveal to location. The chemistry is several logs more sensitive that any electronic technology, enabling it to ‘see’ contamination that exceeds threshold decontamination requirements.

This spray significantly improvesspeed and efficiency of the decon process, but still requires an operator in the loop to spray CIDAS or to apply the decontaminant.

Applying algorithms

To augment the warfighter’s capability to rapidly perform decontamination operations, Teledyne FLIR is leveraging deep learning development to build algorithms that will enable automated detection of the CIDAS positive indications. This AI-trained capability will be fused with 3D mapping to render a high-fidelity digital contamination map of a vehicle.

This capability can be incorporated into a suite of robotic platforms capable of autonomously performing the decontamination mission using simple optical payloads, combined with articulating robotic arms that can work in concert to apply CIDAS, apply decontaminants to mitigate the threats, and can rinse the equipment off with water.

The entire system can be remotely monitored by warfighters to ensure success while keeping the warfighter safe from the immediate risk of contamination via contact with the surfaces. This full-on automation of the process will augment the chemical company’s capability to return vehicles to the mission far sooner while keeping chemical soldiers safe.

“Our Warfighters need an advantage when operating in a CBRN environment. Fighting an unseen threat is tough. CBRN threat mapping and AR/VR visualisation will give our team an advantage and negate the effects of contamination. We can keep Warfighters out of the hazard.”
COL (RET) JAY RECKARD, TELEDYNE FLIR CBRN BUSINESS DEVELOPMENT

A critical element to long-term success is the development of good virtual reality (VR) and mixed reality (MR)-based training tools.

These will increase the warfighter’s level of comfort with the technology by enabling low-burden training by individuals and groups of warfighters on tasks at all levels, including equipment operation, individual tactics, techniques and procedures (TTPs) and team mission execution. These technologies can also serve as valuable tools for after action reviews (AARs) and other mission scenario planning tasks.

Dr. Jeremy Walker is the Director of Science & Technology for Teledyne FLIR Unmanned & Integrated Systems. He has a PhD in Chemistry and has been with Teledyne FLIR since 2006 developing innovative new technologies, including the CIDAS chemical agent disclosure spray. Dr. Walker currently spearheads collaborative R&D work with many DoD agencies including DTRA, DARPA, and the Army Research Office. He currently leads the research and innovation pipeline for Teledyne FLIR.

Image:
For a sensitive site assessment mission, a radiological sensor could be carried downrange by an operator or on an unmanned ground robot to map the radiological measurements and generate a localised heatmap. The augmented reality radiation heatmap shown here is created from readings of a Teledyne FLIR R425 paired with an application running on the Microsoft HoloLens 2 for visualisation and interaction.

©Teledyne FLIR