Extended reality (XR) is a broad term that encompasses virtual reality, augmented reality, and mixed/merged reality. Virtual reality is a completely immersive experience for the user—all generated by computer. Augmented reality adds information to the real environment, such as a drawing that appears in the user’s real-world field of vision. Mixed/merged reality allows for virtual and real worlds to interact—virtual characters can respond to gravity, and real-world objects grasped by the use can become part of the virtual scene. While low-cost XR initially focused on gaming, today’s XR is also used in art, healthcare, manufacturing, and education. In a naval context, XR is often used in training.
The Naval Education and Training Command (NETC) is the largest shore command in the Navy, responsible for overseeing
the Navy’s force development. This includes recruiting sailors and conducting their initial and technical training. NETC
continuously evaluates the effectiveness of naval training. These evaluations have traditionally used the results of students’
quizzes, tests, and fleet feedback surveys, often supplemented with instructors’ appraisals of how well students work with
hands-on procedures using real equipment.
Watching students using real equipment is often more desirable for assessment than written tests, but there are many limitations to assessments with physical equipment. Sometimes equipment is not available or breaks if used too often. Stockpiling real equipment and supplies takes up classroom space, and assessing students individually is time-consuming. A class of 30 students requires instructors to watch students 30 times.
XR training devices, like written tests, allow the assessment of multiple students simultaneously, after or while they are receiving instruction. XR training provides similar economies of scale to written testing, but places more emphasis on application of knowledge, particularly for operating and maintaining equipment.
Despite the high potential of XR for assessing performance, three questions remain unanswered:
- Can current XR systems provide reliable, useful performance metrics that students and instructors accept?
- If current XR systems do not yet provide acceptable metrics, what obstacles are hindering the development of this capability?
- How can the Navy foster the development of acceptable XR performance metrics?
NETC asked CNA to address these questions and make a “roadmap” of recommendations for the future. To assess these issues, CNA examined a high-quality prototype XR trainer that instructs underwater divers on emergency procedures (EPs) and collects data on their EP performance. We observed divers using the XR trainer, interviewed instructors, performed a literature review, conducted discussions with XR subject matter experts across the Department of Defense, and analyzed the data collected by the XR device. Classes, varying in size from 2 to 20 students, were given a two-hour session to use the prototype.
It provides more immersive, realistic indicators of emergency situations than can be simulated in a pool. Using replicas of real equipment with embedded sensors, UBASIM allows divers to develop “muscle memory” of appropriate EP responses. Divers can physically turn dials and valves, getting realistic feedback from their actions. They can learn to recognize emergency situations through consulting simulated sensor readings, and then see the result of their actions in real time.
For example, the diver can look at the sensor-laden secondary display through the goggles to help diagnose the situation. If the diver responds correctly, sensor readings will indicate that their actions are improving the situation. Conversely, inappropriate actions can cause worsening sensor readings. Wrong actions can even lead to simulated symptoms of oxygen toxicity, such as tunnel vision, ringing in the ears, or unconsciousness — represented by images and sounds on the screen and headphones. (See photos.) In other words, the XR prototype provides a more dynamic training and assessment environment. Previously, students memorized a sequence of EP steps, were quizzed on them, and then were asked to demonstrate the procedures in a pool when shown a particular emergency situation on a card.
UBASIM has three modes:
Crawl mode allows divers to choose which EP they want to learn. It provides step-by-step instruction, and no performance data are collected.
Walk mode is more difficult because although the student can choose the EP, the student’s responses must first show that they recognize and diagnose which emergency situation has occurred, given visual inputs from the goggle device and audio cues from headphones. Hints in Walk mode are less specific than in Crawl mode, and Walk mode collects real-time metrics on diver performance.
Run mode is the most difficult of all, because an intelligent tutoring system selects which EP the student will be presented and there are no hints or instruction. Students must diagnose the emergency by themselves. Performance data are collected, and students receive an after-action report that provides remediation.
UBASIM trains and collects performance data on 21 separate diving emergency procedures. UBASIM’s developers
recommend that students go through all the EPs in Crawl mode, then all of EPs in Walk mode before attempting Run
mode. Performance data collected include:
- whether steps were performed
- when they were performed
- whether the diver made life-threatening safety errors
Unfortunately, the XR prototype captured very low student performance levels, with divers passing fewer than 20 percent of Walk and Run mode sessions. This prototype probably was not always recording actual performance reliably and accurately for several reasons: There were several sensor difficulties, instructor-taught order of steps sometimes differed from those assumed by the prototype XR trainer, and the system had difficulties with steps performed simultaneously. The developer is working to address these issues.Download full report
Cleared for public release.
- Document Number: DMM-2023-U-034596-Final
- Publication Date: 1/17/2023