The Problem With Police De-Escalation Training
Law enforcement relies on role-playing for de-escalation training; they gather an entire department for a day, hire actors, rent out a warehouse, and re-enact a series of scenarios to practice de-escalation skills. This can cost up to $300k and can be very disruptive to a department.
For this reason, officers experience scenario-based training too infrequently, so whatever skills they obtained are not reinforced through frequent practice. In addition, the progression of each scenario is dependent on the choices of an actor or a trainer controlling the simulation. Human factors such as these can impact the quality of experience and leave it vulnerable to the biases and quirks of each trainer.
By combining Virtual Reality with Artificial Intelligence, we can create a more affordable, unbiased, and effective de-escalation training to reduce unnecessary use of force by law enforcement. State of the art VR hardware can be purchased for under $5k and rather than needing a large facility for live training, departments only need a single room to house a virtual training environment. In addition, an AI-controlled system will ensure a consistent, repeatable training experience without the biases and judgments of a human actor or operator.
In early 2018, there were two distinct efforts to explore virtual reality and de-escalation training led by the Research and Engineering teams respectively.
The Research team had contracted the help of L2D to prototype a few different experiences exploring ambulation, arousal, and tool interactions. These experiences were built as individual projects designed to learn about VR. Simultaneously, the Engineering team had contracted the help of another digital agency, Digital Domain, to build three scenarios —a house, a store, and a bus stop— and three characters to develop de-escalation AI models.
This approach created a rift between teams and deprived them from a shared sense of momentum. In addition, the product lacked continuity, which confused users and decreased their trust in our ability to deliver a cohesive experience. To address this, we worked with both teams and digital agencies to design a single framework to host both the experiments and simulation scenarios.
We call it Trainer, a VR de-escalation tool composed of three key parts:
- Onboarding: Teach officers how to interact with digital objects and navigate a virtual environment.
- Scenarios: Practice conversational de-escalation by naturally speaking with AI characters in real-life simulations.
- Dashboard: Real-time and longitudinal analytics for officers and their instructors to understand and improve their behavior over time.
Deep Dive: Onboarding
Police officers are not used to interacting with cutting edge technology in their workplace. Many will experience VR for the first time through Trainer and although they are initially very excited about the idea of using VR for training, they’re unfamiliar with basic concepts for interacting with immersive media.
We designed an experience to gradually introduce officers to VR. The experience is presented as a familiar environment to officers: a police department with three floors. Each floor builds on a series of concepts that officers will need to understand VR.
In order to achieve near transfer of skills, we decided to build our tracking systems to map an officer's real-world movements to in-experience movement, reducing the need for controller-based locomotion. This ensured the highest level of immersion in the experience, which is essential to produce natural reactions.
The biggest benefit of the natural movement paradigm is that we didn’t have to teach officers how to move in the virtual environment. They simply walked in the physical world and we matched their movement in the virtual environment. However, this required police departments to dedicate a room for training. After conducting user research, we learned that the average police department could dedicate an 11 by 14 feet sized room.
Once we understood the size of the room, we encountered another challenge: the walkable area in the virtual environment was limited to the size of the physical room. To address this, we worked with environment designers to creatively block out walkable areas with objects in the virtual environment in order to prevent officers from walking into walls. Another challenge introduced by natural movement was aligning the virtual and physical environments with each other. This meant that we needed to somehow calibrate the physical environment with the virtual one.
To achieve calibration, we used a location in the physical space as the Start Area for every virtual environment. This enabled the system to have a consistent reference point to understand where the user is located in relation to the virtual and physical objects.
In order to keep the walkable area within the size of the average available space in a police department, we decided to divide Onboarding into a few discrete scenarios. Each scenario introduces a new concept needed to complete Onboarding.
To navigate between scenarios, we designed a virtual elevator that appeared around the user as they stepped in the start location. Vertical movement allowed users to traverse between virtual scenarios while preserving the same physical position in the real world.The elevator metaphor also enabled us to treat each ‘floor’ as a standalone environment and the ‘start location’ as the hub where users would enter and exit each scenario. Once inside the start location, a console appeared to choose a scenario.
Designing an interface to select a scenario proved to be very challenging. Limited depth perception in VR makes interacting with 2D UI very difficult. Users often don’t know if they’ve reached out far enough to interact with a screen in virtual reality. For this reason, interacting with 3D objects in VR is preferred over 2D interfaces.
With that in mind, the first approach we tried was arranging each floor as a physical button in the console. The buttons reacted to touch and clearly conveyed to the user that they’ve been pressed, addressing the depth perception limitation. However, as we started adding floors to Onboarding, we quickly landed with a wall of buttons that felt clunky and unsophisticated.
We didn’t want the design of the virtual console to limit the number of floors we could add to the experience. So we asked ourselves: could we design an interface that provides tactile feedback while also supporting a potentially infinite number of floors?
After a few rounds of explorations, we landed on a console that had two very tactile methods of interaction: a physical dial and a button. Turning the dial controlled a digital screen, which displayed the scenario name and floor number. Once the scenario is selected, users can press the button to travel to it.
This approach allowed us to create an interface that felt tactile and reliable in VR while also being flexible and extensible enough to support a large number of scenarios in the future.
Floor 1: Introduction
While in VR, any interaction a user has with the real world is a significant break in presence. So, we wanted to ensure officers could go through Onboarding without any help from instructors in the physical space. To achieve this, we created a virtual moderator to guide the user along the experience.
We experimented with humanoid and non-humanoid characters. Humanoid characters quickly fell into the uncanny valley, while non-humanoid characters introduced an unnecessary element of science fiction that pulled people away from the realism of the experience.
Making the environment feel as real as possible was key to our success and, unless perfectly real, introducing virtual characters would have broken the immersion. For this reason, we decided to make the instructor audio-only in order to avoid introducing distractions in the experience.
In order to bring the moderator to life, we spent a lot of time writing a script and recording the voice-over that would convey the appropriate tone and guide officers along each Onboarding task.
To address the novelty effect common in first-time VR experiences, we designed an initial environment that encouraged unbounded exploration. We scattered objects around the space and instructed users to interact with those objects at their own pace.
We placed iPad-like screens on the space with instructions for every interaction. We noticed that officers really enjoyed playing with the environment in order to understand how to engage with objects in Virtual Reality.
Floor 2: Learning Interactions
Interacting with virtual objects is necessary for officers to feel a sense of agency in the virtual environment. In order to do this, officers need to learn how to use VR controllers, which may be foreign to them, as this is likely their first experience in VR.
For simple tasks like grabbing and poking, we had the moderator guide the user in Floor 1. For more complex tasks that required interacting with the controller’s buttons, we designed a second floor where they were gradually introduced to their tools: radio, flashlight, baton, and firearm. The moderator instructs the officer to pick up, use, and holster each tool.
With every tool they used, officers learned a different way of using the physical controllers. For example, the flashlight taught them how to use the trackpad while the firearm taught them how to use the trigger. After using each tool, officers were instructed to holster them in their virtual tool belt, which introduced them to the concept of virtual embodiment, further increasing their feeling of immersion.By the end of Floor 2, officers clearly understood how to use the controllers to interact with virtual objects and had a sense of embodiment in the virtual experience that enabled them to react naturally in VR.
Floor 3: Competitive Expectations
When officers are introduced to VR, they expect to shoot virtual characters in a video game. However, Trainer is the only game-like experience that gives users a virtual firearm but discourages them from using it to succeed.
To address this dichotomy, we designed a third floor that gave officers an opportunity to satisfy their expectations by allowing them to practice what they’ve learned (natural movement, voice commands, and interacting with virtual tools) to accomplish a series of tasks. They’re then given a score for the tasks they’ve completed successfully. This matched their expectations of VR and appealed to officers’ ingrained sense of competitiveness.
Each test required the user to identify a target and complete a task, like saying a command, drawing or using a tool. The environment was designed to include several points of interest, prompting the user to stay on their feet and look around. Several obstructions kept officers engaged finding the target, then interacting quickly with the objects in their toolbelt.
Additionally, we intended to match officers state of mind during the simulation as it would be in the real world. We conducted several user research studies to understand arousal in VR. We learned that sound and lighting played a pivotal role in achieving the intended effect. So, we designed the third level as a outdoors night scene and created a soundtrack to play during the timed tests to increase the levels of intensity.
By the end of Floor 3, most officers were engaged and immersed in the virtual environment. They reacted naturally and attentively to the tasks on screen and had fun doing so.
By going through Trainer's Onboarding, officers successfully learn the basics of VR: they understand how to navigate a virtual environment, interact with digital tools, and react naturally to life-like virtual scenarios. When testing with officers and external partners, Onboarding has repeatedly stood out as an aspirational experience that sets the right tone for the rest of the product. As a result, it enabled the team to secure key partnerships with law enforcement agencies.
In addition to securing external partnerships, Onboarding was presented to Alphabet stakeholders and played a pivotal role in securing funding for the next phase of development by showing progress and a high bar for execution.