If you ask a member of the public about motion capture technology, they will probably tell you about people who move around in the dark in body suits with bright lights or reflectors attached. However, the relatively recent launch of Microsoft’s Kinect back in 2010 has started to change the perceptions of what motion capture can actually accomplish. Kinect has revolutionized home gaming, exercise and entertainment by replacing joysticks, mechanical motion sensors and similar devices with natural body movements that are interpreted by the controller directly.

motion-capture-on-the-battlefield2

While this is the public face of motion capture, it also has a number of well-known commercial applications, such as the creation of avatars when designing video games, and the clinical diagnosis and treatment of conditions that affect movement. What is perhaps less well known is the increasing role that motion capture is starting to play in military applications ranging from training to control systems.

Taking one example, Organic Motion, which is one of the world’s leading vendors of mocap technology, has recently been working with Lockheed Martin’s Global Training and Logistics group. Using advanced technology that does not require markers, they are able to integrate live instructors into real-time training simulations designed to improve soldiers’ ability to deal with a wide range of complex situations.

The instructors’ avatars can play the part of both insurgents and civilians, and, because they are directly controlled by experienced subject matter experts, they can respond in a lifelike manner to a soldier’s actions. Not only does this improve the simulations, it also makes them more available. Soldiers and trainers can be in different locations anywhere around the world, reducing costs and allowing trainers to make more effective use of their time.

The drive is also on to increase the realism of military simulations by integrating other components in addition to motion capture. One of the biggest challenges that soldiers face, given the changing nature of modern warfare, is how they respond and keep their cool when dealing with unfamiliar cultures, languages and customs. For example, they are increasingly called upon to tell combatants apart from innocent civilians on a daily basis in places like Iraq and Afghanistan, The Pentagon is starting to look for new technologies that allow simulations to respond to physiological cues, such as a soldier’s brain waves, voice stress levels, blood pressure, heart rate and respiration rate.

By using these cues, the Pentagon hopes to make simulations respond to the emotional state of the soldier, so that they are able to experience nonlethal situations such as resolving a family dispute among villagers while maintaining an appropriate level of emotional control. This will also lead to a better ability to discriminate between agitated civilians and those who are intent on causing harm.

According to the Pentagon, the objective of this initiative is that “Trainees will be able to speak to and interact at any level with indigenous non-player characters (NPC), complete with voice recognition, speech, and facial gestures. The characters will react according to how the trainee interacts with them. Further the game will track how the local population reacts to these interactions. The game will adapt to changes in local population response. For example, if a player comes in and insults the local tribal leader the game scenario will change and the trainee will find that future interactions with the local population are more difficult and more hostile.”


motion-capture-on-the-battlefield1

Military uses of motion capture technology may not be limited simply to enhance training exercises. For instance, students working at the University of Michigan 3D Lab have recently been working on enhanced control systems that can respond autonomously to motion in the environment. They have used a motion capture system to track the precise movements of miniature helicopters in flight as they respond to manual controls. Armed with this information that they gathered, they then proceeded to develop a set of unique algorithms to control the throttle and steering of the helicopters without any need for further human intervention.

In one demonstration of the power of this technology, they then instructed two of the helicopters to protect one of the researchers. Once programmed to do this, the two helicopters were able to follow the researcher at a constant distance as he walked around the room. While clearly this sort of research is in its infancy, the broader implications for autonomous battlefield operations are clear – although there are significant ethical and operational questions that are raised by such technology, which may delay its introduction or limit the scope of its use.