Ford Motor Company is the only automaker to use the same type of motion-capture software in its virtual engineering labs as Hollywood employs to create such motion pictures as “Avatar,” “Lord of the Rings,” “Cloudy with a Chance of Meatballs” and “Polar Express”
— Ford uses motion-capture software for a variety of automotive applications including the study of human movement for better ergonomics, immersive virtual driving experiences, and virtual manufacturing
— The technology is the latest innovation at Ford to help engineers optimize vehicle design, comfort and ergonomics. The work from this technology is being applied to future products
Leaping into the world of virtual engineering, Ford Motor Company now employs the same type of motion-capture technology used to create films like “Avatar,” “Lord of the Rings” and “Shrek” to design vehicles that are more comfortable and enjoyable to drive.
Motion-capture, a technology that digitally captures movement, is used by Hollywood computer animators and video game designers to make nonhuman characters appear more lifelike. Ford uses the technology to create realistic digital humans that engineers use to test vehicles in the virtual world. Ford is the only automaker to use motion-capture software in this way for vehicle design.
“Just like in the movies, we hook people up with sensors to understand exactly how they move when they are interacting with their vehicles,” said Gary Strumolo, manager, Ford research and engineering. “Once we have all that motion captured, we create virtual humans that we can use to run thousands of tests that help us understand how people of all sizes and shapes interact with all kinds of vehicle designs. It’s an incredibly efficient way of engineering tomorrow’s vehicles.”
Capturing motion in movies and cars
One of the newest ways Ford is using motion-capture technology is through a system called Human Occupant Package Simulator (HOPS), which combines motion-capture software with a special test vehicle to measure and evaluate body motion.
A human test subject is outfitted with up to 50 motion-capture sensors. The test subject performs a series of movements, such as swinging a leg outside of the vehicle or reaching for the seat belt. The markers record trajectories of the test subject’s movements. The recordings are then loaded into a database to create digital human models.
Ford engineers use the digital human models to evaluate movement using different virtual vehicle design proposals from a small car to a full-size pickup truck. The system also can be reconfigured to represent the driver and the front, second or even third-row passenger compartments.
The HOPS motion-capture technology allows engineers to apply a more scientific approach to understanding how people interact with vehicles.
“Comfort or discomfort is inherently a subjective measure,” said Nanxin Wang, Ford technical leader. “For a given vehicle, some people will say it’s comfortable to get in, while others may say just the opposite. The challenge is to find out why people feel that way and how we can change the design to improve the perception.
“Before HOPS, the only way to evaluate a given design was to have people get into a vehicle and tell us how they liked it,” Wang continued. “This took lots of time and guesswork. Now we can couple this subjective appraisal with objective measurements of their arms, legs and head movements, along with muscular efforts to quantify movement mathematically. Our design teams use the data as a guide for developing a variety of vehicle platforms that provide optimal comfort, regardless of a person’s size or shape..”
Creating a holistic experience
Ford also is applying motion-capture animation software to improve real-life driving situations before the first prototype is even built. In Ford’s Immersive Virtual Evaluation (iVE) lab, engineers create virtual vehicles complete with exterior views with buildings, intersections and pedestrians.
“This technology enables us to evaluate many vehicle exterior and interior alternatives in a virtual environment from any location – in the driver’s seat or hundreds of feet away from the vehicle – with animated characters and vehicles,” said Elizabeth Baron, a technical specialist in Virtual Reality and Advanced Visualization at Ford.
Two specialized tools used in the iVE lab are the Cave Automated Virtual Environment (CAVE) and the Programmable Vehicle Model (PVM).
“The CAVE is a room where images are projected in stereo onto three walls and the ceiling to generate real-time, virtual vehicle interiors and exteriors at actual scale,” explained Baron. “When you look around, you can see virtually everything inside and outside of a vehicle that is still only a design in a computer.”
The PVM, an adjustable physical device that can be scaled to the actual dimensions of a car or truck, provides an even greater realistic experience by adding the element of touch.
“We set up key dimensions – steering wheel, gas, brake, center stack, etc. – and then we put the virtual world around that physical model,” said Baron. “Instead of being in a room, you’re actually sitting in a representation of the vehicle. You can touch and feel most everything, but what you’re looking at is digital.”
Both virtual design tools help Ford improve the design aesthetics, engineering and ergonomics of its cars and trucks. They also enable the company to bring products to market faster and more cost effectively.