Researchers at the University of Michigan are teaching self-driving cars to recognize and predict pedestrian movements by focusing on humans' gait, body symmetry and foot placement.

Data collected by vehicles through cameras, lidar and GPS allow the researchers to capture video of humans in motion and then recreate them in 3D computer simulation. With that, they've created a "biomechanically inspired recurrent neural network" that catalogs human movements.

The researchers said they can predict poses and future locations for one or several pedestrians up to about 50 yards from the vehicle. That's at about the scale of a city intersection.

3D look

Prior work has typically only looked at still images and wasn't as concerned with how people move in three dimensions, said Ram Vasudevan, assistant professor of mechanical engineering.

Equipping vehicles with the necessary predictive power requires the network to offer details of human movement: the pace of a human's gait (also known as periodicity), the mirror symmetry of limbs and the way in which foot placement affects stability during walking.

By utilizing video clips that run for several seconds, the University of Michigan system can study the first half of the clip to make its predictions, and then verify the accuracy with the second half.

The idea is to train the system to recognize motion and make predictions of where that pedestrian's body will be as he or she moves along. The work was supported by a grant from Ford Motor Company.

Vasudevan said that if a pedestrian is playing with his or her phone, their pose indicates they are likely distracted. It also indicates what they are capable of doing next. The results have shown that this new system improves upon a driverless vehicle's capacity to recognize what is most likely to happen next.

Researchers reported that the median translation error of their prediction was around 10 cm after one second and less than 80 cm after six seconds. All other comparison methods were up to 7 meters off.

Fatal accident

In June 2018, the U.S. National Transportation Safety Board (NTSB) released a preliminary report for its probe of a fatal crash involving a pedestrian and an Uber Technologies, Inc., test vehicle in Tempe, Arizona.

The modified 2017 Volvo XC90, occupied by one vehicle operator and operating with a self-driving system in computer control mode, struck a pedestrian March 18. The pedestrian died and the vehicle operator was not injured.

The preliminary report said that the pedestrian was dressed in dark clothing, did not look in the vehicle's direction until just before impact and crossed the road in a place that was not directly lit. The pedestrian also was pushing a bicycle that did not have side reflectors. Its front and rear reflectors, along with its forward-facing headlamp, were perpendicular to the path of the oncoming vehicle. The pedestrian entered the roadway from a brick median, where signs facing toward the roadway warn pedestrians to use a crosswalk some 360 ft away.

In its report, the NTSB said Uber equipped the test vehicle with a developmental self-driving system, consisting of forward- and side-facing cameras, radars, lidars, navigation sensors and a computing and data storage unit integrated into the vehicle.

The vehicle was factory equipped with several driver assistance functions by the original manufacturer Volvo Cars, including a collision avoidance function with automatic emergency braking, as well as functions for detecting driver alertness and road sign information. The NTSB said that the Volvo functions typically were disabled only when the test vehicle is operated in computer control mode.

The report said that data obtained from the self-driving system shows the system first registered radar and lidar observations of the pedestrian about six seconds before impact. At the time, the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian first as an unknown object, then as a vehicle and then as a bicycle with varying expectations of future travel path.

At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. The NTSB said that according to Uber, emergency braking maneuvers were not enabled while the vehicle was under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action.