Editor’s note: Part 1 of our two-part series considers what needs to happen in order to realize connected automation in vehicles. Part 2 addresses some of the physical infrastructure challenges likely to confront autonomous vehicles.

With each software tweak, sensor improvement and test drive, autonomous vehicles inch closer to reality. And the roadway is ready, as nothing with the infrastructure must change fundamentally to allow autonomous vehicles to function.

“Robotic drivers will encounter the same problems humans do on the road,” says Jeremy Carlson, senior analyst, autonomous driving, at IHS. Although infrastructure owners can do a few things to make an autonomous car’s job easier, the onus is on self-driving vehicles to adapt to existing infrastructure. That means learning to integrate with human drivers and navigate diverse road conditions. Through test facilities and small-scale deployments, researchers and manufacturers are addressing the limitations of autonomous systems in real-world environments.

Act Like a Human

The goal for autonomous vehicles is to one day replace their human-operated counterparts, but initial deployment likely will require the two groups to share the road. Self-driving cars have many barriers to overcome first.

Sensor inadequacies present a primary challenge. “We don’t have the raw data coming in necessary to discriminate all the different things that a human can,” says Edwin Olson, a computer science professor at the University of Michigan. For instance, sensors can’t tell whether a pedestrian 50 meters down the road is looking at their phone or at traffic.

Ford will replace roof-mounted LIDAR sensors with two hockey puck-sized sensors from Velodyne.Ford will replace roof-mounted LIDAR sensors with two hockey puck-sized sensors from Velodyne. Researchers and automakers also are investigating where to place sensors — and how many are necessary. For the third generation of its Fusion Hybrid autonomous vehicle test fleet, Ford will replace the cars’ four roof-mounted LIDAR sensors with two hockey puck-sized sensors from Velodyne. The sensors will mount near where side-view mirrors are usually located.

The positioning optimizes the amount of beams on the horizon where traffic exists and provides redundant coverage across the front 180°, says Jim McBride, Ford’s technical leader for autonomous vehicles. “You not only double up on the density of beams, but you would still have a field of view should one of them be obscured,” McBride says. The sensors also expand the range of previous LIDAR sensors to 200 meters.

Redundancy is a critical consideration for all systems in an autonomous vehicle. If power steering were to fail in a typical car, the driver can compensate by taking over the wheel. But who or what compensates for a similar failure in a driverless car? “We need to do more work to make the vehicle robust against failures that would ordinarily be handled by having a human driver,” Olson says.

Another impediment to integrating self-driving vehicles is other driver’s conduct behind the wheel. “Human drivers don’t always follow the rules of the road,” IHS’s Carlson says.He cites the example of early programming in Google’s autonomous car to stop completely at a four-way stop. “The car was never able to proceed because the other three vehicles rolled through, rather than fully stopping,” Carlson says. The vehicle also risked being rear-ended by human drivers who weren’t expecting its lack of movement. Google has since programmed its cars to inch out at a four-way stop to signal its intent.

Autonomy Meets Adversity

Just as autonomous vehicles have to account for human behavior, they also need to handle surprises that the infrastructure or weather may throw their way. Among these are potholes, heavy precipitation, inconsistent lane markings and glare from the sun.

To determine its position on the road, an autonomous vehicle typically uses LIDAR to build an image of the road surface. The trouble is that self-driving cars “aren’t reliable enough at decoding a street environment in real-time,” Olson says. “They have a hard time figuring out which traffic light corresponds to which lane, or they might miss a street sign and not know the speed limit.”

To attain accurate data, the localized image needs to be compared against a previously constructed map built with centimeter precision. The sophisticated, highly detailed maps “capture almost everything there is to know about the space you’re driving through,” Olson says.

The prior map also enables the vehicle to know its surroundings in snow and other adverse conditions that may prevent LIDAR from sensing the road. Olson and his colleague, Ryan Eustice, collaborated with Ford to put this application to work.

The researchers developed a high-resolution 3D map of Mcity, the University of Michigan’s 32-acre simulated urban environment to evaluate connected and automated vehicles. The map includes building façades, traffic signals and street lighting to serve as navigational landmarks. Equipped with the map, Ford’s Fusion Hybrid test cars drove through Mcity during a January snowfall. The system performed well in the wintry conditions “and validated our technical approach,” Olson says.

Mcity simulates a variety of urban attributes, including movable “pedestrians.” Image source: University of Michigan.Mcity simulates a variety of urban attributes, including movable “pedestrians.” Image source: University of Michigan. Mcity provides a realistic environment for automakers and other industry partners to prove their autonomous technology. “We tried to capture all the different features of road infrastructure to represent a diverse driving experience,” says Huei Peng, a mechanical engineering professor and director of the University of Michigan’s Mobility Transformation Center (MTC).

The site features multilane roads, a roundabout, sharp turns, merge ramps, and simulated tunnels equipped with varying surfaces such as concrete, asphalt and dirt. Test vehicles also encounter crosswalks, sidewalks, bicycle lanes, building facades, construction barriers, fire hydrants and parking meters, along with various types of curbing and signage. Even the traffic signals are attached to poles in different ways.

Meanwhile, in Concord, Calif., the Contra Costa Transportation Authority has converted the decommissioned Concord Naval Weapons Station into a testing site called GoMentum station. The facility covers 5,000 acres and features a built-in city with 20 miles of roadway, bridges and underpasses, buildings, intersections, railroad crossings, tunnels and parking lots.

Infrastructure Consistency

Because autonomous vehicles are self-contained systems, they don’t require special infrastructure. Even though infrastructure owners aren’t directly involved in vehicle autonomy, “we want to facilitate it as it comes along,” says Blaine Leonard, ITS program manager at the Utah Department of Transportation.

The automotive industry has suggested using highly visible lane striping to keep autonomous vehicles in their proper lanes. Some have suggested embedding into the road paint a machine-readable component like an RFID chip, Leonard says. Consistent roadway signage also is on the industry’s wish list, but that could be a big task considering the variations between high-speed divided highways and local roads, not to mention differences from town to town.

Tesla’s Autopilot software allows drivers to see what the car sees. Image source: Tesla.Tesla’s Autopilot software allows drivers to see what the car sees. Image source: Tesla. Under certain road conditions, however, vehicle-to-vehicle and vehicle-to-infrastructure communications can augment autonomous technology. If there is sun glare or an obstruction around a traffic light, for instance, an autonomous vehicle “won’t see whether the light is red, yellow or green,” Leonard says. But if the car is connected to the infrastructure, “we can send it a signal phase and timing message so it knows the status of the light and when it is going to change.”

Early Deployments

Given the limitations that autonomous vehicles have yet to overcome, many industry watchers forecast a long transition onto the roadway. Increasing levels of automation in vehicles, however, will help ease the changeover for drivers, says Stan Caldwell, executive director of the T-SET National University Transportation Center at Carnegie Mellon University.

For example, the latest update for Tesla’s Autopilot software, released in October 2015, allows the Model S to automatically change lanes, adjust speed in response to traffic, scan for parking spots and parallel park. The instrument cluster panel shows the driver what the car is seeing. But following a series of YouTube videos that showed autopilot allegedly misbehaving, Tesla has restricted the auto-steer function in residential areas and on streets without physical dividers.

Meanwhile, General Motors will release its semi-autonomous technology called Super Cruise on the Cadillac CT6 in 2017. Mercedes-Benz, BMW and Infiniti also have equipped models with hands-free driving capabilities.

Even as researchers and automakers continue to refine autonomy, Caldwell expects driverless commercial shuttles to be deployed before fully autonomous private vehicles. Even so, he says the shuttles most likely “will travel about 20 miles an hour in a limited physical area that can be easily operated.”

Easy Mile, which provides shared driverless vehicles, will launch its first fleet in the United States in partnership with the Contra Costa Transportation Authority. The low-speed, short-distance shuttles transport travelers to transit hubs and business parks. The demonstration project is expected to deploy in summer 2016 at Bishop Ranch, a 585-acre business community near San Francisco.

With continued research, development and testing, autonomous vehicles are adapting to challenges posed by the infrastructure and human drivers. Test beds that are designed like functioning cities provide a realistic platform on which to build self-driving cars that play well with others.

To contact the author of this article, email GlobalSpeceditors@globalspec.com