Considering how much more common it is becoming for first responders to employ drones in search-and- rescue missions, researchers from Simon Fraser University’s Autonomy Lab are working to simplify how humans communicate with drones — particularly in emergency situations.

As such, the team of researchers is using artificial intelligence (AI) to help move that goal along, incorporating it into drone technologies that will ease robot-human communications while rendering the standard drone controller unnecessary.

“Most commercial drones today come with controllers which work really well, but sometimes you may find yourself in a situation where your hands are busy," notes computing science professor Richard Vaughan, who led the research.

"Or maybe you weren't expecting to interact with a drone today so you don't have special equipment with you. We'd like to be able to command drones in these situations and make the interaction natural and intuitive," Vaughan added.

With that goal in mind, the team has been building two different drones with the hope of improving human-drone interaction. One of the drones the team is working on is a drone that can be directed by arm gestures, moving in whatever direction is commanded by the human operator.

The second drone in development is one that operates based on a user’s facial expressions. Prompting a drone to carry out actions such as taking pictures or video of an area that is difficult for humans to access is initiated by a “trigger face.”

"We would like to get to the point where interacting with a robot is as easy as working with a co-worker or a trained animal," said Vaughan.

Watch the accompanying video to see the drones in action.

To contact the author of this article, email mdonlon@globalspec.com