Researchers at the University of Naples Federico II in Italy have developed a new interactive robotic bartender.

Dubbed BRILLO, the robotic system features a humanoid bust, two robotic arms for making drinks and a monitor-based face capable of producing different facial expressions according to interactions with human customers.

Source: John, Rossi and RossiSource: John, Rossi and Rossi

According to its developers, BRILLO also features a microphone, speaker and a camera that captures images of customers so that the system can examine customer body language and process what the customers say and how they respond. This data is captured and stored by the system so that it can be recalled when that same customer returns in the future.

To mimic the interaction between customer and bartender, the robot creates profiles of its customers based on their preferences as determined by both verbal and non-verbal cues. Using this data, the robots can make future drink recommendations or guide future conversations based on previously discussed topics.

Such personalization was achieved by the researchers via a combination of a system of record — the analysis and storage of previous interactions including past purchases, topics of interaction and the type of interaction — and a system of engagement — the improvement of an experience through a run-time evaluation of the current engagement level, the researchers explained.

Based on this, the system enables the robotic bartender to process both the verbal and non-verbal cues of the customer, which reveals their mood, their level of attentiveness and their preferred drinks. The robot then stores this information for later use.

The research is detailed in the article, Personalized Human-Robot Interaction with a Robot Bartender, which appears in the Adjunct Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization.

For more information on BRILLO, watch the accompanying video that appears courtesy of the University of Naples Federico II.

To contact the author of this article, email mdonlon@globalspec.com