MultiModal Mall Entertainment Robot (MuMMER)

IMG_9162

Introduction
Developing an artificial agent capable of coexisting and interacting independently, naturally, and safely with humans in an unconstrained real-world setting has been the dream of robot developers since the very earliest days. In popular culture and science fiction, the prototypical image of a “robot” is precisely this: an artificial human that is able to engage fully in all aspects of face-to-face conversation. However, while the hardware capabilities of robots are increasing rapidly, the software development has not kept pace: even with the most recent technological developments, the most advanced robots have generally supported limited, scripted interactions, often relying on a human operator to help with input processing and/or appropriate behaviour selection.

In MuMMER, the researchers involved aim to develop a robot that is able to interact with humans in a natural, face-to-face setting. Such a robot has a wide range of potential applications, particularly in contexts where trained, skilled target users are not expected such as receptionists, tutors, assistive home robots, and companion robots.

What is MuMMER? 
MuMMER (MultiModal Mall Entertainment Robot) is a four-year, EU-funded project with the overall goal of developing a humanoid robot (based on Softbank’s Pepper platform) that can interact autonomously and naturally in the dynamic environments of a public shopping mall, providing an engaging and entertaining experience to the general public. Using co-design methods, the researchers will work together with stakeholders including customers, retailers, and business managers to develop truly engaging robot behaviours. Crucially, the robot will exhibit behaviour that is socially appropriate: combining speech-based interaction with non-verbal communication and human-aware navigation. To support this behaviour, the team will develop and integrate new methods from audiovisual scene processing, social-signal processing, high-level action selection, and human-aware robot navigation. Throughout the project, the robot will be deployed in Ideapark, a large public shopping mall in Finland.

Collaboration with External Partners
SoftBank Robotics Europe, the manufacturer of the Pepper robot, is a member of the MuMMER project consortium and is providing both technical expertise as well as feedback from other currently active Pepper deployments. The Ideapark shopping mall is also a member of the MuMMER consortium, and the team are working together with mall management, retailers, and customers to develop and evaluate the robot behaviour.

Beyond the MuMMER project consortium, the researchers have also had contact with other companies and institutions that are eager to have the Pepper robot and the MuMMER system active in their premises.

Academic Partners
Academic partners include: Heriot-Watt University; Idiap Research Institute (Martigny, Switzerland); LAAS-CNRS (Toulouse, France); VTT Technical Research Centre of Finland (Tampere, Finland)

Researcher background
Dr Mary Ellen Foster is a Lecturer in the School of Computing Science at the University of Glasgow. Her primary research interests are human-robot interaction, social robotics, and embodied conversational agents. She is a member of the Glasgow Social Robotics group and the Glasgow Interactive Systems Group (GIST), and an Associate Academic of the Institute of Neuroscience and Psychology. She is the coordinator of the MuMMER project, a European Horizon 2020 project in the area of socially aware human-robot interaction. She obtained her PhD from the University of Edinburgh in 2007, and has previously worked at the Technical University of Munich and Heriot-Watt University.

Contact Details
Dr Mary Ellen Foster
Email: MaryEllen.Foster@glasgow.ac.uk
Website: http://mummer-project.eu/