Recent years have seen considerable progress in the deployment of 'intelligent' communicative agents such as Apple's Siri and Amazon's Alexa. However, effective speech-based human-robot dialogue is less well developed; not only do the fields of robotics and spoken language technology present their own special problems, but their combination raises an additional set of issues. In particular, there appears to be a large gap between the formulaic behaviour that typifies contemporary spoken language dialogue systems and the rich and flexible nature of human-human conversation. As a consequence, we still seem to be some distance away from creating Autonomous Social Agents such as robots that are truly capable of conversing effectively with their human counterparts in real world situations. This talk will address these issues and will argue that we need to go far beyond our current capabilities and understanding if we are to move from developing robots that simply talk and listen to evolving intelligent communicative machines that are capable of entering into effective cooperative relationships with human beings.
Prof. Moore (http://staffwww.dcs.shef.ac.uk/people/R.K.Moore/) has over 40 years' experience in Speech Technology R&D and, although an engineer by training, much of his research has been based on insights from human speech perception and production. As Head of the UK Government's Speech Research Unit from 1985 to 1999, he was responsible for the development of the Aurix range of speech technology products and the subsequent formation of 20/20 Speech Ltd. Since 2004 he has been Professor of Spoken Language Processing at the University of Sheffield, and also holds Visiting Chairs at Bristol Robotics Laboratory and University College London Psychology & Language Sciences. He was President of the European/International Speech Communication Association from 1997 to 2001, General Chair for INTERSPEECH-2009 and ISCA Distinguished Lecturer during 2014-15. In 2017 he organised the first international workshop on 'Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR)'. Prof. Moore is the current Editor-in-Chief of Computer Speech & Language and in 2016 he was awarded the LREC Antonio Zampoli Prize for "Outstanding Contributions to the Advancement of Language Resources & Language Technology Evaluation within Human Language Technologies".
Throughout the past few years robots have become increasingly more sophisticated in terms of their hardware and software, paving the way for their more frequent use in a myriad of scenarios: homes; healthcare; schools; entertainment and in many other settings. At the same time, new situations emerge where robots not only have to interact with humans but are also required to collaborate with them, creating hybrid groups of humans and robots. With this vision in mind, it is important to reflect on the impact that robots have in humans' well-being and consider the effects they may have in supporting collaboration and prosocial behaviour in these new hybrid groups of humans and machines. Pro-social behaviour occurs when people and agents perform costly actions that benefit others. Acts such as helping others voluntarily, donating to charity and providing information or sharing resources, are all forms of prosocial behaviour. In this talk I will explore the role of robotics to foster prosociality- Prosocial Robotics.
Several questions will be discussed: What are the conditions that encourage humans to be more prosocial when interacting with robots? What features of robots are relevant to promote prosociality? Does embodiment matter? Do humans respond empathically and prosocially to non-verbal and verbal behaviours by robots? If robots act repetitively in a prosocial manner, do humans respond similarly, or will they exploit the robots' apparent weakness? Lastly, how can we engineer “prosocial robots” in general, leading to more altruistic and cooperative behaviours in a hybrid group? To examine these questions I will describe some preliminary work done in different scenarios. I'll start with a home scenario and then explore collaborative games where hybrid groups play social dilemmas including a public goods game and a collective risk dilemma (CRD). The results obtained so far seem to indicate that social robots can play a role in prosociality but their efficacy will depend largely on a variety of features of the robots.
Ana Paiva is a Full Professor in the Department of Computer Engineering at Instituto Superior Técnico (IST) from the University of Lisbon and is also the Coordinator of GAIPS - "Group on AI for People and Society" at INESC-ID (see http://gaips.inesc-id.pt/). Her group investigates the creation of complex systems using an agent-based approach, with a special focus on social agents. Prof. Paiva’s main research focuses on the problems and techniques for creating social agents that can simulate human-like behaviours, be transparent, natural and eventually, give the illusion of life. Over the years she has addressed this problem by engineering agents that exhibit specific social capabilities, including emotions, personality, culture, non-verbal behaviour, empathy, collaboration, and others. She has published extensively in the area of social agents, received best paper awards in many conferences, in particular she won the first prize of the Blue Sky Awards at the AAAI 2018. She has further advanced the area of artificial intelligence and social agents worldwide, having served for the Global Agenda Council in Artificial Intelligence and Robotics of the World Economic Forum and as a member of the Scientific Advisory Board of Science Europe. She is an EuroAI fellow.
In recent years, many service robots have been introduced in society as professional applications, such as agriculture, surgery, logistics, or public relations. However, it is rare to see service robots properly interacting with people and taking on essential roles in the city. In particular, service encounters are communicative operations in commerce, where many interactive robots have been tried and tested, but there are few examples of their effectiveness. From a business perspective, service robots are just one of the ways to solve business problems, and non-robotic or non-agent solutions, such as tablet ordering apps, self-checkout systems, and digital signages with human tracking cameras, are also considered at the same time. Why do we keen to choose service robots as a commercial solution? What is the value of verbal/non-verbal interaction? Our research group has explored the value of interaction through a number of practical experiments in various fields. This talk will introduce our observations in the experiments and discuss this inevitable question of spreading service robots in society.
Jun Baba is a chief research scientist at CyberAgent AI Lab. CyberAgent, inc. is one of the leading advertising agency in Japanese digital advertising market, and focus on conversational agents such as robots, virtual agents as next technologies for retail marketing. CyberAgent and Osaka University set up a joint research group and have started research projects with Prof. Hiroshi Ishiguro since 2017. Baba is a coordinator of the joint research group and a visiting researcher in Osaka University. He received his ME degree in informatics from Kyoto University, Kyoto, Japan in 2014 and he was a data scientist at CyberAgent from 2014 to 2017. His research interests include human-computer interaction in service encounter and machine learning. He focuses on influential conversational agents that can change people’s behaviors and actions by verbal/non-verbal interaction. In order to research and develop such influential agents, he has conducted a lot of field studies in various retail fields, i.e business hotels, online shopping sites, and shopping malls, and he has investigated what is essentially an important interaction based on people's real reactions and behaviors in the fields. Not only has he published his research findings in academic conferences in the area of HRI and HAI, but he's also trying to develop businesses based on his research findings.