Skip to main content

Unit information: Human-Robot Interaction (UWE, UFMFHP-15-M) in 2018/19

Please note: you are viewing unit and programme information for a past academic year. Please see the current academic year for up to date information.

Unit name Human-Robot Interaction (UWE, UFMFHP-15-M)
Unit code EMATM0043
Credit points 15
Level of study M/7
Teaching block(s) Teaching Block 2 (weeks 13 - 24)
Unit director Professor. Giuliani
Open unit status Not open




School/department Department of Engineering Mathematics
Faculty Faculty of Engineering


This module will provide an overview of human-robot interaction (HRI) as a research field. It will cover different contexts in which humans interact with robots now and in the future and how these contexts shape the physical and social constraints of the interaction. For example, we will look at the assisted living context, in which robots support humans in their homes and thus have to display socially appropriate behaviours. In contrast to that, we will look at collaborative robots in industrial settings, in which knowledge about task planning and part assembly is more important. The module also introduces the technologies needed in a HRI system, for example vision processing, speech recognition and natural language understanding, reasoning, output generation, and cognitive robot architectures. We will introduce the human factors that are relevant for a successful HRI (e.g., acceptance, trust, cognitive load) and how to measure these factors. Finally, the module describes how to set up, execute, and analyse HRI user studies.

Intended learning outcomes

The module learning objectives are that students are able to:

  • identify and describe the interplay of the parts of a HRI system architecture, including input, reasoning, and output components (assessed in component A)
  • demonstrate understanding of the challenges that arise when building a system for multimodal interaction, such as an HRI system (assessed in component A)
  • analyse a given context for an HRI system and make necessary changes to the system design for the context (assessed in component A and B)
  • design and construct an HRI system with rudimentary input processing, reasoning, and output processing (assessed in component B)
  • design and execute a HRI user study (assessed in component B)
  • analyse, critically discuss, and scientifically report the results of a HRI user study (assessed in component B)

Teaching details

Sessions will include lectures leading to group work in practical sessions. During the module students will prepare and execute a small HRI user study with a real robot (Nao or Pepper). The lectures are designed to cover the major areas of HRI and should be a starting point for further reading and study, and for the practical sessions. In the practical sessions, HRI software tools will be used to learn how to perceive and react to humans interacting with the robot. You will also learn to use statistical software to analyse your datasets.

Assessment Details

The module will be assessed in two components. Component A is an exam (40%) where students are required to demonstrate detailed technical understanding of the design and properties of HRI systems/ Component B is an individual report (60%) in the format of a scientific research paper. The assignment task is organised with students working as a member of a team on a research project and then submitting an individual report based on that group research activity.

Component A

  1. An exam of two hours duration. This examination will consist of short descriptive textual questions as well as problems, calculations and data interpretation questions, for the students to show that they have a technical understanding of the design and operation of HRI systems in different usage contexts.

Component B

  1. An individual report of not more than 3000 words based upon practical work and the user study carried on during the group research project. The report will be structured as a scientific research paper.

Reading and References

Bartneck, C., Kulić, D., Croft, E. and Zoghbi, S. (2009) Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. International Journal of Social Robotics. 1 (1), pp. 71–81. doi:10.1007/s12369-008-0001-3.

Breazeal, C. (2003) Toward sociable robots. Robotics and Autonomous Systems. 42 (3-4), pp. 167–175. doi:10.1016/S0921-8890(02)00373-1.

Goodrich, M.A. and Schultz, A.C. (2007) Human-Robot Interaction: A Survey. Foundations and Trends® in Human-Computer Interaction. 1 (3), pp. 203–275. doi:10.1561/1100000005.

Vinciarelli, A., Pantic, M., Heylen, D., Pelachaud, C., Poggi, I., Errico, F. D’ and Schroeder, M. (2012) Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing. IEEE Transactions on Affective Computing. 3 (1), pp. 69–87. doi:10.1109/T-AFFC.2011.27.

Young, J.E., Sung, J., Voida, A., Sharlin, E., Igarashi, T., Christensen, H.I. and Grinter, R.E. (2011) Evaluating Human-Robot Interaction. International Journal of Social Robotics. 3 (1), pp. 53–67. doi:10.1007/s12369-010-0081-8.