You are here

Social Signals in Interaction Between Humans and Artificial Agents

IUALL (Interaction for Universal Access)

Focus

This work package focuses on the interpersonal and contextual (i.e. situation and task-related) aspects of turn- taking and floor management in conversational encounters. In this work package we build models of turn-taking and floor management for artificial conversational agents that display appropriate behaviors when they are involved in a conversation. We develop models that help:

  1. to understand communication between agents that participate in a joint conversational activity, how participants by their individual contributions and expressions coordinate this activity, what signals they send out and receive and how the (temporal patterns in) these signals are related to perceived qualities of the interaction according to the reports of the participants themselves as well in the eyes of outside observers (friendliness, formality, conflicting, stressfulness);
  2. to design artificial agent behaviors intended to be perceived as coordinating interactive behavior when interacting with other artificial agents or with humans and that fit the type of conversation, the specific joint activity, and roles that the agents are involved in.

The work

The work that is carried out involves: scenario specification, data collection and analyses, perception studies, interaction modeling, behavior generation for conversing artificial characters and evaluation.

When humans are co-present, involved in a conversation while performing a joint task, they will coordinate their contributions to the conversation by exchanging all sorts of signals. Such coordination takes place either deliberatively and intentionally, or unintentionally. They show by means of gestures, head movements and eye gaze what their focus of attention is, whether they want to take turn, whether they expect the other to say something, who they address when they say something, and what their stance is towards the others. The ability to display appropriate expressions and behaviors when taking turn, when yielding the turn or when interrupting the speaker are often considered as important requirements for artificial agents will they be accepted as social and believable characters, that can take part in a smooth interaction.

 But what is still acceptable and what isn’t is unclear. In particular, it is unclear what the most prominent features of embodied interface agents in a given situation are that have the most impact on how human subjects perceive the artificial characters and the interpersonal qualities of the interaction. Theoretical studies and controlled perception studies in this project will contribute to clarify this. Target use cases that we focus on are serious games and interactive story telling where artificial human-like characters play a role in realistic scenarios and where human subjects play the role of an outside observer of simulated social behavior presented by characters in a virtual environment or where human subjects are themselves participants interacting with the artificial characters.

It is important to know how human subjects perceive such behaviors and what factors (properties of characters, graphics style, conversational content, timings, tasks, scenario, role, human subject’s personality) influence how the artificial characters are perceived. Turn-taking patterns are an important fingerprint of a social encounter: the way that participants organize their contributions, how and when they direct themselves to others, the flow of the conversation, is strongly related to the type of encounter and often reveals the type of interpersonal relationships between the agents as well as what is going on between the participants.

Various theoretical perspectives – from different disciplines - on turn-taking in face-to-face conversations exist. How do these perspectives and the corresponding models contribute in building models for design and realization of artificial social interacting conversational agents? Conversations are often embedded in a joint activity. The (shared) knowledge of the task and the role that agents play in the task largely determines the flow of joint activities in the conversation. How important are observable non- verbal signals produced by the artificial agents for the engagement and fluency of the interaction and the performance of the joint task?

Social signals, including those signals that regulate the flow of the conversation, are best studied in the concrete practical context where they naturally occur. Hence, we focus on the interactions between the behaviors of the individual participating subjects, not on isolated behaviors of the subjects. We collect data in well-defined scenarios, including descriptions of specific tasks and roles, so that social signals can be interpreted within that scenario.

We will restrict our studies to these specific scenarios and focus on a selection of interesting social behaviors that occur, and on those qualities of social encounters that show to be important in the selected scenarios. We will perform perception studies of conversational interactions where subjects are either outside observer or participant and report about how they perceive and assess the interaction and the agents’ behaviors. We will build virtual human behavior and validate the usefulness of the underlying models by means of similar perception experiments where users interact with or observe the virtual humans showing the simulated behavior in the same scenario.

WP Leader: