Human-agent collaborations : trust in negotiating control

Daronnat, Sylvain and Azzopardi, Leif and Halvey, Martin and Dubiel, Mateusz (2019) Human-agent collaborations : trust in negotiating control. In: CHI 2019, 2019-05-05 - 2019-05-05, Glasgow SEC.

[thumbnail of Daronnat-etal-CHI2019-Human-agent-collaborations-trust-in-negotiating-control]
Text. Filename: Daronnat_etal_CHI2019_Human_agent_collaborations_trust_in_negotiating_control.pdf
Final Published Version
License: Unspecified

Download (540kB)| Preview


For human-agent collaborations to prosper, end-users need to trust the agent(s) they interact with. This is especially important in scenarios where the users and agents negotiate control in order to achieve objectives in real time (e.g. from helping surgeons with precision tasks to parking a semiautonomous car or completing objectives in a video-game, etc.). Too much trust, and the user may overly rely on the agent. Insufficient trust, and the user may not adequately utilise the agent. In addition, measuring trust and trust-worthiness is difficult and presents a number of challenges. In this paper, we discuss current approaches to measuring trust, and explain how they can be inadequate in a real time setting where it is critical to know the extent to which the user currently trusts the agent. We then describe our attempts at quantifying the relationship between trust, performance and control.