« Back to Publications list

Children’s Overtrust: Intentional Use of Robot Errors to Decrease Trust

Robots are being developed to help in educational settings (among others) with young children. Research suggests that children may overtrust robots, which can have a negative impact. We suggest the use of intentional, egregious  robot errors as one technique to mitigate such overtrust. Additionally, how robots attempt to recover from intentional and unintentional errors could also help in reducing children’s trust towards them. In this paper we provide our reasoning behind the proposed purposeful use of errors, as well as suggestions for how various types of errors could be used to decrease trust towards robots.

D. Y. Geiskkovitch and J. E. Young, "Children’s overtrust: Intentional use of robot errors to decrease trust," In Proceedings of the 29th International Conference on Robot & Human Interactive Communication (SCRITA Workshop). 2020.

Authors

James E.Young

James E.Young

Professor