TY - GEN
T1 - Reason explanation for encouraging behaviour change intention
AU - Abdulrahman, Amal
AU - Richards, Deborah
AU - Bilgin, Ayse Aysin
PY - 2021
Y1 - 2021
N2 - The demand for intelligent virtual advisors in our rapidly advancing world is rising and, consequently, the need for understanding the reasoning process to answer why a particular piece of advice is provided to the user is directly increasing. Personalized explanation is regarded as a reliable way to improve the user's understanding and trust in the virtual advisor. So far, cognitive explainable agents utilize reason explanation by referring to their own mental state (beliefs and goals) to explain their own behaviour. However, when the explainable agent plays the role of a virtual advisor and recommends a behaviour for the human to perform, it is best to refer to the user's mental state, rather than the agent's mental state, to form a reason explanation. In this paper, we are developing an explainable virtual advisor (XVA) that communicates with the user to elicit the user's beliefs and goals and then tailors its advice and explains it according to the user's mental state. We tested the proposed XVA with university students where the XVA provides tips to reduce the students' study stress. We measured the impact of receiving three different patterns of tailored explanations (belief-based, goal-based, and belief&goal-based explanation) in terms of the students' intentions to change their behaviours. The results showed that the intention to change is not only related to the explanation pattern but also to the user context, the relationship built with the agent, the type of behaviour recommended and the user's current intention to do the behaviour.
AB - The demand for intelligent virtual advisors in our rapidly advancing world is rising and, consequently, the need for understanding the reasoning process to answer why a particular piece of advice is provided to the user is directly increasing. Personalized explanation is regarded as a reliable way to improve the user's understanding and trust in the virtual advisor. So far, cognitive explainable agents utilize reason explanation by referring to their own mental state (beliefs and goals) to explain their own behaviour. However, when the explainable agent plays the role of a virtual advisor and recommends a behaviour for the human to perform, it is best to refer to the user's mental state, rather than the agent's mental state, to form a reason explanation. In this paper, we are developing an explainable virtual advisor (XVA) that communicates with the user to elicit the user's beliefs and goals and then tailors its advice and explains it according to the user's mental state. We tested the proposed XVA with university students where the XVA provides tips to reduce the students' study stress. We measured the impact of receiving three different patterns of tailored explanations (belief-based, goal-based, and belief&goal-based explanation) in terms of the students' intentions to change their behaviours. The results showed that the intention to change is not only related to the explanation pattern but also to the user context, the relationship built with the agent, the type of behaviour recommended and the user's current intention to do the behaviour.
KW - Explainable agents
KW - Personal virtual advisor
KW - Reason explanation
KW - Behaviour change intention
KW - Working alliance
KW - Trust
UR - http://www.scopus.com/inward/record.url?scp=85112171963&partnerID=8YFLogxK
M3 - Conference proceeding contribution
AN - SCOPUS:85112171963
SN - 9781713832621
T3 - Proceedings of the International Joint Conference on Autonomous Agents and Multiagent Systems, AAMAS
SP - 68
EP - 77
BT - 20th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2021
PB - Association for Computing Machinery, Inc
CY - New York, NY
T2 - 20th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2021
Y2 - 3 May 2021 through 7 May 2021
ER -