Explainable virtual agents provide insight into the agent's decision-making process, which aims to improve the user's acceptance of the agent's actions or recommendations. However, explainable agents commonly rely on their own knowledge and goals in providing explanations, rather than the beliefs, plans or goals of the user. Little is known about the user perception of such tailored explanations and their impact on their behaviour change. In this paper, we explore the role of belief-based explanation by proposing a user-Aware explainable agent by embedding the cognitive agent architecture with a user model and explanation engine to provide a tailored explanation. To make a clear conclusion on the role of explanation in behaviour change intentions, we investigated whether the level of behaviour change intentions is due to building agent-user rapport through the use of empathic language or due to trusting the agent's understanding through providing explanation. Hence, we designed two versions of a virtual advisor agent, empathic and neutral, to reduce study stress among university students and measured students' rapport levels and intentions to change their behaviour. Our results showed that the agent could build a trusted relationship with the user with the help of the explanation regardless of the level of rapport. The results, further, showed that nearly all the recommendations provided by the agent highly significantly increased the intention of the user to change their behavior.