TY - JOUR
T1 - Trust management and trust theory revision
AU - Ma, Ji
AU - Orgun, Mehmet A.
N1 - Copyright 2006 IEEE. Reprinted from IEEE transactions on systems, man, and cybernetics. Part A : Systems and humans. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Macquarie University’s products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.
PY - 2006/5
Y1 - 2006/5
N2 - A theory of trust for a given system consists of a set of rules that describe trust of agents in the system. In a certain logical framework, the theory is generally established based on the initial trust of agents in the security mechanisms of the system. Such a theory provides a foundation for reasoning about agent beliefs as well as security properties that the system may satisfy. However, trust changes dynamically. When agents lose their trust or gain new trust in a dynamic environment, the theory established based on the initial trust of agents in the system must be revised, otherwise it can no longer be used for any security purpose. This paper investigates the factors influencing trust of agents and discusses how to revise theories of trust in dynamic environments. A methodology for revising and managing theories of trust for multiagent systems is proposed. This methodology includes a method for modeling trust changes, a method for expressing theory changes, and a technique for obtaining a new theory based on a given trust change. The proposed approach is very general and can be applied to obtain an evolving theory of trust for agent-based systems.
AB - A theory of trust for a given system consists of a set of rules that describe trust of agents in the system. In a certain logical framework, the theory is generally established based on the initial trust of agents in the security mechanisms of the system. Such a theory provides a foundation for reasoning about agent beliefs as well as security properties that the system may satisfy. However, trust changes dynamically. When agents lose their trust or gain new trust in a dynamic environment, the theory established based on the initial trust of agents in the system must be revised, otherwise it can no longer be used for any security purpose. This paper investigates the factors influencing trust of agents and discusses how to revise theories of trust in dynamic environments. A methodology for revising and managing theories of trust for multiagent systems is proposed. This methodology includes a method for modeling trust changes, a method for expressing theory changes, and a technique for obtaining a new theory based on a given trust change. The proposed approach is very general and can be applied to obtain an evolving theory of trust for agent-based systems.
UR - http://www.scopus.com/inward/record.url?scp=33646890642&partnerID=8YFLogxK
U2 - 10.1109/TSMCA.2006.871628
DO - 10.1109/TSMCA.2006.871628
M3 - Article
AN - SCOPUS:33646890642
SN - 1083-4427
VL - 36
SP - 451
EP - 460
JO - IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans
JF - IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans
IS - 3
ER -