Abstract
|
Reputation systems support users to distinguish between trustworthy and malicious or unreliable services. They collect and evaluate available user opinions about services and about other users in order to determine an estimation for the trustworthiness of a specified service. The usefulness of a reputation system highly depends on its underlying trust model, i.e., the representation of trust values and the methods to calculate with these trust vales. Several proposed trust models that allow representing degrees of trust, ignorance and distrust show undesired properties when conflicting opinions are combined. The proposed consensus operators usually eliminate the incurred degree of conflict and perform a re-normalization. We argue that this elimination cause counterintuitive effects and should thus be avoided. Therefore, we propose a new representation of trust values that reflects also the degree of conflict, and we develop a calculus and operators to compute reputation values. Our approach requires no re-normalizations and thus avoids the thereby caused undesired effects.
|
Reference entry
|
Gutscher, A.
Reasoning with Uncertain and Conflicting Opinions in Open Reputation Systems
Proceedings of the 4th International Workshop on Security and Trust Management 2008 (STM 2008), Trondheim, June 2008
|