eprintid: 260200 rev_number: 192 eprint_status: archive userid: 53633 source: http://eprints.ecs.soton.ac.uk/id/eprint/10200 dir: disk0/00/26/02/00 datestamp: 2005-03-02 lastmod: 2024-03-15 03:22:27 status_changed: 2011-03-01 11:05:12 type: thesis metadata_visibility: show item_issues_count: 0 ispublished: pub full_text_status: public keywords: Negotiation, Trust, Persuasion, Argumentation-based Negotiation, Multi-Agent Systems date: 2004 institution: University of Southampton department: ECS divisions: fbc6e442-3f49-48b5-a45a-74aadb6728c7 qualification_name: Doctor of Philosophy creators_name: Ramchurn, Sarvapali creators_id: 1d62ae2a-a498-444e-912d-a6082d3aaea3 creators_orcid: 0000-0001-9686-4302 creators_hidden: FALSE contributors_type: http://www.loc.gov/loc.terms/relators/AUT contributors_name: Ramchurn, Sarvapali contributors_id: 1d62ae2a-a498-444e-912d-a6082d3aaea3 contributors_orcid: 0000-0001-9686-4302 contributors_hidden: FALSE title: Multi-Agent Negotiation using Trust and Persuasion abstract: In this thesis, we propose a panoply of tools and techniques to manage inter-agent dependencies in open, distributed multi-agent systems that have significant degrees of uncertainty. In particular, we focus on situations in which agents are involved in repeated interactions where they need to negotiate to resolve conflicts that may arise between them. To this end, we endow agents with decision making models that exploit the notion of trust and use persuasive techniques during the negotiation process to reduce the level of uncertainty and achieve better deals in the long run. Firstly, we develop and evaluate a new trust model (called CREDIT) that allows agents to measure the degree of trust they should place in their opponents. This model reduces the uncertainty that agents have about their opponents' reliability. Thus, over repeated interactions, CREDIT enables agents to model their opponents' reliability using probabilistic techniques and a fuzzy reasoning mechanism that allows the combination of measures based on reputation (indirect interactions) and confidence (direct interactions). In so doing, CREDIT takes a wider range of behaviour-influencing factors into account than existing models, including the norms of the agents and the institution within which transactions occur. We then explore a novel application of trust models by showing how the measures developed in CREDIT ca be applied negotiations in multiple encounters. Specifically we show that agents that use CREDIT are able to avoid unreliable agents, both during the selection of interaction partners and during the negotiation process itself by using trust to adjust their negotiation stance. Also, we empirically show that agents are able to reach good deals with agents that are unreliable to some degree (rather than completely unreliable) and with those that try to strategically exploit their opponent. Secondly, having applied CREDIT to negotiations, we further extend the application of trust to reduce uncertainty about the reliability of agents in mechanism design (where the honesty of agents is elicited by the protocol). Thus, we develop \acf{tbmd} that allows agents using a trust model (such as CREDIT) to reach efficient agreements that choose the most reliable agents in the long run. In particular, we show that our mechanism enforces truth-telling from the agents (i.e. it is incentive compatible), both about their perceived reliability of their opponent and their valuations for the goods to be traded. In proving the latter properties, our trust-based mechanism is shown to be the first reputation mechanism that implements individual rationality, incentive compatibility, and efficiency. Our trust-based mechanism is also empirically evaluated and shown to be better than other comparable models in reaching the outcome that maximises all the negotiating agents' utilities and in choosing the most reliable agents in the long run. Thirdly, having explored ways to reduce uncertainties about reliability and honesty, we use persuasive negotiation techniques to tackle issues associated with uncertainties that agents have about the preferences and the space of possible agreements. To this end, we propose a novel protocol and reasoning mechanism that agents can use to generate and evaluate persuasive elements, such as promises of future rewards, to support the offers they make during negotiation. These persuasive elements aim to make offers more attractive over multiple encounters given the absence of information about an opponent's discount factors or exact payoffs. Specifically, we empirically demonstrate that agents are able to achieve a larger number of agreements and a higher expected utility over repeated encounters when they are given the capability to give or ask for rewards. Moreover, we develop a novel strategy using this protocol and show that it outperforms existing state of the art heuristic negotiation models. Finally, the applicability of persuasive negotiation and CREDIT is exemplified through a practical implementation in a pervasive computing environment. In this context, the negotiation mechanism is implemented in an instant messaging platform (JABBER) and used to resolve conflicts between group and individual preferences that arise in a meeting room scenario. In particular, we show how persuasive negotiation and trust permit a flexible management of interruptions by allowing intrusions to happen at appropriate times during the meeting while still managing to satisfy the preferences of all parties present. date_type: published thesis_type: phd related_urls_url: http://www.ecs.soton.ac.uk/~sdr01r/mypubs/accepted/thesis.pdf languages_3char: eng organisations: University of Southampton organisations: Agents, Interactions & Complexity pure_uuid: 991d5b77-d42b-4e78-9acf-20a3cedfabb7 fp7_type: info:eu-repo/semantics/other dates_date: 2004 dates_date_type: published hoa_date_pub: 2005-03-02 citation: Ramchurn, Sarvapali (2004) Multi-Agent Negotiation using Trust and Persuasion. University of Southampton, ECS, Doctoral Thesis. document_url: https://eprints.soton.ac.uk/260200/1/thesis.pdf