RMIT University
Browse

Performance assessment of a system for reasoning under uncertainty

journal contribution
posted on 2024-11-02, 16:28 authored by Branko RisticBranko Ristic, Christopher Gilliam, Marion Byrne
© 2021 Elsevier B.V. From the early developments of machines for reasoning and decision making in higher-level information fusion, there was a need for a systematic and reliable evaluation of their performance. Performance evaluation is important for comparison and assessment of alternative solutions to real-world problems. In this paper we focus on one aspect of performance assessment for reasoning under uncertainty: the accuracy of the resulting belief (prediction or estimate). We propose a framework for assessment based on the assumption that the system under investigation is uncertain only due to stochastic variability (randomness), which is partially known. In this context we formulate a distance measure between the “ground truth” and the output of an automated system for reasoning in the framework of one of the non-additive uncertainty formalisms (such as imprecise probability theory, belief function theory or possibility theory). The proposed assessment framework is demonstrated with a simple numerical example.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1016/j.inffus.2021.01.006
  2. 2.
    ISSN - Is published in 15662535

Journal

Information Fusion

Volume

71

Start page

11

End page

16

Total pages

6

Publisher

Elsevier

Place published

United Kingdom

Language

English

Former Identifier

2006104863

Esploro creation date

2021-04-21