Performance assessment of a system for reasoning under uncertainty
journal contribution
posted on 2024-11-02, 16:28 authored by Branko RisticBranko Ristic, Christopher Gilliam, Marion Byrne© 2021 Elsevier B.V. From the early developments of machines for reasoning and decision making in higher-level information fusion, there was a need for a systematic and reliable evaluation of their performance. Performance evaluation is important for comparison and assessment of alternative solutions to real-world problems. In this paper we focus on one aspect of performance assessment for reasoning under uncertainty: the accuracy of the resulting belief (prediction or estimate). We propose a framework for assessment based on the assumption that the system under investigation is uncertain only due to stochastic variability (randomness), which is partially known. In this context we formulate a distance measure between the “ground truth” and the output of an automated system for reasoning in the framework of one of the non-additive uncertainty formalisms (such as imprecise probability theory, belief function theory or possibility theory). The proposed assessment framework is demonstrated with a simple numerical example.
History
Related Materials
- 1.
- 2.
Journal
Information FusionVolume
71Start page
11End page
16Total pages
6Publisher
ElsevierPlace published
United KingdomLanguage
EnglishFormer Identifier
2006104863Esploro creation date
2021-04-21Usage metrics
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC

