RMIT University
Browse

Distributionally Robust Bayesian Quadrature Optimization

conference contribution
posted on 2024-11-03, 14:36 authored by Thanh Nguyen, Sunil Gupta, Huong HaHuong Ha, Santu Rana, Svetha Venkatesh
Bayesian quadrature optimization (BQO) maximizes the expectation of an expensive black-box integrand taken over a known probability distribution. In this work, we study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d samples. A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set. Though Monte Carlo estimate is unbiased, it has high variance given a small set of samples; thus can result in a spurious objective function. We adopt the distributionally robust optimization perspective to this problem by maximizing the expected objective under the most adversarial distribution. In particular, we propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose. We demonstrate the empirical effectiveness of our proposed framework in synthetic and real-world problems, and characterize its theoretical convergence via Bayesian regret.

History

Start page

1921

End page

1931

Total pages

11

Outlet

Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020)

Name of conference

AISTATS 2020

Publisher

ML Research Press

Place published

United States

Start date

2020-06-03

End date

2020-06-05

Language

English

Copyright

© 2020 by the author(s).

Former Identifier

2006107682

Esploro creation date

2021-08-11

Usage metrics

    Scholarly Works

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC