RMIT University
Browse

Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces

conference contribution
posted on 2024-11-03, 13:55 authored by Hung Tran-The, Sunil Gupta, Santu Rana, Huong HaHuong Ha, Svetha Venkatesh
Bayesian optimisation is a popular method for efficient optimisation of expensive black-box functions. Traditionally, BO assumes that the search space is known. However, in many problems, this assumption does not hold. To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a \emph{hyperharmonic series}. Further, we propose another variant of our algorithm that scales to high dimensions. We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates. Our experiments with synthetic and real-world optimisation tasks demonstrate the superiority of our algorithms over the current state-of-the-art methods for Bayesian optimisation in unknown search space.

Funding

Pattern analysis for accelerating scientific innovation

Australian Research Council

Find out more...

History

Start page

1

End page

11

Total pages

11

Outlet

Proceedings of the 34th Conference on Neural Information Processing Systems (NeurIPS 2020)

Editors

H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan and H. Lin

Name of conference

NeurIPS 2020

Publisher

arXiv

Place published

New York, United States

Start date

2020-12-06

End date

2020-12-12

Language

English

Former Identifier

2006107681

Esploro creation date

2021-08-11

Usage metrics

    Scholarly Works

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC