1. A framework to build bespoke auto-tuners with structured Bayesian optimisation
- Author
-
Valentin Dalibard
- Abstract
Due to their complexity, modern computer systems expose many configuration parameters which users must manually tune to maximise the performance of their applications. To relieve users of this burden, auto-tuning has emerged as an alternative in which a black-box optimiser iteratively evaluates configurations to find efficient ones. A popular auto-tuning technique is Bayesian optimisation, which uses the results to incrementally build a probabilistic model of the impact of the parameters on performance. This allows the optimisation to quickly focus on efficient regions of the configuration space. Unfortunately, for many computer systems, either the configuration space is too large to develop a good model, or the time to evaluate performance is too long to be executed many times. In this dissertation, I argue that by extracting a small amount of domain specific knowledge about a system, it is possible to build a bespoke auto-tuner with significantly better performance than its off-the-shelf counterparts. This could be performed, for example, by a system engineer who has a good understanding of the underlying system behaviour and wants to provide performance portability. This dissertation presents BOAT, a framework to build BespOke Auto-Tuners. BOAT offers a novel set of abstractions designed to make the exposition of domain knowledge easy and intuitive. First, I introduce Structured Bayesian Optimisation (SBO), an extension of the Bayesian optimisation algorithm. SBO can leverage a bespoke probabilistic model of the system’s behaviour, provided by the system engineer, to rapidly converge to high performance configurations. The model can benefit from observing many runtime measurements per evaluation of the system, akin to the use of profilers. Second, I present Probabilistic-C++ a lightweight, high performance probabilistic programming library. It allows users to declare a probabilistic models of their system’s behaviour and expose it to an SBO. Probabilistic programming is a recent tool from the Machine Learning community making the declaration of structured probabilistic models intuitive. Third, I present a new optimisation scheduling abstraction which offers a structured way to express optimisations which themselves execute other optimisations. For example, this is useful to express Bayesian optimisations, which each iteration execute a numerical optimisation. Furthermore, it allows users to easily implement decompositions which exploit the loose coupling of subsets of the configuration parameters to optimise them almost independently.
- Published
- 2017
- Full Text
- View/download PDF