{ "cells": [ { "cell_type": "markdown", "id": "b802985c", "metadata": { "origin_pos": 0 }, "source": [ "# Hyperparameter Optimization\n", ":label:`chap_hyperopt`\n", "\n", "**Aaron Klein** (*Amazon*), **Matthias Seeger** (*Amazon*), and **Cedric Archambeau** (*Amazon*)\n", "\n", "The performance of every machine learning model depends on its hyperparameters.\n", "They control the learning algorithm or the structure of the underlying\n", "statistical model. However, there is no general way to choose hyperparameters\n", "in practice. Instead, hyperparameters are often set in a trial-and-error manner\n", "or sometimes left to their default values by practitioners, leading to\n", "suboptimal generalization.\n", "\n", "Hyperparameter optimization provides a systematic approach to this problem, by\n", "casting it as an optimization problem: a good set of hyperparameters should (at\n", "least) minimize a validation error. Compared to most other optimization problems\n", "arising in machine learning, hyperparameter optimization is a nested one, where\n", "each iteration requires training and validating a machine learning model.\n", "\n", "In this chapter, we will first introduce the basics of hyperparameter\n", "optimization. We will also present some recent advancements that improve the\n", "overall efficiency of hyperparameter optimization by exploiting cheap-to-evaluate\n", "proxies of the original objective function. At the end of this chapter, you\n", "should be able to apply state-of-the-art hyperparameter optimization techniques\n", "to optimize the hyperparameter of your own machine learning algorithm.\n", "\n", ":begin_tab:toc\n", " - [hyperopt-intro](hyperopt-intro.ipynb)\n", " - [hyperopt-api](hyperopt-api.ipynb)\n", " - [rs-async.md](rs-async.md.ipynb)\n", " - [sh-intro](sh-intro.ipynb)\n", " - [sh-async](sh-async.ipynb)\n", ":end_tab:\n" ] } ], "metadata": { "language_info": { "name": "python" }, "required_libs": [] }, "nbformat": 4, "nbformat_minor": 5 }