{ "cells": [ { "cell_type": "markdown", "id": "34b05043", "metadata": { "origin_pos": 0 }, "source": [ "# Multilayer Perceptrons\n", ":label:`chap_perceptrons`\n", "\n", "In this chapter, we will introduce your first truly *deep* network.\n", "The simplest deep networks are called *multilayer perceptrons*,\n", "and they consist of multiple layers of neurons\n", "each fully connected to those in the layer below\n", "(from which they receive input)\n", "and those above (which they, in turn, influence).\n", "Although automatic differentiation\n", "significantly simplifies the implementation of deep learning algorithms,\n", "we will dive deep into how these gradients\n", "are calculated in deep networks.\n", "Then we will\n", "be ready to\n", "discuss issues relating to numerical stability and parameter initialization\n", "that are key to successfully training deep networks.\n", "When we train such high-capacity models we run the risk of overfitting. Thus, we will\n", "revisit regularization and generalization\n", "for deep networks.\n", "Throughout, we aim\n", "to give you a firm grasp not just of the concepts but also of the practice of using deep networks.\n", "At the end of this chapter, we apply what we have introduced so far to a real case: house price\n", "prediction. We punt matters relating to the computational performance, scalability, and efficiency\n", "of our models to subsequent chapters.\n", "\n", ":begin_tab:toc\n", " - [mlp](mlp.ipynb)\n", " - [mlp-implementation](mlp-implementation.ipynb)\n", " - [backprop](backprop.ipynb)\n", " - [numerical-stability-and-init](numerical-stability-and-init.ipynb)\n", " - [generalization-deep](generalization-deep.ipynb)\n", " - [dropout](dropout.ipynb)\n", " - [kaggle-house-price](kaggle-house-price.ipynb)\n", ":end_tab:\n" ] } ], "metadata": { "language_info": { "name": "python" }, "required_libs": [] }, "nbformat": 4, "nbformat_minor": 5 }