0%

If you�¢??ve been curious about machine learning but didn�¢??t know where to start, this is the book you�¢??ve been waiting for. Focusing on the subfield of machine learning known as deep learning, it explains core concepts and gives you the foundation you need to start building your own models. Rather than simply outlining recipes for using existing toolkits, Practical Deep Learning teaches you the why of deep learning and will inspire you to explore further.

All you need is basic familiarity with computer programming and high school math�¢??the book will cover the rest. After an introduction to Python, you�¢??ll move through key topics like how to build a good training dataset, work with the scikit-learn and Keras libraries, and evaluate your models�¢?? performance.

You�¢??ll also learn:

�¢?�¢How to use classic machine learning models like k-Nearest Neighbors, Random Forests, and Support Vector Machines

�¢?�¢How neural networks work and how they�¢??re trained

�¢?�¢How to use convolutional neural networks

�¢?�¢How to develop a successful deep learning model from scratch

You�¢??ll conduct experiments along the way, building to a final case study that incorporates everything you�¢??ve learned. All of the code you�¢??ll use is available at the linked examples repo.

The perfect introduction to this dynamic, ever-expanding field, Practical Deep Learning will give you the skills and confidence to dive into your own machine learning projects.

Table of Contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Dedication
  5. About the Author
  6. About the Technical Reviewer
  7. BRIEF CONTENTS
  8. CONTENTS IN DETAIL
  9. FOREWORD
  10. ACKNOWLEDGMENTS
  11. INTRODUCTION
    1. Who Is This Book For?
    2. What Can You Expect to Learn?
    3. About This Book
  12. 1 GETTING STARTED
    1. The Operating Environment
    2. Installing the Toolkits
    3. Basic Linear Algebra
    4. Statistics and Probability
    5. Graphics Processing Units
    6. Summary
  13. 2 USING PYTHON
    1. The Python Interpreter
    2. Statements and Whitespace
    3. Variables and Basic Data Structures
    4. Control Structures
    5. Functions
    6. Modules
    7. Summary
  14. 3 USING NUMPY
    1. Why NumPy?
    2. Basic Arrays
    3. Accessing Elements in an Array
    4. Operators and Broadcasting
    5. Array Input and Output
    6. Random Numbers
    7. NumPy and Images
    8. Summary
  15. 4 WORKING WITH DATA
    1. Classes and Labels
    2. Features and Feature Vectors
    3. Features of a Good Dataset
    4. Data Preparation
    5. Training, Validation, and Test Data
    6. Look at Your Data
    7. Summary
  16. 5 BUILDING DATASETS
    1. Irises
    2. Breast Cancer
    3. MNIST Digits
    4. CIFAR-10
    5. Data Augmentation
    6. Summary
  17. 6 CLASSICAL MACHINE LEARNING
    1. Nearest Centroid
    2. k-Nearest Neighbors
    3. Naïve Bayes
    4. Decision Trees and Random Forests
    5. Support Vector Machines
    6. Summary
  18. 7 EXPERIMENTS WITH CLASSICAL MODELS
    1. Experiments with the Iris Dataset
    2. Experiments with the Breast Cancer Dataset
    3. Experiments with the MNIST Dataset
    4. Classical Model Summary
    5. When to Use Classical Models
    6. Summary
  19. 8 INTRODUCTION TO NEURAL NETWORKS
    1. Anatomy of a Neural Network
    2. Implementing a Simple Neural Network
    3. Summary
  20. 9 TRAINING A NEURAL NETWORK
    1. A High-Level Overview
    2. Gradient Descent
    3. Stochastic Gradient Descent
    4. Backpropagation
    5. Loss Functions
    6. Weight Initialization
    7. Overfitting and Regularization
    8. Summary
  21. 10 EXPERIMENTS WITH NEURAL NETWORKS
    1. Our Dataset
    2. The MLPClassifier Class
    3. Architecture and Activation Functions
    4. Batch Size
    5. Base Learning Rate
    6. Training Set Size
    7. L2 Regularization
    8. Momentum
    9. Weight Initialization
    10. Feature Ordering
    11. Summary
  22. 11 EVALUATING MODELS
    1. Definitions and Assumptions
    2. Why Accuracy Is Not Enough
    3. The 2 × 2 Confusion Matrix
    4. Metrics Derived from the 2 × 2 Confusion Matrix
    5. More Advanced Metrics
    6. The Receiver Operating Characteristics Curve
    7. Handling Multiple Classes
    8. Summary
  23. 12 INTRODUCTION TO CONVOLUTIONAL NEURAL NETWORKS
    1. Why Convolutional Neural Networks?
    2. Convolution
    3. Anatomy of a Convolutional Neural Network
    4. Convolutional Layers
    5. Pooling Layers
    6. Fully Connected Layers
    7. Fully Convolutional Layers
    8. Step by Step
    9. Summary
  24. 13 EXPERIMENTS WITH KERAS AND MNIST
    1. Building CNNs in Keras
    2. Basic Experiments
    3. Fully Convolutional Networks
    4. Scrambled MNIST Digits
    5. Summary
  25. 14 EXPERIMENTS WITH CIFAR-10
    1. A CIFAR-10 Refresher
    2. Working with the Full CIFAR-10 Dataset
    3. Animal or Vehicle?
    4. Binary or Multiclass?
    5. Transfer Learning
    6. Fine-Tuning a Model
    7. Summary
  26. 15 A CASE STUDY: CLASSIFYING AUDIO SAMPLES
    1. Building the Dataset
    2. Classifying the Audio Features
    3. Spectrograms
    4. Classifying Spectrograms
    5. Ensembles
    6. Summary
  27. 16 GOING FURTHER
    1. Going Further with CNNs
    2. Reinforcement Learning and Unsupervised Learning
    3. Generative Adversarial Networks
    4. Recurrent Neural Networks
    5. Online Resources
    6. Conferences
    7. The Book
    8. So Long and Thanks for All the Fish
  28. INDEX
3.134.104.173