contents

front matter

preface

acknowledgments

about this book

about the author

about the cover illustration

Part 1 Foundations of TensorFlow 2 and deep learning

1 The amazing world of TensorFlow

1.1 What is TensorFlow?

An overview of popular components of TensorFlow

Building and deploying a machine learning model

1.2 GPU vs. CPU

1.3 When and when not to use TensorFlow

When to use TensorFlow

When not to use TensorFlow

1.4 What will this book teach you?

TensorFlow fundamentals

Deep learning algorithms

Monitoring and optimization

1.5 Who is this book for?

1.6 Should we really care about Python and TensorFlow 2?

2 TensorFlow 2

2.1 First steps with TensorFlow 2

How does TensorFlow operate under the hood?

2.2 TensorFlow building blocks

Understanding tf.Variable

Understanding tf.Tensor

Understanding tf.Operation

2.3 Neural network–related computations in TensorFlow

Matrix multiplication

Convolution operation

Pooling operation

3 Keras and data retrieval in TensorFlow 2

3.1 Keras model-building APIs

Introducing the data set

The Sequential API

The functional API

The sub-classing API

3.2 Retrieving data for TensorFlow/Keras models

tf.data API

Keras DataGenerators

tensorflow-datasets package

4 Dipping toes in deep learning

4.1 Fully connected networks

Understanding the data

Autoencoder model

4.2 Convolutional neural networks

Understanding the data

Implementing the network

4.3 One step at a time: Recurrent neural networks (RNNs)

Understanding the data

Implementing the model

Predicting future CO2 values with the trained model

5 State-of-the-art in deep learning: Transformers

5.1 Representing text as numbers

5.2 Understanding the Transformer model

The encoder-decoder view of the Transformer

Diving deeper

Self-attention layer

Understanding self-attention using scalars

Self-attention as a cooking competition

Masked self-attention layers

Multi-head attention

Fully connected layer

Putting everything together

Part 2 Look ma, no hands! Deep networks in the real world

6 Teaching machines to see: Image classification with CNNs

6.1 Putting the data under the microscope: Exploratory data analysis

The folder/file structure

Understanding the classes in the data set

Computing simple statistics on the data set

6.2 Creating data pipelines using the Keras ImageDataGenerator

6.3 Inception net: Implementing a state-of-the-art image classifier

Recap on CNNs

Inception net v1

Putting everything together

Other Inception models

6.4 Training the model and evaluating performance

7 Teaching machines to see better: Improving CNNs and making them confess

7.1 Techniques for reducing overfitting

Image data augmentation with Keras

Dropout: Randomly switching off parts of your network to improve generalizability

Early stopping: Halting the training process if the network starts to underperform

7.2 Toward minimalism: Minception instead of Inception

Implementing the stem

Implementing Inception-ResNet type A block

Implementing the Inception-ResNet type B block

Implementing the reduction block

Putting everything together

Training Minception

7.3 If you can't beat them, join ‘em: Using pretrained networks for enhancing performance

Transfer learning: Reusing existing knowledge in deep neural networks

7.4 Grad-CAM: Making CNNs confess

8 Telling things apart: Image segmentation

8.1 Understanding the data

8.2 Getting serious: Defining a TensorFlow data pipeline

Optimizing tf.data pipelines

The final tf.data pipeline

8.3 DeepLabv3: Using pretrained networks to segment images

A quick overview of the ResNet-50 model

Atrous convolution: Increasing the receptive field of convolution layers with holes

Implementing DeepLab v3 using the Keras functional API

Implementing the atrous spatial pyramid pooling module

Putting everything together

8.4 Compiling the model: Loss functions and evaluation metrics in image segmentation

Loss functions

Evaluation metrics

8.5 Training the model

8.6 Evaluating the model

9 Natural language processing with TensorFlow: Sentiment analysis

9.1 What the text? Exploring and processing text

9.2 Getting text ready for the model

Splitting training/validation and testing data

Analyze the vocabulary

Analyzing the sequence length

Text to words and then to numbers with Keras

9.3 Defining an end-to-end NLP pipeline with TensorFlow

9.4 Happy reviews mean happy customers: Sentiment analysis

LSTM Networks

Defining the final model

9.5 Training and evaluating the model

9.6 Injecting semantics with word vectors

Word embeddings

Defining the final model with word embeddings

Training and evaluating the model

10 Natural language processing with TensorFlow: Language modeling

10.1 Processing the data

What is language modeling?

Downloading and playing with data

Too large vocabulary? N-grams to the rescue

Tokenizing text

Defining a tf.data pipeline

10.2 GRUs in Wonderland: Generating text with deep learning

10.3 Measuring the quality of the generated text

10.4 Training and evaluating the language model

10.5 Generating new text from the language model: Greedy decoding

10.6 Beam search: Enhancing the predictive power of sequential models

Part 3 Advanced deep networks for complex problems

11 Sequence-to-sequence learning: Part 1

11.1 Understanding the machine translation data

11.2 Writing an English-German seq2seq machine translator

The TextVectorization layer

Defining the TextVectorization layers for the seq2seq model

Defining the encoder

Defining the decoder and the final model

Compiling the model

11.3 Training and evaluating the model

11.4 From training to inference: Defining the inference model

12 Sequence-to-sequence learning: Part 2

12.1 Eyeballing the past: Improving our model with attention

Implementing Bahdanau attention in TensorFlow

Defining the final model

Training the model

12.2 Visualizing the attention

13 Transformers

13.1 Transformers in more detail

Revisiting the basic components of the Transformer

Embeddings in the Transformer

Residuals and normalization

13.2 Using pretrained BERT for spam classification

Understanding BERT

Classifying spam with BERT in TensorFlow

13.3 Question answering with Hugging Face’s Transformers

Understanding the data

Processing data

Defining the DistilBERT model

Training the model

Ask BERT a question

14 TensorBoard: Big brother of TensorFlow

14.1 Visualize data with TensorBoard

14.2 Tracking and monitoring models with TensorBoard

14.3 Using tf.summary to write custom metrics during model training

14.4 Profiling models to detect performance bottlenecks

Optimizing the input pipeline

Mixed precision training

14.5 Visualizing word vectors with the TensorBoard

15 TFX: MLOps and deploying models with TensorFlow

15.1 Writing a data pipeline with TFX

Loading data from CSV files

Generating basic statistics from the data

Inferring the schema from data

Converting data to features

15.2 Training a simple regression neural network: TFX Trainer API

Defining a Keras model

Defining the model training

SignatureDefs: Defining how models are used outside TensorFlow

Training the Keras model with TFX Trainer

15.3 Setting up Docker to serve a trained model

15.4 Deploying the model and serving it through an API

Validating the infrastructure

Resolving the correct model

Evaluating the model

Pushing the final model

Predicting with the TensorFlow serving API

appendix A Setting up the environment

appendix B Computer vision

appendix C Natural language processing

index

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.189.170.17