Part 1 Foundations of TensorFlow 2 and deep learning
1 The amazing world of TensorFlow
An overview of popular components of TensorFlow
Building and deploying a machine learning model
1.3 When and when not to use TensorFlow
1.4 What will this book teach you?
1.6 Should we really care about Python and TensorFlow 2?
2.1 First steps with TensorFlow 2
How does TensorFlow operate under the hood?
2.2 TensorFlow building blocks
2.3 Neural network–related computations in TensorFlow
3 Keras and data retrieval in TensorFlow 2
3.2 Retrieving data for TensorFlow/Keras models
4 Dipping toes in deep learning
4.2 Convolutional neural networks
4.3 One step at a time: Recurrent neural networks (RNNs)
Predicting future CO2 values with the trained model
5 State-of-the-art in deep learning: Transformers
5.1 Representing text as numbers
5.2 Understanding the Transformer model
The encoder-decoder view of the Transformer
Understanding self-attention using scalars
Self-attention as a cooking competition
Part 2 Look ma, no hands! Deep networks in the real world
6 Teaching machines to see: Image classification with CNNs
6.1 Putting the data under the microscope: Exploratory data analysis
Understanding the classes in the data set
Computing simple statistics on the data set
6.2 Creating data pipelines using the Keras ImageDataGenerator
6.3 Inception net: Implementing a state-of-the-art image classifier
6.4 Training the model and evaluating performance
7 Teaching machines to see better: Improving CNNs and making them confess
7.1 Techniques for reducing overfitting
Image data augmentation with Keras
Dropout: Randomly switching off parts of your network to improve generalizability
Early stopping: Halting the training process if the network starts to underperform
7.2 Toward minimalism: Minception instead of Inception
Implementing Inception-ResNet type A block
Implementing the Inception-ResNet type B block
Implementing the reduction block
7.3 If you can't beat them, join ‘em: Using pretrained networks for enhancing performance
Transfer learning: Reusing existing knowledge in deep neural networks
7.4 Grad-CAM: Making CNNs confess
8 Telling things apart: Image segmentation
8.2 Getting serious: Defining a TensorFlow data pipeline
8.3 DeepLabv3: Using pretrained networks to segment images
A quick overview of the ResNet-50 model
Atrous convolution: Increasing the receptive field of convolution layers with holes
Implementing DeepLab v3 using the Keras functional API
Implementing the atrous spatial pyramid pooling module
8.4 Compiling the model: Loss functions and evaluation metrics in image segmentation
9 Natural language processing with TensorFlow: Sentiment analysis
9.1 What the text? Exploring and processing text
9.2 Getting text ready for the model
Splitting training/validation and testing data
Text to words and then to numbers with Keras
9.3 Defining an end-to-end NLP pipeline with TensorFlow
9.4 Happy reviews mean happy customers: Sentiment analysis
9.5 Training and evaluating the model
9.6 Injecting semantics with word vectors
Defining the final model with word embeddings
Training and evaluating the model
10 Natural language processing with TensorFlow: Language modeling
Downloading and playing with data
Too large vocabulary? N-grams to the rescue
10.2 GRUs in Wonderland: Generating text with deep learning
10.3 Measuring the quality of the generated text
10.4 Training and evaluating the language model
10.5 Generating new text from the language model: Greedy decoding
10.6 Beam search: Enhancing the predictive power of sequential models
Part 3 Advanced deep networks for complex problems
11 Sequence-to-sequence learning: Part 1
11.1 Understanding the machine translation data
11.2 Writing an English-German seq2seq machine translator
Defining the TextVectorization layers for the seq2seq model
Defining the decoder and the final model
11.3 Training and evaluating the model
11.4 From training to inference: Defining the inference model
12 Sequence-to-sequence learning: Part 2
12.1 Eyeballing the past: Improving our model with attention
Implementing Bahdanau attention in TensorFlow
12.2 Visualizing the attention
13.1 Transformers in more detail
Revisiting the basic components of the Transformer
13.2 Using pretrained BERT for spam classification
Classifying spam with BERT in TensorFlow
13.3 Question answering with Hugging Face’s Transformers
14 TensorBoard: Big brother of TensorFlow
14.1 Visualize data with TensorBoard
14.2 Tracking and monitoring models with TensorBoard
14.3 Using tf.summary to write custom metrics during model training
14.4 Profiling models to detect performance bottlenecks
14.5 Visualizing word vectors with the TensorBoard
15 TFX: MLOps and deploying models with TensorFlow
15.1 Writing a data pipeline with TFX
Generating basic statistics from the data
Inferring the schema from data
15.2 Training a simple regression neural network: TFX Trainer API
SignatureDefs: Defining how models are used outside TensorFlow
Training the Keras model with TFX Trainer
15.3 Setting up Docker to serve a trained model
15.4 Deploying the model and serving it through an API
Predicting with the TensorFlow serving API
appendix A Setting up the environment
appendix C Natural language processing
18.189.170.17