Deep Learning with Sequence Data and Text

In the last chapter, we covered how to handle spatial data using Convolution Neural Networks (CNNs) and also built image classifiers. In this chapter, we will cover the following topics:

  • Different representations of text data that are useful for building deep learning models
  • Understanding recurrent neural networks (RNNs) and different implementations of RNNs, such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), which power most of the deep learning models for text and sequential data
  • Using one-dimensional convolutions for sequential data

Some of the applications that can be built using RNNs are:

  • Document classifiers: Identifying the sentiment of a tweet or review, classifying news articles
  • Sequence-to-sequence learning: For tasks such as language translations, converting English to French
  • Time-series forecasting: Predicting the sales of a store given details about previous days' store details
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.129.26.22