Feature Learning

In our final chapter, where we will be exploring feature engineering techniques, we will be taking a look at what is likely the most powerful feature engineering tool at our disposal. Feature learning algorithms are able to take in cleaned data (yes, you still need to do some work) and create brand-new features by exploiting latent structures within data. If all of this sounds familiar, that is because this is the description that we used in the previous chapter for feature transformations. The differences between these two families of algorithms are in the parametric assumptions that they make when attempting to create new features.

We will be covering the following topics:

  • Parametric assumptions of data
  • Restricted Boltzmann Machines
  • The BernoulliRBM
  • Extracting RBM components from MNIST
  • Using RBMs in a machine learning pipeline
  • Learning text features—word vectorization
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset