Home Page Icon
Home Page
Table of Contents for
cover
Close
cover
by Matthew Moocarme, Mahla Abdolahnejad, Ritesh Bhagwat
The Deep Learning with Keras Workshop
The Deep Learning with Keras Workshop
Preface
About the Book
Audience
About the Chapters
Conventions
Code Presentation
Setting up Your Environment
Installing Anaconda
Installing Libraries
Running Jupyter Notebook
Accessing the Code Files
1. Introduction to Machine Learning with Keras
Introduction
Data Representation
Tables of Data
Loading Data
Exercise 1.01: Loading a Dataset from the UCI Machine Learning Repository
Data Preprocessing
Exercise 1.02: Cleaning the Data
Appropriate Representation of the Data
Exercise 1.03: Appropriate Representation of the Data
Life Cycle of Model Creation
Machine Learning Libraries
scikit-learn
Keras
Advantages of Keras
Disadvantages of Keras
More Than Building Models
Model Training
Classifiers and Regression Models
Classification Tasks
Regression Tasks
Training Datasets and Test Datasets
Model Evaluation Metrics
Exercise 1.04: Creating a Simple Model
Model Tuning
Baseline Models
Exercise 1.05: Determining a Baseline Model
Regularization
Cross-Validation
Activity 1.01: Adding Regularization to the Model
Summary
2. Machine Learning versus Deep Learning
Introduction
Advantages of ANNs over Traditional Machine Learning Algorithms
Advantages of Traditional Machine Learning Algorithms over ANNs
Hierarchical Data Representation
Linear Transformations
Scalars, Vectors, Matrices, and Tensors
Tensor Addition
Exercise 2.01: Performing Various Operations with Vectors, Matrices, and Tensors
Reshaping
Matrix Transposition
Exercise 2.02: Matrix Reshaping and Transposition
Matrix Multiplication
Exercise 2.03: Matrix Multiplication
Exercise 2.04: Tensor Multiplication
Introduction to Keras
Layer Types
Activation Functions
Model Fitting
Activity 2.01: Creating a Logistic Regression Model Using Keras
Summary
3. Deep Learning with Keras
Introduction
Building Your First Neural Network
Logistic Regression to a Deep Neural Network
Activation Functions
Forward Propagation for Making Predictions
Loss Function
Backpropagation for Computing Derivatives of Loss Function
Gradient Descent for Learning Parameters
Exercise 3.01: Neural Network Implementation with Keras
Activity 3.01: Building a Single-Layer Neural Network for Performing Binary Classification
Model Evaluation
Evaluating a Trained Model with Keras
Splitting Data into Training and Test Sets
Underfitting and Overfitting
Early Stopping
Activity 3.02: Advanced Fibrosis Diagnosis with Neural Networks
Summary
4. Evaluating Your Model with Cross-Validation Using Keras Wrappers
Introduction
Cross-Validation
Drawbacks of Splitting a Dataset Only Once
K-Fold Cross-Validation
Leave-One-Out Cross-Validation
Comparing the K-Fold and LOO Methods
Cross-Validation for Deep Learning Models
Keras Wrapper with scikit-learn
Exercise 4.01: Building the Keras Wrapper with scikit-learn for a Regression Problem
Cross-Validation with scikit-learn
Cross-Validation Iterators in scikit-learn
Exercise 4.02: Evaluating Deep Neural Networks with Cross-Validation
Activity 4.01: Model Evaluation Using Cross-Validation for an Advanced Fibrosis Diagnosis Classifier
Model Selection with Cross-Validation
Cross-Validation for Model Evaluation versus Model Selection
Exercise 4.03: Writing User-Defined Functions to Implement Deep Learning Models with Cross-Validation
Activity 4.02: Model Selection Using Cross-Validation for the Advanced Fibrosis Diagnosis Classifier
Activity 4.03: Model Selection Using Cross-validation on a Traffic Volume Dataset
Summary
5. Improving Model Accuracy
Introduction
Regularization
The Need for Regularization
Reducing Overfitting with Regularization
L1 and L2 Regularization
L1 and L2 Regularization Formulation
L1 and L2 Regularization Implementation in Keras
Activity 5.01: Weight Regularization on an Avila Pattern Classifier
Dropout Regularization
Principles of Dropout Regularization
Reducing Overfitting with Dropout
Exercise 5.01: Dropout Implementation in Keras
Activity 5.02: Dropout Regularization on the Traffic Volume Dataset
Other Regularization Methods
Early Stopping
Exercise 5.02: Implementing Early Stopping in Keras
Data Augmentation
Adding Noise
Hyperparameter Tuning with scikit-learn
Grid Search with scikit-learn
Randomized Search with scikit-learn
Activity 5.03: Hyperparameter Tuning on the Avila Pattern Classifier
Summary
6. Model Evaluation
Introduction
Accuracy
Exercise 6.01: Calculating Null Accuracy on a Pacific Hurricanes Dataset
Advantages and Limitations of Accuracy
Imbalanced Datasets
Working with Imbalanced Datasets
Confusion Matrix
Metrics Computed from a Confusion Matrix
Exercise 6.02: Computing Accuracy and Null Accuracy with APS Failure for Scania Trucks Data
Activity 6.01: Computing the Accuracy and Null Accuracy of a Neural Network When We Change the Train/Test Split
Exercise 6.03: Deriving and Computing Metrics Based on a Confusion Matrix
Activity 6.02: Calculating the ROC Curve and AUC Score
Summary
7. Computer Vision with Convolutional Neural Networks
Introduction
Computer Vision
Convolutional Neural Networks
The Architecture of a CNN
Input Image
Convolution Layer
The Pooling Layer
Flattening
Image Augmentation
Advantages of Image Augmentation
Exercise 7.01: Building a CNN and Identifying Images of Cars and Flowers
Activity 7.01: Amending Our Model with Multiple Layers and the Use of softmax
Exercise 7.02: Amending Our Model by Reverting to the Sigmoid Activation Function
Exercise 7.03: Changing the Optimizer from Adam to SGD
Exercise 7.04: Classifying a New Image
Activity 7.02: Classifying a New Image
Summary
8. Transfer Learning and Pre-Trained Models
Introduction
Pre-Trained Sets and Transfer Learning
Feature Extraction
Fine-Tuning a Pre-Trained Network
The ImageNet Dataset
Some Pre-Trained Networks in Keras
Exercise 8.01: Identifying an Image Using the VGG16 Network
Activity 8.01: Using the VGG16 Network to Train a Deep Learning Network to Identify Images
Exercise 8.02: Classifying Images That Are Not Present in the ImageNet Database
Exercise 8.03: Fine-Tuning the VGG16 Model
Exercise 8.04: Image Classification with ResNet
Activity 8.02: Image Classification with ResNet
Summary
9. Sequential Modeling with Recurrent Neural Networks
Introduction
Sequential Memory and Sequential Modeling
Recurrent Neural Networks (RNNs)
The Vanishing Gradient Problem
A Brief Explanation of the Exploding Gradient Problem
Long Short-Term Memory (LSTM)
Exercise 9.01: Predicting the Trend of Alphabet's Stock Price Using an LSTM with 50 Units (Neurons)
Activity 9.01: Predicting the Trend of Amazon's Stock Price Using an LSTM with 50 Units (Neurons)
Exercise 9.02: Predicting the Trend of Alphabet's Stock Price Using an LSTM with 100 units
Activity 9.02: Predicting Amazon's Stock Price with Added Regularization
Activity 9.03: Predicting the Trend of Amazon's Stock Price Using an LSTM with an Increasing Number of LSTM Neurons (100 Units)
Summary
Appendix
1. Introduction to Machine Learning with Keras
Activity 1.01: Adding Regularization to the Model
2. Machine Learning versus Deep Learning
Activity 2.01: Creating a Logistic Regression Model Using Keras
3. Deep Learning with Keras
Activity 3.01: Building a Single-Layer Neural Network for Performing Binary Classification
Activity 3.02: Advanced Fibrosis Diagnosis with Neural Networks
4. Evaluating Your Model with Cross-Validation Using Keras Wrappers
Activity 4.01: Model Evaluation Using Cross-Validation for an Advanced Fibrosis Diagnosis Classifier
Activity 4.02: Model Selection Using Cross-Validation for the Advanced Fibrosis Diagnosis Classifier
Activity 4.03: Model Selection Using Cross-validation on a Traffic Volume Dataset
5. Improving Model Accuracy
Activity 5.01: Weight Regularization on an Avila Pattern Classifier
Activity 5.02: Dropout Regularization on the Traffic Volume Dataset
Activity 5.03: Hyperparameter Tuning on the Avila Pattern Classifier
6. Model Evaluation
Activity 6.01: Computing the Accuracy and Null Accuracy of a Neural Network When We Change the Train/Test Split
Activity 6.02: Calculating the ROC Curve and AUC Score
7. Computer Vision with Convolutional Neural Networks
Activity 7.01: Amending Our Model with Multiple Layers and the Use of softmax
Activity 7.02: Classifying a New Image
8. Transfer Learning and Pre-Trained Models
Activity 8.01: Using the VGG16 Network to Train a Deep Learning Network to Identify Images
Activity 8.02: Image Classification with ResNet
9. Sequential Modeling with Recurrent Neural Networks
Activity 9.01: Predicting the Trend of Amazon's Stock Price Using an LSTM with 50 Units (Neurons)
Activity 9.02: Predicting Amazon's Stock Price with Added Regularization
Activity 9.03: Predicting the Trend of Amazon's Stock Price Using an LSTM with an Increasing Number of LSTM Neurons (100 Units)
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Next
Next Chapter
The Deep Learning with Keras Workshop
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset