Index
A
- academic baseline datasets, Deep Learning Is Not Just for Image Classification
- accountability for ethics violations, Recourse and Accountability, Fairness, Accountability, and Transparency
- accuracy metric
- classification models, Computing Metrics Using Broadcasting
- deeper models, Going Deeper
- improving while validation loss worse, Discriminative Learning Rates
- Mixup augmentation improving, Mixup
- more parameters, more accuracy, Deeper Architectures
- multi-label classifier, Binary Cross Entropy
- top 5 accuracy, A State-of-the-Art ResNet
- validation set, How Our Image Recognizer Works, How Our Image Recognizer Works
- validation set size, Validation Sets and Test Sets
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- activation function
- activation regularization, Activation Regularization and Temporal Activation Regularization
- activations
- ActivationStats, A Simple Baseline
- Adam, Adam
- AdaptiveAvgPool2d, Going Back to Imagenette
- aggregation bias, Aggregation bias
- algorithm buggy, ethics of, Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability, Addressing different types of bias
- Ali, Muhammad, Historical bias
- Amazon
- Ameisin, Emmanuel, How to Avoid Disaster
- Angwin, Julia, Conclusion
- annealing learning rate, 1cycle Training
- Apple APIs for apps under iOS, Deploying Your App
- applications, Going Deeper into fastai’s Layered API
- (see also web applications)
- architecture of model
- AWD-LSTM architecture, Fine-Tuning the Language Model
- AWD-LSTM for NLP RNNs, Regularizing an LSTM
- computer vision, Computer Vision
- deeper architectures, Deeper Architectures
- definition, A Bit of Deep Learning Jargon, Jargon Recap
- exporting models, Using the Model for Inference
- long short-term memory, Pixels: The Foundations of Computer Vision
- natural language processing, Natural Language Processing
- picking not so important, How Our Image Recognizer Works
- ResNet, How Our Image Recognizer Works, Deeper Architectures, ResNets
- (see also ResNet architecture)
- tabular models, Tabular
- argument binding with partial function, Binary Cross Entropy
- Arkansas healthcare buggy algorithm (ethics), Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability, Addressing different types of bias
- arrays
- arrest record Google bias, Bias: Professor Latanya Sweeney “Arrested”
- artificial intelligence (see machine learning)
- Artificial Intelligence: A Frontier of Automation article, What Is Machine Learning?
- autocompletion in notebooks, Gathering Data
- autogenerated text (see text generation)
- autonomous vehicles, Deep Learning Is Not Just for Image Classification, The Drivetrain Approach
- AWD-LSTM architecture
- axis of tensor or matrix, Jargon Recap
- Azure Cognitive Services (Microsoft), Gathering Data
B
- backpropagation
- backpropagation through time (BPTT), Maintaining the State of an RNN
- backward hook, CAM and Hooks
- backward pass, Calculating Gradients, Jargon Recap, The Forward and Backward Passes
- bagging, Random Forests-Ensembling
- Barocas, Solon, Fairness, Accountability, and Transparency
- batch normalization, Batch Normalization
- batch operations
- batch size, SGD and Mini-Batches
- data augmentation, Data Augmentation
- GPU serving production model, Deploying Your App
- mini-batch, From Data to DataLoaders, Jargon Recap
- PyTorch single item or batch same code, Binary Cross Entropy
- resizing images, How Our Image Recognizer Works
- SGD and mini-batches, SGD and Mini-Batches
- show_batch method, Checking and Debugging a DataBlock
- texts into batches for language model, Putting Our Texts into Batches for a Language Model-Putting Our Texts into Batches for a Language Model
- BCELoss, Binary Cross Entropy
- BCEWithLogitsLoss, Binary Cross Entropy, Conclusion
- bear classifier (see image classifier models)
- beginning
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- begin in known areas, Starting Your Project
- begin with simple baseline model, First Try: Pixel Similarity, Checking and Debugging a DataBlock
- book website, Deep Learning in Practice: That’s a Wrap!
- deep learning applicability to problem, The State of Deep Learning
- experiments lead to projects, Starting Your Project
- first model, Your First Model-Running Your First Notebook
- first notebook, Running Your First Notebook-Running Your First Notebook
- GPU servers, Getting a GPU Deep Learning Server
- (see also GPU deep learning servers)
- Jupyter Notebook, Getting a GPU Deep Learning Server
- (see also Jupyter Notebook)
- pretrained model accuracy, How Our Image Recognizer Works
- process (see process end-to-end)
- steps toward starting, Deep Learning in Practice: That’s a Wrap!
- Bengio, Yoshua, Pixels: The Foundations of Computer Vision, Defining and Initializing a Layer
- Berkhahn, Felix, Categorical Embeddings, Combining Embeddings with Other Methods
- bias
- about, Bias, Historical bias, Addressing different types of bias
- aggregation bias, Aggregation bias
- Bing Image Search example, Gathering Data
- facial recognition, Integrating Machine Learning with Product Design, Historical bias
- feedback loops, Limitations Inherent to Machine Learning, Unforeseen Consequences and Feedback Loops
- gender and, The Power of Diversity
- Google advertising, Bias: Professor Latanya Sweeney “Arrested”
- historical bias, Historical bias-Historical bias
- measurement bias, Measurement bias
- mitigation, Addressing different types of bias
- racial bias, Historical bias
- representation bias, Representation bias, Bootstrapping a Collaborative Filtering Model
- socioeconomic bias, Addressing different types of bias
- binary cross entropy loss function, Constructing a DataBlock-Binary Cross Entropy
- binary database format as data type, From Dogs and Cats to Pet Breeds
- Binder free app hosting, Deploying Your App
- Bing Image Search for gathering data, Gathering Data
- Bittman, Ladislav, Disinformation
- Black, Edwin, Why Does This Matter?
- blogging about deep learning journey
- body of a model, cnn_learner
- The Book of Why (Pearl and Mackenzie), Partial Dependence
- book updates on website, Deep Learning in Practice: That’s a Wrap!
- boosting, Boosting
- bootstrapping problem of new users, Bootstrapping a Collaborative Filtering Model
- BPTT (see backpropagation through time)
- Breiman, Leo, Random Forests
- broadcasting, Computing Metrics Using Broadcasting, Computing Metrics Using Broadcasting, Broadcasting
- Brostow, Gabriel J., Deep Learning Is Not Just for Image Classification
- buggy algorithm ethics, Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability, Addressing different types of bias
- Buolamwini, Joy, Historical bias
- button click event handler, Creating a Notebook App from the Model
C
- C programming language, NumPy Arrays and PyTorch Tensors
- calculus and SymPy, Gradients and the Backward Pass
- California gang database (ethics), Recourse and Accountability
- callbacks
- annealing, Scheduling the Learning Rate
- building Learner class from scratch, Callbacks
- creating, Creating a Callback
- exceptions, Callback Ordering and Exceptions
- HookCallback, CAM and Hooks
- language model, Maintaining the State of an RNN
- Learner, Mixup, A Simple Baseline
- mid-level API, Going Deeper into fastai’s Layered API
- training process, Callbacks
- Callison-Burch, Chris, Deep Learning Is Not Just for Image Classification
- CAM (see class activation map)
- capacity of a model, Deeper Architectures
- car safety for ethics inspiration, Cars: A Historical Precedent
- cardinality
- casting in PyTorch, First Try: Pixel Similarity
- categorical outcome cross-entropy loss, Checking and Debugging a DataBlock
- categorical variables
- CategoryBlock
- cats and dogs first model, Your First Model-Running Your First Notebook
- Ceglowski, Maciej, The Effectiveness of Regulation
- cells in notebooks
- census data weaponization, Analyze a Project You Are Working On
- center of person’s face in image (see key point model)
- Chaslot, Guillaume, Feedback Loops
- Chomsky, Noam, From Dogs and Cats to Pet Breeds
- Chou, Tracy, The Power of Diversity
- CIFAR10 dataset, Imagenette
- Cipolla, Roberto, Deep Learning Is Not Just for Image Classification
- class activation map (CAM)
- class methods, Language Model Using DataBlock
- classes in object-oriented programming, Collaborative Filtering from Scratch
- classification models definition, How Our Image Recognizer Works
- click event handler, Creating a Notebook App from the Model
- CNN (see convolutional neural network)
- cnn_learner
- collaborative filtering
- about, Collaborative Filtering Deep Dive
- bootstrapping problem, Bootstrapping a Collaborative Filtering Model
- building from scratch, Collaborative Filtering from Scratch-Weight Decay
- collab_learner, Using fastai.collab
- DataLoaders, Creating the DataLoaders
- dataset, A First Look at the Data
- deep learning model, Deep Learning for Collaborative Filtering
- embedding, Creating the DataLoaders-Weight Decay
- fitting model, Collaborative Filtering from Scratch
- interpretting embeddings and biases, Interpreting Embeddings and Biases
- items rather than products, Collaborative Filtering Deep Dive
- latent factors, Collaborative Filtering Deep Dive
- layers via printing model, Using fastai.collab
- learning latent factors, Learning the Latent Factors
- probabilistic matrix factorization, Bootstrapping a Collaborative Filtering Model
- skew from small number of users, Bootstrapping a Collaborative Filtering Model
- structuring model, A First Look at the Data
- tables as matrices, Creating the DataLoaders
- collab_learner, Using fastai.collab
- color image as rank-3 tensor, Color Images
- color_dim, 1cycle Training
- community support, Concluding Thoughts
- COMPAS algorithm, Historical bias, Conclusion
- computer vision models
- architecture, Computer Vision
- autonomous vehicles localizing objects, Deep Learning Is Not Just for Image Classification
- convolutional neural networks for, How Our Image Recognizer Works
- current state of, Computer vision
- dataset image representation rule, Image Recognizers Can Tackle Non-Image Tasks
- datasets for, Imagenette
- examples of, Deep Learning Is for Everyone
- (see also image classifier models)
- fastai vision library in first model, How Our Image Recognizer Works
- finding edges via convolution, The Magic of Convolutions
- image basics, Pixels: The Foundations of Computer Vision-Pixels: The Foundations of Computer Vision
- image classifier (see image classifier models)
- labels in datasets, How Our Image Recognizer Works
- non-image tasks, Image Recognizers Can Tackle Non-Image Tasks-Image Recognizers Can Tackle Non-Image Tasks, Deep Learning Is Not Just for Image Classification
- object detection, Computer vision, Historical bias
- pixels as foundation, Pixels: The Foundations of Computer Vision-Pixels: The Foundations of Computer Vision
- pretrained model weight values, How Our Image Recognizer Works
- Python Imaging Library, Pixels: The Foundations of Computer Vision
- ResNets for, Building a Modern CNN: ResNet-Skip Connections
- self-supervised learning for, NLP Deep Dive: RNNs
- concatenating categorical and continuous variables, Categorical Embeddings
- confusion matrix with image classifiers, Training Your Model, and Using It to Clean Your Data, Model Interpretation
- conspiracy theory feedback loop, Feedback Loops: YouTube’s Recommendation System, Feedback Loops, Feedback Loops
- context manager, CAM and Hooks
- continuous variables
- convolutional neural network (CNN)
- 1cycle training, 1cycle Training
- about, Unfreezing and Transfer Learning, Our First Convolutional Neural Network
- batch size increased, Increase Batch Size
- building a CNN, Creating the CNN-Batch Normalization
- building Learner class from scratch, Simple CNN
- building ResNet CNN, Building a Modern CNN: ResNet-Skip Connections
- computer vision models, How Our Image Recognizer Works
- convolution as matrix multiplication, Understanding the Convolution Equations
- convolution described, The Magic of Convolutions, Mapping a Convolutional Kernel
- definition, Jargon Recap
- equations, Understanding the Convolution Equations
- first model, How Our Image Recognizer Works
- fully convolutional networks, Going Back to Imagenette, Going Back to Imagenette
- head, cnn_learner
- kernel, The Magic of Convolutions-The Magic of Convolutions
- learning rate for, Going Back to Imagenette
- nested list of comprehensions, Mapping a Convolutional Kernel
- padding, Strides and Padding
- pretrained parameter, How Our Image Recognizer Works
- PyTorch convolutions, Convolutions in PyTorch
- refactoring, Creating the CNN
- stem, A State-of-the-Art ResNet
- top 5 accuracy, A State-of-the-Art ResNet
- training, Creating the CNN
- visualizing learning, What Our Image Recognizer Learned
- Yann Lecun’s work, Pixels: The Foundations of Computer Vision
- cosine annealing, 1cycle Training
- CPU servers, Deploying Your App, Deploying Your App
- crash test dummies and gender, Cars: A Historical Precedent
- credit report system errors (ethics), Recourse and Accountability
- cross-entropy loss
- CSV data for models
- CT scan stroke analysis, Combining text and images
- cutting network, cnn_learner
- cyclical momentum, 1cycle Training
D
- data augmentation
- data leakage of illegitimate information, Data Leakage
- data project checklist
- database data for models, Deep Learning Is Not Just for Image Classification
- DataBlock
- checking, Checking and Debugging a DataBlock-Checking and Debugging a DataBlock
- DataFrame to DataLoaders, Constructing a DataBlock-Constructing a DataBlock
- debugging, Checking and Debugging a DataBlock, Checking and Debugging a DataBlock, Constructing a DataBlock, Constructing a DataBlock, Language Model Using DataBlock
- image classifier model, From Data to DataLoaders
- image regression example, Regression-Training a Model
- language model using, Language Model Using DataBlock
- mid-level API foundation, Data Munging with fastai’s Mid-Level API
- movie review classifier, Creating the Classifier DataLoaders
- presizing, From Dogs and Cats to Pet Breeds
- DataFrame
- DataLoader iterator, Constructing a DataBlock
- building Learner class from scratch, Dataset
- DataLoaders
- Dataset collection
- datasets
- academic baselines, Deep Learning Is Not Just for Image Classification
- best models for majority of, Beyond Deep Learning
- bias (see bias)
- Bing Image Search for gathering data, Gathering Data
- Blue Book for Bulldozers Kaggle competition, The Dataset
- bootstrapping problem of new users, Bootstrapping a Collaborative Filtering Model
- CIFAR10 dataset, Imagenette
- cleaning
- computer vision, Imagenette
- cut-down versions of popular, Deep Learning Is Not Just for Image Classification
- data augmentation definition, Computer vision, Data Augmentation
- data availability, Starting Your Project
- data leakage of illegitimate information, Data Leakage
- data product design integrated with ML, Integrating Machine Learning with Product Design
- date handling, Handling Dates
- definition, Running Your First Notebook
- demographics, Deep Learning Is Not Just for Image Classification
- dependent variable definition, A Bit of Deep Learning Jargon
- domain shift, How to Avoid Disaster
- download first model, Running Your First Notebook, How Our Image Recognizer Works
- ethics, Data Ethics
- examining data importance, Look at the Data, Our Language Model in PyTorch, Improving Training Stability
- facial recognition across races, Historical bias
- feature engineering, The Magic of Convolutions
- filename extraction, From Dogs and Cats to Pet Breeds-From Dogs and Cats to Pet Breeds
- freely available, Running Your First Notebook
- gathering data, Gathering Data-Gathering Data
- handwritten digits, Pixels: The Foundations of Computer Vision, Imagenette, Improving Training Stability, Going Back to Imagenette
- Human Numbers, The Data
- image representation rule of thumb, Image Recognizers Can Tackle Non-Image Tasks
- ImageNet dataset, Imagenette, Going Back to Imagenette
- IMDb Large Movie Review, Deep Learning Is Not Just for Image Classification
- independent variable definition, A Bit of Deep Learning Jargon
- Kaggle as source, Kaggle Competitions
- Kinect Head Pose, Assembling the Data
- label importance, Limitations Inherent to Machine Learning
- missing values as data leakage, Data Leakage
- MNIST handwritten digits dataset, Pixels: The Foundations of Computer Vision, Imagenette, Improving Training Stability, Going Back to Imagenette
- MovieLens, Deep Learning Is Not Just for Image Classification, A First Look at the Data
- normalization of data, Normalization
- other data types, Other data types
- out-of-domain data, Computer vision, How to Avoid Disaster, The Extrapolation Problem
- PASCAL multi-label dataset, The Data
- path to dataset, How Our Image Recognizer Works
- pet images, Running Your First Notebook, How Our Image Recognizer Works, From Dogs and Cats to Pet Breeds, Applying the Mid-Level Data API: SiamesePair
- pretrained model weight values, How Our Image Recognizer Works
- racial balance of, Historical bias
- save method, Using TabularPandas and TabularProc
- structure of, How Our Image Recognizer Works
- tabular data for models
- test set, Validation Sets and Test Sets
- training set, How Our Image Recognizer Works, How Our Image Recognizer Works, Jargon Recap
- training, validation, test, Use Judgment in Defining Test Sets-Use Judgment in Defining Test Sets
- types of data, From Dogs and Cats to Pet Breeds
- validation set, How Our Image Recognizer Works, How Our Image Recognizer Works, Jargon Recap, Validation Sets and Test Sets
- validation set defined, From Data to DataLoaders
- Datasets iterator, Constructing a DataBlock
- date handling in tabular data, Handling Dates
- De-Arteaga, Maria, Representation bias
- debugging
- decision trees
- about, Beyond Deep Learning, Decision Trees
- displaying tree, Creating the Decision Tree-Creating the Decision Tree
- libraries for, Beyond Deep Learning
- overfitting, Creating the Decision Tree
- random forests, Random Forests-Ensembling
- creating a random forest, Creating a Random Forest
- data leakage, Data Leakage
- ensembling, Ensembling
- ensembling, boosting, Boosting
- extrapolation problem, Extrapolation and Neural Networks
- feature importances, Feature Importance
- hyperparameter insensitivity, Creating a Random Forest
- model interpretation, Model Interpretation-Tree Interpreter
- out-of-bag error, Creating a Random Forest
- out-of-domain data, The Extrapolation Problem
- partial dependence, Partial Dependence
- removing low-importance variables, Removing Low-Importance Variables
- removing redundant features, Removing Redundant Features
- tree interpreter, Tree Interpreter
- tree variance for prediction confidence, Tree Variance for Prediction Confidence
- training, Decision Trees-Creating the Decision Tree
- decode method, Transforms
- deep learning
- about, Foreword, Deep Learning Is for Everyone-Deep Learning Is for Everyone
- about the importance of parameters, How Our Image Recognizer Works
- architecture not so important, How Our Image Recognizer Works
- beyond deep learning, Beyond Deep Learning
- blogging about journey (see blogging)
- capabilities and constraints, The Practice of Deep Learning
- community support, A Note About Twitter, Concluding Thoughts
- current state of, The State of Deep Learning
- dataset image representation rule, Image Recognizers Can Tackle Non-Image Tasks
- history, Pixels: The Foundations of Computer Vision
- how to learn, How to Learn Deep Learning-Your Projects and Your Mindset
- image recognition (see image classifier models)
- as machine learning, What Is Machine Learning?, Jargon Recap
- machine learning visualized, What Our Image Recognizer Learned
- manual process in parallel, How to Avoid Disaster
- model and human interaction, Combining text and images, How to Avoid Disaster
- neural networks beyond understanding, How to Avoid Disaster
- neural networks used, Deep Learning Is for Everyone, What Is Machine Learning?
- (see also neural networks)
- non-image tasks, Image Recognizers Can Tackle Non-Image Tasks-Image Recognizers Can Tackle Non-Image Tasks, Deep Learning Is Not Just for Image Classification
- overview, Jargon Recap
- predicting sales from stores, Categorical Embeddings
- process of creating application (see process end-to-end)
- risk mitigation, How to Avoid Disaster
- scikit-learn library instead, Beyond Deep Learning
- server requirements, Getting a GPU Deep Learning Server
- tabular data needing more, Categorical Embeddings
- terminology, A Bit of Deep Learning Jargon, Jargon Recap, Jargon Recap
- Twitter for help, A Note About Twitter
- deeper models having more layers, Going Deeper
- delegates, Deep Learning for Collaborative Filtering
- demographics dataset, Deep Learning Is Not Just for Image Classification
- dependent variable
- deployment
- app from notebook, Turning Your Notebook into a Real App
- Binder free app hosting, Deploying Your App
- CPU-based server, Deploying Your App
- exporting model, Using the Model for Inference
- mobile devices, Deploying Your App
- prediction inference, Using the Model for Inference
- Raspberry Pi, Deploying Your App
- risk mitigation, How to Avoid Disaster
- unforeseen challenges, Unforeseen Consequences and Feedback Loops
- web application, Turning Your Model into an Online Application-Deploying Your App
- web resource discussing, How to Avoid Disaster
- derivative of a function, Calculating Gradients
- DeVries, Terrance, Historical bias
- diabetes data aggregation bias, Aggregation bias
- digital signature, Disinformation
- dimension multiple meanings, First Try: Pixel Similarity
- DiResta, Renee, Feedback Loops
- disaster avoidance with web applications, How to Avoid Disaster
- discriminative learning rates, Unfreezing and Transfer Learning
- disinformation, Text (natural language processing), Disinformation, Disinformation and Language Models
- diversity against ethical risks, The Power of Diversity
- doc for method documentation, Deep Learning Is Not Just for Image Classification, Gathering Data
- dogs and cats first model, Your First Model-Running Your First Notebook
- domain shift, How to Avoid Disaster
- dot product of vectors, A First Look at the Data, Categorical Embeddings
- download_images, Gathering Data
- Drivetrain Approach for actionable outcomes, The Drivetrain Approach
- dropout, Dropout
- Dumoulin, Vincent, Mapping a Convolutional Kernel
- dunder init, Collaborative Filtering from Scratch
- Durbin, Meredith, Fairness, Accountability, and Transparency
E
- early stopping, Selecting the Number of Epochs
- Einstein summation, Einstein Summation
- electronic health record measurement bias, Measurement bias
- elementwise arithmetic, Elementwise Arithmetic
- embedding, Creating the DataLoaders-Weight Decay
- built from scratch, Creating Our Own Embedding Module-Deep Learning for Collaborative Filtering
- categorical variables transformed into continuous, Categorical Embeddings
- combining with other methods, Combining Embeddings with Other Methods
- continuous values, continuous input, Categorical Embeddings
- delegates, Deep Learning for Collaborative Filtering
- embedding distance, Embedding Distance, Categorical Embeddings
- embedding layer, Categorical Embeddings
- embedding matrix, Creating the DataLoaders
- entity embedding, Categorical Embeddings
- kwargs, Deep Learning for Collaborative Filtering
- tabular data with categorical columns, Using a Neural Network
- encoder, Saving and Loading Models
- end-to-end process (see process end-to-end)
- Enlitic company malignant tumor identification, Deep Learning Is for Everyone, Who We Are
- ensembles of decision trees (see decision trees)
- ensembling random forests, Ensembling
- entity embedding, Categorical Embeddings
- epochs
- error debugging, Gathering Data
- error rate, Running Your First Notebook, How Our Image Recognizer Works
- errors in data (ethics), Recourse and Accountability
- escape key for command/edit mode, Running Your First Notebook
- Estola, Evan, Feedback Loops
- ethics
- accountability, Recourse and Accountability
- addressing ethical issues, Identifying and Addressing Ethical Issues
- early stages of, Conclusion
- ethical lenses, Ethical lenses
- fairness, accountability, transparency, Fairness, Accountability, and Transparency
- policy’s role, Role of Policy-Cars: A Historical Precedent
- power of diversity, The Power of Diversity
- processes to implement, Processes to Implement
- bias
- about, Bias, Addressing different types of bias
- aggregation bias, Aggregation bias
- facial recognition, Integrating Machine Learning with Product Design, Historical bias
- geo-diversity, Historical bias
- historical bias, Historical bias-Historical bias
- measurement bias, Measurement bias
- mitigation, Addressing different types of bias
- natural language processing, Historical bias
- racial bias (see racial bias)
- representation bias, Representation bias
- socioeconomic bias, Addressing different types of bias
- buggy algorithm, Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability, Addressing different types of bias
- car safety inspiration, Cars: A Historical Precedent
- consideration of project as whole, Why Does This Matter?
- data ethics, Data Ethics
- description of, Data Ethics
- disinformation, Text (natural language processing), Disinformation, Disinformation and Language Models
- healthcare benefits buggy algorithm, Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability
- IBM and Nazi Germany, Why Does This Matter?, Analyze a Project You Are Working On
- identifying ethical issues, Identifying and Addressing Ethical Issues
- importance of, Why Does This Matter?
- medicine and text generation, Text (natural language processing)
- product design integrated with ML, Integrating Machine Learning with Product Design
- recourse, Recourse and Accountability
- Volkswagen emission test cheating, Why Does This Matter?
- YouTube recommendation feedback loops, Feedback Loops: YouTube’s Recommendation System, Feedback Loops
- Etzioni, Oren, Disinformation
- evaluating models (see testing models)
- exponential function (exp), Softmax
- export method, Using the Model for Inference
F
- F (torch.nn.functional), First Try: Pixel Similarity, Convolutions in PyTorch
- face center in image (see key point model)
- Facebook
- facial recognition bias, Integrating Machine Learning with Product Design, Historical bias
- factory methods versus customization, From Data to DataLoaders
- Fairness and Machine Learning online book (Barocas, Hardt, and Narayanan), Fairness, Accountability, and Transparency
- fairness, accountability, and transparency, Fairness, Accountability, and Transparency
- fast.ai ML courses
- fastai software library
- about, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- accuracy with validation set, How Our Image Recognizer Works
- data augmentation, Assembling the Data
- data cleaning GUI, Training Your Model, and Using It to Clean Your Data
- documentation for methods, Deep Learning Is Not Just for Image Classification
- forums for community support, Concluding Thoughts
- import efficiency in notebook, How Our Image Recognizer Works
- L class returning collections, From Dogs and Cats to Pet Breeds
- labeling methods, How Our Image Recognizer Works
- layered API, Going Deeper into fastai’s Layered API
- loss function selected by, Checking and Debugging a DataBlock, Binary Cross Entropy, Training a Model, Conclusion
- metrics, How Our Image Recognizer Works
- Tabular classes, Using a Neural Network
- Transforms, How Our Image Recognizer Works
- validation set, How Our Image Recognizer Works, How Our Image Recognizer Works
- version 2 in book, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- Fauqueur, Julien, Deep Learning Is Not Just for Image Classification
- feature engineering, The Magic of Convolutions
- feedback loops
- arrest rates on racial grounds, Unforeseen Consequences and Feedback Loops
- conspiracy theories fed by, Feedback Loops: YouTube’s Recommendation System, Feedback Loops, Feedback Loops
- description, Limitations Inherent to Machine Learning, Unforeseen Consequences and Feedback Loops
- Facebook and conspiracy theories, Feedback Loops
- metrics driving algorithms, Feedback Loops
- recommendation system ethics, Feedback Loops: YouTube’s Recommendation System, Feedback Loops
- skew from small number of users, Bootstrapping a Collaborative Filtering Model
- Fergus, Rob, What Our Image Recognizer Learned, Unfreezing and Transfer Learning
- Feynman, Richard, Your Projects and Your Mindset
- file upload to web widget, Creating a Notebook App from the Model
- files as data type, From Dogs and Cats to Pet Breeds
- fine-tuning models
- definition, How Our Image Recognizer Works, Jargon Recap
- fine-tune method, How Our Image Recognizer Works
- first model, Running Your First Notebook, How Our Image Recognizer Works
- image classifier model, Training Your Model, and Using It to Clean Your Data
- natural language models, NLP Deep Dive: RNNs
- non-pretrained, Deep Learning Is Not Just for Image Classification
- pretrained models, How Our Image Recognizer Works, Unfreezing and Transfer Learning
- first model, Your First Model-Running Your First Notebook
- code for, How Our Image Recognizer Works-How Our Image Recognizer Works
- convolutional neural network, How Our Image Recognizer Works
- as deep learning, What Is Machine Learning?
- error rate, Running Your First Notebook
- fine-tuning, Running Your First Notebook, How Our Image Recognizer Works
- GPU servers, Getting a GPU Deep Learning Server
- machine learning visualized, What Our Image Recognizer Learned
- as neural net, What Is a Neural Network?
- process of creating application (see process end-to-end)
- tested, Running Your First Notebook
- Transforms, How Our Image Recognizer Works
- fisheries monitoring model competition, Use Judgment in Defining Test Sets
- fitting models
- fix_html, Word Tokenization with fastai
- floating point numbers
- forgery via AI, Disinformation
- forward hook, CAM and Hooks
- forward method, Collaborative Filtering from Scratch
- forward pass, Calculating Gradients, Jargon Recap, The Forward and Backward Passes
- fraud detection, Image Recognizers Can Tackle Non-Image Tasks
- freezing pretrained models, Unfreezing and Transfer Learning
- fully convolutional networks, Going Back to Imagenette, Going Back to Imagenette
G
- Gebru, Timnit, Cars: A Historical Precedent
- gender
- generalization by models, Jargon Recap, Extrapolation and Neural Networks
- genocide and Facebook, The Effectiveness of Regulation
- geo-diveristy of datasets, Historical bias
- Géron, Aurélien, Feedback Loops
- get_dummies for categorical variables, Categorical Variables
- get_preds function, Viewing Activations and Labels
- Giomo, Stefano, 1cycle Training
- GitHub Pages hosting blog, Blogging with GitHub Pages
- Glorot, Xavier, Defining and Initializing a Layer
- Google
- GPU deep learning servers
- gradient boosted decision trees (GBDTs), Boosting
- gradient boosting machines (GBMs), Boosting
- gradient descent, Summarizing Gradient Descent, Jargon Recap
- gradients
- Gramian Angular Difference Field (GADF), Image Recognizers Can Tackle Non-Image Tasks
- graphics processing unit (GPU), Getting a GPU Deep Learning Server
- Greek letters, Mixup
- Guo, Cheng, Categorical Embeddings, Combining Embeddings with Other Methods
- Guttag, John, Bias
H
- H for help, Running Your First Notebook
- half-precision floating point (fp16), Deeper Architectures
- handwritten digits dataset, Pixels: The Foundations of Computer Vision, Imagenette, Improving Training Stability, Going Back to Imagenette
- handwritten text read by models, Pixels: The Foundations of Computer Vision
- (see also numerical digit classifier)
- Hardt, Mortiz, Fairness, Accountability, and Transparency
- He, Kaiming, ResNets, Defining and Initializing a Layer
- He, Tong, A State-of-the-Art ResNet
- head of model
- head pose dataset, Assembling the Data
- healthcare benefits buggy algorithm (ethics), Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits, Recourse and Accountability, Addressing different types of bias
- help by pressing H, Running Your First Notebook
- hidden state, Our First Recurrent Neural Network
- Hinton, Geoffrey, Pixels: The Foundations of Computer Vision, Dropout, RMSProp
- historical bias, Historical bias-Historical bias
- history
- Hitler, Adolf, Why Does This Matter?, Analyze a Project You Are Working On
- Hochreiter, Sepp, Pixels: The Foundations of Computer Vision
- HookCallback, CAM and Hooks
- hooks in PyTorch, CAM and Hooks-CAM and Hooks
- horizontal scaling, Deploying Your App
- Human Numbers dataset, The Data
- Hutson, Jevan, Fairness, Accountability, and Transparency
- hyperparameters, Validation Sets and Test Sets
I
- IBM and Nazi Germany, Why Does This Matter?, Analyze a Project You Are Working On
- IBM and the Holocaust book (Black), Why Does This Matter?
- identity function, Skip Connections
- identity generation by ML, Disinformation and Language Models
- identity mapping, Skip Connections
- Image class, Pixels: The Foundations of Computer Vision
- image classifier model training
- activations, Viewing Activations and Labels
- activations into predictions, Viewing Activations and Labels
- baseline simple model, Checking and Debugging a DataBlock
- baseline training run, Imagenette
- cross-entropy loss, Checking and Debugging a DataBlock-Taking the log
- discriminative learning rates, Unfreezing and Transfer Learning
- epochs, number of, Selecting the Number of Epochs, Selecting the Number of Epochs
- freezing pretrained layers, Unfreezing and Transfer Learning
- Imagenette dataset, Training a State-of-the-Art Model
- images sized progressively, Progressive Resizing
- improving, Improving Our Model
- label smoothing, Label Smoothing
- learning rate finder, The Learning Rate Finder
- logarithms for loss, Taking the log
- metrics and validation loss, Selecting the Number of Epochs
- Mixup, Mixup
- script for training with and without, Mixup
- normalization of data, Normalization
- predictions, Viewing Activations and Labels
- process end-to-end, Training Your Model, and Using It to Clean Your Data
- softmax activation function, Viewing Activations and Labels
- test time augmentation, Test Time Augmentation
- testing with confusion matrix, Model Interpretation
- image classifier models
- accuracy as metric, Computing Metrics Using Broadcasting
- architecture, Deeper Architectures
- autonomous vehicles localizing objects, Deep Learning Is Not Just for Image Classification
- capabilities and constraints, The Practice of Deep Learning
- convolutional neural networks for, How Our Image Recognizer Works
- CT scan stroke analysis, Combining text and images
- current state of, Computer vision
- data augmentation, Data Augmentation
- data availability, Starting Your Project
- data gathering, Gathering Data-Gathering Data
- DataLoaders, From Data to DataLoaders-From Data to DataLoaders
- dataset, From Dogs and Cats to Pet Breeds, Imagenette
- checking, Checking and Debugging a DataBlock-Checking and Debugging a DataBlock
- debugging, Checking and Debugging a DataBlock
- examination of, From Dogs and Cats to Pet Breeds-From Dogs and Cats to Pet Breeds
- filename extraction, From Dogs and Cats to Pet Breeds-From Dogs and Cats to Pet Breeds
- image representation rule, Image Recognizers Can Tackle Non-Image Tasks
- labeling, From Dogs and Cats to Pet Breeds
- labels, How Our Image Recognizer Works
- presizing, From Dogs and Cats to Pet Breeds
- regular expressions, From Dogs and Cats to Pet Breeds
- types of data, From Dogs and Cats to Pet Breeds
- distracted driver model, Use Judgment in Defining Test Sets
- download_images, Gathering Data
- facial recognition bias, Integrating Machine Learning with Product Design, Historical bias
- first model, Your First Model-Running Your First Notebook
- Google Photos label racial bias, Historical bias
- image basics, Pixels: The Foundations of Computer Vision-Pixels: The Foundations of Computer Vision
- image size, How Our Image Recognizer Works, From Data to DataLoaders, From Data to DataLoaders, From Dogs and Cats to Pet Breeds
- labels in datasets, How Our Image Recognizer Works
- machine learning explained, What Is Machine Learning?
- manual process in parallel, How to Avoid Disaster
- multi-label classification, Multi-Label Classification
- non-image tasks, Image Recognizers Can Tackle Non-Image Tasks-Image Recognizers Can Tackle Non-Image Tasks, Deep Learning Is Not Just for Image Classification
- numerical digit (see numerical digit classifier)
- performance of model via loss, Training Your Model, and Using It to Clean Your Data
- prediction inference, Using the Model for Inference
- presizing, From Dogs and Cats to Pet Breeds
- pretrained model weight values, How Our Image Recognizer Works
- production complexity, How to Avoid Disaster
- Python Imaging Library, Pixels: The Foundations of Computer Vision
- Siamese model image comparison, Applying the Mid-Level Data API: SiamesePair-Applying the Mid-Level Data API: SiamesePair
- softmax, Softmax
- testing
- training deep dive (see image classifier model training)
- verify_images, Gathering Data
- web application from model, Turning Your Model into an Online Application-Deploying Your App
- image regression
- ImageBlock
- ImageClassifierCleaner, Training Your Model, and Using It to Clean Your Data
- ImageNet dataset, Imagenette, Going Back to Imagenette
- images combined with text, Combining text and images
- IMDb Large Movie Review dataset, Deep Learning Is Not Just for Image Classification
- independent variable
- inference
- inheritance in object-oriented programming, Collaborative Filtering from Scratch
- init (dunder init), Collaborative Filtering from Scratch
- inputs
- interpretation via class activation map, CAM and Hooks-CAM and Hooks
- Ioffe, Sergey, Batch Normalization
- IPython widgets, Creating a Notebook App from the Model
- Isaac, William, Unforeseen Consequences and Feedback Loops
- item transforms, From Data to DataLoaders
- iterate development end to end, Starting Your Project
K
- Kaggle machine learning community
- Kalash, Mahmoud, Image Recognizers Can Tackle Non-Image Tasks
- Kao, Jeff, Disinformation and Language Models
- kernel in notebooks
- kernel of convolution, The Magic of Convolutions-The Magic of Convolutions
- Keskar, Nitish Shirish, Regularizing an LSTM
- key point model of image regression
- Keyes, Os, Fairness, Accountability, and Transparency
- The KGB and Soviet Disinformation book (Bittman), Disinformation
- Khan Academy math tutorials online, What You Need to Know, First Try: Pixel Similarity
- Kinect Head Pose dataset, Assembling the Data
- Kohavi, Ron, Deep Learning Is Not Just for Image Classification
- König, Inke, Categorical Variables
- kwargs, Deep Learning for Collaborative Filtering
L
- L class returning collections, From Dogs and Cats to Pet Breeds
- L1 norm (mean absolute difference), First Try: Pixel Similarity, First Try: Pixel Similarity
- L2 norm (root mean squared error), First Try: Pixel Similarity
- L2 regularization, Weight Decay
- label smoothing, Label Smoothing
- labels
- bias in, Historical bias
- challenge of object detection, Computer vision
- checking, Checking and Debugging a DataBlock-Checking and Debugging a DataBlock
- definition, Jargon Recap
- dependent variable definition, A Bit of Deep Learning Jargon
- extraction from dataset
- incorrect affecting loss, Training Your Model, and Using It to Clean Your Data
- independent variable definition, A Bit of Deep Learning Jargon
- multi-label classification, Multi-Label Classification-Binary Cross Entropy
- need for, Limitations Inherent to Machine Learning
- lambda functions, Constructing a DataBlock
- language model
- building from scratch
- building model, Our First Language Model from Scratch-Our First Recurrent Neural Network
- building model in PyTorch, Our Language Model in PyTorch
- callback, Maintaining the State of an RNN
- data, The Data-The Data
- hidden state activations, Our First Recurrent Neural Network
- LSTM model, LSTM-Training a Language Model Using LSTMs
- LSTM model, regularizing, Regularizing an LSTM-Training a Weight-Tied Regularized LSTM
- LSTM training, Training a Weight-Tied Regularized LSTM-Training a Weight-Tied Regularized LSTM
- metric, Our Language Model in PyTorch
- multilayer RNNs, Multilayer RNNs-Exploding or Disappearing Activations
- recurrent neural network, first, Our First Recurrent Neural Network
- recurrent neural network, improved, Improving the RNN-Creating More Signal
- training, Our Language Model in PyTorch
- weight tying, Training a Weight-Tied Regularized LSTM
- DataBlock, Language Model Using DataBlock
- definition, NLP Deep Dive: RNNs
- NLP (see natural language processing)
- language translation (see translation of languages)
- latent factors, Collaborative Filtering Deep Dive, Bootstrapping a Collaborative Filtering Model
- law enforcement
- layered API, Going Deeper into fastai’s Layered API
- layers
- backpropagation for derivative, Calculating Gradients
- deeper models having more layers, Going Deeper, Deeper Architectures
- encoding of, Unfreezing and Transfer Learning, Discriminative Learning Rates
- final layer matrix, Unfreezing and Transfer Learning
- forward pass for activations, Calculating Gradients
- last layer and pretrained models, How Our Image Recognizer Works
- more linear layers, more computations, Adding a Nonlinearity, Going Deeper
- nonlinear function between linears, Adding a Nonlinearity, Going Deeper, Unfreezing and Transfer Learning
- optimization and, Going Deeper
- out-of-memory error, Deeper Architectures
- prediction viewing, Viewing Activations and Labels
- printing model to see, Using fastai.collab
- ResNet architecture, How Our Image Recognizer Works
- training, overfitting, and, How Our Image Recognizer Works
- visualizing convolutional networks, What Our Image Recognizer Learned
- Learner
- about, Creating an Optimizer, Binary Cross Entropy
- building Learner class from scratch
- callbacks, Callbacks
- DataLoader, Dataset
- Dataset, Dataset
- images, Data
- Learner class, Learner
- learning rate scheduling, Scheduling the Learning Rate
- loss function, Loss
- Module, Module and Parameter-Module and Parameter
- Parameter, Module and Parameter-Module and Parameter
- simple CNN, Simple CNN
- stochastic gradient descent, Learner
- untar_data, Data
- callbacks for custom behavior, Mixup, A Simple Baseline
- cnn_learner
- collaborative filtering system, Collaborative Filtering from Scratch
- collab_learner, Using fastai.collab
- fully convolutional network, Going Back to Imagenette
- lambda functions and exporting, Constructing a DataBlock
- learn.load, Saving and Loading Models
- learn.model, Deep Learning for Collaborative Filtering
- learn.recorder, 1cycle Training
- learn.save, Saving and Loading Models
- numerical digit classifier, Creating an Optimizer
- show_results, Training a Model
- learning rate (LR)
- Lecun, Yann, Pixels: The Foundations of Computer Vision, Pixels: The Foundations of Computer Vision
- Li, Hao, Skip Connections
- Liang, James, Why Does This Matter?
- linear and nonlinear layers, Adding a Nonlinearity, Adding a Nonlinearity, Jargon Recap, Unfreezing and Transfer Learning
- LinkedIn ML-generated profile, Disinformation and Language Models
- list comprehensions, First Try: Pixel Similarity
- load method, Saving and Loading Models
- load_learner, Using the Model for Inference
- Lockhart, Paul, How to Learn Deep Learning
- logarithmic scale
- long short-term memory (LSTM), Pixels: The Foundations of Computer Vision
- look-up index as one-hot-encoded vector, Creating the DataLoaders
- loss
- BCELoss, Binary Cross Entropy
- BCEWithLogitsLoss, Binary Cross Entropy, Conclusion
- bear image classifier, Training Your Model, and Using It to Clean Your Data
- binary cross entropy, Constructing a DataBlock-Binary Cross Entropy
- building Learner class from scratch, Loss
- categorical outcome cross-entropy loss, Checking and Debugging a DataBlock
- class versus plain functional form, Taking the log
- cross-entropy, Checking and Debugging a DataBlock
- (see also cross-entropy loss)
- definition, A Bit of Deep Learning Jargon, Jargon Recap, Jargon Recap
- fastai selecting function, Checking and Debugging a DataBlock, Binary Cross Entropy, Training a Model, Conclusion
- label incorrect, not model, Training Your Model, and Using It to Clean Your Data
- logarithms for, Taking the log
- metrics versus, How Our Image Recognizer Works
- MNIST loss function, The MNIST Loss Function-Sigmoid, Log Likelihood-Log Likelihood
- model defined by, Regression
- MSELoss, Training a Model
- multi-label classifier loss function, Binary Cross Entropy-Binary Cross Entropy
- numerical digit image classifier, The MNIST Loss Function-Sigmoid
- passing to learner, Training a Model
- pet breeds image classifier, Checking and Debugging a DataBlock-Taking the log
- probability as confidence level, Training Your Model, and Using It to Clean Your Data
- PyTorch functions for comparisons, First Try: Pixel Similarity
- reinforcement learning, Feedback Loops
- selecting loss function for problem, Conclusion
- validation loss improvement slowing, Discriminative Learning Rates
- lowercase rule, Word Tokenization with fastai
- ls method in Path class, Using the Model for Inference, Pixels: The Foundations of Computer Vision
- LSTM language model
- Lum, Kristian, Unforeseen Consequences and Feedback Loops
M
- Maas, Andrew, Deep Learning Is Not Just for Image Classification
- machine learning (ML)
- bagging, Random Forests-Ensembling
- bias, Bias
- capabilities and constraints, The Practice of Deep Learning
- classification model definition, How Our Image Recognizer Works
- concepts of, What Is Machine Learning?-What Is Machine Learning?
- current state of, The State of Deep Learning
- defined, What Is Machine Learning?
- explained, What Is Machine Learning?-What Is Machine Learning?, Jargon Recap
- fairness and, Fairness, Accountability, and Transparency
- feature engineering, The Magic of Convolutions
- first model as neural net, What Is a Neural Network?
- history of development, What Is Machine Learning?, Pixels: The Foundations of Computer Vision
- key techniques, Beyond Deep Learning
- key to ML via derivatives, Calculating Gradients
- limitations inherent to, Limitations Inherent to Machine Learning-Limitations Inherent to Machine Learning
- manual process in parallel, How to Avoid Disaster
- mobile landscape, Deploying Your App
- neural networks beyond understanding, How to Avoid Disaster
- product design integrated with, Integrating Machine Learning with Product Design
- regression model definition, How Our Image Recognizer Works
- risk mitigation, How to Avoid Disaster
- scikit-learn library, Beyond Deep Learning
- Twitter for help, A Note About Twitter
- visualizing, What Our Image Recognizer Learned
- weights, What Is Machine Learning?-What Is Machine Learning?
- Mackenzie, Dana, Partial Dependence
- Making Learning Whole book (Perkins), How to Learn Deep Learning
- malware classification, Image Recognizers Can Tackle Non-Image Tasks
- manual process in parallel, How to Avoid Disaster
- Mark I Perceptron, Neural Networks: A Brief History
- Markdown in notebook cells, Running Your First Notebook
- math tutorials online, What You Need to Know, First Try: Pixel Similarity
- matrix multiplication, The MNIST Loss Function
- McClelland, James, Neural Networks: A Brief History
- McCulloch, Warren, Neural Networks: A Brief History
- McKinney, Wes, The Data, Beyond Deep Learning
- mean absolute difference (L1 norm), First Try: Pixel Similarity, First Try: Pixel Similarity
- mean average percent error metric, Combining Embeddings with Other Methods
- mean squared error (MSE), First Try: Pixel Similarity
- measurement bias, Measurement bias
- medicine
- Meetup recommendation algorithm, Feedback Loops
- memory usage
- Merity, Stephen, Regularizing an LSTM
- methods
- metrics
- about, Look at the Data
- definition, How Our Image Recognizer Works, Jargon Recap, Computing Metrics Using Broadcasting
- fastai library, How Our Image Recognizer Works
- feedback loops driven by, Feedback Loops
- first model declaration, How Our Image Recognizer Works, How Our Image Recognizer Works
- loss versus, How Our Image Recognizer Works
- mean average percent error, Combining Embeddings with Other Methods
- numerical digit classifier, Computing Metrics Using Broadcasting-Computing Metrics Using Broadcasting
- pet breeds image classifier, Selecting the Number of Epochs
- root mean squared log error, Look at the Data, Creating the Decision Tree, Creating a Random Forest
- top 5 accuracy, A State-of-the-Art ResNet
- Microsoft
- mid-level API
- mini-batch, From Data to DataLoaders, Jargon Recap
- Minsky, Marvin, Neural Networks: A Brief History
- missing values as data leakage, Data Leakage
- mixed-precision training, Deeper Architectures
- Mixup augmentation technique, Mixup
- ML (see machine learning)
- MNIST handwritten digits dataset, Pixels: The Foundations of Computer Vision, Imagenette, Improving Training Stability, Going Back to Imagenette
- mobile device deployment of apps, Deploying Your App
- models
- accuracy (see accuracy)
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- autonomous vehicles localizing objects, Deep Learning Is Not Just for Image Classification
- begin simply, First Try: Pixel Similarity, Checking and Debugging a DataBlock
- best methods for majority of datasets, Beyond Deep Learning
- capacity, Deeper Architectures
- classification model definition, How Our Image Recognizer Works
- data seen changing over time, How to Avoid Disaster, Feedback Loops: YouTube’s Recommendation System
- defined by variables and loss function, Regression
- definition, Jargon Recap
- encoder, Saving and Loading Models
- exporting, Using the Model for Inference
- first model (see first model)
- GPUs and production models, Deploying Your App
- head and pretrained models, How Our Image Recognizer Works
- load method, Saving and Loading Models
- model and human interaction, Combining text and images, How to Avoid Disaster
- modeling competitions, Use Judgment in Defining Test Sets
- more parameters, more accuracy, Deeper Architectures
- overfitting importance, Conclusion
- parameter importance, How Our Image Recognizer Works
- printing to see layers, Using fastai.collab
- process of creating application (see process end-to-end)
- programs constrasted, What Is Machine Learning?, What Is Machine Learning?
- regression model definition, How Our Image Recognizer Works
- results versus performance, What Is Machine Learning?
- save method, Saving and Loading Models, Saving and Loading Models
- system behavior changed by, Unforeseen Consequences and Feedback Loops
- tabular data for, Deep Learning Is Not Just for Image Classification
- (see also tabular data)
- advice for modeling, Conclusion
- training, What Is Machine Learning?-What Is Machine Learning?
- web application from, Turning Your Model into an Online Application-Deploying Your App
- Module class
- modules, Creating an Optimizer
- momentum in SGD, Momentum-Momentum
- Monroe, Fred, Conclusion
- mouse movements for fraud detection, Image Recognizers Can Tackle Non-Image Tasks
- movie recommendation system
- collaborative filtering
- about, Collaborative Filtering Deep Dive
- biases, Collaborative Filtering from Scratch
- bootstrapping problem, Bootstrapping a Collaborative Filtering Model
- collab_learner, Using fastai.collab
- DataLoaders, Creating the DataLoaders
- dataset, A First Look at the Data
- deep learning model, Deep Learning for Collaborative Filtering
- embedding, Creating the DataLoaders-Weight Decay
- embedding distance, Embedding Distance
- embedding from scratch, Creating Our Own Embedding Module-Deep Learning for Collaborative Filtering
- fitting model, Collaborative Filtering from Scratch
- interpretting embeddings and biases, Interpreting Embeddings and Biases
- items rather than products, Collaborative Filtering Deep Dive
- latent factors, Collaborative Filtering Deep Dive
- layers via printing model, Using fastai.collab
- Learner from scratch, Collaborative Filtering from Scratch
- learning latent factors, Learning the Latent Factors
- look-up index as one-hot-encoded vector, Creating the DataLoaders
- overfitting, Collaborative Filtering from Scratch
- probabilistic matrix factorization, Bootstrapping a Collaborative Filtering Model
- structuring model, A First Look at the Data
- tables as matrices, Creating the DataLoaders
- weight decay, Weight Decay
- MovieLens sample model, Deep Learning Is Not Just for Image Classification
- skew from small number of users, Bootstrapping a Collaborative Filtering Model
- movie review sentiment model, Deep Learning Is Not Just for Image Classification, NLP Deep Dive: RNNs
- (see also natural language processing)
- MovieLens dataset, Deep Learning Is Not Just for Image Classification, A First Look at the Data
- MSELoss, Training a Model, Conclusion
- Mueller report, Feedback Loops, Disinformation
- Mullainathan, Sendhil, Measurement bias
- multi-label classification, Multi-Label Classification-Binary Cross Entropy
- MultiCategoryBlock, Constructing a DataBlock
- multilayer RNNs, Multilayer RNNs-Exploding or Disappearing Activations
- multilayered neural networks learned with SGD, Beyond Deep Learning
N
- Nader, Ralph, Cars: A Historical Precedent
- Narayanan, Arvind, Fairness, Accountability, and Transparency
- National Institute of Standards and Technology, Pixels: The Foundations of Computer Vision
- natural language processing (NLP)
- architecture, Natural Language Processing
- backpropagation through time for, Natural Language Processing
- bias in data, Historical bias
- Chomsky’s syntax book, From Dogs and Cats to Pet Breeds
- correct response not ensured, Text (natural language processing)
- current state of, Text (natural language processing)
- data augmentation of text data, Regularizing an LSTM
- disinformation, Text (natural language processing), Disinformation, Disinformation and Language Models
- fine-tuning
- language model from scratch
- building model, Our First Language Model from Scratch-Our First Recurrent Neural Network
- building model in PyTorch, Our Language Model in PyTorch
- callback, Maintaining the State of an RNN
- data, The Data-The Data
- hidden state activations, Our First Recurrent Neural Network
- LSTM model, LSTM-Training a Language Model Using LSTMs
- LSTM model, regularizing, Regularizing an LSTM-Training a Weight-Tied Regularized LSTM
- LSTM training, Training a Weight-Tied Regularized LSTM-Training a Weight-Tied Regularized LSTM
- metric, Our Language Model in PyTorch
- multilayer RNNs, Multilayer RNNs-Exploding or Disappearing Activations
- recurrent neural network, first, Our First Recurrent Neural Network
- recurrent neural network, improved, Improving the RNN-Creating More Signal
- training, Our Language Model in PyTorch
- weight tying, Training a Weight-Tied Regularized LSTM
- Mixup data augmentation, Mixup
- pretrained English language model, NLP Deep Dive: RNNs
- protein chains as, Other data types
- recurrent neural network, Text Preprocessing, Fine-Tuning the Language Model
- (see also recurrent neural networks)
- about process, Text Preprocessing
- accuracy, Fine-Tuning the Classifier
- classifier DataLoaders, Creating the Classifier DataLoaders
- fine-tuning classifier, Fine-Tuning the Classifier
- fine-tuning pretrained language model, Fine-Tuning the Language Model-Saving and Loading Models
- language model using DataBlock, Language Model Using DataBlock
- numericalization, Text Preprocessing, Numericalization with fastai
- pretraining, NLP Deep Dive: RNNs
- text generation, Text Generation
- texts into batches for language model, Putting Our Texts into Batches for a Language Model-Putting Our Texts into Batches for a Language Model
- training text classifier, Training a Text Classifier
- unfreezing classifier, Fine-Tuning the Classifier
- sentiment of movie review, Deep Learning Is Not Just for Image Classification, NLP Deep Dive: RNNs
- style of target corpus, NLP Deep Dive: RNNs
- text generation, Text (natural language processing)
- tokenization
- unfreezing classifiers, Fine-Tuning the Classifier
- Wikipedia for pretraining, NLP Deep Dive: RNNs
- Nazi Germany and IBM, Why Does This Matter?, Analyze a Project You Are Working On
- negative log likelihood loss (nll_loss), Taking the log
- nested list comprehensions, Mapping a Convolutional Kernel
- net neutrality disinformation, Disinformation and Language Models
- neural networks
- beyond understanding, How to Avoid Disaster
- building layer from scratch, Building a Neural Net Layer from Scratch-Einstein Summation
- backward pass, The Forward and Backward Passes
- broadcasting, Broadcasting
- broadcasting rules, Broadcasting rules
- broadcasting vector to matrix, Broadcasting a vector to a matrix-Broadcasting a vector to a matrix
- broadcasting with a scalar, Broadcasting with a scalar
- defining and initializing a layer, Defining and Initializing a Layer-Defining and Initializing a Layer
- Einstein summation, Einstein Summation
- elementwise arithmetic, Elementwise Arithmetic
- forward pass, The Forward and Backward Passes
- gradients and backward pass, Gradients and the Backward Pass-Gradients and the Backward Pass
- matrix multiplication, Matrix Multiplication from Scratch
- modeling a neuron, Modeling a Neuron
- PyTorch, Going to PyTorch-Going to PyTorch
- refactoring the model, Refactoring the Model
- Coursera class, RMSProp
- deep learning using, Deep Learning Is for Everyone, What Is Machine Learning?, Jargon Recap
- explained, What Is a Neural Network?-What Is a Neural Network?, Adding a Nonlinearity
- first model as, What Is a Neural Network?
- fundamental weights and bias equation, The MNIST Loss Function
- GPU running, Getting a GPU Deep Learning Server
- history, Neural Networks: A Brief History-Neural Networks: A Brief History, The Learning Rate Finder
- layers (see layers)
- multilayered neural networks learned with SGD, Beyond Deep Learning
- natural language processing, Text Preprocessing
- (see also natural language processing)
- refactoring, Creating the CNN
- risk mitigation, How to Avoid Disaster
- RNN definition, Our First Recurrent Neural Network
- (see also recurrent neural networks)
- tabular data, Using a Neural Network
- testing, complexity of, How to Avoid Disaster
- training via backpropagation, Pixels: The Foundations of Computer Vision
- training with large learning rates, 1cycle Training
- visualizing learning, What Our Image Recognizer Learned
- new user bootstrapping problem, Bootstrapping a Collaborative Filtering Model
- NLP (see natural language processing)
- nonlinear and linear layers, Adding a Nonlinearity, Adding a Nonlinearity, Jargon Recap, Unfreezing and Transfer Learning
- normalization of data, Normalization
- notebooks
- about, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter), Getting a GPU Deep Learning Server
- app from notebook, Turning Your Notebook into a Real App
- Binder free app hosting, Deploying Your App
- blogging with, Jupyter for Blogging
- book written in, Running Your First Notebook
- cell execution order, Deep Learning Is Not Just for Image Classification
- cells, Running Your First Notebook
- code from book, What You Need to Know, Running Your First Notebook, Deep Learning Is Not Just for Image Classification
- command mode, Running Your First Notebook
- edit mode, Running Your First Notebook
- escape key for command/edit mode, Running Your First Notebook
- features for efficiency, Gathering Data
- first cell CLICK ME, Running Your First Notebook, Deep Learning Is Not Just for Image Classification
- first notebook, Running Your First Notebook-Running Your First Notebook
- full versus stripped, Running Your First Notebook
- GPU server setup, Getting a GPU Deep Learning Server
- H for help, Running Your First Notebook
- kernel
- library efficiency, How Our Image Recognizer Works
- Markdown formatting, Running Your First Notebook
- opening, Running Your First Notebook
- out-of-memory error, Deeper Architectures
- process of creating application (see process end-to-end)
- showing source code, Word Tokenization with fastai
- utils class, Gathering Data
- web application deployment, Turning Your Model into an Online Application-Deploying Your App
- number precision and training, Deeper Architectures
- number-related datasets
- numerical digit classifier
- accuracy metric, Computing Metrics Using Broadcasting-Computing Metrics Using Broadcasting
- activations, Softmax
- color-code array or tensor, Pixels: The Foundations of Computer Vision
- comparing with ideal digit, First Try: Pixel Similarity
- convolutional neural network
- 1cycle training, 1cycle Training
- batch normalization, Batch Normalization
- batch size increased, Increase Batch Size
- building a CNN, Creating the CNN-Batch Normalization
- color images, Color Images
- convolution arithmetic, Understanding Convolution Arithmetic
- convolution described, The Magic of Convolutions
- dataset, Improving Training Stability
- equations, Understanding the Convolution Equations
- kernel, The Magic of Convolutions-The Magic of Convolutions
- kernel mapping, Mapping a Convolutional Kernel-Mapping a Convolutional Kernel
- nested list of comprehensions, Mapping a Convolutional Kernel
- padding, Strides and Padding
- PyTorch convolutions, Convolutions in PyTorch
- receptive fields, Receptive Fields
- training, Creating the CNN
- training more stable, A Simple Baseline-Batch Normalization
- training on all digits, A Simple Baseline-Batch Normalization
- dataset download, Pixels: The Foundations of Computer Vision
- feature engineering, The Magic of Convolutions
- fully convolutional networks and, Going Back to Imagenette
- ideal digit creation, First Try: Pixel Similarity-First Try: Pixel Similarity
- image as array or tensor, Pixels: The Foundations of Computer Vision
- Learner creation, Creating an Optimizer
- MNIST loss function, The MNIST Loss Function-Sigmoid, Log Likelihood-Log Likelihood
- optimization step, SGD and Mini-Batches-Going Deeper
- pixel similarity, First Try: Pixel Similarity-First Try: Pixel Similarity
- stochastic gradient descent, Computing Metrics Using Broadcasting-Summarizing Gradient Descent
- terminology, Jargon Recap
- validation set, Computing Metrics Using Broadcasting
- viewing dataset images, Pixels: The Foundations of Computer Vision
- numericalization
- NumPy
- NVIDIA GPU deep learning server, Getting a GPU Deep Learning Server
O
- Obermeyer, Ziad, Measurement bias
- object detection
- object recognition
- object-oriented programming, Collaborative Filtering from Scratch
- objectives via Drivetrain Approach, The Drivetrain Approach
- occupations and gender, Historical bias, Representation bias
- OCR (see numerical digit classifier)
- one-hot encoding
- online advertisement bias, Historical bias
- online applications (see web applications)
- online resources (see web resources)
- optical character recognition (see numerical digit classifier)
- optimization
- Adam as default, Adam
- creating an optimizer, Creating an Optimizer-Going Deeper
- generic optimizer, A Generic Optimizer
- gradient descent, Summarizing Gradient Descent, Jargon Recap
- layers and, Going Deeper
- module parameters, Creating Our Own Embedding Module
- nonlinearity added, Adding a Nonlinearity
- numerical digit classifier, SGD and Mini-Batches-Going Deeper
- pet breeds image classifier, Checking and Debugging a DataBlock-Taking the log
- stochastic gradient descent, SGD and Mini-Batches-Going Deeper
- ordinal columns in tabular data, Look at the Data
- out-of-domain data, Computer vision
- out-of-memory error, Deeper Architectures
- outputs
- overfitting
- avoid only when occurring, How Our Image Recognizer Works
- definition, Jargon Recap
- importance of, How Our Image Recognizer Works, Conclusion
- layers and, How Our Image Recognizer Works
- learning rate finder, The Learning Rate Finder
- model memorizing training set, How Our Image Recognizer Works
- reducing, Conclusion
- regularizing RNNs against, Regularizing an LSTM
- retrain from scratch, Selecting the Number of Epochs
- training versus validation loss, Discriminative Learning Rates
- validation set, Validation Sets and Test Sets
- weight decay against, Weight Decay
- O’Neill, Cathy, Addressing different types of bias
P
- padding a convolution, Strides and Padding
- Pandas library
- papers (see research papers)
- Papert, Seymour, Neural Networks: A Brief History
- Parallel Distributed Processing (PDP) book (Rumelhart, McClelland, and PDP Research Group), Neural Networks: A Brief History
- parameters
- architecture requiring many, How Our Image Recognizer Works
- calling module calls forward method, Collaborative Filtering from Scratch
- deeper models and, Going Deeper
- definition, Jargon Recap, Jargon Recap
- derivative of a function, Calculating Gradients
- exporting models, Using the Model for Inference
- hyperparameters, Validation Sets and Test Sets
- importance of, How Our Image Recognizer Works
- loss function selected by fastai, Checking and Debugging a DataBlock
- machine learning concepts, What Is Machine Learning?, A Bit of Deep Learning Jargon
- more accuracy from more parameters, Deeper Architectures
- neural networks beyond understanding, How to Avoid Disaster
- Parameter class, Creating Our Own Embedding Module
- Parr, Terence, Creating the Decision Tree
- partial function to bind arguments, Binary Cross Entropy
- PASCAL multi-label dataset, The Data
- path to dataset
- PDP Research Group, Neural Networks: A Brief History
- Pearl, Judea, Partial Dependence
- pedophiles and YouTube, Feedback Loops
- Perceptrons book (Minsky and Papert), Neural Networks: A Brief History
- performance of model as loss, A Bit of Deep Learning Jargon, Jargon Recap
- Perkins, David, How to Learn Deep Learning
- person’s face center in image (see key point model)
- pet breeds image classifier (see image classifier models)
- pet images dataset, Running Your First Notebook, How Our Image Recognizer Works, From Dogs and Cats to Pet Breeds, Applying the Mid-Level Data API: SiamesePair
- pickle system for save method, Using TabularPandas and TabularProc
- PIL images, Pixels: The Foundations of Computer Vision
- Pipeline class, Pipeline
- Pitts, Walter, Neural Networks: A Brief History
- pixels
- plain text data approach, Beyond Deep Learning
- PointBlock, Assembling the Data
- policy’s role in ethics, Role of Policy-Cars: A Historical Precedent
- positive feedback loop, Limitations Inherent to Machine Learning
- precision of numbers and training, Deeper Architectures
- predictions
- activations transformed into, Viewing Activations and Labels
- bagging, Random Forests-Ensembling
- button for web application, Creating a Notebook App from the Model
- definition, A Bit of Deep Learning Jargon
- dependent variable for, Look at the Data
- hypothetical world of, Partial Dependence
- independent variable, A Bit of Deep Learning Jargon, From Data to DataLoaders
- inference instead of training, Using the Model for Inference
- inference with image classifier, Using the Model for Inference
- as machine learning limitation, Limitations Inherent to Machine Learning
- metric measuring quality, How Our Image Recognizer Works
- model changing system behavior, Unforeseen Consequences and Feedback Loops
- model overconfidence, Discriminative Learning Rates
- movie recommendation system, Deep Learning Is Not Just for Image Classification
- overfitting and, How Our Image Recognizer Works
- predictive modeling competitions, Use Judgment in Defining Test Sets
- predictive policing algorithm, Unforeseen Consequences and Feedback Loops
- random forest confidence, Tree Variance for Prediction Confidence
- sales from stores, Categorical Embeddings
- softmax sum of 1 requirement, Binary Cross Entropy
- stroke prediction, Combining text and images, Measurement bias
- viewing, Viewing Activations and Labels
- prerequisite for book, What You Need to Know
- presizing, From Dogs and Cats to Pet Breeds
- pretrained models
- accuracy from, How Our Image Recognizer Works
- convolutional neural network parameter, How Our Image Recognizer Works
- definition, How Our Image Recognizer Works, Jargon Recap
- discriminative learning rates, Unfreezing and Transfer Learning
- fine-tuning first model, Running Your First Notebook
- first model, Running Your First Notebook
- freezing, Unfreezing and Transfer Learning
- last layer and, How Our Image Recognizer Works, Unfreezing and Transfer Learning
- NLP English language, NLP Deep Dive: RNNs
- normalization of data, Normalization
- pixel count required, How Our Image Recognizer Works
- recommendation system rarity, Deep Learning Is Not Just for Image Classification
- self-supervised learning for, NLP Deep Dive: RNNs
- tabular model rarity, Deep Learning Is Not Just for Image Classification
- transfer learning, How Our Image Recognizer Works, Summarizing Gradient Descent
- Wikipedia for pretraining NLP, NLP Deep Dive: RNNs
- privacy
- probabilistic matrix factorization, Bootstrapping a Collaborative Filtering Model
- process end-to-end
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- applicability of deep learning to problem, The State of Deep Learning
- begin in known areas, Starting Your Project, Deep Learning in Practice: That’s a Wrap!
- capabilities and contraints of deep learning, The Practice of Deep Learning
- data availability, Starting Your Project
- data biases, Gathering Data
- data cleaning, Training Your Model, and Using It to Clean Your Data
- data gathering, Gathering Data-Gathering Data
- DataLoaders, From Data to DataLoaders-From Data to DataLoaders
- deployment
- app from notebook, Turning Your Notebook into a Real App
- Binder free app hosting, Deploying Your App
- deployment file, Using the Model for Inference
- exporting model, Using the Model for Inference
- mobile devices, Deploying Your App
- prediction inference, Using the Model for Inference
- risk mitigation, How to Avoid Disaster
- unforeseen challenges, Unforeseen Consequences and Feedback Loops
- web application, Turning Your Model into an Online Application-Deploying Your App
- web application deployment, Deploying Your App-Deploying Your App
- web application disaster avoidance, How to Avoid Disaster
- web resource discussing issues, How to Avoid Disaster
- experiments lead to projects, Starting Your Project
- image size, From Data to DataLoaders, From Data to DataLoaders, From Dogs and Cats to Pet Breeds
- iterate end to end, Starting Your Project
- model and human interaction, Combining text and images, How to Avoid Disaster
- performance of model via loss, Training Your Model, and Using It to Clean Your Data
- prototyping, Starting Your Project
- risk mitigation, How to Avoid Disaster
- testing with confusion matrix, Training Your Model, and Using It to Clean Your Data
- training the model, Training Your Model, and Using It to Clean Your Data
- web application disaster avoidance, How to Avoid Disaster
- web application from model, Turning Your Model into an Online Application-Deploying Your App
- production
- CPU servers cheaper than GPU, Deploying Your App
- data seen changing over time, How to Avoid Disaster, Feedback Loops: YouTube’s Recommendation System
- GPU for model in production, Deploying Your App
- manual process in parallel, How to Avoid Disaster
- out-of-domain data, Computer vision, How to Avoid Disaster
- product design integrated with ML, Integrating Machine Learning with Product Design
- testing, complexity of, How to Avoid Disaster
- web application from model, Turning Your Model into an Online Application-Deploying Your App
- profile identity generated by ML, Disinformation and Language Models
- programs versus models, What Is Machine Learning?, What Is Machine Learning?
- progressive resizing, Progressive Resizing
- protein chains as natural language, Other data types
- prototyping
- publishing app on Binder, Deploying Your App
- Python
- array APIs, NumPy Arrays and PyTorch Tensors
- class methods, Language Model Using DataBlock
- context manager, CAM and Hooks
- error debugging, Gathering Data
- fastai library efficiency, How Our Image Recognizer Works
- IPython widgets, Creating a Notebook App from the Model
- Jupyter for, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- lambda functions, Constructing a DataBlock
- list comprehensions, First Try: Pixel Similarity
- list type as fastai L class, From Dogs and Cats to Pet Breeds
- loop inefficiency, NumPy Arrays and PyTorch Tensors, The MNIST Loss Function
- method double underscores, Collaborative Filtering from Scratch
- nested list comprehensions, Mapping a Convolutional Kernel
- Pandas library, The Data
- partial function to bind arguments, Binary Cross Entropy
- Path class, How Our Image Recognizer Works, Using the Model for Inference
- tensor APIs, NumPy Arrays and PyTorch Tensors
- web browser functionality, Creating a Notebook App from the Model
- Python for Data Analysis book (McKinney), The Data, Beyond Deep Learning
- Python Imaging Library (PIL), Pixels: The Foundations of Computer Vision
- PyTorch
- about, Foreword, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- about fastai software library, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- building NLP model, Our Language Model in PyTorch
- casting, First Try: Pixel Similarity
- convolutions, Convolutions in PyTorch
- decision trees don’t use, Beyond Deep Learning
- fastai torch.nn.functional import, First Try: Pixel Similarity, Convolutions in PyTorch
- hooks, CAM and Hooks-CAM and Hooks
- loss functions for comparisons, First Try: Pixel Similarity
- most important technique, Computing Metrics Using Broadcasting
- names ending in underscore, Putting It All Together
- object-oriented programming, Collaborative Filtering from Scratch
- optimizer creation, Creating an Optimizer-Going Deeper
- SGD class, The Training Process-A Generic Optimizer
- single item or batch same code, Binary Cross Entropy
- tensors
R
- racial bias
- arrest rates, Unforeseen Consequences and Feedback Loops
- datasets for training models, Historical bias
- Facebook advertising, Historical bias
- facial recognition, Integrating Machine Learning with Product Design, Historical bias
- Google advertising, Bias: Professor Latanya Sweeney “Arrested”
- Google Photos label, Historical bias
- historical, Historical bias
- power of diversity, The Power of Diversity
- sentencing and bail algorithm, Historical bias
- radiologist-model interaction, Combining text and images
- Raji, Deb, Gathering Data
- random forests, Random Forests-Ensembling
- random seed for validation set selection, How Our Image Recognizer Works, From Data to DataLoaders
- RandomResizedCrop
- rank correlation, Removing Redundant Features
- rank of tensor
- recommendation systems
- about, Limitations Inherent to Machine Learning
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- Amazon, Recommendation systems
- collaborative filtering (see collaborative filtering)
- conspiracy theory feedback loops, Feedback Loops: YouTube’s Recommendation System, Feedback Loops, Feedback Loops
- current state of, Recommendation systems
- feedback loop ethics, Feedback Loops: YouTube’s Recommendation System, Feedback Loops
- Google Play concatenation approach, Categorical Embeddings
- Meetup and gender, Feedback Loops
- movies based on viewing habits, Deep Learning Is Not Just for Image Classification
- pretrained model rarity, Deep Learning Is Not Just for Image Classification
- skew from small number of users, Bootstrapping a Collaborative Filtering Model
- as tabular data, Recommendation systems
- YouTube feedback loop ethics, Feedback Loops: YouTube’s Recommendation System, Feedback Loops
- recourse for ethics violations, Recourse and Accountability
- rectified linear unit (ReLU), Adding a Nonlinearity, Jargon Recap
- recurrent neural networks (RNNs)
- refactoring parts of neural networks, Creating the CNN, Refactoring the Model
- regression models definition, How Our Image Recognizer Works
- regular expressions (regex), From Dogs and Cats to Pet Breeds
- regulating ethics, The Effectiveness of Regulation
- reinforcement learning, Feedback Loops
- replace_all_caps, Word Tokenization with fastai
- replace_maj, Word Tokenization with fastai
- replace_rep, Word Tokenization with fastai
- replace_wrep, Word Tokenization with fastai
- representation bias, Representation bias, Bootstrapping a Collaborative Filtering Model
- research papers
- about, Mixup
- advertising bias, Bias: Professor Latanya Sweeney “Arrested”
- bagging predictors, Random Forests
- batch normalization, Batch Normalization
- bias in machine learning, Bias
- class activation map, CAM and Hooks
- convolution arithmetic, Mapping a Convolutional Kernel
- cyclical momentum, 1cycle Training
- data leakage, Data Leakage
- deep residual learning, ResNets
- demographics dataset, Deep Learning Is Not Just for Image Classification
- ethical lens versus ethical intuitions, Fairness, Accountability, and Transparency
- geo-diversity of datasets, Historical bias
- gradient class activation map, Gradient CAM
- label smoothing, Label Smoothing
- malware classification, Image Recognizers Can Tackle Non-Image Tasks
- measurement bias, Measurement bias
- Mixup, Mixup
- model bias, Gathering Data
- object recognition, Historical bias
- predicting sales from stores, Categorical Embeddings
- predictive policing, Unforeseen Consequences and Feedback Loops
- rectifier deep dive, Defining and Initializing a Layer
- regularizing LSTM language models, Regularizing an LSTM
- representation bias, Representation bias
- ResNet improved, A State-of-the-Art ResNet
- sentiment analysis, Deep Learning Is Not Just for Image Classification
- skip connections smoothing loss, Skip Connections
- training a segmentation model, Deep Learning Is Not Just for Image Classification
- training deep feedforward neural networks, Defining and Initializing a Layer
- training with large learning rates, 1cycle Training
- visualizing neural network weights, What Our Image Recognizer Learned, Unfreezing and Transfer Learning
- Resize, From Data to DataLoaders
- ResNet architecture
- about, ResNets, Skip Connections, Skip Connections
- building ResNet CNN, Building a Modern CNN: ResNet-Skip Connections
- building state-of-the-art ResNet, A State-of-the-Art ResNet-Bottleneck Layers
- ease of learning, Skip Connections
- first model, How Our Image Recognizer Works
- fully convolutional networks, Going Back to Imagenette
- image classifier, Deeper Architectures
- Imagenette dataset, Going Back to Imagenette
- layer quantity variants, Deeper Architectures
- ResNet-18, -34, -50 versions, Deeper Architectures, A State-of-the-Art ResNet
- skip connections, Skip Connections-Skip Connections
- results (see predictions)
- rights and policy, Rights and Policy
- RMSProp, RMSProp
- rm_useless_spaces, Word Tokenization with fastai
- RNN (see recurrent neural networks)
- root mean squared error (RMSE or L2 norm), First Try: Pixel Similarity
- root mean squared log error as metric, Look at the Data, Creating the Decision Tree, Creating a Random Forest
- Rosenblatt, Frank, Neural Networks: A Brief History
- Rumelhart, David, Neural Networks: A Brief History
- Russia and 2016 election, Disinformation
- Russia Today and Mueller report, Feedback Loops
S
- Samuel, Arthur, What Is Machine Learning?
- save method, Using TabularPandas and TabularProc, Saving and Loading Models
- Schmidhuber, Jurgen, Pixels: The Foundations of Computer Vision
- scikit-learn library, Beyond Deep Learning
- search_images_bing, Gathering Data
- seed for validation set selection, From Data to DataLoaders
- segmentation, Computer vision
- self-driving cars, Deep Learning Is Not Just for Image Classification, The Drivetrain Approach
- self-supervised learning
- Sequential class, Adding a Nonlinearity, Simple CNN
- server for running code, Getting a GPU Deep Learning Server
- setup
- SGD (see stochastic gradient descent)
- SGD class, Creating an Optimizer, The Training Process-A Generic Optimizer
- (see also stochastic gradient descent)
- Shankar, Shreya, Historical bias
- show_batch method, Checking and Debugging a DataBlock
- show_image function, First Try: Pixel Similarity
- Siamese model image comparison, Applying the Mid-Level Data API: SiamesePair-Applying the Mid-Level Data API: SiamesePair
- sigmoid function
- sigmoid_range, Training a Model, Collaborative Filtering from Scratch
- signature of function
- skip connections, Skip Connections-Skip Connections
- sklearn
- Smith, Leslie, The Learning Rate Finder, The Learning Rate Finder, 1cycle Training, 1cycle Training
- Socher, Richard, Regularizing an LSTM
- socioeconomic bias, Addressing different types of bias
- softmax activation function, Viewing Activations and Labels, Unfreezing and Transfer Learning
- sound analyzed as spectrogram, Image Recognizers Can Tackle Non-Image Tasks, Computer vision, Other data types
- source code of function displayed, Word Tokenization with fastai
- special tokens, Word Tokenization with fastai
- spec_add_spaces, Word Tokenization with fastai
- Splunk.com fraud detection, Image Recognizers Can Tackle Non-Image Tasks
- spreadsheet data for models, Deep Learning Is Not Just for Image Classification
- starting (see beginning)
- stem in convolutional neural network, A State-of-the-Art ResNet, cnn_learner
- stochastic gradient descent (SGD)
- about, What Is a Neural Network?, Computing Metrics Using Broadcasting-Stochastic Gradient Descent, The Training Process
- backward, Calculating Gradients
- building Learner class from scratch, Learner
- calculating gradients, Calculating Gradients-Calculating Gradients
- cyclical momentum, 1cycle Training
- example end-to-end, An End-to-End SGD Example-Step 7: Stop
- mini-batches, SGD and Mini-Batches
- momentum, Momentum-Momentum
- multilayered neural networks learned with, Beyond Deep Learning
- optimization of numerical digit classifier, SGD and Mini-Batches-Going Deeper
- SGD class, Creating an Optimizer, The Training Process-A Generic Optimizer
- stepping with learning rate, Stepping with a Learning Rate-Stepping with a Learning Rate
- summarizing, Summarizing Gradient Descent
- store sales predictions
- stride-1 convolutions, Strides and Padding
- stride-2 convolutions, Strides and Padding
- stroke prediction, Combining text and images, Measurement bias
- subword tokenization, Subword Tokenization
- summary method
- Suresh, Harini, Bias
- Sweeney, Latanya, Bias: Professor Latanya Sweeney “Arrested”
- symbolic computation library, Gradients and the Backward Pass
- SymPy library and calculus, Gradients and the Backward Pass
- Syntactic Structures book (Chomsky), From Dogs and Cats to Pet Breeds
- Szegedy, Christian, Label Smoothing, Batch Normalization
T
- Tabular classes, Using a Neural Network
- tabular data for models
- about, Deep Learning Is Not Just for Image Classification, Tabular Modeling Deep Dive
- advice for modeling, Conclusion
- architecture, Tabular
- categorical embeddings, Categorical Embeddings
- current state of, Tabular data
- as data type, From Dogs and Cats to Pet Breeds
- dataset for deep dive, The Dataset
- decision trees as first approach, Beyond Deep Learning
- deep learning not best starting point, Categorical Embeddings
- entity embedding, Categorical Embeddings
- model interpretation, Model Interpretation
- multi-label classification, The Data-The Data
- neural network model, Using a Neural Network
- ordinal columns, Look at the Data
- predicting sales from stores, Categorical Embeddings
- pretrained model rarity, Deep Learning Is Not Just for Image Classification
- recommendation systems as, Recommendation systems
- TabularPandas class, Using TabularPandas and TabularProc
- TabularProc, Using TabularPandas and TabularProc
- tech industry and gender, The Power of Diversity
- temporal activation regularization, Activation Regularization and Temporal Activation Regularization
- tensor core support by GPUs, Deeper Architectures
- tensors
- about, NumPy Arrays and PyTorch Tensors
- all images in directory, First Try: Pixel Similarity
- APIs, NumPy Arrays and PyTorch Tensors
- broadcasting, Computing Metrics Using Broadcasting, Computing Metrics Using Broadcasting
- color image as rank-3 tensor, Color Images
- column selected, NumPy Arrays and PyTorch Tensors
- creating a tensor, NumPy Arrays and PyTorch Tensors
- definition, Jargon Recap
- displaying as images, First Try: Pixel Similarity
- elementwise arithmetic, Elementwise Arithmetic
- image section, Pixels: The Foundations of Computer Vision
- image sizes same, From Data to DataLoaders, From Dogs and Cats to Pet Breeds
- matrix multiplication, The MNIST Loss Function
- operators, NumPy Arrays and PyTorch Tensors
- rank, First Try: Pixel Similarity
- row selected, NumPy Arrays and PyTorch Tensors
- shape, First Try: Pixel Similarity
- slicing row or column, NumPy Arrays and PyTorch Tensors
- type, NumPy Arrays and PyTorch Tensors
- terminology for deep learning, A Bit of Deep Learning Jargon, Jargon Recap, Jargon Recap
- test time augmentation (TTA), Test Time Augmentation
- testing models
- text combined with images, Combining text and images
- text data approach, Beyond Deep Learning
- (see also natural language processing)
- text generation
- TextBlock, Language Model Using DataBlock
- TextDataLoaders.from_folder, Going Deeper into fastai’s Layered API
- TfmdLists, TfmdLists and Datasets: Transformed Collections-TfmdLists
- Thomas, Rachel, Get Writing!, Analyze a Project You Are Working On
- time series analysis
- tokenization
- approaches to, Tokenization
- definition, Text Preprocessing
- fastai interface, Word Tokenization with fastai
- most common token prediction, Our Language Model in PyTorch
- numericalization, Numericalization with fastai
- showing rules used, Word Tokenization with fastai
- special tokens, Word Tokenization with fastai
- subword tokenization, Subword Tokenization
- texts into batches for language model, Putting Our Texts into Batches for a Language Model-Putting Our Texts into Batches for a Language Model
- token definition, Tokenization
- Transform class, Transforms
- unknown word token, Numericalization with fastai
- word tokenization, Word Tokenization with fastai, Subword Tokenization
- top 5 accuracy, A State-of-the-Art ResNet
- torch.nn.functional, First Try: Pixel Similarity, Convolutions in PyTorch
- training
- 1cycle training, 1cycle Training
- backpropagation for neural networks, Pixels: The Foundations of Computer Vision
- bagging, Random Forests-Ensembling
- baseline, First Try: Pixel Similarity, Checking and Debugging a DataBlock, Establishing a Baseline-Establishing a Baseline
- biases, Gathering Data
- black-and-white or hand-drawn images, Computer vision
- cyclical momentum, 1cycle Training
- data cleanup before versus after, Training Your Model, and Using It to Clean Your Data, Training Your Model, and Using It to Clean Your Data
- decision trees, Decision Trees-Creating the Decision Tree
- deeper models, Going Deeper, Deeper Architectures
- definition, Jargon Recap
- early stopping, Selecting the Number of Epochs
- epochs, number of, How Our Image Recognizer Works
- ethics importance, Why Does This Matter?
- experiments lead to projects, Starting Your Project
- fine-tuning definition, How Our Image Recognizer Works
- first model, Running Your First Notebook
- head of model, How Our Image Recognizer Works
- image classifier models (see image classifier model training)
- image differences during, From Data to DataLoaders
- labels for examples, Limitations Inherent to Machine Learning
- layers and, How Our Image Recognizer Works, Unfreezing and Transfer Learning
- learning rate, Stepping with a Learning Rate-Stepping with a Learning Rate
- machine learning concepts, What Is Machine Learning?-What Is Machine Learning?
- mixed-precision training, Deeper Architectures
- model memorizing data, How Our Image Recognizer Works, Validation Sets and Test Sets
- neural networks and learning rate, 1cycle Training
- numerical digit classifier (see numerical digit classifier)
- out-of-domain data, Computer vision
- overfitting, How Our Image Recognizer Works
- prediction model inference, Using the Model for Inference
- pretrained models (see pretrained models)
- process
- about, The Training Process
- Adam, Adam
- baseline established, Establishing a Baseline-Establishing a Baseline
- callbacks, Callbacks
- callbacks, creating, Creating a Callback
- callbacks, exceptions, Callback Ordering and Exceptions
- decoupled weight decay, Decoupled Weight Decay
- momentum, Momentum-Momentum
- optimizer generic, A Generic Optimizer
- RMSProp, RMSProp
- SGD class, The Training Process-A Generic Optimizer
- random variations, Running Your First Notebook
- recurrent neural networks, Regularizing an LSTM
- self-supervised learning, NLP Deep Dive: RNNs
- (see also self-supervised learning)
- stochastic gradient descent, Computing Metrics Using Broadcasting-Stochastic Gradient Descent
- tensor core support for speed, Deeper Architectures
- text classifier, Training a Text Classifier
- time spent, Running Your First Notebook
- trained model is program, What Is Machine Learning?
- training set, How Our Image Recognizer Works, How Our Image Recognizer Works, Jargon Recap
- building, Use Judgment in Defining Test Sets-Use Judgment in Defining Test Sets
- classes for representing, accessing, Constructing a DataBlock
- cleaning GUI, Training Your Model, and Using It to Clean Your Data
- DataLoaders, From Data to DataLoaders-From Data to DataLoaders
- DataLoaders customization, From Data to DataLoaders
- presizing, From Dogs and Cats to Pet Breeds
- production complexity and, How to Avoid Disaster
- racial balance of, Historical bias
- time series, Using TabularPandas and TabularProc
- transfer learning
- about, Unfreezing and Transfer Learning
- cutting network, cnn_learner
- definition, How Our Image Recognizer Works
- final layer, Unfreezing and Transfer Learning
- fine-tuning as, How Our Image Recognizer Works, Unfreezing and Transfer Learning
- image classifier, Unfreezing and Transfer Learning
- natural language processing, NLP Deep Dive: RNNs
- progressive resizing hurting performance, Progressive Resizing
- self-supervised learning, NLP Deep Dive: RNNs
- weights, Summarizing Gradient Descent
- Transforms
- collections, TfmdLists and Datasets: Transformed Collections
- Datasets, Datasets
- definition, How Our Image Recognizer Works
- image cropping, From Data to DataLoaders
- image size, How Our Image Recognizer Works, From Data to DataLoaders, From Dogs and Cats to Pet Breeds
- item transforms, From Data to DataLoaders
- Pipeline class, Pipeline
- presizing, From Dogs and Cats to Pet Breeds
- Siamese model image comparison, Applying the Mid-Level Data API: SiamesePair-Applying the Mid-Level Data API: SiamesePair
- TabularProc, Using TabularPandas and TabularProc
- TfmdLists, TfmdLists and Datasets: Transformed Collections-TfmdLists
- Transform class, Transforms
- writing your own, Writing Your Own Transform, TfmdLists
- translation of languages
- tumor identification, Deep Learning Is for Everyone, Who We Are
- Turing Award, Pixels: The Foundations of Computer Vision
- tutorials
- Twitter for deep learning help, A Note About Twitter
V
- validation set
- building, Use Judgment in Defining Test Sets-Use Judgment in Defining Test Sets
- classes for representing and accessing, Constructing a DataBlock
- cleaning GUI, Training Your Model, and Using It to Clean Your Data
- DataLoaders, From Data to DataLoaders-From Data to DataLoaders
- definition, Jargon Recap, Validation Sets and Test Sets
- error rate, How Our Image Recognizer Works
- export method, Using the Model for Inference
- first model, How Our Image Recognizer Works
- hyperparameter picked by, Binary Cross Entropy
- NLP most common token, Our Language Model in PyTorch
- numeric digit classifier, Computing Metrics Using Broadcasting
- out-of-domain data, The Extrapolation Problem
- overfitting, Validation Sets and Test Sets
- random seed, How Our Image Recognizer Works
- size of, Validation Sets and Test Sets
- splitting from training set, From Data to DataLoaders
- test time augmentation, Test Time Augmentation
- testing with confusion matrix, Training Your Model, and Using It to Clean Your Data
- time series, Using TabularPandas and TabularProc
- variables
- vector dot product, A First Look at the Data, Categorical Embeddings
- verify_images, Gathering Data
- Visin, Francesco, Mapping a Convolutional Kernel
- vocabulary (see terminology)
- Voilà, Creating a Notebook App from the Model
- Volkswagen emission test cheating (ethics), Why Does This Matter?
W
- warmup learning rate, 1cycle Training
- Watson, Thomas, Why Does This Matter?
- Weapons of Math Destruction book (O’Neill), Addressing different types of bias
- web applications
- web resources
- actionable outcomes via Drivetrain Approach, The Drivetrain Approach
- bias in machine learning, Bias
- Binder free app hosting, Deploying Your App
- blogging article, Get Writing!
- book updates, Deep Learning in Practice: That’s a Wrap!
- code from book, What You Need to Know, Running Your First Notebook, Deep Learning Is Not Just for Image Classification
- datasets and other Kaggle resources, Kaggle Competitions
- decision tree viewer, Creating the Decision Tree
- deployment issue discussion, How to Avoid Disaster
- documentation for methods, Deep Learning Is Not Just for Image Classification
- ethics description, Data Ethics
- ethics toolkits, Processes to Implement
- Fairness and Machine Learning book, Fairness, Accountability, and Transparency
- fast.ai free online course, Concluding Thoughts
- fast.ai website, What You Need to Know
- fastai forums, Concluding Thoughts
- fraud detection at Splunk.com, Image Recognizers Can Tackle Non-Image Tasks
- GitHub Pages hosting blog, Blogging with GitHub Pages
- GPU servers, Getting a GPU Deep Learning Server
- Jupyter, The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
- Kaggle machine learning community, Who We Are
- malware classification, Image Recognizers Can Tackle Non-Image Tasks
- math tutorials, What You Need to Know, First Try: Pixel Similarity
- mathematical symbols, Mixup
- predicting sales from stores paper, Categorical Embeddings
- predictive policing paper, Unforeseen Consequences and Feedback Loops
- Python debugger, Gathering Data
- recommended web app hosts, Deploying Your App
- regular expression tutorials, From Dogs and Cats to Pet Breeds
- segmentation training, Deep Learning Is Not Just for Image Classification
- sklearn docs, Creating a Random Forest
- sound analyzed as spectrogram, Image Recognizers Can Tackle Non-Image Tasks
- SymPy library, Gradients and the Backward Pass
- tutorials for each book chapter, How Our Image Recognizer Works
- visualizing convolutional networks, What Our Image Recognizer Learned
- weights
- machine learning, What Is Machine Learning?-What Is Machine Learning?
- neural networks, What Is a Neural Network?
- as parameters, What Is Machine Learning?, A Bit of Deep Learning Jargon
- pretrained parameter, How Our Image Recognizer Works
- random in training from scratch, Summarizing Gradient Descent
- stochastic gradient descent, Computing Metrics Using Broadcasting-Stochastic Gradient Descent
- transfer learning
- visualizing learning, What Our Image Recognizer Learned
- weight decay, Weight Decay
- weight tying, Training a Weight-Tied Regularized LSTM
- Werbos, Paul, Pixels: The Foundations of Computer Vision
- Wikipedia for pretraining NLP, NLP Deep Dive: RNNs
- word tokenization, Word Tokenization with fastai, Subword Tokenization
- Wright, Marvin, Categorical Variables
..................Content has been hidden....................
You can't read the all page of ebook, please click
here login for view all page.