0%

Book Description

Natural Language Processing in Action is your guide to building machines that can read and interpret human language. In it, you’ll use readily available Python packages to capture the meaning in text and react accordingly. The book expands traditional NLP approaches to include neural networks, modern deep learning algorithms, and generative techniques as you tackle real-world problems like extracting dates and names, composing text, and answering free-form questions.

Table of Contents

  1. Copyright
  2. Brief Table of Contents
  3. Table of Contents
  4. Foreword
  5. Preface
  6. Acknowledgments
  7. About this Book
  8. About the Authors
  9. About the cover Illustration
  10. Part 1. Wordy machines
    1. Chapter 1. Packets of thought (NLP overview)
      1. 1.1. Natural language vs. programming language
      2. 1.2. The magic
      3. 1.3. Practical applications
      4. 1.4. Language through a computer’s “eyes”
      5. 1.5. A brief overflight of hyperspace
      6. 1.6. Word order and grammar
      7. 1.7. A chatbot natural language pipeline
      8. 1.8. Processing in depth
      9. 1.9. Natural language IQ
      10. Summary
    2. Chapter 2. Build your vocabulary (word tokenization)
      1. 2.1. Challenges (a preview of stemming)
      2. 2.2. Building your vocabulary with a tokenizer
      3. 2.3. Sentiment
      4. Summary
    3. Chapter 3. Math with words (TF-IDF vectors)
      1. 3.1. Bag of words
      2. 3.2. Vectorizing
      3. 3.3. Zipf’s Law
      4. 3.4. Topic modeling
      5. Summary
    4. Chapter 4. Finding meaning in word counts (semantic analysis)
      1. 4.1. From word counts to topic scores
      2. 4.2. Latent semantic analysis
      3. 4.3. Singular value decomposition
      4. 4.4. Principal component analysis
      5. 4.5. Latent Dirichlet allocation (LDiA)
      6. 4.6. Distance and similarity
      7. 4.7. Steering with feedback
      8. 4.8. Topic vector power
      9. Summary
  11. Part 2. Deeper learning (neural networks)
    1. Chapter 5. Baby steps with neural networks (perceptrons and backpropagation)
      1. 5.1. Neural networks, the ingredient list
      2. Summary
    2. Chapter 6. Reasoning with word vectors (Word2vec)
      1. 6.1. Semantic queries and analogies
      2. 6.2. Word vectors
      3. Summary
    3. Chapter 7. Getting words in order with convolutional neural networks (CNNs)
      1. 7.1. Learning meaning
      2. 7.2. Toolkit
      3. 7.3. Convolutional neural nets
      4. 7.4. Narrow windows indeed
      5. Summary
    4. Chapter 8. Loopy (recurrent) neural networks (RNNs)
      1. 8.1. Remembering with recurrent networks
      2. 8.2. Putting things together
      3. 8.3. Let’s get to learning our past selves
      4. 8.4. Hyperparameters
      5. 8.5. Predicting
      6. Summary
    5. Chapter 9. Improving retention with long short-term memory networks
      1. 9.1. LSTM
      2. Summary
    6. Chapter 10. Sequence-to-sequence models and attention
      1. 10.1. Encoder-decoder architecture
      2. 10.2. Assembling a sequence-to-sequence pipeline
      3. 10.3. Training the sequence-to-sequence network
      4. 10.4. Building a chatbot using sequence-to-sequence networks
      5. 10.5. Enhancements
      6. 10.6. In the real world
      7. Summary
  12. Part 3. Getting real (real-world NLP challenges)
    1. Chapter 11. Information extraction (named entity extraction and question answering)
      1. 11.1. Named entities and relations
      2. 11.2. Regular patterns
      3. 11.3. Information worth extracting
      4. 11.4. Extracting relationships (relations)
      5. 11.5. In the real world
      6. Summary
    2. Chapter 12. Getting chatty (dialog engines)
      1. 12.1. Language skill
      2. 12.2. Pattern-matching approach
      3. 12.3. Grounding
      4. 12.4 Retrieval (search)
      5. 12.5. Generative models
      6. 12.6 Four-wheel drive
      7. 12.7. Design process
      8. 12.8 Trickery
      9. 12.9. In the real world
      10. Summary
    3. Chapter 13. Scaling up (optimization, parallelization, and batch processing)
      1. 13.1. Too much of a good thing (data)
      2. 13.2. Optimizing NLP algorithms
      3. 13.3. Constant RAM algorithms
      4. 13.4. Parallelizing your NLP computations
      5. 13.5. Reducing the memory footprint during model training
      6. 13.6. Gaining model insights with TensorBoard
      7. Summary
  13. Appendix A. Your NLP tools
    1. A.1. Anaconda3
    2. A.2. Install NLPIA
    3. A.3. IDE
    4. A.4. Ubuntu package manager
    5. A.5. Mac
    6. A.6. Windows
    7. A.7. NLPIA automagic
  14. Appendix B. Playful Python and regular expressions
    1. B.1. Working with strings
    2. B.2. Mapping in Python (dict and OrderedDict)
    3. B.3. Regular expressions
    4. B.4. Style
    5. B.5. Mastery
  15. Appendix C. Vectors and matrices (linear algebra fundamentals)
    1. C.1. Vectors
  16. Appendix D. Machine learning tools and techniques
    1. D.1. Data selection and avoiding bias
    2. D.2. How fit is fit?
    3. D.3. Knowing is half the battle
    4. D.4. Cross-fit training
    5. D.5. Holding your model back
    6. D.6. Imbalanced training sets
    7. D.7. Performance metrics
    8. D.8. Pro tips
  17. Appendix E. Setting up your AWS GPU
    1. E.1. Steps to create your AWS GPU instance
  18. Appendix F. Locality sensitive hashing
    1. F.1. High-dimensional vectors are different
    2. F.2. High-dimensional indexing
    3. F.3. “Like” prediction
  19. Resources
    1. Applications and project ideas
    2. Courses and tutorials
    3. Tools and packages
    4. Research papers and talks
    5. Competitions and awards
    6. Datasets
    7. Search engines
  20. Glossary
    1. Acronyms
    2. Terms
  21. Chatbot Recirculating (Recurrent) Pipeline
  22. Index
  23. List of Figures
  24. List of Tables
  25. List of Listings