Convolutional and Recurrent Networks

The human brain is often the main inspiration and comparison we make when building AI and is something deep learning researchers often look to for inspiration or reassurance. By studying the brain and its parts in more detail, we often discover neural sub-processes. An example of a neural sub-process would be our visual cortex, the area or region of our brain responsible for vision. We now understand that this area of our brain is wired differently and responds differently to input. This just so happens to be analogous to analog what we have found in our previous attempts at using neural networks to classify images. Now, the human brain has many sub-processes all with specific mapped areas in the brain (sight, hearing, smell, speech, taste, touch, and memory/temporal), but in this chapter, we will look at how we model just sight and memory by using advanced forms of deep learning called convolutional and recurrent networks. The two-core sub-processes of sight and memory are used extensively by us for many tasks including gaming and form the focus of research of many deep learners.

Researchers often look to the brain for inspiration, but the computer models they build often don't entirely resemble their biological counterpart. However, researchers have begun to identify almost perfect analogs to neural networks inside our brains. One example of this is the ReLU activation function. It was recently found that the excitement level in our brains' neurons, when plotted, perfectly matched a ReLU graph. 

In this chapter, we will explore, in some detail, convolutional and recurrent neural networks. We will look at how they solve the problem of replicating accurate vision and memory in deep learning. These two new network or layer types are a fairly recent discovery but have been responsible in part for many advances in deep learning. This chapter will cover the following topics:

  • Convolutional neural networks
  • Understanding convolution
  • Building a self-driving CNN
  • Memory and recurrent networks
  • Playing rock, paper, scissors with LSTMs

Be sure you understand the fundamentals outlined in the previous chapter reasonably well before proceeding. This includes running the code samples, which install this chapter's required dependencies.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset