General conversational models

Conversational chatbots can be broken down further into two main forms: generative and selective. The method we will look at is called generative. Generative models learn by being fed a sequence of words and dialog in context/reply pairs. Internally, these models use RNN (LSTM) layers to learn and predict those sequences back to the conversant. An example of how this system works is as follows:


Example of the generative conversational model

Note that each block in the diagram represents one LSTM cell. Each cell then remembers the sequence that text was part of. What may not be clear from the preceding diagram is that both sides of the conversation text were fed into the model before training. Thus, this model is not unlike the GANs we covered in Chapter 3, GAN for Games. In the next section, we will get into the details of setting up this type of model.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset