Terms

Affordance

—A way for your user to interact with your product that you intentionally make possible. Ideally that interaction should come naturally to the user, be easily discoverable, and self-documenting.

Artificial neural network

—A computational graph for machine learning or simulation of a biological neural network (brain)

Cell

—The memory or state part of an LSTM unit that records a single scalar value and outputs it continuously [4]

4

See the web page titled “Long short-term memory” (https://en.wikipedia.org/wiki/Long_short-term_memory).

Dark patterns

—Software patterns (usually for a user interface) that are intended to increase revenue but often fail due to “blowback” because they manipulate your customers into using your product in ways that they don’t intend

Feed-forward network

—A “one-way” neural network that passes all its inputs through to its outputs in a consistent direction, forming a computation directed acyclic graph (DAG) or tree

Morpheme

—A part of a token or word that contains meaning in and of itself. The morphemes that make up a token are collectively called the token’s morphology. The morphology of a token can be found using algorithms in packages like SpaCy that process the token with its context (words around it).[5]

5

See the web page titled “Linguistic Features : spaCy Usage Documentation” (https://spacy.io/usage/linguistic-features#rule-based-morphology).

Net, network, or neural net

—Artificial neural network

Neuron

—A unit in a neural net whose function (such as y = tanh(w.dot(x))) takes multiple inputs and outputs a single scalar value. This value is usually the weights for that neuron (w or wi) multiplied by all the input signals (x or xi) and summed with a bias weight (w0) before applying an activation function like tanh. A neuron always outputs a scalar value, which is sent to the inputs of any additional hidden or output neurons in the network. If a neuron implements a much more complicated activation function than that, like the enhancements that were made to recurrent neurons to create an LSTM, it is usually called a unit, for example, an LSTM unit.

Nessvector

—An informal term for topic vectors or semantic vectors that capture concepts or qualities, such as femaleness or blueness, into the dimensions of a vector

Predicate

—In English grammar, the predicate is the main verb of a sentence that’s associated with the subject. Every complete sentence must have a predicate, just like it must also have a subject.

Skip-grams

—Pairs of tokens used as training examples for a word vector embedding, where any number of intervening words are ignored (see chapter 6).

Softmax

—Normalized exponential function used to squash the real-valued vector output by a neural network so that its values range between 0 and 1 like probabilities.

Subject

—The main noun of a sentence—every complete sentence must have a subject (and a predicate) even if the subject is implied, like in the sentence “Run!” where the implied subject is “you.”

Unit

—Neuron or small collection of neurons that perform some more complicated nonlinear function to compute the output. For example, an LSTM unit has a memory cell that records state, an input gate (neuron) that decides what value to remember, a forget gate (neuron) that decides how long to remember that value, and an output gate neuron that accomplishes the activation function of the unit (usually a sigmoid or tanh()). A unit is a drop-in replacement for a neuron in a neural net that takes a vector input and outputs a scalar value; it just has more complicated behavior.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset