There's more...

Another way of encoding text into a numerical form is by using the Word2Vec algorithm. The algorithm computes a distributed representation of words with the advantage that similar words are placed close together in the vector space.

Check out this tutorial to learn more about Word2Vec and the skip-gram model: http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/.

Here's how we do it in Spark:

w2v = feat.Word2Vec(
vectorSize=5
, minCount=2
, inputCol=sw_remover.getOutputCol()
, outputCol='vector'
)

We will get a vector of five elements from the .Word2Vec(...) method. Also, only words that occur at least twice in the corpus will be used to create the word-embedding. Here's what the resulting vector will look like:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset