Perplexity

Perplexity, when applied to LDA, measures how well the topic-word probability distribution recovered by the model predicts a sample, for example, unseen text documents. It is based on the entropy H(p) of this distribution p and computed with respect to the set of tokens w:

Measures closer to zero imply the distribution is better at predicting the sample.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.232.125.188