9cca48d185
word2vec is an implementation of the Continuous Bag-of-Words (CBOW) and the Skip-gram model (SG), as well as several demo scripts. Given a text corpus, the word2vec tool learns a vector for every word in the vocabulary using the Continuous Bag-of-Words or the Skip-Gram neural network architectures.
6 lines
144 B
Text
6 lines
144 B
Text
@comment $NetBSD: PLIST,v 1.1 2019/12/02 02:00:41 minskim Exp $
|
|
bin/compute-accuracy
|
|
bin/distance
|
|
bin/word-analogy
|
|
bin/word2phrase
|
|
bin/word2vec
|