9cca48d185
word2vec is an implementation of the Continuous Bag-of-Words (CBOW) and the Skip-gram model (SG), as well as several demo scripts. Given a text corpus, the word2vec tool learns a vector for every word in the vocabulary using the Continuous Bag-of-Words or the Skip-Gram neural network architectures.
16 lines
443 B
C
16 lines
443 B
C
$NetBSD: patch-word2vec.c,v 1.1 2019/12/02 02:00:41 minskim Exp $
|
|
|
|
Portability fix.
|
|
https://github.com/tmikolov/word2vec/pull/40
|
|
|
|
--- word2vec.c.orig 2017-07-16 22:46:08.000000000 +0000
|
|
+++ word2vec.c
|
|
@@ -71,7 +71,7 @@ void InitUnigramTable() {
|
|
void ReadWord(char *word, FILE *fin, char *eof) {
|
|
int a = 0, ch;
|
|
while (1) {
|
|
- ch = fgetc_unlocked(fin);
|
|
+ ch = getc_unlocked(fin);
|
|
if (ch == EOF) {
|
|
*eof = 1;
|
|
break;
|