Skip to content

Is the trained word2vec really skip-gram by default? or is it CBOW? #54

@kkteru

Description

@kkteru

Thanks for putting this together!

The variable naming indicates that the trained word2vec model within node2vec model is a skip-gram model. However, the default setting of gensim's Word2Vec is CBOW. Having said that, one can easily pass the required parameter to chose a skip-gram model instead of CBOW.

This is a tiny technical detail which makes the default setting of this code different from the original node2vec which explicitly trains a skip-gram model. Perhaps the users should be aware of that. Or did I miss something?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions