You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey @Madhu009
Thanks for the reply
I understand embedding vectors, I was just wondering if there was a quick workaround for the code so that I can plug in a word and return top 10 similar terms.
I tried using the tensorflow board but it wasnt successful either.
Currently researching other methods too
Thank you firstly for the tutorial
I wanted to ask if it is possible to use the final embeddings to test out a word and return top 10 similar terms.
e.g
Top 10 Similar words given an input word
word="external"
word_vec = final_embeddings[dictionary[word]]
sim = np.dot(word_vec,-final_embeddings.T).argsort()[0:8]
for idx in range(8):
print (reverse_dictionary[sim[idx]])
The text was updated successfully, but these errors were encountered: