Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 9, 2020 | Accepted Version
Report Open

Multi Sense Embeddings from Topic Models

Abstract

Distributed word embeddings have yielded state-of-the-art performance in many NLP tasks, mainly due to their success in capturing useful semantic information. These representations assign only a single vector to each word whereas a large number of words are polysemous (i.e., have multiple meanings). In this work, we approach this critical problem in lexical semantics, namely that of representing various senses of polysemous words in vector spaces. We propose a topic modeling based skip-gram approach for learning multi-prototype word embeddings. We also introduce a method to prune the embeddings determined by the probabilistic representation of the word in each topic. We use our embeddings to show that they can capture the context and word similarity strongly and outperform various state-of-the-art implementations.

Additional Information

Accepted at ACL supported conference for Natural Language & Speech Processing. this https URL, Year: 2019

Attached Files

Accepted Version - 1909.07746.pdf

Files

1909.07746.pdf
Files (478.9 kB)
Name Size Download all
md5:0ad31d0beeab8123f1cc32d5523da191
478.9 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 18, 2023