Abstract
In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.
Original language | English |
---|---|
Title of host publication | Proceedings of the 53rd annual meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Short Papers) |
Publisher | Association for Computational Linguistics |
Pages | 15-20 |
Number of pages | 6 |
Volume | 2 |
ISBN (Print) | 978-1-941643-73-0 |
Publication status | Published - 2015 |
Event | 53rd annual meeting of the Association for Computational Linguistics / 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing - Beijing, China Duration: 26 Jul 2015 → 31 Jul 2015 |
Meeting
Meeting | 53rd annual meeting of the Association for Computational Linguistics / 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing |
---|---|
Abbreviated title | ACL-IJCNLP 2015 |
Country/Territory | China |
City | Beijing |
Period | 26/07/15 → 31/07/15 |