@hangdong

Joint Multi-Label Attention Networks for Social Text Annotation

, , , and . Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), page 1348--1354. Minneapolis, Minnesota, Association for Computational Linguistics, (June 2019)

Abstract

We propose a novel attention network for document annotation with user-generated tags. The network is designed according to the human reading and annotation behaviour. Usually, users try to digest the title and obtain a rough idea about the topic first, and then read the content of the document. Present research shows that the title metadata could largely affect the social annotation. To better utilise this information, we design a framework that separates the title from the content of a document and apply a title-guided attention mechanism over each sentence in the content. We also propose two semantic-based loss regularisers that enforce the output of the network to conform to label semantics, i.e. similarity and subsumption. We analyse each part of the proposed system with two real-world open datasets on publication and question annotation. The integrated approach, Joint Multi-label Attention Network (JMAN), significantly outperformed the Bidirectional Gated Recurrent Unit (Bi-GRU) by around 13\%-26\% and the Hierarchical Attention Network (HAN) by around 4\%-12\% on both datasets, with around 10\%-30\% reduction of training time.

Description

Joint Multi-Label Attention Networks for Social Text Annotation - ACL Anthology

Links and resources

Tags

community