Siamese Network with Soft Attention for Semantic Text Understanding

Research & Innovation

We propose a task independent neural networks model, based on a Siamese twin architecture. Our model specifically benefits from two forms of attention scheme which we use to extract high-level feature representation of the underlying texts, both at the word level (intra-attention) as well as at the sentence level (inter-attention). The inter-attention scheme uses one of the text to create a contextual interlock with the other text, thus paying attention to mutually important parts. We evaluate our system on three tasks, i.e. Textual Entailment, Paraphrase Detection and answer-sentence selection. We set a near state-of-the-art result on the textual entailment task with the SNLI corpus while obtaining strong performance across the other tasks that we evaluate our model on.

Speakers: 

Access the Recording and Slide Deck?

As a registered participant, you got a login to access the recording and slide deck. You may also purchase an on-demand ticket (36,- incl. VAT).