Nlp Highlights
58 - Learning What’s Easy: Fully Differentiable Neural Easy-First Taggers, with André Martins
- Autor: Vários
- Narrador: Vários
- Editora: Podcast
- Duração: 0:47:29
- Mais informações
Informações:
Sinopse
EMNLP 2017 paper by André F. T. Martins and Julia Kreutzer André comes on the podcast to talk to us the paper. We spend the bulk of the time talking about the two main contributions of the paper: how they applied the notion of "easy first" decoding to neural taggers, and the details of the constrained softmax that they introduced to accomplish this. We conclude that "easy first" might not be the right name for this - it's doing something that in the end is very similar to stacked self-attention, with standard independent decoding at the end. The particulars of the self-attention are inspired by "easy first", however, using a constrained softmax to enforce some novel constraints on the self-attention. https://www.semanticscholar.org/paper/Learning-What's-Easy%3A-Fully-Differentiable-Neural-Martins-Kreutzer/252571243aa4c0b533aa7fc63f88d07fd844e7bb

Experimente 7 dias grátis
- Acesso ilimitado a todo o conteúdo da plataforma.
- Mais de 30 mil títulos, entre audiobooks, ebooks, podcasts, séries, documentários e notícias.
- Narração dos audiolivros feita por profissionais, entre atores, locutores e até mesmo os próprios autores.