Date: Fri, 03 May 2019 15:31:05 +0000
<p>Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.</p>