You are here

+DS IPLE: Attention Networks for Natural Language Processing

Mar 26

Thursday, March 26, 2020 - 4:30pm to 6:30pm

Virtual classroom

Add to calendar »


Lawrence Carin

Neural-network-based methods for natural language processing (NLP) constitute an area of significant recent technical progress, with many interesting real-world applications. The Transformer Network is one of the newest and most powerful approaches of this type. This algorithm is based on repeated application of attention networks, in an encoder-decoder framework. In this presentation the basics of all-attention models (the Transformer) for NLP will be described, with application in areas like text synthesis (e.g., suggesting email text) and language translation. This session is part of the Duke+Data Science (+DS) program in-person learning experiences (IPLEs). To learn more, please visit