You are here

+DS vLE:Natural Language Processing with LSTM Recurrent Neural Networks

Oct 7

Wednesday, October 7, 2020 - 4:30pm to 6:00pm


Add to calendar »


Lawrence Carin

Natural language processing (NLP) is a field focused on developing automated methods for analyzing text, and also for computer-driven text generation (synthesis, for example in translation and text summarization). Recurrent neural networks have recently become a state-of-the-art method for NLP, with the long short-time memory (LSTM) network representing the primary methodology of this type. In this session LSTM NLP models will be introduced, with as little math as possible and with an emphasis on intuition. The concept of word embeddings will be introduced within the context of implementing LSTMs, and it will be explained how such models are used in practice for analysis and generation of natural language (e.g., language translation). Before the session, all registrants will receive an e-mail with a link and meeting information. This session is part of the Duke+DataScience (+DS) program virtual learning experiences (vLEs). To learn more, please visit