[Project in collaboration with Telenor-NTNU AILab]
In the recent years most of Deep Learning advancements have a strong impact in Natural Language Processing Tasks such as Question/Answering, Conversational Systems (i.e.: chatbot) and Text Understanding problems.
For example the attention mechanism is allowing the network to focus only on a certain subset of the data for a given task for reducing the amount of information to be processed.
It have been proved to be effective in some of this tasks, such in machine translation  and Question Answering 
In this work, the student should implement and explore the current state of the art of deep learning according to one of the NLP tasks described before. The student should focus on specific methods (Attention Mechanism, use of RNN, etc.) and testing it on existing challenges/dataset (Stanford QA Dataset)
 - Effective Approaches to Attention-based Neural Machine Translation
 - Dynamic Coattention Network
 - Abstractive text summarization using sequence-to-sequence rnns and beyond 
A minimal background in Machine Learning is requested (at least one of two courses)
Tags: QA, Deep Learning, NLP