Natural Language Processing Papers
https://ai-paper.github.io/nlp/
View On GitHub
Distributed Representations of Sentences and Documents
GloVe: Global Vectors for Word Representation
Skip-Thought Vectors
Convolutional Neural Networks for Sentence Classification
Character-level Convolutional Networks for Text Classification
Hierarchical Attention Networks for Document Classification
Neural Relation Extraction with Selective Attention over Instances
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
Sequence to Sequence Learning with Neural Networks
Convolutional Sequence to Sequence Learning
Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Phrase-Based & Neural Unsupervised Machine Translation
Get To The Point: Summarization with Pointer-Generator Networks
End-To-End Memory Networks
QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension
Bidirectional Attention Flow for Machine Comprehension
Adversarial Learning for Neural Dialogue Generation
SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient
Modeling Relational Data with Graph Convolutional Networks
Exploring the Limits of Language Modeling
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
Deep contextualized word representations
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Efficient Estimation of Word Representations in Vector Space
Neural Machine Translation by Jointly Learning to Align and Translate
Attention is all you need
A Convolutional Neural Network for Modelling Sentences
Bag of Tricks for Efficient Text Classification
Siamese recurrent architectures for learning sentence similarity