본문 바로가기

Papers/NLP

(4)
[논문 리뷰] Text Classification Using Label Names Only: A Language Model Self-Training Approach (LOTClass, EMNLP 2020) Written by Yu Meng et al. Paper Link: https://arxiv.org/pdf/2010.07245.pdf Summary Motivation: To suggest a weakly-supervised text classification model trained on only unlabeled corpus without external sources. Contribution: 1) no need for external sources like Wikipedia. 2) using indicative words for category and predicting the masked ones. 3) comparable text classification performance compared..
[NLP 근본 논문 리뷰 #2] Distant Supervision for Relation Extraction without Labeled Data (ACL 2009) Written by Mike Mintz et al. Paper Link: https://aclanthology.org/P09-1113.pdf Summary Motivation To use Freebase to give us a training set of relations and entity pairs that participate in those relations Contribution Avoid overfitting and domain-dependence by using large-scale knowledge base Limitation Wrongly labeled relations False negatives Assumption on large data (knowledge base such as F..
[NLP 근본 논문 리뷰 #1] GloVe: Global Vectors for Word Representation (EMNLP 2014) by Jeffrey Pennington Paper : https://nlp.stanford.edu/pubs/glove.pdf Motivation Word2vec은 word vectors의 representation을 배우기 위한 프레임워크입니다. Context가 주어졌을 때 word vectors의 similarity를 계산하자는 것이 주요한 아이디어입니다. Word2vec에선 skip-gram과 continuous bag-of-word (CBOW)의 두 가지 모델을 제시하고 있습니다. 이 모델의 objective function은 context가 주어졌을 때 word를 예측하는 것입니다. 이 방법론은 직접적으로 co-occurence statistics를 계산할 수 없다는 한계로 인해, corpus 내..
[리뷰] Structured Prediction as Translation between Augmented Natural Languages (ICLR 2021) by Paolini et al. Paper Link : https://openreview.net/pdf?id=US-TP-xnXI Code Link : https://github.com/amazon-research/tanl Contents 1. Structured Prediction 2. Structured Prediction Tasks 3. Introduction 4. Proposed Method 5. Experiments 6. Discussion & Summary Abbreivation & Keywords SP : Structured Prediction TANL : Translation between Augmented Natural Language PLM : Pre-trained Language Mod..

반응형