Paper Title
COMMONSENSE GROUNDED TEXT SUMMARIZATION WITH KNOWLEDGE GRAPHS

Abstract
Natural language processing has seen remarkable progress in conditional language modelling during the post-transformers era. However, commonsense knowledge or lack thereof, remains a critical inadequacy of the same. In our study, we propose to replicate the performance of an existing graph-based neural architecture, CCM, utilizing large-scale commonsense knowledge base and graph attention mechanism. Originally designed to facilitate conversation generation and semantics understanding; with a few tweaks in the training hyper parameters, we extend its capability for meaningful response generation further to abstractive text summarization. Unlike existing models, this is the first attempt that utilizes commonsense knowledge graphs for text summarization. Our experiments demonstrate the precise yet informative nature of generated summaries against some baseline metrics. Keywords - Knowledge Graphs, Text Summarization, NLP, Commonsense Reasoning, Sequence-to-sequence, Graph Neural Networks