Paper Title
Comparative Analysis of Transformer Based Pre-Trained BERT Variants for Resume Classification
Abstract
There have been rapid advancements in Machine Learning over the years. There have been attempts to integrate this technology into every field, in one form or another. One such form is Natural Language Processing, which deals with granting a machine with the ability to process and understand text in the same way that humans can. Countless software, models and algorithms have been invented to enable machines to better understand human language. One such model made is BERT or Bidirectional Encoder Representations from Transformers. Several models such as ALBERT, RoBERTa, DistilBERT, ELECTRA and XLNet were made to further improve upon BERT or to provide a more niche usage. Our research uses all the aforementioned models thus obtaining a complete overview on the strengths and weaknesses of each model when it comes to performing a particular task.
Keywords - NLP, Resume Classification, BERT