no code implementations • ICON 2021 • Anmol Bansal, Anjali Shenoy, Krishna Chaitanya Pappu, Kay Rottmann, Anurag Dwarakanath
Fine-tuning self-supervised pre-trained language models such as BERT has significantly improved state-of-the-art performance on natural language processing tasks.