Advancing Transformer's Capabilities in Commonsense Reasoning

Y Zhou, Y Han, H Zhou, Y Wu - arXiv preprint arXiv:2310.06803, 2023 - arxiv.org
Y Zhou, Y Han, H Zhou, Y Wu
arXiv preprint arXiv:2310.06803, 2023arxiv.org
Recent advances in general purpose pre-trained language models have shown great
potential in commonsense reasoning. However, current works still perform poorly on
standard commonsense reasoning benchmarks including the Com2Sense Dataset. We
argue that this is due to a disconnect with current cutting-edge machine learning methods. In
this work, we aim to bridge the gap by introducing current ML-based methods to improve
general purpose pre-trained language models in the task of commonsense reasoning …
Recent advances in general purpose pre-trained language models have shown great potential in commonsense reasoning. However, current works still perform poorly on standard commonsense reasoning benchmarks including the Com2Sense Dataset. We argue that this is due to a disconnect with current cutting-edge machine learning methods. In this work, we aim to bridge the gap by introducing current ML-based methods to improve general purpose pre-trained language models in the task of commonsense reasoning. Specifically, we experiment with and systematically evaluate methods including knowledge transfer, model ensemble, and introducing an additional pairwise contrastive objective. Our best model outperforms the strongest previous works by ~15\% absolute gains in Pairwise Accuracy and ~8.7\% absolute gains in Standard Accuracy.
arxiv.org
Showing the best result for this search. See all results