LIONs: An Empirically Optimized Approach to Align Language Models

Xiao Yu, Qingyang Wu, Yu Li, Zhou Yu


Abstract
Alignment is a crucial step to enhance the instruction-following and conversational abilities of language models. Despite many recent works proposing new algorithms, datasets, and training pipelines, there is a lack of comprehensive studies measuring the impact of various design choices throughout the whole training process. We first conduct a rigorous analysis over a three-stage training pipeline consisting of supervised fine-tuning, offline preference learning, and online preference learning. We have found that using techniques like sequence packing, loss masking in SFT, increasing the preference dataset size in DPO, and online DPO training can significantly improve the performance of language models. We then train from Gemma-2b-base and LLama-3-8b-base, and find that our best models exceed the performance of the official instruct models tuned with closed-source data and algorithms. Our code and models can be found at https://github.com/Columbia-NLP-Lab/LionAlignment.
Anthology ID:
2024.emnlp-main.496
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8732–8753
Language:
URL:
https://aclanthology.org/2024.emnlp-main.496/
DOI:
10.18653/v1/2024.emnlp-main.496
Bibkey:
Cite (ACL):
Xiao Yu, Qingyang Wu, Yu Li, and Zhou Yu. 2024. LIONs: An Empirically Optimized Approach to Align Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 8732–8753, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
LIONs: An Empirically Optimized Approach to Align Language Models (Yu et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.496.pdf
Software:
 2024.emnlp-main.496.software.zip