Skip to main content
eScholarship
Open Access Publications from the University of California

UNIFIT: A Unified Framework For Instruction Tuning To Improve Instruction Following Ability For Large Language Models

Abstract

Extensive instruction tuning of large language models(LLMs) has proven to be a powerful technique, extending the outstanding performance of general LLMs to new tasks. Consequently, the integration of state-of-the-art(SOTA) general open-source models with specific domains leverages instruction tuning to unlock the emerging capabilities of large language models(LLMs) in those domains. Current practices in instruction tuning often rely on expanding the size of the dataset without a clear strategy to ensure data quality. This can inadvertently introduce noise and degrade model performance. Furthermore, there is currently no unified approach for the quantity, quality, and diversity of instruction tuning data, as well as the methods for instruction tuning. As a result, this severely hampers the practicality and universality of instruction tuning.\ Addressing these issues, we propose a \textbf{UNI}fied \textbf{F}ramework for \textbf{I}nstruction \textbf{T}uning(\textbf{UNIFIT}), namely Concept Tree generation for instruction tuning data, instruction following difficulty selecting data for high-quality instruction tuning, and the incorporation of random noise embeddings(NE) to enhance model performance during tuning. Through experiments involving multiple models, domains, and orders of magnitude, our proposed instruction tuning framework not only enhances the diversity of instruction tuning data but also achieves a remarkable 60\% reduction in training time consumption, with a mere 6\% of all instruction tuning data, surpassing the performance of using all instruction tuning data by 11\%. This universally applicable instruction tuning framework signifies a substantial advancement in the generality of large language model instruction tuning, marking a revolutionary leap forward and promising efficiency gains while being resource-conscious.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View