Our paper Sparse is Enough in Fine-tuning Pre-trained Large Language Models (SIFT) was accepted by ICML 2024. It presents an efficient sparse fine-tuning framework that achieves strong performance with significantly reduced updates.

Links: Paper · Code