Rethinking Learning Rate Tuning in the Era of Large Language Models
Published in 2023 IEEE 5th International Conference on Cognitive Machine Intelligence (CogMI), 2023
This paper explores the challenges of learning rate tuning for Large Language Models (LLMs) and introduces LRBench++ for benchmarking.
Recommended citation: Jin, H., Wei, W., Wang, X., Zhang, W., & Wu, Y. (2023). "Rethinking Learning Rate Tuning in the Era of Large Language Models." 2023 IEEE 5th International Conference on Cognitive Machine Intelligence (CogMI), 112-121.
Download Paper