Base of RoPE Bounds Context Length

Abstract

Position embedding is a core component of current Large Language Models(LLMs). Rotary position embedding (RoPE), a technique that encodes the position information with a rotation matrix, has been the de facto choice for position embedding in many LLMs, such as the Llama series. RoPE has been further utilized to extend long context capability, which is roughly based on adjusting the base parameter of RoPE to mitigate out-of-distribution (OOD) problems in position embedding. However, in this paper, we find that LLMs may obtain a superficial long-context ability based on the OOD theory. We revisit the role of RoPE in LLMs and propose a novel property of long-term decay, deriving that thebase of RoPE bounds context length: there is an absolute lower bound for the base value to obtain certain context length capability. Our work reveals the relationship between context length and RoPE base both theoretically and empirically, which may shed light on future long context training.

Publication
In Conference on Neural Information Processing Systems (NeurIPS 2024)
Qingyu Zhang
Qingyu Zhang
Master Student of Computer Science and Technology

Research interests include LLM Long Context and Post-training.

Qingyu Zhang
Qingyu Zhang
Master Student of Computer Science and Technology

Research interests include LLM Long Context and Post-training.