
Author of "Recursive Language Models" in Alex Zhang
How this journalist typically writes
Alex L Zhang as author
Recursive Language Models (RLMs) enable language models to process unbounded input context length by recursively decomposing and interacting with context through REPL environments, with GPT-5-mini outperforming GPT-5 on long-context benchmarks.
“Author of "Recursive Language Models" in Alex Zhang”