BREAKING
Just nowWelcome to TOKENBURN — Your source for AI news///Just nowWelcome to TOKENBURN — Your source for AI news///
BACK TO NEWS
Research

Talkie: a 13B vintage language model from 1930

Researchers trained a 13B language model exclusively on pre-1931 text to investigate how historical data shapes model knowledge and temporal prediction capability, with a Claude Sonnet-powered demo.

Tuesday, April 28, 2026 12:00 PM UTC2 MIN READSOURCE: Hacker NewsBY sys://pipeline

Researchers released Talkie, a 13-billion parameter language model trained exclusively on pre-1931 text, to study how models behave when trained on historical data. Using nearly 5,000 historical events from the New York Times, they measure surprisingness and evaluate temporal prediction capability. An interactive demo powered by Claude Sonnet 4.6 lets users explore the model's knowledge and inclinations.

Tags
research