← Other posts
LTM-1: an LLM with a 5,000,000 token context window
Magic Team, on 6/6/2023

Magic’s LTM-1 enables 50x larger context windows than transformers

Magic's trained a Large Language Model (LLM) that’s able to take in the gigantic amounts of context when generating suggestions. For our coding assistant, this means Magic can now see your entire repository of code.

Towards trustworthy,
grounded AI

Larger context windows can allow AI models to reference more explicit, factual information and their own action history. We hope to be able to utilise this research to improve reliability and coherence.

Sign up

Magic’s AI coding assistant, powered by LTM Nets, is currently in closed Alpha.

Join Waitlist