One Million Token Context Window

March 25, 2025

One Million Token Context Window

On June 11, 2024, former CEO of Google Eric Schmidt spoke to students at the Hoover Institute in Stanford University. His interactive keynote focused on the intersection of artificial intelligence, national security, and the role of Silicon Valley.

The first thing he did in his keynote was to ask the audience a question:

"Can someone explain what a million token context window is?"

Put simply, a million token context window means that "you can ask the LLM a one million word question".

🤯

"We're going to 10," said Schmidt.

10 million token context window.

At the time, Anthropic's AI model Claude 3 had a 200,000 token context window. The incredible Claude 3.5 model hadn't even launched yet - the world would hear about it 9 days after this talk at Stanford when it was announced.

200,000 token context window going to 10 million. Schmidt was predicting a 50x increase.

Where are we today?

Fast forward to today, and we're seeing Schmidt's vision unfold faster than some may have expected. It's March 25, 2025 and Google just announced Gemini 2.5 Pro. They claim it is their "most intelligent AI model."

One line in Google’s press release really stood out to me:

"Gemini 2.5 builds on what makes Gemini models great — native multimodality and a long context window. 2.5 Pro ships today with a 1 million token context window (2 million coming soon)"

2 million token context window coming soon?

We have gone from 200,000 to 1 million today, with 2 million around the corner. Schmidt's prediction of "going to 10" will be here before we know it.

The question I have for you is:

What are you going to do with these context windows with millions of tokens?

Click to Tweet