
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
Plus
7
Ṁ4692028
40%
chance
1D
1W
1M
ALL
In "Situational Awareness: The Decade Ahead", Leopold Aschenbrenner claims:
Another way of thinking about it is that given inference fleets
in 2027, we should be able to generate an entire internet’s worth of
tokens, every single day.
Resolves YES if by the end of 2027, there is enough deployed inference capacity to generate 30 trillion tokens in a 24-hour period using a combination of frontier models. "Frontier models" in the sense that GPT-4 is a frontier model today in mid-2024.
This is one of a series of markets on claims made in Leopold Aschenbrenner's Situational Awareness report(s):
Other markets about Leopold's predictions:
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Related questions
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
83% chance
Will any AI model achieve > 40% on Frontier Math before 2026?
89% chance
AI: Will someone train a $10T model by 2100?
59% chance
By March 14, 2025, will there be an AI model with over 10 trillion parameters?
11% chance
Will anyone train a TokenFormer model at scale before 2026?
25% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will frontier AI effective training compute increase by a factor 10 billion between 2025 and 2035?
62% chance
Will a new lab create a top-performing AI frontier model before 2028?
75% chance
Will we see a public GPU compute sharing pool for LLM model training or inference before 2026 ?
86% chance
Will OpenAI release a tokenizer with more than 210000 tokens before 2026?
24% chance