
If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
Plus
10
Ṁ1080Jan 2
19%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Related questions
Related questions
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
82% chance
Will there be evidence in 2025 that in April 2023, OpenAI had a GPT-4.5 or higher model?
16% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
Will we have an open-source model that is equivalent GPT-4 by end of 2025?
82% chance
Will an open source model beat GPT-4 in 2024?
76% chance
How much compute will be used to train GPT-5?
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
92% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance