Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
Plus
22
Ṁ12702026
85%
chance
1D
1W
1M
ALL
The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.
1/10th of this energy = 5-6 million kWh
1/100th of this energy = 0.5-0.6 million kWh
See calculations below:
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Related questions
Related questions
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
53% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
9% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
68% chance
Will there be an open source LLM as good as GPT4 by June 2024?
18% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
35% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
81% chance
Will a 15 billion parameter LLM match or outperform GPT4 in 2024?
24% chance
Which next-gen frontier LLMs will be released before GPT-5? (2025)
Will xAI develop a more capable LLM than GPT-5 by 2026
59% chance
Will we have an open-source model better than GPT-4-Turbo before 2025?
95% chance