How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
Basic
5
Ṁ11792025
3%
<1e24
3%
[1e24, 3e24)
7%
[3e24, 1e25)
8%
[1e25, 3e25)
38%
[3e25, 1e26)
30%
[3e26, 1e27)
5%
[1e27, 3e27)
2%
[3e27, 1e28)
1.8%
[1e28, 3e28)
1.3%
[3e28, 1e29)
1.3%
>=1e29
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
9% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
72% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
22% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
53% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
How many FLOPs will go into training the first ASL-3 model?
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2028?
64% chance
Will it cost less than 100k USD to train and run a language model that outperforms GPT-3 175B on all benchmarks by the end 2024?
85% chance