If there exists a super-intelligent AI, would majority of AI researchers answer Yes to "Have we reached AGI?" ?
Plus
23
Ṁ7862031
63%
chance
1D
1W
1M
ALL
Super-intelligent AI :
"Something along the lines of -> smarter than humans at most cognitive tasks, very very good at some key tasks, and can afford to be indifferent too anything it can't do." (@Duncn's comment)
"AI that is better than majority of the humans at most economically valuable tasks, but not necessarily better than the best humans in all of those tasks."
(I created this market to gauge opinion for @Primer's question)
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@ShadowyZephyr Resolves whenever there is such a survey and such a superintelligent AI, until then market trades according to what that survey will point to
Related questions
Related questions
In which year will a majority of AI researchers concur that a superintelligent, fairly general AI has been realized?
Will we have an AGI as smart as a "generally educated human" by the end of 2025?
53% chance
Will artificial superintelligence exist by 2030? [resolves N/A in 2027]
38% chance
Will we have at least one more AI winter before AGI is realized?
38% chance
Will AI be capable of superhuman persuasion well before (>1yr) superhuman general intelligence?
72% chance
Will AI create the first AGI?
39% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
35% chance
Will a sentient AI system have existed before 2030? [Resolves to 2100 expert consensus]
38% chance
The probability of extremely good AGI outcomes eg. rapid human flourishing will be >24% in next AI experts survey
54% chance
Will Artificial General Intelligence (AGI) lead directly to the development of Artificial Superintelligence (ASI)?
76% chance