
Will an AI system similar to Auto-GPT make a successful attempt to kill a human by 2030?
Plus
34
Ṁ16742031
28%
chance
1D
1W
1M
ALL
By default resolves NO; the burden of proof is on the YES side.
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Sort by:
@tailcalled I think the main point is that it should arrive at this decision completely autonomously. Even if a specific task was given to the AI by humans, it is not one where killing a human is an expected outcome. E.g. military AI would not count but paperclip producing AI would.
Related questions
Related questions
Before 2030, will an AI complete the Turing Test in the Kurzweil/Kapor Longbet?
61% chance
Will superintelligent AI take over humanity by 2030?
18% chance
Will AI wipe out humanity before the year 2030?
4% chance
Will an AI system be judged to have killed a human on its own initiative and for no other purpose by 2030?
26% chance
Will AI out-wipe humanity by 2030?
12% chance
Will an AI system beat humans in the GAIA benchmark before the end of 2025?
60% chance
Will AI wipe out AI before 2030?
8% chance
Will AI wipe out AI before the year 2030?
4% chance
Will humans wipe out AI by 2030?
6% chance
GPT-Zero: By 2030, will anyone develop an AI with a massive GPT-like knowledge base that it taught itself?
23% chance