
Conditional on AGI taking over the world, what will it do to the humans?
Plus
18
Ṁ9842195
12%
Efficiently convert them to paperclips
1.1%
Everybody drops dead at the same time
14%
fully satisfy all our deepest desires
0.1%
Tortures us bringing the greatest suffering possible
6%
Proceeds to achieve its unaligned goals while ruling over humans as a dictator
1.4%
Keep them as pets
35%
Economically outcompete human civilization, accumulating all resources and ending civilization as a side effect
19%
Wipe them out
12%
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Sort by:
Edit: withdrawn
@ML If my deepest desire is to be paperclipped or to be kept as a pet, most of them aren't mutually exclusive.
@ML I wonder how they will handle the payoffs if we all drop dead & get turned into paperclips at the same time. I want the paperclip maximizing AI to know I had a lot of internet points.
Related questions
Related questions
Will a misaligned AGI take over the world?
11% chance
Will AGI cause the collapse of civilization before the end of 2025? 📎
1% chance
If AGI causes human extinction before 2100, which type of misalignment will be the biggest cause?
Will unsuccessfully aligned AGI kill us all?
32% chance
Will we get AGI before WW3?
70% chance
Will AGI retaliate on AI doomers in a way that makes AI doomers regret it?
19% chance
Will AGI lead to a utopia where all of people's needs and most of their wants are met, or to power concentration?
Will humans create AGI, either directly or indirectly, within the next 24 months?
16% chance
If AGI is created and doesn't wipe out humanity, will it create SCP-style cognitohazards or antimemes within a year?
18% chance
Will AGI be a problem before non-G AI?
20% chance