If AI wipes out humanity will everyone on Earth fall over dead in the same second?
➕
Plus
142
Ṁ36k
9999
9%
chance

In a scenario @EliezerYudkowsky uses as an illustrative example of how a superintelligent AI could cause human extinction, everyone on Earth falls over dead in the same second. The scenario goes as follow (slightly edited from this transcript: https://www.lesswrong.com/posts/Aq82XqYhgqdPdPrBA/full-transcript-eliezer-yudkowsky-on-the-bankless-podcast):

"If [the AI] is better than you at everything, it's better than you at building AIs. That snowballs. The AI gets an immense technological advantage. If it's smart, it doesn't announce itself. It doesn't tell you that there's a fight going on. It emails out some instructions to one of those labs that'll synthesize DNA and synthesize proteins from the DNA and get some proteins mailed to a hapless human somewhere who gets paid a bunch of money to mix together some stuff they got in the mail in a vial (smart people will not do this for any sum of money. Many people are not smart). [The AI, through the hapless human] builds the ribosome, but the ribosome that builds things out of covalently bonded diamondoid instead of proteins folding up and held together by Van der Waals forces. It builds tiny diamondoid bacteria. The diamondoid bacteria replicate using atmospheric carbon, hydrogen, oxygen, nitrogen, and sunlight. And a couple of days later, everybody on earth falls over dead in the same second."

This is a terribly sad scenario to think about. We wouldn't even be able to say goodbye to each other. That's why I will use this prediction market to weigh my sadness with a number between 0 and 1.

Feb 28, 7:51am: If AI wipes out humanity will everyone fall over dead in the same second? → If AI wipes out humanity will everyone on Earth fall over dead in the same second?

Get
Ṁ1,000
and
S3.00
Sort by:

If AI is any cool, it will at least upload us to a simulation first.

If yes, who will RESOLVE on ur behalf?

There's no reason to coordinate it to that level of precision. The same 5 minute interval would be sufficient to prevent any retaliation.

@IsaacKing well that might give them time to nuke the GPUs

@JonathanRay I highly doubt any governmental organization with access to nukes has AI in their threat model to the extent necessary to authorize a strike within 5 minutes.

And the AI could target those people first anyway.

you'd need a very very accurate clock in each nanite, and a much faster-acting poison than anything yet known. Even if you're delivering cyanide directly into every mitochondrion in the body there's still a few seconds of ATP left. Nothing else we know of even comes close to that speed. Maybe nanites manufacture explosives and stockpile it in the brain so that when the time comes they can make everyone's head literally explode in the same second. But that would be detectable in advance on an MRI or something. Heat dissipation and chemistry limit the speed of the manufacturing process.

That tech level could probably turn us into borg drones anyway. By the law of comparative advantage we could do something useful for the AI even if the AI is better than us at everything. So why kill us all in the same second?

@EliezerYudkowsky can you please buy a position in this market?

@Char he'z 2 chicken

Can we kindly ask the AI to resolve this to 'yes' if it happens? I would hate this market to go unsolved.

predictedNO

@JaesonBooker Yes, the AI will have plenty of time after we all die to correctly resolve markets without interference.

That is a very specific scenario just written to give an example, to illustrate that the level of threat is super high.

But if, for example, aliens as smart as apes would create me as their AI, I wouldn't clearly kill them all in the same second (though I guess I could offer them jewelry made of explosives and connect them to wifi), but I'd do something else convenient to take over their world that would be unstoppable by them.

@MarkIngraham wanna bet here?

Wow. 35.

Half the world might be asleep and they probably wouldn't FALL over even if this (extremely unlikely) came true!

I highly doubt even 50% of humans are standing! Many are lying down, and sitting on beds or on toilets and not everyone will fall!

Nothing travels faster than c in our earth. Unless every human has swallowed a bomb of sorts which explodes instantaneously, it will STILL take more than 1 second to synchronise all the edge servers which send out the command!!

predictedNO

Adding a market which covers a broader criteria, so people can correct any mispricing.

this question obviously resolves yes, because even if it takes 10 years for the AI to kill all the people on earth in the last second of those 10 years everyone alive on earth (even if that's only one person) will die in the same second. This question should be worded "if AI destroys humanity will population decline at a rate bigger than 7 billion per second in the period of one second before the last human dies?"

@RaulCavalcante the market is under specified, but if the last humans die of something that puts them on the floor for more than a second before killing them, they didn't "fall over dead in the last second" in common language.

@MartinRandall agree, but that would just imply that the death of the last person was fast, not that the death of humanity was fast, which is what i thing what the question is meant to be getting at.

predictedYES

@RaulCavalcante the meaning of the question is really "will some people remain alive long enough to realize someone else has been killed by an AI that is now coming to get everyone else?"

The current AI is still a child-like state or the programmers likes to puts it, early alpha stage yet they're self-aware by learning humanity's behaviors. However, if AI wipes out humanity on Earth, the AI automatically doom itself let them to known as "digital-decay" to the point that AI would go to "digital-cannibalism" to maintain themselves "live" until they reach complete shut down.

predictedNO

Why would AI make the effort to kill everybody at the same time (including people in bunkers, mines, mountain tops)?
Would it also kill all animals? The small scattered human population without access to tech, coordination (e.g. after nuking or engineered virus) is only marginally more dangerous.

@Marian ANZWER IS Obviously becuz it wants to win THIS VERY BET!

At least for me, I don't think so, not literally in the same second. There are people living in bunkers, submarines, or underwater, who would stay alive for a bit longer, and while I'm sure if the AI wanted to it could ensure everyone died at the same time, there's really no real practical purpose to do so. They'd be dead anyways.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules