Will @NathanpmYoung live to 1000
➕
Plus
103
Ṁ370k
3000
14%
chance
Get
Ṁ1,000
and
S3.00
Sort by:

Put a limit order at 15% if some YES holder want to go out.

I think I'm 8% to the idea that I live to be 1000, implying a terrifying conclusion that, if we are equally-well-calibrated, 20% of all my life-extension capability gets delivered in the years between when I die and when @NathanpmYoung dies

predictedNO

I was supposed to write a post refuting the LW model for why this should be >20% and while I was doing that I went down a longetivity rabbit hole and I found things (reading some new science articles, etc) unrelated to the LW post that made me a bit less sure in my <5% evaluation, + other work has made me delayed in posting that.

predictedYES

@firstuserhere While I applaud the effort of doing research on longevity, I think the question of "how much labor will longevity take to cure compared to already solved problems in science" is largely not a crux here, as most people can agree that it will not be many (>4) orders of magnitude harder than the hardest problem humans have solved.

I think the main crux here is AGI timelines and AGI impacts, where, if you buy the following:

  1. Longevity is similarly difficult to solve to other problems we have solved (in terms of amount of labor invested)

  2. The price of labor will go down after AGI and the amount of labor will go up by many orders of magnitude

  3. AGI will happen within Nathan's lifetime

Then, conditioning on AGI not killing Nathan, it's quite obvious that he will live to see aging cured.

I think that under this framing, the relative promisingness of various different longevity research directions that currently exist, or the progress on them that far, just doesn't provide that much relevant data.

predictedYES

@Nikola Like, doing research on current longevity directions is useful for figuring out whether humans will solve longevity. But we won't. AI will. The current work on longevity represents a tiny speck of what could be done after AGI, and the fact that a frog can't jump to the moon presents very little evidence on whether a rocket can fly to it.

predictedNO

I see that we all have nothing better to do after the OAI and POTY markets resolved so we're back to long term markets! These are tough because they lock your mana up for so long and require predicting what other people are willing to lock their mana up for. Perhaps a poll would help?

predictedNO

Fun mini-game: Guess my P(doom) based on my estimate of 5% Nathan lives to 1000.

So far 28 votes, 15 are for this being <1%.

predictedNO

@Joshua P(doom)=10% ? (that is, ~60% AGI go wrongly)

It's embarrassing that this is >10%

@firstuserhere Many people believe that AGI will be able to solve aging

@Simon74fe I'm not doubting that, but it's not 10% for a long series of improbable events.

predictedYES

@firstuserhere What's the series of events? AGI happens within Nathan's lifetime, (AGI doesn't kill us), AGI cures aging, Nathan gets access to the cure before dying, Nathan chooses to live for 1000 years, nothing else kills Nathan in the mean time.

predictedYES

I wonder where you disagree with this model.

predictedNO

@Nikola

  1. AGI happens within Nathan's lifetime

  2. AGI doesn't kill us

  3. AGI cures aging

  4. Nathan gets access to the cure before dying,

  5. Nathan chooses to live for 1000 years

  6. Nothing else kills Nathan for 1000 years.

I mean, that is a long series of improbable events. Where would you prefer me to reply to specific points on the linkpost? Here or LW? (I'll do that sometime in the evening today)

predictedYES

@firstuserhere Here is better probably, but I'd be eager to hear your probabilities for these 6 events, mine (conditioned on the previous ones) are roughly something like:

  1. AGI happens within Nathan's lifetime: 85%

  2. AGI doesn't kill us: 60%

  3. AGI cures aging: 90%

  4. Nathan gets access to the cure before dying: 80%

  5. Nathan chooses to live for 1000 years: 80%

  6. Nothing else kills Nathan for 1000 years: 95%

predictedYES

@firstuserhere I'd also be curious to see how you would put these probabilities

predictedYES

@Simon74fe I also think this market is a good way to split up the possible scenarios (here are my rough probabilities): https://manifold.markets/Simon74fe/m1000-subsidy-will-nathanpmyoung-li

  1. Dies before AGI has been developed: 13%

  2. Accident / killed by AGI or rogue AI: 40%

  3. Natural death post-AGI / AGI does not solve aging quickly enough: 3%

  4. Dies for other reason post-AGI: 14%

  5. Chooses not to live to 1000: 5%

  6. Makes it to 1000: 25%


(I'm aware that these are slightly inconsistent with the ones I gave before, but they both end up at P(Nathan lives to 1000)>0.2)

predictedYES

@Nikola you also need to keep in mind counterparty risk; if the AGI kills us all then there are no counterparties for our contracts (this betting market) to resolve with.

That'd make your odds: .85 * .90 * .80 * .80 * .95 = .46

predictedYES

@RobertCousineau I'm aware that this is a bad strategy, but I generally don't condition on surviving. This is my bio:
I don't condition on humanity still existing when buying questions. My policy is to buy questions in the direction of my subjective probability, even if this will predictably lead to me losing mana on extinction-related markets in worlds where we live. I want to maximize how reasonable my guesses are to future historians (who look back and condition on the information we had available today), not how much mana I have in worlds where I survive.

(I do this because I think the world would be better off if prediction markets showed unconditioned probabilities, but I'm happy to have my mind changed)

predictedNO

@Nikola So, the problem is that I don't think you're going to convince everyone else to bet that way. And so instead of prediction markets showing unconditioned probabilities, they show probabilities which are only determined by the people who think it is worth betting on these markets for non-max-EV reasons.

@Nikola I'd liken that strategy to saying "I drove my car into the lake because it would be better for the world if they also worked as boats".

Yes, it would be a better world if prediction markets like these were capable of forecasting topics that have significant counterparty risk. That isn't the world we live in though.

Further, smart people (those we want/need to convince about x-risk) know this, so when we have AI risk markets at 15-30 percent (when they clearly should be near-zero) it's hard for me to argue to them "yeah, those probabilities are off by a ridiculous amount but trust me bro, the others are pretty well calibrated".

predictedNO

@RobertCousineau Yeah on a meta-level, my goal for wanting these markets to have the "correct" price is so that we don't look dumb compared to Metaculus.

predictedYES

@Joshua Metaculus has also turned pretty optimistic about solving aging

predictedNO

@RobertCousineau Calibration is an average and it is very difficult for a prediction market to avoid averaging out to "good calibration," even one with fake money (nearly impossible with real money, as I said earlier.) I agree the extinction markets should be near zero, both on EV grounds and on true probability grounds. That is not enough to prevent Manifold from being overall calibrated.

predictedYES

@DavidBolin off topic: I think the true probability of disempowerment/extinction by AI is 33%+ in my life (I'm in my 20s).

On calibration: They are some of the highest volume, highest trader count markets on the site. If they have such a clear mispricing, that really does reflect badly on the site. Part of me debates just throwing 10k into fixing it, but then I run into counterparty risk issues with Manifold (do I actually trust them be able to donate my money where I want? Not at that scale).

predictedYES

@RobertCousineau Although to be clear, my unconditioned probability for this market is around 50%, and my conditioned probability (on manifold/humanity still existing or something) is around 25%, which means it's not a huge crux for my current strategy. I'm just buying YES in either case given it's currently at 15%

bought Ṁ100 YES from 15% to 17%
predictedNO

@RobertCousineau I agree it reflects badly on the site, but it says something about the demographic and not something more general about other unrelated markets.

predictedYES

@DavidBolin if the demographic is dumb enough to throw their money away in a guaranteed to lose and irrational strategy, that does say something about unrelated markets; namely, there's not enough smart money on this site (yet?) to have truly accurate probabilities.

predictedNO

@RobertCousineau

(do I actually trust them be able to donate my money where I want? Not at that scale)

Sounds like a prediction market, I'd bet that Manifold would properly direct a ten thousand dollar donation. Current donations are at 123,666 and they're funded for 500k. I don't see why your 10k would meaningfully affect this process, certainly not because of scale. If you don't like any of the listed charities that would be a problem, however.

predictedYES

@Sailfish They've recently limited mana outflows to 10k a month (starting next year). I'd likely need to keep my mana in said markets for a considerable time. If there's a 25% chance it is no longer able to be donated after 1 year, it is not worth it (as with 8% SP500 returns versus what is generously 20% returns on the AI markets, then multiplied by .75, it puts me at negative 13% returns compared to the counterfactual).

predictedNO

@RobertCousineau

They've recently limited mana outflows to 10k a month (starting next year)

I don't think this matters, from the announcement

So far, the median monthly donation is well below $5,000

Even if you pin every month to $10k this pessimistically takes three months or so to go through.

I'd likely need to keep my mana in said markets for a considerable time.

Maybe, loans make this a pretty juicy proposition though, assuming you actually value having mana.

If there's a 25% chance it is no longer able to be donated after 1 year

There isn't. I probably still wouldn't do this, I don't think the charity-via-mana pipeline is an excellent choice, but that's for other reasons. If you're seriously considering it it could be worth asking about whatever concerns you have, but I think your actual expected value calculations are wrong. I'm of course willing to bet (probably not enough to be worth your time) on all of these beliefs.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules