Conditional on not having died from unaligned AGI, I consider myself a full time alignment researcher by the end of 2030
Plus
16
Ṁ5502030
34%
chance
1D
1W
1M
ALL
I suspect that the primary mechanism by which this market resolves to NO would be either burnout or running out of funding. However, do not be limited to these mechanisms when trading.
Relevant market: https://manifold.markets/AlanaXiang/will-i-consider-myself-a-fulltime-a
I do not intend to buy shares in this market (either YES or NO).
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Related questions
Related questions
In 2025, will I believe that aligning automated AI research AI should be the focus of the alignment community?
59% chance
Will I have a career as an alignment researcher by the end of 2024?
38% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
35% chance
Will we get AGI before 2030?
55% chance
Will tailcalled think that the Brain-Like AGI alignment research program has achieved something important by October 20th, 2026?
36% chance
Will we get AGI before 2032?
65% chance
Will we reach "weak AGI" by the end of 2025?
26% chance
Conditional on their being no AI takeoff before 2050, will the majority of AI researchers believe that AI alignment is solved?
51% chance
Will we get AGI before 2031?
60% chance
Will we get AGI before 2035?
70% chance