
Conditional on not having died from unaligned AGI, I consider myself a full time alignment researcher by the end of 2030
Plus
16
Ṁ5502030
34%
chance
1D
1W
1M
ALL
I suspect that the primary mechanism by which this market resolves to NO would be either burnout or running out of funding. However, do not be limited to these mechanisms when trading.
Relevant market: https://manifold.markets/AlanaXiang/will-i-consider-myself-a-fulltime-a
I do not intend to buy shares in this market (either YES or NO).
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Sort by:
Related questions
Related questions
Will I have a career as an alignment researcher by the end of 2024?
38% chance
Will I still work on alignment research at Redwood Research in 3 years?
60% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
34% chance
Will Meta AI start an AGI alignment team before 2026?
35% chance
Conditional on their being no AI takeoff before 2050, will the majority of AI researchers believe that AI alignment is solved?
51% chance
By the end of 2025, which piece of advice will I feel has had the most positive impact on me becoming an effective AI alignment researcher?
Will tailcalled think that the Brain-Like AGI alignment research program has achieved something important by October 20th, 2026?
20% chance
Will taking annual MRIs of the smartest alignment researchers turn out alignment-relevant by 2033?
7% chance
Will we solve AI alignment by 2026?
8% chance
I make a contribution to AI safety that is endorsed by at least one high profile AI alignment researcher by the end of 2026
59% chance