
In Jan 2027, Risks from Artificial Intelligence (or similar) will be on 80,000 hours top priority list
Basic
8
Ṁ5712027
94%
chance
1D
1W
1M
ALL
The top 10 recommended jobs by some kind of odering on a page like this
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Related questions
Related questions
In 2025, what % of EA lists "AI risk" as their top cause?
44% chance
Will >90% of Elon re/tweets/replies on 19 December 2025 be about AI risk?
7% chance
What AI safety incidents will occur in 2025?
In January 2026, how publicly salient will AI deepfakes/media be, vs AI labor impact, vs AI catastrophic risks?
Will humanity wipe out AI x-risk before 2030?
10% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2030?
67% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2035?
91% chance
At the beginning of 2026, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
Public opinion, late 2025: Out-of-control AI becoming a threat to humanity, a real threat?
The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
73% chance