Is attention all you need? (transformers SOTA in 2027)
➕
Plus
127
Ṁ24k
2027
48%
chance

This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)

Details can be fount at https://www.isattentionallyouneed.com/

Proposition

On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.

Other markets on the same:

Get
Ṁ1,000
and
S3.00
Sort by:
bought Ṁ100 NO

In 2030, which AI paper will have the most citations?

What about hybrid models, like Jamba? They might be the best of both worlds.

predictedYES

Yes given that an architecture qualifies that levers a combination of transformer models and supporting infra components that wouldn’t be considered breakthrough technologies on their own (e.g. RAG).

So do mixtures of experts count? The linked page this not contain any actual details.

bought Ṁ4 YES at 61%
predictedYES

@EchoNolan I talked to Sasha, and his response is basically that as long as the E in the MoE is Transformer, its a transformer.

i have strong principled reasons this should stay at 50% for the next 24 hours

subsidy phasein

predictedYES

@jacksonpolack Hm, I will add in subsidy at a later point wherever the market stabilizes to maintain that

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules