Efficient full homomorphic encryption for frontier model training by 2030?
➕
Plus
19
Ṁ2878
2030
11%
chance

Resolves YES if deep learning models at the compute frontier* could be** trained using full homomorphic encryption (FHE) with a <10x slowdown before 2030/1/1.

[*] Say, within 1 OOM of the highest-compute model deployed.

[**] No need for frontier models to be trained with FHE. Empirical evidence of smaller models trained with FHE at <10x slowdown plus a heuristic argument (e.g. dumb extrapolation) that larger models would also satisfy this will suffice.

Get
Ṁ1,000
and
S3.00
Sort by:
predictedNO

i could do 20k on no at 50 here but thats a big enough % of my portfolio i wont leave that up

bought Ṁ10 NO

@jacksonpolack likewise, strong NO from me, but I'm too poor to bet big bucks long-term

Zachary WeissboughtṀ6YES

.

@weissz The <10x slowdown is the unlikely part, it's a crazy "high" bar to hit

https://arxiv.org/pdf/2202.02960.pdf "HElib is [...] almost 19M times slower for multiplication". Like, these are state of the art results. I have no doubt that they'll continue to improve over time, but 19,000,000 is a loooooong way from 10

bought Ṁ90 NO from 12% to 11%

And that's for a single multiplication, I imagine the math involved with ML backprop would fare worse

big limit orders!

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules