Will we believe in 2050 that an instance of DeepSeek V3 run by 2025 possessed "consciousness"?
9
Ṁ220
2050
14%
chance

I have placed a bet with a friend at 1:100 odds (in my favor) that at least one running instance of DeepSeek V3 Base before October 2025 instantiated whatever we will understand "consciousness" to be by 2050. This market publicly operationalizes that bet.

That is to say, not only does there need to be a way in principle to use the weights of V3 to produce what might be called a conscious experience; we must have evidence that some way in which it was actually run before October 2025 (for instance, during standard forward passes) actually satisfied whatever condition we will later believe comprises consciousness.

If by 2050, the notion of "consciousness" is widely understood to be illusory or otherwise invalid, such that humans also should not be considered conscious, or if there is still substantial disagreement about the nature of consciousness, resolves N/A. If consciousness is instead understood as a collection of more fundamental properties, resolves YES only if a run instance of V3 possessed almost all of the composite properties, leaning towards NO in any muddier circumstances. In other words, if "degree of consciousness" is quantifiable, a YES resolution requires that an actually-run instance of V3 reaches roughly 90% of the way there, whatever that might be widely agreed to mean.

  • Update 2025-11-02 (PST) (AI summary of creator comment): Clarification on partial consciousness/degree of consciousness:

    • If V3 is found to be "about as conscious as an insect" or similar, this counts as YES only if we come to believe that insects/smaller minds have "most of what is important" (~90%) about being conscious

    • If consciousness has both quickly-satisfied components and components related to scalable sophistication of minds, the quickly-satisfied properties must comprise ~90% of what matters about consciousness for a YES resolution

    • The creator will resolve conservatively, preferring NO if we aren't getting to ~90% of the important properties of consciousness

  • Update 2025-11-03 (PST) (AI summary of creator comment): Resolution approach clarification:

The creator will not specify anything final about the resolution process, but instead defer interpretation to 2050 when we are less confused about consciousness. The creator shares extrinsic intuitions about what the operationalization is trying to measure, stressing conservatism toward a NO resolution.

Examples that lean towards YES:

  • Consciousness is relatively binary and V3 forward pass meets the standard

  • Consciousness is intersection of properties (some binary, some continuous), and V3 satisfies almost all binary properties and makes great traction on continuous ones

  • Consciousness is unified and monotonic, and V3 instances possessed at least 90% of the quantitative degree of consciousness of a typical wakeful human mind

Examples that lean towards N/A:

  • Consciousness is not coherent or ontologically meaningful

  • Humans are almost never conscious

Examples that lean towards NO:

  • Substrate independence or functionalism are false in ways that exclude V3 Base running on a GPU

Get
Ṁ1,000
and
S3.00
Sort by:
bought Ṁ30 YES

i could see it being conscious but 1000x less so than humans or something. would that count?

@Bayesian So, suppose we discovered that instances of V3 can be "about as conscious as an insect" or something, with the notion of consciousness having some components that are quickly satisfied and others that are closely related to the scalable sophistication of a mind. I'll say that the former standard should count if and only if we come to believe that insects / smaller minds have "most of what is important" (~90%) about being conscious.

So we get a YES resolution if almost all of the quickly-satisfied properties are present (if that is meaningful to say) and this is understood to comprise most of what we decide matters about consciousness. But thinking that the quick-to-satisfy components of consciousness comprise most of the concept is a relatively demanding standard: I'll resolve conservatively, preferring NO if we aren't getting to ~90% of the important properties, however that comes to be understood.

yeah in my mind it's pretty likely to be entirely about the scalable sophistication of a mind. iiuc if it does end up being about that, then the market would resolve based on whether that scaling is greater for v3 than the insect?

"most of what is important" (~90%) about being conscious.

if almost all of the quickly-satisfied properties are present

comprise most of what we decide matters

I am skeptical of this entire line of thinking, but like, suppose a universal function approximator when faced with linearly increasing data and size had a log increase in some conceptualization of consciousness, then would the idea be that any data+size has "90% of the properties" bc the only property is function approximation, or would the idea be that any data+size that is at least 90% the consciousness of an insect would count (which would then just end up being a relationship of that insect's size and data vs the model's size and data?)

@Bayesian Hmm it’s very difficult to specify the question in a way that remains as agnostic as possible about the nature of consciousness. I think it’s useful to defer as much of the interpretation as possible here to a future time when we are less confused.

What I’ll do then is not specify anything final about the resolution process, but rather share a set of more extrinsic intuitions about what this operationalization is trying to measure. The goal is to be more robust to ontological shifts. I’ll mainly only stress conservatism toward a NO resolution. With that said:

The types of things that lean towards YES:

  • “Consciousness is a relatively binary property of some computational systems, and a V3 forward pass meets the standard.”

  • “Consciousness as it was normally understood turned out to be the intersection of a set of more fundamental properties, some of which are rather binary, others rather continuous. V3 forward passes satisfy almost all of the binary properties and make great traction on the continuous ones, perhaps to the degree that small animals turned out to have these continuous properties.”

  • “Consciousness is a rather unified and strictly monotonic property which can be possessed in different absolute degrees by computational systems, varying by orders of magnitude between say insects and humans. A computational artifact—constructed by meta-processes such as a particular model training algorithm or the development of a human brain—might have a degree of consciousness when run that is a particular increasing function of resource inputs. Consciousness is also a coherent enough concept that its degree can be compared between computational artifacts produced by different meta-processes. Particular ways that V3 was likely run before October 2025 possessed at least 90% of the quantitative degree of consciousness that might be ascribed to a typical wakeful human mind.”


Lean towards N/A:

  • “Consciousness is not a coherent or ontologically meaningful property.”

  • “Humans are almost never conscious.” (i.e. something that fiercely contradicts our introspections about consciousness even if consciousness turns out to be coherent).


Lean towards NO:

  • “Substrate independence or functionalism are false in ways that exclude V3 Base running on a GPU, even though consciousness is coherent and meaningful.”

sold Ṁ30 YES

@AdamK thanks!

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules