Will GPT-4's parameter count be known by end of 2024?
➕
Plus
111
Ṁ31k
Jan 1
5%
chance

Must be officially released by OpenAI so leaks or off-hand comments do not count.

Extremely wide intervals("between 1000 and 100 trillion") would not be accepted: Endpoints must be within 25% of the mean. So "500-600 billion" would be acceptable because the mean is 550 billion, the 25% variation is [412, 687], and the given interval is contained within it.

Get
Ṁ1,000
and
S3.00
Sort by:

Does this have to be any 4, and not 4o or future variants? @Mira

Jensen, CEO of NVIDIA, just revealed it to be 1.8T

Plausible-sounding leak: https://twitter.com/Yampeleg/status/1678547812177330180

Based on paywalled content here: https://www.semianalysis.com/p/gpt-4-architecture-infrastructure

Edit: Tweet was taken down due to copyright takedown request by SemiAnalysis. Archived: https://archive.is/2RQ8X

predictedNO

@josephrocca Leaks don't count (read description).

predictedNO

I believe the movement in the market since yesterday is from speculation that spread over twitter ( I learned of it this morning from Mastodon ) : https://twitter.com/soumithchintala/status/1671267150101721090?s=20

Given the shrinking of the moat by the very enthusiastic and competitive environment and their stated past concerns about this in 2023 so far, I don't see this environment changing even if there were new regulations concerning LLM training and so I believe this will be kept a trade secret at least until model size for LLMs truly becomes irrelevant due to reduction in HW costs or efficiency gains from software development, which I put past the end of 2024 (we're only at the starting line of the singularity). Oops meant to buy NO. Doh!

@parhizj fixed

Comment hidden
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules