Will any open-source Transformers LLM model that function as a dense mixture of experts be released by end of 2024?
Plus
3
Ṁ18Jan 1
50%
chance
1D
1W
1M
ALL
Will any open source or weights Transformers LLM based model emerge that is functionally a dense version of mixture of experts where the empirical mathematical sparsity resembles dense models like Llama 3.1 405B or Mistral Large Enough. A tool that allows for the creation of this type of model even if no model is released along with it would resolve as yes as long as it is possible to create the model for example Mergekit for various ways of model manipulation. A paper would only resolve as yes if there was an accompanying model, functional code released, or implementation by a third party.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
When will a non-Transformer model become the top open source LLM?
Will Transformer based architectures still be SOTA for language modelling by 2026?
68% chance
Will Meta release a Llama 3 405B multi-modal open source before the end of 2024?
29% chance
When will OpenAI release a more capable LLM?
When will the first fully open-source advanced LLM (data, code, weights) be released?
Will superposition in transformers be mostly solved by 2026?
73% chance
Will Meta release an open source language model that outperforms GPT-4 by the end of 2024
67% chance
Are Mixture of Expert (MoE) transformer models generally more human interpretable than dense transformers?
50% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
27% chance