- Modelle
- DBRX
DBRX
by Databricks
DBRX is a state-of-the-art, transformer-based, decoder-only large language model developed by Databricks. It features a Mixture-of-Experts (MoE) architecture with 132 billion parameters, designed for efficient next-token prediction. Released in 2024, it outperforms many open-source models on standard benchmarks.