
No ratings yet
Be the first to review this model
SambaNova's cloud platform offering the world's fastest inference for open-source models like Llama and DeepSeek. Specializes in serving large models with record-breaking tokens-per-second throughput. Ideal for enterprise teams needing maximum inference speed for open-weight model deployments.
Released
September 15, 2024
Parameters
Unknown
Context
128K
Pricing
Paid