Technology
Reka-Flash
Reka Flash is the 21B parameter, 'turbo-class' multimodal model: fast, efficient, and competitive with larger-scale proprietary systems like Gemini Pro and GPT-3.5.
This is Reka AI's high-performance, resource-efficient multimodal language model, Reka Flash. The core model, often seen in its 21B parameter configuration (e.g., Reka Flash 3.1), is built for speed and quality in fast workloads. It handles complex, interleaved inputs (text, image, video, audio) with a context length up to 128K tokens. Benchmarks confirm its 'turbo-class' positioning: it consistently rivals or surpasses models like Gemini Pro and Llama 2 70B on key evaluations, including MMLU and GPQA. The architecture is highly versatile, excelling in areas like coding, agentic tasks, and multilingual reasoning across over 32 languages.
Related technologies
Recent Talks & Demos
Showing 1-1 of 1