Cohere Tiny Aya Projects .

Technology

Cohere Tiny Aya

A 3.35-billion parameter open-weight model family optimized for high-performance multilingual text generation across 70+ languages.

Cohere Tiny Aya is a specialized family of small language models (SLMs) designed to deliver robust multilingual capabilities on consumer-grade hardware. With a 3.35B parameter architecture and an 8K context window, the model family includes five distinct variants (Base, Global, Earth, Fire, and Water) tailored for specific regional language groups and deployment needs. It excels in low-resource language representation (supporting scripts like Hindi, Arabic, and Tamil) while maintaining a compact 2.14 GB memory footprint. Built using a 64 H100 GPU cluster and a redesigned language-balanced tokenizer, Tiny Aya provides an efficient alternative for on-device translation, summarization, and conversational AI without requiring cloud connectivity.

https://huggingface.co/CohereLabs/tiny-aya-global
0 projects · 0 cities

Recent Talks & Demos

Showing 1-0 of 0

Members-Only

Sign in to see who built these projects

No public projects found for this technology yet.