BGE-small-en-v1 Projects .

Technology

BGE-small-en-v1

A high-efficiency 33.5 million parameter embedding model that dominates the MTEB leaderboard for English retrieval tasks.

Engineered by the Beijing Academy of Artificial Intelligence (BAAI), BGE-small-en-v1.5 delivers elite performance in a compact 133MB footprint. It processes 512-token sequences into 384-dimensional vectors, balancing low latency with high accuracy for RAG pipelines and semantic search. This model currently maintains top-tier rankings on the Massive Text Embedding Benchmark (MTEB) by outperforming significantly larger architectures in retrieval, reranking, and clustering metrics.

https://huggingface.co/BAAI/bge-small-en-v1.5
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects