Technology
Mosaic
Mosaic provides the high-performance infrastructure and software libraries required to train large-scale AI models with maximum efficiency.
Mosaic (now part of Databricks) provides the specialized stack to train massive generative AI models with surgical precision. The platform uses its open-source Composer library to implement 20+ efficiency methods (like Selective Backprop and BlurPool) that reduce training costs and speed up timelines by up to 7x. Engineering teams leverage this infrastructure to build models like MPT-7B across multi-cloud GPU clusters while maintaining absolute ownership of their weights and data.
Recent Talks & Demos
Showing 1-0 of 0