Technology
TimesFM
TimesFM is Google Research's 200M-parameter, decoder-only foundation model: it delivers state-of-the-art zero-shot time-series forecasting across diverse, unseen datasets.
TimesFM (Time-series Foundation Model) redefines forecasting: it's a 200-million-parameter, decoder-only transformer from Google Research, pre-trained on over 100 billion real-world time points. This massive training enables its core strength: zero-shot forecasting, which rivals the accuracy of models explicitly trained for specific datasets (e.g., Monash benchmarks). The architecture uses input patching (like reading a "sentence" of data) to efficiently process context lengths up to 512 points (1.0) or 16k points (2.5), dramatically simplifying the predictive modeling pipeline for domains like retail, finance, and energy.
Related technologies
Recent Talks & Demos
Showing 1-1 of 1