TimesFM
Google Research pretrained time-series foundation model for forecasting with open Apache-2.0 code and checkpoints.
Why it is included
Listed on TAAFT (Google Research repositories) as a widely starred time-series FM project.
Best for
Analysts and ML engineers needing strong zero-shot or fine-tuned forecasting baselines.
Strengths
- Purpose-built for series
- Research-grade release
- Apache-2.0
Limitations
- Domain shift and seasonality still need validation
Good alternatives
Prophet · ARIMA stacks · Temporal Fusion Transformers
Related tools
AI & Machine Learning
PyTorch
Deep learning framework with strong research-to-production paths.
AI & Machine Learning
JAX
Composable transformations (grad, vmap, pmap) plus NumPy-like API for high-performance ML research on accelerators.
AI & Machine Learning
Meta Llama (open models)
Meta’s Llama family of open **weights** (subject to Llama license) with reference code, tooling, and downloads via Hugging Face and meta-llama org.
AI & Machine Learning
Mistral AI (open models)
Mistral’s open-weight checkpoints (e.g. 7B era, Mixtral MoE) and Apache-2.0–licensed **code** alongside proprietary flagship lines—verify each checkpoint.
AI & Machine Learning
Qwen
Alibaba’s Qwen family (dense and MoE) with strong multilingual and coding variants; weights and code on Hugging Face under stated licenses per release.
AI & Machine Learning
DeepSeek
DeepSeek open-weight models (e.g. V3/R1 lineage) with MIT or custom terms per release—high capability coding and reasoning checkpoints.
