CauKer: Classification Time Series Foundation Models Can Be Pretrained on Synthetic Data
Shifeng Xie, Vasilii Feofanov, Jianfeng Zhang, Themis Palpanas, Ievgen Redko
Abstract
Time series foundation models (TSFMs) have recently gained significant attention due to their strong zero-shot capabilities and widespread real-world applications. Such models typically require a computationally costly pretraining on large-scale, carefully curated collections of real-world sequences. To allow for a sample-efficient pretraining of TSFMs, we propose CauKer, a novel algorithm designed to generate diverse, causally coherent synthetic time series with realistic trends, seasonality, and nonlinear interactions. CauKer combines Gaussian Process (GP) kernel composition with Structural Causal Models (SCM) to produce data for sample-efficient pretraining of state-of-the-art classification TSFMs having different architectures and following different pretraining approaches. Additionally, our experiments reveal that CauKer-generated datasets exhibit clear scaling laws for both dataset size (10K to 10M samples) and model capacity (1M to 783M parameters), unlike real-world datasets, which display irregular scaling behavior.
Generates diverse synthetic time series for pretraining foundation models with clear scaling laws.
- Proposes CauKer algorithm combining Gaussian Process kernels with Structural Causal Models
- Generates causally coherent synthetic data with realistic trends, seasonality, and nonlinear interactions
- Demonstrates TSFMs pretrained on CauKer-generated data match performance of larger real-world datasets
- Reveals clear scaling laws for synthetic data unlike irregular patterns in real-world datasets
- Gaussian processes
- Structural causal models
- Synthetic data generation
Considered only two models following different pre-training paradigms
from the paperDid not consider large-scale forecasting benchmarks such as Time-300B
from the paper
Authors did not state explicit future directions.
Author keywords
- Time Series Foundation Model
- Time Series Classification
Related orals
Causal Structure Learning in Hawkes Processes with Complex Latent Confounder Networks
Develops causal structure learning framework for Hawkes processes identifying latent confounder subprocesses.
Global Resolution: Optimal Multi-Draft Speculative Sampling via Convex Optimization
Solves optimal multi-draft speculative sampling via convex optimization achieving 90% acceptance rates.
Conformal Robustness Control: A New Strategy for Robust Decision
CRC optimizes prediction set construction under explicit robustness constraints instead of coverage for more efficient robust decisions.
Structured Flow Autoencoders: Learning Structured Probabilistic Representations with Flow Matching
Structured Flow Autoencoders integrate flow matching with graphical models for structured representation learning.