Abstract
Fine-grained long-term (FGLT) time series forecasting is a fundamental challenge in Function as a Service (FaaS) platforms. The data that FaaS function requests produce are fine-grained (per-second/minute), often have daily periodicity, and are persistent over the long term. Forecasting in the FGLT data regime is challenging, and Transformer models can scale poorly for long sequences. We propose FoldFormer that combines several novel elements - time-to-latent folding, seasonal attention, and convolutions over FFT representations - as a new solution for FGLT forecasting of FaaS function requests. FoldFormer is designed to efficiently consume very fine-grained multi-day data with nearly no additional model, memory, or compute overhead, when compared to consuming coarse-grained data. We show either state-of-the-art or competitive performance for per-minute function requests on the top 5 most requested functions for three data sources, including two in-house Huawei Cloud sources and Azure 2019. We also show state-of-the-art performance at per-second granularity --- a regime that critically limits most other methods.
Original language | English |
---|---|
Title of host publication | EuroMLSys '23 |
Subtitle of host publication | proceedings of the 3rd workshop on machine learning and systems |
Editors | Eiko Yoneki, Luigi Nardi |
Place of Publication | New York, NY |
Publisher | ACM |
Pages | 71-77 |
Number of pages | 7 |
ISBN (Print) | 9798400700842 |
DOIs | |
Publication status | Published - 6 May 2023 |
Keywords
- Transformers
- Neural networks
- Forecasting
- Systems infrastructure
- Time series