FoldFormer: sequence folding and seasonal attention for fine-grained long-term FaaS forecasting

Luke Darlow, Artjom Joosen, Martin Asenov, Qiwen Deng, Jianfeng Wang, Adam David Barker

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Fine-grained long-term (FGLT) time series forecasting is a fundamental challenge in Function as a Service (FaaS) platforms. The data that FaaS function requests produce are fine-grained (per-second/minute), often have daily periodicity, and are persistent over the long term. Forecasting in the FGLT data regime is challenging, and Transformer models can scale poorly for long sequences. We propose FoldFormer that combines several novel elements - time-to-latent folding, seasonal attention, and convolutions over FFT representations - as a new solution for FGLT forecasting of FaaS function requests. FoldFormer is designed to efficiently consume very fine-grained multi-day data with nearly no additional model, memory, or compute overhead, when compared to consuming coarse-grained data. We show either state-of-the-art or competitive performance for per-minute function requests on the top 5 most requested functions for three data sources, including two in-house Huawei Cloud sources and Azure 2019. We also show state-of-the-art performance at per-second granularity --- a regime that critically limits most other methods.
Original languageEnglish
Title of host publicationEuroMLSys '23
Subtitle of host publicationproceedings of the 3rd workshop on machine learning and systems
EditorsEiko Yoneki, Luigi Nardi
Place of PublicationNew York, NY
Number of pages7
ISBN (Print)9798400700842
Publication statusPublished - 6 May 2023


  • Transformers
  • Neural networks
  • Forecasting
  • Systems infrastructure
  • Time series


Dive into the research topics of 'FoldFormer: sequence folding and seasonal attention for fine-grained long-term FaaS forecasting'. Together they form a unique fingerprint.

Cite this