LottieGPTTokenizing Vector Animation for Autoregressive Generation

1 Tsinghua University  |  2 BAAI  |  3 The Hong Kong Polytechnic University  |  4 Nanjing University  |  5 Guangming Lab
* Equal contribution   |   † Corresponding authors
LottieGPT pipeline
Pipeline

Condition in,
animation out

Inputs are encoded, turned into animation tokens, and decoded into a valid Lottie file that can be reused in real design workflows.

LottieGPT tokenizer
Tokenizer

Compress motion into a learnable sequence

The representation captures shape, transform, temporal structure, and easing with enough detail for editing after generation.

LottieGPT data curation
Data Curation

From noisy assets to trainable examples

Filtering, normalization, and token alignment turn raw assets into cleaner examples that support stable training at scale.

LottieGPT dataset overview
Dataset

15M Lottie dataset

We introduce a large-scale vector animation dataset with 15M samples for autoregressive generation.

Citation

BibTeX

@misc{chen2026lottiegpttokenizingvectoranimation,
  title={LottieGPT: Tokenizing Vector Animation for Autoregressive Generation},
  author={Junhao Chen and Kejun Gao and Yuehan Cui and Mingze Sun and Mingjin Chen and Shaohui Wang and Xiaoxiao Long and Fei Ma and Qi Tian and Ruqi Huang and Hao Zhao},
  year={2026},
  eprint={2604.11792},
  archivePrefix={arXiv},
  primaryClass={cs.CV},
  url={https://arxiv.org/abs/2604.11792},
}