[edit]
A Tree-Structure Enhanced Transformer for Cardinality Estimation
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:225-237, 2024.
Abstract
Accurate cardinality estimation is crucial for query optimization by guiding plan selection. Traditional cardinality estimation approaches often fail to provide precise estimates, leading to suboptimal query plans. In recent years, learning-based methods have emerged as a promising alternative. For tree-based learning methods, a conventional way to simply encode nearby parent-child pairs and learn by iteratively training hampers the performance. But it leads to a significant loss of structural information and incurs substantial computational overhead by the iterative training process. In this paper, we proposed a tree-structure positional encoding scheme. It can not only extract effective features for each node, but also capture the inherent structural characteristics of the tree. Based on the tree-based feature, we designed a novel transformer-based cardinality estimation model, which enhances the parallelism of the model training process and reduces the overhead caused by iterative training. On real-world datasets, our method beats the current state-of-the-art techniques, QF, by 25% in terms of mean qerror.