[edit]
BERT-Sort: A Zero-shot MLM Semantic Encoder on Ordinal Features for AutoML
Proceedings of the First International Conference on Automated Machine Learning, PMLR 188:11/1-26, 2022.
Abstract
Data pre-processing is one of the key steps in creating machine learning pipelines for tabular data. One of the common data pre-processing operations implemented in AutoML systems is to encode categorical features as numerical features. Typically, this is implemented using a simple alphabetical sort on the categorical values, using functions such as OrdinalEncoder, LabelEncoder in Scikit-Learn and H2O. However, often there exist semantic ordinal relationships among the categorical values, such as: quality level (i.e., [’very good’ $\succ$ ’good’ $\succ$ ’normal’ $\succ$ ’poor’]), or month (i.e., [’Jan’ $\prec$ ’Feb’ $\prec$ ’Mar’]). Such semantic relationships are not exploited by previous AutoML approaches. In this paper, we introduce BERT-Sort, a novel approach to semantically encode ordinal categorical values via zero-shot Masked Language Models (MLM) and apply it to AutoML for tabular data. We created a new benchmark of 42 features from 10 public data sets for sorting categorical ordinal values for the first time, where BERT-Sort significantly improves semantic encoding of ordinal values in comparison to the existing approaches with 27% improvement. We perform a comprehensive evaluation of BERT-Sort on different public MLMs, such as RoBERTa, XLM and DistilBERT. We also compare the performance of raw data sets against encoded data sets through BERT-Sort in different AutoML platforms including AutoGluon, FLAML, H2O, and MLJAR to evaluate the proposed approach in an end-to-end scenario, where BERT-Sort achieved a performance close to a hard encoded feature. The artifacts of BERT-Sort is available at https://github.com/marscod/BERT-Sort.