Probabilistic Calibration by Design for Neural Network Regression

Victor Dheur, Souhaib Ben Taieb
(2024) Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS) 2024.

With the growing utilization of machine learning models in real-world applications, improving neural network calibration has become a primary concern. Various methods have been proposed, including post-hoc methods that adjust predictions after training and regularization methods that act during training. In regression, post-hoc methods have shown superior calibration performance compared to regularization methods. However, the base model is trained without any direct connection to the subsequent post-hoc step. To address this limitation, we introduce a novel end-to-end method called Quantile Recalibration Training, integrating post-hoc calibration directly into the training process. We also propose an algorithm unifying our method with other post-hoc and regularization methods. We demonstrate the performance of Quantile Recalibration Training in a large-scale experiment involving 57 tabular regression datasets, showcasing improved predictive accuracy. Additionally, we conduct an ablation study, revealing the significance of different elements in our method. Aiming for reproducibility and fair comparisons, we have implemented our experiments in a common code base.