Self-Calibrating Vicinal Risk Minimisation for Model Calibration

Date

Authors

Liu, Jiawei
Ye, Changkun
Cui, Ruikai
Barnes, Nick

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE Computer Society

Access Statement

Research Projects

Organizational Units

Journal Issue

Abstract

Model calibration, measuring the alignment between the prediction accuracy and model confidence, is an important metric reflecting model trustworthiness. Existing dense binary classification methods, without proper regularisation of model confidence, are prone to being over-confident. To calibrate Deep Neural Networks (DNNs), we propose a SelfCalibrating Vicinal Risk Minimisation (SCVRM) that explores the vicinity space of labeled data, where vicinal images that are farther away from labeled images adopt the groundtruth label with decreasing label confidence. We prove that in the logistic regression problem, SCVRM can be seen as a Vicinal Risk Minimisation plus a regularisation term that penalises the over-confident predictions. In practical implementation, SCVRM is approximated using Monte Carlo sampling that samples additional augmented training images and labels from the vicinal distributions. Experimental results demonstrate that SCVRM can signifi-cantly enhance model calibration for different dense classification tasks on both in-distribution and out-of-distribution data. Code is available at https://github.com/Carlisle-Liu/SCVRM.

Description

Citation

Source

Book Title

Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024

Entity type

Publication

Access Statement

License Rights

Restricted until