Uncertainty Estimation with Calibrated Confidence Scores

Research output: Conference materialsPoster


Modern machine learning models can achieve impressive results in a wide range of different tasks but are usually poor in expressing reliably how confident they are with their predictions. In an industrial setting, the end goal is usually not a prediction of a model, but a decision based on that prediction. It is often not sufficient to generate high-accuracy predictions on average. One also needs to estimate uncertainty and risks involved when making the decisions. Thus, having reliable and calibrated uncertainty estimates is highly useful for any model used in automated decision making.

In this paper we present a case study, where we propose a novel method to improve the uncertainty estimates of an in-production machine learning model operating in an industrial setting with real-life data. This model is used by a Finnish company Basware to extract information from invoices in the form of machine-readable PDFs. The solution we propose is shown to produce calibrated confidence estimates, which outperform legacy estimates on several relevant metrics, increasing coverage of automated invoices from 65.6% to 73.2% with no increase in error rate.
Original languageEnglish
Number of pages1
Publication statusPublished - 16 Nov 2022
MoE publication typeNot Eligible
EventAI Day 2022 - Espoo, Finland
Duration: 16 Nov 202216 Nov 2022


OtherAI Day 2022
Internet address

Fields of Science

  • 113 Computer and information sciences

Cite this