Failure Prediction in 2D Document Information Extraction with Calibrated Confidence Scores

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Modern machine learning models can achieve impressive results in many tasks, but often fail to express reliably how confident they are with their predictions. In an industrial setting, the end goal is usually not a prediction of a model, but a decision based on that prediction. It is often not sufficient to generate high-accuracy predictions on average. One also needs to estimate the uncertainty and risks involved when making related decisions. Thus, having reliable and calibrated uncertainty estimates is highly useful for any model used in automated decision-making.

In this paper, we present a case study, where we propose a novel method to improve the uncertainty estimates of an in-production machine learning model operating in an industrial setting with real-life data. This model is used by Basware, a Finnish software company, to extract information from invoices in the form of machine-readable PDFs. The solution we propose is shown to produce calibrated confidence estimates, which outperform legacy estimates on several relevant metrics, increasing coverage of automated invoices from 65.6\% to 73.2\% with no increase in error rate.
Original languageEnglish
Title of host publication2023 IEEE 47th Annual Computers, Software, and Applications Conference (COMPSAC)
Number of pages10
PublisherIEEE
Publication date2023
Pages193-202
ISBN (Electronic)979-8-3503-2697-0
DOIs
Publication statusPublished - 2023
MoE publication typeA4 Article in conference proceedings
EventAnnual Computers, Software, and Applications Conference - Torino, Italy
Duration: 26 Jun 202330 Jun 2023
Conference number: 47

Fields of Science

  • 113 Computer and information sciences
  • Machine learning
  • Uncertainty estimation
  • Confidence calibration
  • Failure prediction
  • Information extraction

Cite this