Gradient-based training and pruning of radial basis function networks with an application in materials physics

Jussi Määttä, Viacheslav Bazaliy, Jyri Kimari, Flyura Djurabekova, Kai Nordlund, Teemu Roos

Forskningsoutput: TidskriftsbidragArtikelVetenskapligPeer review

Sammanfattning

Many applications, especially in physics and other sciences, call for easily interpretable and robust machine learning techniques. We propose a fully gradient-based technique for training radial basis function networks with an efficient and scalable open-source implementation. We derive novel closed form optimization criteria for pruning the models for continuous as well as binary data which arise in a challenging real-world material physics problem. The pruned models are optimized to provide compact and interpretable versions of larger models based on informed assumptions about the data distribution. Visualizations of the pruned models provide insight into the atomic configurations that determine atom-level migration processes in solid matter; these results may inform future research on designing more suitable descriptors for use with machine learning algorithms. (c) 2020 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Originalspråkengelska
TidskriftNeural Networks
Volym133
Sidor (från-till)123-131
Antal sidor9
ISSN0893-6080
DOI
StatusPublicerad - jan 2021
MoE-publikationstypA1 Tidskriftsartikel-refererad

Vetenskapsgrenar

  • 114 Fysik
  • 113 Data- och informationsvetenskap

Citera det här