Abstrakti
We study the complexity of the problem of training neural networks defined via various activation functions. The training problem is known to be existsR-complete with respect to linear activation functions and the ReLU activation function. We consider the complexity of the problem with respect to the sigmoid activation function and other effectively continuous functions. We show that these training problems are polynomial-time many-one bireducible to the existential theory of the reals extended with the corresponding activation functions. In particular, we establish that the sigmoid activation function leads to the existential theory of the reals with the exponential function. It is thus open, and equivalent with the decidability of the existential theory of the reals with the exponential function, whether training neural networks using the sigmoid activation function is algorithmically solvable. In contrast, we obtain that the training problem is undecidable if sinusoidal activation functions are considered. Finally, we obtain general upper bounds for the complexity of the training problem in the form of low levels of the arithmetical hierarchy.
Alkuperäiskieli | englanti |
---|---|
Otsikko | Proceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence |
Toimittajat | Michael Wooldridge, Jennifer Dy, Sriraam Natarajan |
Sivumäärä | 8 |
Vuosikerta | 38(11) |
Kustantaja | AAAI Press |
Julkaisupäivä | 2024 |
Sivut | 12278-12285 |
ISBN (painettu) | 978-1-57735-887-9 |
DOI - pysyväislinkit | |
Tila | Julkaistu - 2024 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisuussa |
Tapahtuma | Annual AAAI Conference on Artificial Intelligence - Vancouver, Kanada Kesto: 20 helmik. 2024 → 27 helmik. 2024 Konferenssinumero: 38 https://aaai.org/aaai-conference/ |
Julkaisusarja
Nimi | |
---|---|
ISSN (painettu) | 2159-5399 |
ISSN (elektroninen) | 2374-3468 |
Lisätietoja
Revised version of a manuscript sent for review in April 2023Tieteenalat
- 111 Matematiikka