Increasing Stability of EEG Components Extraction Using Sparsity Regularized Tensor Decomposition

Deqing Wang, Xiaoyu Wang, Yongjie Zhu, Petri Toiviainen, Minna Huotilainen, Tapani Ristaniemi, Fengyu Cong

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

Tensor decomposition has been widely employed for EEG signal processing in recent years. Constrained and regularized tensor decomposition often attains more meaningful and interpretable results. In this study, we applied sparse nonnegative CANDECOMP/PARAFAC tensor decomposition to ongoing EEG data under naturalistic music stimulus. Interesting temporal, spectral and spatial components highly related with music features were extracted. We explored the ongoing EEG decomposition results and properties in a wide range of sparsity levels, and proposed a paradigm to select reasonable sparsity regularization parameters. The stability of interesting components extraction from fourteen subjects’ data was deeply analyzed. Our results demonstrate that appropriate sparsity regularization can increase the stability of interesting components significantly and remove weak components at the same …
Originalspråkengelska
Titel på gästpublikationAdvances in Neural Networks – ISNN 2018 : 15th International Symposium on Neural Networks, ISNN 2018, Minsk, Belarus, June 25–28, 2018, Proceedings
RedaktörerTingwen Huang, Jiancheng Lv, Changyin Sun, Alexander V. Tuzikov
Antal sidor11
FörlagSpringer
Utgivningsdatum2018
Sidor789-799
ISBN (tryckt)978-3-319-92536-3
ISBN (elektroniskt)978-3-319-92537-0
StatusPublicerad - 2018
MoE-publikationstypA4 Artikel i en konferenspublikation
Evenemang15th International Symposium on Neural Networks - Minsk, Vitryssland
Varaktighet: 25 jun 201828 jun 2018

Publikationsserier

NamnLecture Notes in Computer Science
FörlagSpringer
Nummer10878
ISSN (tryckt)0302-9743
ISSN (elektroniskt)1611-3349

Vetenskapsgrenar

  • 516 Pedagogik
  • 3112 Neurovetenskaper
  • 3111 Biomedicinska vetenskaper

Citera det här