Isotropy, Clusters, and Classifiers

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

Whether embedding spaces use all their dimensions equally, i.e., whether they are isotropic, has been a recent subject of discussion. Evidence has been accrued both for and against enforcing isotropy in embedding spaces. In the present paper, we stress that isotropy imposes requirements on the embedding space that are not compatible with the presence of clusters---which also negatively impacts linear classification objectives. We demonstrate this fact both empirically and mathematically and use it to shed light on previous results from the literature.
Originalspråkengelska
Titel på värdpublikationProceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
RedaktörerLun-Wei Ku, Andre Martins, Vivek Srikumar
Antal sidor10
UtgivningsortKerrville
FörlagThe Association for Computational Linguistics
Utgivningsdatum1 aug. 2024
Sidor75-84
ISBN (elektroniskt)979-8-89176-095-0
StatusPublicerad - 1 aug. 2024
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangAnnual Meeting of the Association for Computational Linguistics - Bangkok, Thailand
Varaktighet: 11 aug. 202416 aug. 2024
Konferensnummer: 62

Vetenskapsgrenar

  • 6121 Språkvetenskaper
  • 113 Data- och informationsvetenskap

Citera det här