Supervised Classification Using Balanced Training

Mian Du, Matthew Pierce, Lidia Pivovarova, Roman Yangarber

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

We examine supervised learning for multi-class, multi-label text
classification. We are interested in exploring classification in a
real-world setting, where the distribution of labels may change
dynamically over time. First, we compare the performance of an array of
binary classifiers trained on the label distribution found in the
original corpus against classifiers trained on balanced data, where
we try to make the label distribution as nearly uniform as possible. We
discuss the performance trade-offs between balanced vs. unbalanced
training, and highlight the advantages of balancing the training set.
Second, we compare the performance of two classifiers, Naive Bayes and
SVM, with several feature-selection methods, using balanced training. We
combine a Named-Entity-based rote classifier with the statistical
classifiers to obtain better performance than either method alone.
Originalspråkengelska
Titel på värdpublikationUnknown host publication
Antal sidor12
FörlagSpringer-Verlag
Utgivningsdatumokt. 2014
StatusPublicerad - okt. 2014
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangInternational Conference on Statistical Language and Speech Processing (SLSP 2014) - Grenoble, Frankrike
Varaktighet: 14 okt. 201416 okt. 2014
Konferensnummer: 2

Publikationsserier

NamnLecture notes in artificial intelligence
Nummer8791

Vetenskapsgrenar

  • 113 Data- och informationsvetenskap

Citera det här