Supervised Classification Using Balanced Training

Mian Du, Matthew Pierce, Lidia Pivovarova, Roman Yangarber

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


We examine supervised learning for multi-class, multi-label text
classification. We are interested in exploring classification in a
real-world setting, where the distribution of labels may change
dynamically over time. First, we compare the performance of an array of
binary classifiers trained on the label distribution found in the
original corpus against classifiers trained on balanced data, where
we try to make the label distribution as nearly uniform as possible. We
discuss the performance trade-offs between balanced vs. unbalanced
training, and highlight the advantages of balancing the training set.
Second, we compare the performance of two classifiers, Naive Bayes and
SVM, with several feature-selection methods, using balanced training. We
combine a Named-Entity-based rote classifier with the statistical
classifiers to obtain better performance than either method alone.
Original languageEnglish
Title of host publicationUnknown host publication
Number of pages12
Publication dateOct 2014
Publication statusPublished - Oct 2014
MoE publication typeA4 Article in conference proceedings
EventInternational Conference on Statistical Language and Speech Processing (SLSP 2014) - Grenoble, France
Duration: 14 Oct 201416 Oct 2014
Conference number: 2

Publication series

NameLecture notes in artificial intelligence

Fields of Science

  • 113 Computer and information sciences

Cite this