Data, Analysis, and Standardization

Gabriella Rustici, Andreas Scherer, John Quackenbush

Research output: Chapter in Book/Report/Conference proceedingChapterScientificpeer-review

Abstract

‘Reporting noise’ is generated when data and their metadata are described, stored, and exchanged. Such noise can be minimized by developing and adopting data reporting standards, which are fundamental to the effective interpretation, analysis and integration of large data sets derived from high-throughput studies. Equally crucial is the development of experimental standards such as quality metrics and a consensus on data analysis pipelines, to ensure that results can be trusted, especially in clinical settings. This chapter provides a review of the initiatives currently developing and disseminating computational and experimental standards in biomedical research.
Original languageEnglish
Title of host publicationBatch effects and Noise in Microarray Experiments: Sources and Solutions
Number of pages15
PublisherWiley
Publication date2009
Pages215-230
ISBN (Print)978-0-470-74138-2
Publication statusPublished - 2009
MoE publication typeA3 Book chapter

Fields of Science

  • 112 Statistics and probability

Cite this