Data, Analysis, and Standardization

Gabriella Rustici, Andreas Scherer, John Quackenbush

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKapitelVetenskapligPeer review


‘Reporting noise’ is generated when data and their metadata are described, stored, and exchanged. Such noise can be minimized by developing and adopting data reporting standards, which are fundamental to the effective interpretation, analysis and integration of large data sets derived from high-throughput studies. Equally crucial is the development of experimental standards such as quality metrics and a consensus on data analysis pipelines, to ensure that results can be trusted, especially in clinical settings. This chapter provides a review of the initiatives currently developing and disseminating computational and experimental standards in biomedical research.
Titel på värdpublikationBatch effects and Noise in Microarray Experiments: Sources and Solutions
Antal sidor15
ISBN (tryckt)978-0-470-74138-2
StatusPublicerad - 2009
MoE-publikationstypA3 Del av bok eller annan forskningsbok


  • 112 Statistik

Citera det här