Data, Analysis, and Standardization

Gabriella Rustici, Andreas Scherer, John Quackenbush

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKapitelVetenskapligPeer review

Sammanfattning

‘Reporting noise’ is generated when data and their metadata are described, stored, and exchanged. Such noise can be minimized by developing and adopting data reporting standards, which are fundamental to the effective interpretation, analysis and integration of large data sets derived from high-throughput studies. Equally crucial is the development of experimental standards such as quality metrics and a consensus on data analysis pipelines, to ensure that results can be trusted, especially in clinical settings. This chapter provides a review of the initiatives currently developing and disseminating computational and experimental standards in biomedical research.
Originalspråkengelska
Titel på värdpublikationBatch effects and Noise in Microarray Experiments: Sources and Solutions
Antal sidor15
FörlagWiley
Utgivningsdatum2009
Sidor215-230
ISBN (tryckt)978-0-470-74138-2
StatusPublicerad - 2009
MoE-publikationstypA3 Del av bok eller annan forskningsbok

Vetenskapsgrenar

  • 112 Statistik

Citera det här