‘Reporting noise’ is generated when data and their metadata are described, stored, and exchanged. Such noise can be minimized by developing and adopting data reporting standards, which are fundamental to the effective interpretation, analysis and integration of large data sets derived from high-throughput studies. Equally crucial is the development of experimental standards such as quality metrics and a consensus on data analysis pipelines, to ensure that results can be trusted, especially in clinical settings. This chapter provides a review of the initiatives currently developing and disseminating computational and experimental standards in biomedical research.
|Titel på värdpublikation||Batch effects and Noise in Microarray Experiments: Sources and Solutions|
|Status||Publicerad - 2009|
|MoE-publikationstyp||A3 Del av bok eller annan forskningsbok|
- 112 Statistik