‘Reporting noise’ is generated when data and their metadata are described, stored, and exchanged. Such noise can be minimized by developing and adopting data reporting standards, which are fundamental to the effective interpretation, analysis and integration of large data sets derived from high-throughput studies. Equally crucial is the development of experimental standards such as quality metrics and a consensus on data analysis pipelines, to ensure that results can be trusted, especially in clinical settings. This chapter provides a review of the initiatives currently developing and disseminating computational and experimental standards in biomedical research.
|Otsikko||Batch effects and Noise in Microarray Experiments: Sources and Solutions|
|Tila||Julkaistu - 2009|
|OKM-julkaisutyyppi||A3 Kirjan tai muun kokoomateoksen osa|
- 112 Tilastotiede