Today molecular biology researchers allow generating massive quantities of multiple types of data with Nextgeneration sequencing, qPCR, microRNA, and Proteomics in less time.
For an example, if the researcher is dealing with understanding complex cause and effect relationships of differentially expressed genes, lethal or loss-of-function mutations can be challenging to identify systematically without having very high quality curated knowledge.
Obviously, data analysis depends on the research, and statistical analysis may consider the first step. However, researchers have to filter down their massive quantities of initial data in order to comprehensive biological analysis, to figure out the most interesting and relevant information from their experimental results.
With your experiences how do you effectively sort through and better understand these data for meaningful biological discovery?
What are your experiences of understanding your experimental data for making a cohesive biological story based on the known supporting evidence /knowledge?
What makes you perfect? What are your personal experiences regarding this?