After all the advances in bioinformatics in protein modelling, it feels like scientist still do not trust most of the data generated using in-silico techniques.
I don’t think any of the big problems in biology will be solved using computational approaches alone. I think most of questions have been and will continue to be solved by combination of theoretical, including computational, and experimental approaches.
So, broadly speaking, in science, you collect data doing some experiments and then use this data to guess or sometimes compute a model of the system to which the data apply. Then with that model in hand, you do more experiments. So you need to do both experimenting and modelling in order to solve any significant question.
In-silico science contributes 50/50% with experimental science.
Important issue about the in-silico science is the accuracy of a model or quality of the data produced by the models i.e How much reliable is the data produced from a model? How accurate is your model?
It depends on the quality of data, and it also depends on the quality of the modelling process.
We need to merge both in-silico and experimental science together in order to solve biological problems.
Modelling can guide and thus reduce the experimental effort and provide a sense of direction to experimentalists.
A model is a hypothesis concerning the structure of a protein. Like any hypothesis it is only valuable if it leads to predictions that can be verified experimentally. In the Plückthun lab, we use in-silico design in close conjunction with in-vitro evolution and experimental verification. In this context, in-silico design is extremely valuable.
In many cases, it can give us the good candidates. But, if only prediction, w/o further verification, it will be useful for making a paper, but nothing for research.
Hai Dr.Miguel de abreu , your post really attracts me. I hope your question will get more interesting comments.
To answer your question I would tell yes. The in silico studies are much helpful to the research society if you have enough data. The researcher working on the hypothetical proteins or a protein with lack of sufficient data or structure will really face a difficult task. In this case, we compare the prediction we have with experimental data. Same way, the in silico studies are much helpful to support the experimental data too.
A in silico result generated from less evidence is just a hint for other researchers.
In a number of projects, I have been able to use protein modeling to sufficiently stabilize/improve the folding efficiency of a protein. This enabled us to produce sufficient amountsof the protein to either obtain an X-ray or NMR structure.
I don’t think any of the big problems in biology will be solved using computational approaches alone. I think most of questions have been and will continue to be solved by combination of theoretical, including computational, and experimental approaches.
So, broadly speaking, in science, you collect data doing some experiments and then use this data to guess or sometimes compute a model of the system to which the data apply. Then with that model in hand, you do more experiments. So you need to do both experimenting and modelling in order to solve any significant question.
In-silico science contributes 50/50% with experimental science.
Important issue about the in-silico science is the accuracy of a model or quality of the data produced by the models i.e How much reliable is the data produced from a model? How accurate is your model?
It depends on the quality of data, and it also depends on the quality of the modelling process.
We need to merge both in-silico and experimental science together in order to solve biological problems.
Modelling can guide and thus reduce the experimental effort and provide a sense of direction to experimentalists.
People are so eager to forget that the DNA structure itself was discovered through modeling, only *using* available experimental data as input. It should be interpreted as computational / structural modeling as an primary approach that can lead to valuable inferences from experimental data. It also highlights the primacy of thought experiments.
The DNA structure is a prime example of what a model should do: It yielded a plausible explanation for well-studies puzzling experimental findings, such as Chagaff's rules :
A, T, C, and G were not found in equal quantities (as some models at the time would have predicted)
The amounts of the bases varied among species, but not between individuals of the same species
The amount of A always equalled the amount of T, and the amount of C always equalled the amount of G (A = T and G = C)
and semi-conservative replication of DNA, it fitted Rosalind Franklin's X-ray pattern, which suggested a two stranded, helical pattern from which the overall size could be deduced. It has since be well verified by X-ray crystallography, although other conformations have been shown to exist for particular sequences and environmental conditions, such as A and Z-DNA, as well as DNA quadruplexes.
The prime advantage Watson and Crick had over competing players, such as Linus Pauling, was access to Rosalind Franklin's data, which allowed them to identify the correct model, although the way they obtained this data was questionable.
I think it need "dry" and “wet” combine. The dry is theroy computation, such as molecular dynamics simulations and molecular docking and et al. The wet is the experimental results.