People have to go around measuring things. There's no escape from that for most of that type of work. There's a deep relationship between the two. No one's going to come up with a model that works without going and comparing with experiment. But it is the intelligent use of experimental measurements that we're after there because that goes to this concept of Bayesian methods. I will perform the right number of experiments to make measurements of, say, the time series evolution of a given set of proteins. From those data, when things are varying in time, I can map that on to my deterministic Popperian model and infer what's the most likely value of all the parameters that would be Popperian ones that would fit into the model. It's an intelligent interaction between them that's necessary in many complicated situations.
INTRODUCTION
by John Brockman
There’s a massive clash of philosophies at the heart of modern science. One philosophy, called Baconianism after Sir Francis Bacon, neglects theoretical underpinning and says just make observations, collect data, and interrogate them. This approach is widespread in modern biology and medicine, where it’s often called informatics. But there’s a quite different philosophy, traditionally used in physics, formulated by another British Knight, Sir Karl Popper. In this approach, we make predictions from models and we test them, then iterate our theories.
In modern medicine you might find it strange that many people don’t think in theoretical terms. It's a shock to many physical scientists when they encounter this attitude, particularly when it is accompanied by a conflation of correlation with causation. Meanwhile, in physics, it is extremely hard to go from modeling simple situations consisting of a handful of particles to the complexity of the real world, and to combine theories that work at different levels, such as macroscopic theories (where there is an arrow of time) and microscopic ones (where theories are indifferent to the direction of time).
At University College London, physical chemist Peter Coveney, is using theory, modeling and supercomputing to predict material properties from basic chemical information, and to mash up biological knowledge at a range of levels, from biomolecules to organs, into timely and predictive clinical information to help doctors. In doing this, he is testing a novel way to blend the Baconian and Popperian approaches and have already had some success when it comes to personalized medicine and predicting the properties of next generation composites.
—JB
PETER COVENEY holds a chair in Physical Chemistry, and is director of the Centre for Computational Science at University College London and co-author, with Roger Highfield, of The Arrow of Time and Frontiers of Complexity. Peter Coveney's Edge Bio Page.