It may seem strange to say it, but experts are rarely interested in getting at the truth, whatever it may be. What they want to do is prove that certain things are true. Which things? Well, whatever they happen to believe is true, for whatever reasons, or whatever will benefit their careers or status or funding the most. Hawkers of diet plans need their gimmicks to help people lose weight, golf pros need their tips to take strokes off their clients' games, relationship gurus need their insights to strengthen marriages––if there's evidence that their advice doesn't in fact pay off, don't expect to learn it from them….
It's a cornerstone of science, on the other hand, that researchers aren't supposed to favor particular outcomes in their studies. And yet Penn State's Kenneth Weiss and his colleagues have noted that the beliefs of researchers are shaped by "all of the vanities, vested interests, hunches, experiences, politics, careerism, grantsmanship tactics, competing cadres of collaborators, imperfections, and backgrounds of the scientists investigating problems at any time. If a scientist wants or expects to end up with certain results, he will likely achieve them, often through some form of fudging, whether conscious or not–– bias exerts a sort of gravity over error, pulling the glitches in one direction, so that the errors tend to add up rather than cancel out. Francis Bacon noted in the late sixteenth century that preconceived ideas shape observation, causing people, for example, to take special notice of phenomena and measurements that confirm a belief while ignoring those that contradict it. Thomas Kuhn, the MIT science historian who famously gave the world the phrase "paradigm shift," argued in the early 1960s that what scientists choose to measure, how they measure it, which measurements they keep, and what they conclude from them are all shaped by their own and their colleagues' ideas and beliefs. And Berkeley's Robert MacCoun told me that once an expert jumps to a dubious conclusion, she'll simply tend to ignore or explain away conflicting evidence.
David H. Freedman. Wrong: Why Experts Keep Failing Us—And How to Know When Not to Trust Them pp. 113-14
caveat utilitor de omnibus dubitandum est
____________________ RA since 1988; AP since 1993; MP Nov. 2008; Phase 2 |
Zith 6/09- d/c 11/09; resumed 10/10. Benicar q6, Feldene, mino 100 q48 | 5/09 D=14 |12/09 D= 10 | 6/10 D=12 My Intro