Blog Archive

Followers

Come Reason's Apologetics Notes blog will highlight various news stories or current events and seek to explore them from a thoughtful Christian perspective. Less formal and shorter than the www.comereason.org Web site articles, we hope to give readers points to reflect on concerning topics of the day.

Powered by Blogger.
Showing posts with label scientism. Show all posts
Showing posts with label scientism. Show all posts

Friday, December 15, 2017

Why a Scientific Consensus isn't What it's Cracked Up to Be



A couple of years ago, the Internet blew up over a huge debate—one that captured the attention of popular culture and caused fierce disagreements between friends and family members. I am, of course, talking about the infamous "What color is the dress?" meme portrayed in the accompanying image. One can perceive the dress colors to be either blue and black or white and gold, and it seems for most people once you see the colors a certain way, you simply can't see them from the other perspective.

Now, imagine you want to buy a gift for your mother's birthday and your father had sent you that same picture with the recommendation that since he's buying her a dress, you should purchase the accessories. Would your purchases make sense? We don't know. It all depends on what you see and whether your perception matches reality. Even if the one buying the accessories had the most exquisite fashion sense and was gifted in picking out the most tasteful and appropriate accoutrements, it matters what their perception of the dress colors were.

Scientific Consensus is Founded on Paradigms

I offer the thought experiment because it helps us to better understand how paradigms influence people. We all make choices based on a specific way of seeing things, and this is true in the fields of science as much as it is in any other discipline. In fact, the terms "paradigm" and "paradigm shift" were coined by Thomas Kuhn in his earthshaking book The Structure of Scientific Revolutions. Kuhn there demonstrates how scientific knowledge hasn't been acquired in a slow, steady, progressive line. That's a myth.

Kuhn states that what really happens is young scientists accept certain assumptions about how the world works because they've been taught that from those already in the field. He writes that the student studying in whatever scientific discipline he will eventually practice,
joins men who learned the bases of their field from the same concrete models, his subsequent practice will seldom evoke overt disagreement over fundamentals. Men whose research is based on shared paradigms are committed to the same rules and standards for scientific practice. That commitment and the apparent consensus it produces are prerequisites for normal science, i.e., for the genesis and continuation of a particular research tradition.1(emphasis added).
What this means is that scientists within a particular field of study all start with some basic assumptions and then they rely upon those assumptions to solve problems on details within that model. So, if one were to start with the paradigm that the dress is white and gold, then the answer to the problem of what kind of accessories would complement the dress will come out differently than if one were to hold the paradigm that the dress is blue and black.

The Consensus Can be Influenced by Outside Factors

If you are basing your accessory choices on the paradigm of a white and gold dress, and you find that the majority of those who you learn from and those you work with have also accepted this paradigm, you no longer ask about the color of the dress or whether whiter is a better color for a handbag than back. When someone comes into your fold and suggests black for a handbag, your reaction would be one of incredulity. Certainly any fool can see that black is the wrong color choice! You might even make fun of them and dismiss them as not doing good science. But what they've questioned is the paradigm you have assumed, not the reasoning to the color if the paradigm were true.

Here's the thing, though. These paradigms themselves are frequently caused by factors beyond dispassionate science. Kuhn himself discovered this when investigating the Ptolemaic and Copernican ideas of the solar system. Ptolemy's paradigm was first formed by Aristotle, who held to a couple of very Greek ideas, one of which was that some bodies are naturally inclined to move in a circular pattern. In other words, planets by their nature would move circularly because that's what they do. Aristotle's assumption set the paradigm that worked for many centuries and allowed the scientists for those days to come up with accurate predictions.

It's much like another image that takes on conflicting perceptions. Look at the drawing of the animal I have here. Is this a drawing of a rabbit or a duck? Normally, you will perceive one or the other first. Interestingly, outside factors make a difference in what you see. The Independent reports "At different times during the year, the result of the test seem to change. During the Easter period, people are more likely to see a rabbit first but in October, seeing the duck first is more common."2

Aristotle's assumption on the nature of bodies moving in a circular pattern was based on Greek philosophy. Thus it was a philosophical commitment that shaped the science of planetary orbits and of our understanding the nature of our solar system for centuries. It was only when instruments became more sophisticated that flaw could be seen in the model. These flaws grew to the point of crisis until those within the community had to abandon their paradigm and adopt a new one. This is what Kuhn labels a paradigm shift.

The Consensus Can Be Wrong

Before a paradigm shift occurs, there is a scientific consensus about whatever point one is discussing. But even though a consensus exists, that doesn't mean those who oppose the consensus are wrong. They may in fact be right, but they are simply offering a different paradigm.

When you read about the contentious scientific issues of our day like the origin of life, man-caused climate change, and neo-Darwinian evolution, it won't be long before someone makes the claim that given a scientific consensus exists on topic X, anyone who holds a contrary view is anti-science. That's simply wrong. It may be that those who hold to the contrary position see the flaws and wish to question the paradigm itself. The bigger question thinking people need to ask is "what are the assumptions implicit in this position and have they been tested?" The question of the color of the dress can be answered, if one enlarges the frame to see more of the picture. Doing this isn't anti-science but what Kun calls extraordinary science.

So let's not point to the idea of a scientific consensus as the final card in a debate. The consensus may be the very thing that needs to be questioned.

References

1. Thomas Samuel Kuhn, The Structure of Scientific Revolutions, Second Edition., University of Chicago Press, 1970. 11.
2. Chloe Farand. "Duck or Rabbit? The 100-Year-Old Optical Illusion That Could Tell You How Creative You Are." The Independent, Independent Digital News and Media, 14 Feb. 2016, www.independent.co.uk/news/science/duck-or-rabbit-the-100-year-old-optical-illusion-that-tells-you-how-creative-you-are-a6873106.html.

Monday, January 30, 2017

Scientists Have Created Human-Pig Hybrids. So Now What?



It can happen during busy news cycles that some of the more important stories are missed. That may have been the case last week as scientists announced they had successfully created a human-pig chimera embryos in what was called a "first proof" of concept by the BBC. According to the report, scientists were able to inject human stem cells into a newly-formed pig embryo and then implanted the embryo into a sow, allowing it to grow for 28 days. They then looked to see whether the human cells were growing before destroying the embryo.

The ultimate goal in these tests is not to develop some kind of hybrid monster, but to be able to grow human organs in animals for eventual transplant to patients whose organs are failing. In this specific experiment, the embryos would be considered less than one-ten thousandth human. Still, it marks the first time functioning human cells have been observed growing inside a large animal, according to Juan Carlos Izpisua Belmonte of the Salk Institute, causing other researchers to describe the published findings as "exciting."

More Questions than Answers

I think this report is important for some very specific reasons. First, the idea of a human-pig chimera is shocking. It opens a lot of questions about humanity and our technical abilities. The report made it clear that this research is highly inefficient and would take many years to develop more fully. But the BBC report also noted this kind of research is "ethically charged" and offered a one-sentence disclaimer stating "There was no evidence that human cells were integrating into the early form of brain tissue." 1

The fact that our technology is progressing faster than our ability to place it in its ethical context is the biggest takeaway from the story. For those who hold to moral pragmatism—meaning the ends justify the means—then it makes sense to do whatever one wants in order to achieve a desired goal. Burt pragmatism isn’t real morality; it simply says the ends justify the means, which is a position used by tyrants.

How Science Cannot Account for Morality

More interestingly to me is the fact that the chimera research is a clear example of how science cannot answer all the important questions. Prof. Belmonte, who was o0ne of the researchers on the project was clear that at this time they were not allowing the embryos to grow longer than one month, as that’s all they needed to confirm development t of human cells in the pigs:
One possibility is to let these animals be born, but that is not something we should allow to happen at this point.

Not everything that science can do we should do, we are not living in a niche in lab, we live with other people - and society needs to decide what can be done.2
I’m glad to see a bit of caution in Belmonte’s words. He’s right to say not everything science can do it should. The statement really rebuts new atheists like Sam Harris, who famously argued in his book The Moral Landscape that science itself can be the foundation of morality. Well, here’s the science. Where does the morality come from? How would Harris ground any kind of moral reasoning as to whether we should do what we can do in this instance? Should we mix human and pig cells even more? What if human and pig brain cells were combined?

Ultimately, this shows that science can only tell us what is possible, not whether it should be. Moral reasoning must come a moral lawgiver, not from the fact that something can be done. Otherwise, we’ll all be left with a real Frankenstein’s monster of moral values.

References

1. Gallagher, James. "Human-pig 'chimera Embryos' Detailed." BBC News. BBC, 26 Jan. 2017. Web. 27 Jan. 2017. http://www.bbc.com/news/health-38717930.
2. Gallagher, James. 2017.

Wednesday, September 14, 2016

Trusting in Science Alone will Starve Our Ability to Know



Every group has its biases. Enlightenment thinkers believed reason could provide the ultimate answer to all questions. The Victorians stressed common manners and proprieties. Both were helpful in some ways; manners provided a common framework for engaging with large populations pushed together as modern cities developed and reason is an appropriate way to seek understanding. But they shouldn't be practiced to the exclusion of other ways we understand.

Today, the dominant framework most people assume will provide answers and meaning is neither manners nor reason, but science. Atheists and "freethinkers" especially tend to hold to an over-confidence in science as the path to discovering truth. As an example, I wrote an article entitled "Three Intractable Problems for Atheism" where I pointed out that the origin of universe, the origin of life, and the origin of consciousness are unexplainable if all that exists is matter following physical laws. One comment I received was "We don't know YET, because we've only just in the past century begun to seriously uncover the origins of the universe. If that day comes, and you don't like the answer, what will the next goalpost be?" What those who respond in such ways never say is why they think that science is even the right discipline to answering these questions at all.

Fingers and Forks

In fact, science will never be able to answer these questions because it isn't designed to do so. Let me offer an example. Early cultures primarily used their fingers to eat their food. They would pick and tear at a piece of meat or tear off a hunk of bread. Even in Jesus's day, this was pretty common. But using your fingers has some drawbacks, too. If your hands are dirty, they can contaminate the food. You can't touch things that are too hot, and the buildup of greasy food on your hands means you'll need to wash after a meal.

That's why the fork is such a great invention. It solves health issues that accompany eating only with one's fingers. But it does more than that. It allows one to keep an item from moving so it can be cut, adjusting the size of your bite to fit you individually. It skewers smaller food items, like individual beans, that would be hard to grasp with your hands. It also reflects proper manners, providing a symbol of separation from animals.

Forks have given human beings a great step forward in our culinary history, allowing us to eat in ways we couldn't have without it. However, if the chef places a bowl of tomato soup in front of me, the fork is no longer useful. The benefits that the fork conveys when consuming solid food are the very reason it fails when applied to liquids. To close the tines of the fork so it may hold liquid would rob the fork of its unique abilities to skewer other foods. I need a different tool.

Now imagine a person from "the fork is the only way to true nourishment" camp who seeks to eat the soup with his fork. He tries to eat the soup and quickly becomes frustrated. He can dip his utensil inn the soup for a long, long, time. He'll never get all the soup and probably burn more calories than he consumes trying. At this result, he may then conclude that soup isn't really food at all.

Choosing the Right Utensil When Searching for Truth

Science is like a fork in humanity's quest for knowledge. It can do a lot of things. It has improved our health and allowed us to create new polymers. It has shown us facts about the material universe and its laws. But from where that universe and its laws originate, science cannot answer because it simply isn't designed to do so. It cannot tell us about things like consciousness since consciousness is immaterial.

When pressed, atheists usually try to escape their dilemma in one of two ways: they either claim science will get there eventually (what I call a Science of the Gaps argument). But that's just wishful thinking and as they seriously consider what human consciousness entails—things like the capacity for free will on a purely materialist framework—they begin to deny things like consciousness and free will are real.

Science, like a fork, is useful in the hand of humanity. It can serve us well as we seek to cut into the mysteries of the universe and digest what we discover there. However, it shouldn't be the only tool on the table. To ignore other ways of consuming knowledge is to limit not expand our intellectual palate.

Wednesday, August 10, 2016

Neil deGrasse Tyson Violates Rationalia's One Principle



I find it fascinating how blinded people can be to their own biases. One recent case in point is cosmologist Neil deGrasse Tyson and his imaginary country of Rationalia. Originally spawned by a single tweet, Tyson asserted "Earth needs a virtual country: #Rationalia, with a one-line Constitution: All policy shall be based on the weight of evidence."

It's pretty easy to see the glaring holes in such a proposition and several commentators were quick to point out a few of them. Practices such as eugenics, abortions for disability or population control, legislating against an unnatural ice age, and other disastrous consequences would have easily followed if Tyson's dream was a reality in prior decades. Several commentators for organizations like The Federalist, U.S. News and World Report, and New Scientist  pointed out the foolishness in his original tweet.

However, Tyson doubled-down on his proposition with a recent Facebook post. Linking to those articles before casually dismissing them out of hand, Tyson upped the ante for his proposition, maintaining that Rationalia would not only solve deep political divisions, but it would usher in a new panacea of prosperity for humanity:
Unlike what typically occurs between adversarial politicians, in scientific discourse, when we disagree with one another there's an underlying premise: Either I'm wrong and you're right; you're wrong and I'm right; or we're both wrong. The unfolding argument actually shapes the quest for more and better data to resolve the differences so that we can agree and move on to the next unresolved problem.

In Rationalia, the Constitution stipulates that a body of convincing evidence needs to exist in support of an idea before any Policy can established based on it. In such a country, data gathering, careful observations, and experimentation would be happening all the time, influencing practically every aspect of our modern lives. As a result, Rationalia would lead the world in discovery, because discovery would be built into the DNA of how the government operates, and how its citizens think.1

The Competitive World of Scientific Theory

Of course, Tyson's Pollyana-ish assumption that scientists are always objective about the data while politicians are simply adversaries is ridiculous. Thomas Kuhn's The Structure of Scientific Revolutions lays out just how nonsensical such an assumption is. Kuhn argues that scientific consensus of a certain concept, such as the nature of light, can have "a number of competing schools and sub-schools"2 arguing for their own understanding even when they are all using the same data. Kuhn states:
"Each of the corresponding schools derived strength from its relation to some particular metaphysic, and each emphasizes, as paradigmatic observations, the particular cluster of optical phenomena that its own theory could do most to explain. Other observations were dealt with by ad hoc elaborations, ort they remained as outstanding problems for further research. (emphasis added)"3
These are not detached, non-emotional observations. Scientists are people and each has a dog in the fight, so to speak. It isn't surprising that they would want to see their own theories succeed, just as politicians would want to see their own legislation pass. It isn't malicious, it's being human. And in modern research, when you add research grant money into the mix, there's a potent motivator to really push to justify one's efforts.

Paradigms and Flaws

Kuhn goes on to tell of other problems that plague scientific discourse, such as the "body of evidence" that Tyson looks toward may itself be limited given the limits of technology. Scientists may not be able to see how their theories are flawed simply because they have to guess at what data they should measure, where to look for it. Maybe the instrument that proves their theory false hasn't yet been invented. Charles Darwin couldn't have realized the complexity of living cells since there were no microscopes capable of displaying the amazing molecular machinery that allow the cell to function in his day.

This "body of evidence" that Tyson references may also be deeply flawed.  Researchers at Open Science and at Bayer labs recently found 65 to 75% or more of results reported in some of the most prestigious scientific journals could not be repeated. There was a strong body of evidence for the researchers' conclusions, but no one had previously bothered to check and see if the evidence was good or not. In turn, we get biased polices such as the Food and Drug Administration's 60 year ban on dietary fat when it turned out the scientist pushing for the restrictions was more concerned with his legacy than the facts.

Some of the problem lies in the technicality and specialization of the scientific disciplines themselves. Kuhn notes that as one of the competing concepts gathers a majority, it becomes a consensus and ultimately a paradigm that closes out others.4 Then, as the field becomes more specialized, the paradigm is assumed by the practitioners and "his communiques will begin to change in ways whose evolution has been too little studied but whose modern end products are obvious to all and oppressive to many."5

Tyson Ignores This Body of Evidence

Kuhn's arguments are based on historical observation for how scientific paradigms have developed. He has quite a body of evidence from which to draw: the entire history of the scientific enterprise. Yet, Tyson seems to completely ignore this in his proposal for a country of Rationalia. I find that interesting. If Tyson won't even acknowledge the body of evidence against science being just as flawed as politics or other governing methods, then he is proving the very point his critics are making. Just because a scientist comes to the conclusion of X doesn't make it right, morally good, or unbiased.

References

1. Tyson, Neil deGrasse. "Reflections on Rationalia." Facebook.com/neil-degrasse-tyson. Facebok.com, 7 Aug. 2016. Web. 10 Aug. 2016.
2. Kuhn, Thomas. "The Structure of Scientific Revolutions." The Philosophy of Science: An Historical Anthology. By Timothy J. McGrew, Marc Alspector-Kelly, and Fritz Allhoff. Chichester, U.K.: Wiley-Blackwell, 2009. 491. Print.
3. Kuhn, 2009. 491.
4. Kuhn, 2009. 492.
5. Kuhn, 2009. 492.
Image courtesy NASA Goddard Space Flight Center from Greenbelt, MD, USA (Dr. Neil deGrasse Tyson Visits NASA Goddard) [CC BY 2.0], via Wikimedia Commons

Thursday, February 18, 2016

Why Science Cannot Ground All Knowledge



Is science the best, most assured way of learning about reality? In the minds of more and more people, the answer is "yes." Yesterday, I highlighted a quote from scientist Peter Atkins on how he relies upon science to inform him about the world, dismissing even the consideration of God's existence as "lazy." But, relying on science as the only arbiter for judging the verity of truth claims will never work, because science cannot function as one's starting point.

When explaining reality, everyone must have a starting point. For example, one may observe an event, such as a strike of lightning, and ask "what makes that happen?" A person may respond by describing how a storm cell moving across the land scrapes off electrons until the charge is to such a degree they rush back to the ground, which is reasonable scientific. The first person would be justified in asking "how do you know that?" More conversations could ensue about the structure of atoms, experimental testing and predictions, etc. But each tome, the questioner could ask for further justification for the facts being presented. Sooner or later, there must be a starting point for science.

Four Assumptions Scientist Must Hold

Assuming the questioner drives his respondent back further and further (i.e. "But, how do you know that?") one will quickly see the scientific method relies upon several assumptions. The first is the world will behave consistently. Scientists assume that because electrons have behaved in a certain way in the past, they will also do so tomorrow, and next week, and fifty billion years from now. Science cannot prove this; the scientist must assume it to make predictions.

Secondly, in order to draw any conclusions at all, scientists must assume logic takes us towards the truth. Without logic, one could never infer anything. How can one infer any electron in the universe will behave in the same manner as the electrons creating the lighting strike if one cannot build an argument? The scientific method is really a logical argument offering support for its premises by way of experimentation and concludes with its hypothesis either confirmed or denied. The scientist gives reasons for his conclusion!

Thirdly, the scientist must assume ethics are important. Much research today draws its conclusions not simply from its own findings but from prior research and publication. Falsifying data to arrive at the conclusion one wants is considered wrong. Even unintentional bias and flawed research methods can corrupt results. That's why there's a real concern that so much of what's being published in scientific journals is irreproducible.  Without assuming ethical standards of truth-telling and the importance of solid methodology, scientific endeavors would be a confusing mishmash of conflicting data, with everyone's opinion held as equally valuable.

Lastly, the scientist must assume that his or her own mind is reliable in reporting how the world works. This is a key component to the scientific process and it also poses the biggest problem in cutting God out of the picture. If your brain is the product of mutations whose only benefit to its host is that of survival, then why should you trust it? Survival is not the same thing as truth-telling. In fact, lying can make survival much easier in many circumstances. As long as survival is the outcome, it doesn't matter whether you believe you need to run from the tiger because you're in a race or because it may eat you. If you get away, the same result is achieved. So, if we evolved from some primate species, why trust our "monkey-brains" to tell us the truth? How could one argue that a mindless, random process would even act in an orderly way?

God Grounds the Starting Points

Going back to pour first point, one must assume some intentional ordering of universe in order to ground the assumption of a consistent universe. Christianity teaches that God is a consistent God. He would create his universe in such a way that it would be consistent as well. This gives us a reason to believe in the consistency of the universe, a reason which science cannot offer. Scientists certainly assume the universe is consistent in its laws, but they have no basis for doing so, other than that's what they've seen. But even our dreams have an air of consistency to them until we wake up. Then we realize how inconsistent they are. To assume

Secondly, in the assumption of logic, God also becomes the starting point. If God is the logos—that is Reason itself—then logic and reason are built into the universe as reflections of his nature. Logic works because God is a logical God and we, as rational creatures, bear his image. Thus, we can understand and use reason to discover truths about the created order.

Thirdly, morality must have its grounding in God. The concept of classifying things as right or classifying them as wrong is central to theology. One cannot have the absolute standards of right and wrong without appealing to a being who transcends all of creation. That is God.

Lastly, the fact that a God of reason created us with the capacity to reason gives us grounding for believing our capacity for reason itself. AS part of God's created order, we can experience it in meaningful ways.

Science is a wonderful tool that tells us much about a very small slice of reality: the natural world. But the world is much bigger than its mechanics. Logic, ethics, aesthetics, relationships, mathematics, abstract concepts, and spiritual realities also comprise our lives and our experiences. Not only can science not explain these things, it must assume them before it gets going. It cannot explain its own assumptions, and therefore shows its incapacity for being the proper starting point.

Image courtesy Longlivetheux - Own work, CC BY-SA 4.0

Wednesday, February 17, 2016

The Unvarished Bias of Scientism



Recently, an episode of the Unbelievable? program featured a discussion on whether it is reasonable to claim that advances in science somehow undermine the existence of God. It pivoted on the assertion that science and religion are somehow opposed to one another, In other words, once certain scientific explanations for some observed phenomena are found, it removes the need for God to "do that."

The show pitted mathematician and physics Dr. David Glass against Oxford Emeritus Professor Peter Atkins and humanist James Croft. Peter Atkins is a physical chemist and a primary example of what it means to believe in not simply science, but scientism.

Atkins over and over again characterizes any appeal to a divine intelligence for explaining why things are the way they are as "lazy." It's as though the more he repeats the charge, the more believable he thinks it becomes. Then, at about the 44:52 mark, he offers this statement:
I'm just taking the world as it seems to me, from an utterly unprejudiced point of view. Lying here, looking at the evidence, assessing the evidence, accepting that this purported alternative explanation has arisen from sentiment, misogyny, power, hegemony, you name it… fear of personal annihilation, manipulation. All those things don't convince me that it's a better explanation.
Is this really the viewpoint of someone who holds an "utterly unprejudiced point of view?" Such a claim is farcical on its face. This isn't a one-off comment, either. Earlier in the program, he explained why he rejects theism as holding any sort of explanatory power:
I accepted right at the beginning that you can't disprove the existence of God, because as James [Croft] said, it's such a slippery and ill-defined concept. But what you can do is to understand how people came to believe that "God did it." That is, it's driven by sentiment, fear of personal annihilation, and cultural pressures, and history, and power grabbing, and all the things that go into religious belief. But if you discard those and you're left with trying to understand a mechanism by which the world works, a mechanism how it came into existence, then the only answer is through the scientific method, which is a procedure that depends upon evidence and setting theories into a whole network of understanding.
During the conversation, Glass queries Atkins and asks him how he proposes to use science to explain things like objective moral values, mathematics, and logic. Atkins retorts that ethics indeed can be explained via evolutionary survival principles, thus completely missing the distinction between functional outcomes and moral reality.

Who's Lazy Now?

Atkins' dodge should be noted. He cannot discuss how science claims to account for mathematics or the laws of logic. That's because it is impossible to do so, for science must assume these things before it can even start.

Even leaving all that aside, any person who is even half-interested in the truth will recognize that Atkins is anything but unbiased when trying to understand how beliefs are formed. His vitriolic mischaracterization that all cultures across all societies throughout history came to the conclusion of a creator for the material world because of sentiment, power, and hegemony is shameful. Has Atkins bothered at all to look into this matter? Why doesn't he acknowledge that world-class scientists like Francis Collins, who is doing top-notch work, would be deeply offended at such a characterization?

Atkins' statements do serve a purpose. They functions as evidence for only one conclusion: Atkins is the one corrupted by bias. He's the lazy one who isn't interested in seeking answers. He simply wants to throw insults, and his opinions on this issue can be ignored.

Monday, January 11, 2016

It's Crippling to Believe Only in Science



I've written several times on how today's culture holds an over-inflated view of science. Science is a great tool that helps us to learn about one very specific subset of knowledge: the mechanics behind the natural world. It cannot tell us about other crucial pieces such as what constitutes knowledge, what constitutes a meaningful relationship, or how to stop people from being evil. Given its limited scope, therefore, science is of a certain limited value.

This isn't to say the study of science is of no value or marginal value. Some of our gravest problems do come from mechanical interactions. Illness would be one example. But it is wrong to think that because one can claim "science says so" and therefore the discussion should end. With politically contentious and highly complex issues like how modern humanity may be affecting climate, a large degree of caution is warranted.

The fact is science doesn't always get it right. Thomas S. Kuhn explained scientific advancements do not come in a pattern of smooth upwards growth, but in a very herky-jerky set of fits and starts, as those holding to old paradigms are hesitant to give their particular views up. Even if there is a strong consensus of opinion on how some particular point, scientists are still people and people are capable of being wrong and being persuaded by others who are also wrong.

Here are just a few areas where claims based on accepted science were either rushed, fraudulent, or simply wrong:
  • AETHER: Aether was believed to be an element permeating the universe. The view was held by a consensus of scientists for many centuries, including names such as Issac Newton, Thomas Young, Maxwell, and Lord Kelvin. In the 19th century, more and more scientists held to the theory of luminiferous aether as the all-encompassing medium through which waves of light traveled. So strongly was the theory held that published student references works would claim: "The cumulative evidence for thinking space filled with a ponderable medium of exceedingly minute density grows stronger every day."1

    However, the entire enterprise and the many, many well-thought explanations of how our universe works were completely overthrown after an 1887 experiment couldn't detect the aether2 and Einstein showed the medium wasn't necessary. It is now considered scientifically obsolete, however it took decades for the theory to be completely abandoned as the 1914 student reference work demonstrates.
  • PILTDOWN MAN Palentologists in Britain announced Piltdown Man in1913 as a find of one of the "missing links" between ape and man. The general accepted it for years, but in 1953, Piltdown 'man' was exposed as a forgery. The skull was modern and the teeth on the ape's jaw had been filed down.3
  • ACADEMIC FRAUD: The US National Institutes of Health investigatory panel found the immunologist Thereza Imanishi-Kari had fabricated data in a 1986 research paper authored with the Nobel prize winner David Baltimore. The findings claimed in the paper promised a breakthrough for genetic modification of the immune system.4
  • N-RAYS: A French physicist, RenĂ© Blondlot, claimed to have discovered a new type of radiation, shortly after Roentgen had discovered X-rays. American physicist Robert Wood, however, revealed that N-rays were little more than a delusion. Wood removed the prism from the N-ray detection device, without which the machine couldn't work.5
None of these events show that all of science is corrupted or questionable, but it does illustrate that science has no claim on being the only way to really know something. That's why anyone who says they only believe in "science" has crippled him or herself from the truth before they've even begun to search for it.

Evolutionist Stephen Jay Gould said "Scientists cannot claim higher insight into moral truth from any superior knowledge of the world's empirical constitution."6 I always take exception when in conversation an atheist will claim to "only believe in science."

References

1. Beach, Chandler Belden, Frank Morton McMurry, and Eleanor Atkinson. "Ether." The New Student's Reference Work: For Teachers, Students and Families. Vol. II. Chicago: F.E. Compton, 1914. Online. https://en.wikisource.org/wiki/The_New_Student%27s_Reference_Work/Ether
2. "Michelson–Michelson–Morley ExperimentMorley Experiment." Wikipedia. Wikimedia Foundation, 23 Nov. 2015. Web. 11 Jan. 2016. https://en.wikipedia.org/wiki/Michelson%E2%80%93Morley_experiment.
3. "Piltdown Man." Natural History Museum. The Trustees of the Natural History Museum, London, n.d. Web. 11 Jan. 2016. http://www.nhm.ac.uk/our-science/departments-and-staff/library-and-archives/collections/piltdown-man.html.
4. Research Integrity Adjudications Panel ."Thereza Imanishi-Kari, Ph.D., DAB No. 1582 (1996)." Departmental Appeals Board, Department of Health and Human Services, 21 June 1996. Web. 11 Jan. 2016. http://www.hhs.gov/dab/decisions/dab1582.html.
5. Carroll, Robert Todd. The Skeptic's Dictionary: A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions. Hoboken, NJ: Wiley, 2003. 63. Print..
6. Gould, Stephen Jay. "Nonoverlapping Magisteria." Natural History 106 (March 1997): 16-22; Reprinted by The Unofficial Stephen Jay Gould Archive. 1998. Web. 11 Jan. 2016. http:

Wednesday, January 06, 2016

Questioning Our Over-Reliance on Science (video)



Recently, I got to sit down with the One Minute Apologist, Bobby Conway, and discussed several topics. One item that came up was our culture's over-emphasis of science as the last word in knowledge. The role of science does seem to be misunderstood these days, with people giving it more credence than it may deserve.

Interestingly, John Cleese of Monty Python fame also recently tweeted:
Cleese went on to offer a couple other tweets, which could be viewed in different ways, although folks like John Prager at AddictingInfo felt Cleese was slamming "anti-science conservatives." I don't know of that was Cleese's intention. However I do know that in his podcast, he seemed to make fun of those who would place an over-emphasis on science and scientists in this humorous video.

Of course, taking that tweet as it stands, Cleese is right. Science is only one method we use to know about the world and it is a fairly limited one at that. That's what I was able to explain in this short clip with Bobby Conway. You can watch it here:


For more detail on these ideas, check out my previous articles here, here, and here.

Wednesday, April 22, 2015

Biology Cannot Account for Personhood


What makes a person? A New York judge has caused a lot of confusion on that score in the last couple of days. After hearing arguments by representatives of the Nonhuman Rights Project, who are seeking the "freedom" of two chimps held at the Stony Brook University lab. Upon hearing the petition, Manhattan Judge Barbara Jaffe issued a writ of habeas corpus, which according to Science magazine reporter David Grimm who has been reporting on the case, "typically allows human prisoners to challenge their detention."1 The action by Judge Jaffe would have been the first time non-humans were recognized as legal persons. However, Jaffe quickly amended her court order, striking out the phrase "writ of habeas corpus," according to updates of the story.2

Are chimps persons? What defines personhood? Groups like Planned Parenthood have gone out of their way to make sure that unborn children are not defined as persons. They try to justify that claim by pointing to things like the ability to have complex thoughts or limited brain development. Those kinds of limitations are supposedly what keep unborn children from being seen as persons. Yet, the chimps at Stony Brook University will never have the capacity for abstract reasoning. They may feel pain, but they will never be able to internalize the concept of pain as an idea in and of itself. So, why should people petition for the recognition of chimps as legal persons when the argument can be made much more persuasively that human fetuses are human persons?

Reducing People to Biological Machines

Much of the confusion on what properties define personhood is the shortsightedness of relying on science to answer such questions. Science has been a great tool and has helped us understand things like human development in-vitro. It has also shown us that there are similarities in the way certain processes of biology function in both humans and animals. We share more of these similarities with some animals, such as chimps and apes, than we do with others, such as spiders or earthworms. But is a description of the machinery of our bodies all that's required to determine personhood or is there something more?

I think there is. In fact, biology isn't the necessary component in what makes a being a person at all. What if a human being is not being kept alive by his or her biology by purely by mechanical processes?  If someone has multiple artificial components surgically transplanted into them, does it make them less a person than another without the implants? Of course not. Even if we could one day replace all of that individual's body with machines, it wouldn't change the personhood of the individual.

Personhood is Immaterial

It isn't the biology that matters in the question of personhood. It is the fact that persons share certain non-physical attributes, such as the ability to love, to reason, to recognize other persons as persons and to have communion with God. Those are what make a person a person. Basically, we reflect certain attributes of God, attributes that are immaterial. I want to be clear here, though. I am not saying that these attributes need to be active for personhood to obtain. If that were the case, those under anesthesia or in a coma would not be considered persons. It would disqualify some with significant mental disabilities.  Rather, personhood recognizes the being as having the potential for these kinds of things, even if they aren't fully realized.

Peter Kreeft sums it up appropriately:
The reason we should love, respect, and not kill human beings is because they are persons, i.e., subjects, souls, "I's", made in the image of God Who is I AM. We revere the person, not the functioning; the doer, not the doing. If robots could do all that persons can do behaviorally, they would still not be persons. Mere machines cannot be persons. They may function as persons, but they do not understand that they do not have freedom, or free will to choose what they do. They obey their programming without free choice. They are artifacts, and artifacts are not persons. Persons are natural, not artificial. They develop from within (like fetuses!); artifacts are made from without.3
As long as the broader culture looks to biology to try and define personhood, confusion will continue. Personhood is something bigger than biology, though. We need to expand our thinking to include the non-physical aspects of what makes  each of us persons, lest we lose the concept of personhood altogether.

References

1. Grimm, David. "Updated: Judge's Ruling Grants Legal Right to Research Chimps." Science Insider. American Association for the Advancement of Science, 23 Apr. 2015. Web. 22 Apr. 2015. http://news.sciencemag.org/plants-animals/2015/04/judge-s-ruling-grants-legal-right-research-chimps.
2. Calamur, Krishnadev. "N.Y. Judge Amends 'Habeas Corpus' Order For Chimps." NPR. NPR, 22 Apr. 2015. Web. 22 Apr. 2015. http://www.npr.org/blogs/thetwo-way/2015/04/22/401519113/n-y-judge-amends-habeas-corpus-order-for-chimps.
3. Kreeft, Peter. "Human Personhood Begins at Conception." Peter Kreeft. Peter Kreeft, n.d. Web. 22 Apr. 2015. http://www.peterkreeft.com/topics-more/personhood.htm.

Wednesday, February 25, 2015

Answering the Bias Objection

There's a concept held by many today that neutrality is to be valued when discussing important ideas or events. It seems to pop up in diverse conversations about abortion, the reliability of the Gospel accounts, or the debate over creation versus evolution. The claim that because one holds a particular position makes them "biased" and therefore unqualified to objectively weigh a matter is widely assumed, but it's completely mistaken. While biases can lead people to ignore or deny certain facts, biases are absolutely necessary to be an informed human being.


What is a bias?

Just what is a bias? The word has become associated with the concept of prejudice or, as Wikipedia puts it, the inclination to "hold a partial perspective, often accompanied by a refusal to consider the possible merits of alternative points of view."1 Yet, that's not the only definition of what bias is. Bias can be any leaning or predisposition towards a point of view as the Oxford English Dictionary definition notes.2 In other words, anyone who leans towards one position over another in any field will have some kind of bias. But that isn't a bad thing. For example, Jonas Salk had a belief that the same approach to developing an influenza vaccine could be applied to polio, even though prior polio vaccination attempts had been disastrous, causing paralysis and even death in those who had taken it.3  Salk assembled a team and worked for seven years creating a dead-virus version of the vaccine that ultimately proved hugely successful, and it was Salk's bias towards the vaccine method that drove him to keep trying.

It makes sense that bias would be necessary for advancement in a field like medicine. It is simply unreasonable for a person who after years of study and research and to remain neutral and uncommitted about his or her specialty. We expect experts in their field to have some bias towards certain theories or procedures. Bias in this sense is a good thing. As Robin Collins puts it:
Not every bias distorts: some biases can help us decided ahead of time what's worth paying attention to and what is not… I am biased against the possibility that the number of puppies in a litter has anything to do with the number of legs the father has, so I would never pay anyone money to study what the relationship is."4

The myth of being "bias-free."

Of course, the corollary to the "bias is always bad" myth is that there are certain disciplines that are somehow bias-free. Folks assume that journalistic standards or the scientific method can provide unbiased observations about the world. This simply isn't true, either. I've written before about how one man's bias became scientific dogma that we are only now finding to be false. His resilience influenced other scientists, and his bias was accepted as the scientific consensus, shaping national dietary guidelines and doctor recommendations for some fifty years. That's just one example. In any experiment, one cannot measure every aspect of a scenario, so scientists look to measure the "relevant" factors and exclude any "irrelevant" ones. But it is one's previous biases, as with Collins' dog litter example above, that shape what one considers relevant. Thus, he notes "Some biases can distort: people who think that all human behavior can be explained by our genes have a bias that blinds them to moral realities. So, we cannot promise that science is without bias; and we have to assess—by critical thinking—whether that leads to sound or unsound conclusions."5

Looking for the truth value

So, bias is not the determining factor in finding out truth. Some biases, like Salk's, help us to discover new things. Others are unwarranted and lead us away from the truth. The big question is the one Collins asked: can we use our critical reasoning to weigh these things and determine if the biased are appropriate or simply prejudice? That means examining the facts, something that tends to be missing from the conversations of those who seek to shut you down with the simplistic objection of "you're just biased."

References

1. "Bias." Wikipedia. Wikimedia Foundation, 23 Feb. 2015. Web. 25 Feb. 2015. http://en.wikipedia.org/wiki/Bias.
2. "Bias." Oxford English Dictionary. Oxford University Press, Jan. 2005. Web. 25 Feb. 2015. http://www.oed.com/view/Entry/18564?rskey=S5Ld2w&result=1#eid.
3. Brodie, M., and W. H. Park. "Active Immunization Against Poliomyelitis." JAMA: The Journal of the American Medical Association 105.14 (1935): 1089-093. Web. 25 Feb. 2015. http://jama.jamanetwork.com/article.aspx?articleid=1154662.
4. Collins, C. John. Science & Faith: Friends or Foes? Wheaton, IL: Crossway, 2003. Print. 30.
5. Collins, 2003. 31.
Image "Research Bias"  courtesy Boundless.com and licensed under CC BY-SA 4.0 with attribution required. 

Tuesday, February 24, 2015

What is Science, Anyway?

What is science? That may seem like a simplistic question, but the answer is neither easy nor unimportant, especially in this day and age. We live in an era where the scientist has become the one assumed to hold the answers to a wide diversity of questions, even those that are not scientific. Michael Shermer just published an article where he credits "scientific thinking" for human moral progress since the enlightenment.1 I've had people ask me to prove God's existence scientifically, and of course discussions on creation of the universe or the emergence of life on earth put science right in the middle of the debate.



Given how modern society places its nearly unquestioning trust in science, it's easy to see why someone would seek to dismiss God's existence or intelligent design with a wave of a hand and the claim of "that's not science." But just what is science, then? As a recent video by Stephen C. Meyer (included below) points out, science has been notoriously difficult to define. Let's take a look at some definitions of what supposedly qualifies something to be science.

Collecting data through observation

One of the more common definitions of science pivots on how one goes about gathering their evidence for their hypothesis. Robin Collins writes , "I remember being taught as a boy that 'science' is, at its simplest, collecting data from observations of the world, and then organizing those observations in a way that leads to a generalization called a 'law.'"2 Meyer states in the video that "If a theory is going to be scientific, it must not invoke unobservable entities." Yet, as he then references, the entire field of theoretical physics is currently dealing in objects and concepts that by definition are unobservable. No one can see quarks. Quantum vacuums are unobservable. Does that mean that Stephen Hawking and those in his field should not be considered "doing science" when they invoke such causes?

The criteria of falsifiability

A second definition is one that philosopher of science Karl Popper made famous, the concept of falsifiability. Yet, falsifiability is really the other side of the observability coin. Popper, who had a "teenage flirtation with Marxism,"noted that Marxist explanations of history conformed with observed facts, such as the greater economic influence of the lower classes. However, competing economic models used the same set of historical data to fit their explanations as well. Later, Popper found that Freud's theory of psychoanalysis was too capable of explaining every situation. There was never a situation where Freud's theories would be shown to be false; every circumstance could be justified in some way. Thus Popper came to the conclusion that a theory is scientific if there's a way to prove it false.4 The Stanford Encyclopedia of Philosophy sums it up this way:
If a theory is incompatible with possible empirical observations it is scientific; conversely, a theory which is compatible with all such observations, either because, as in the case of Marxism, it has been modified solely to accommodate such observations, or because, as in the case of psychoanalytic theories, it is consistent with all possible observations, is unscientific.5
The problem here, though, is similar to the one above. If certain fields of study are unobservable, how can someone observe their falsification? Modern evolutionary theory posits mutations and intermediate forms that, as Meyer points out, are unobservable. We cannot see into the past and there is no way to know that one fossil is a transaction from another, those are all inferences. Therefore, using this criteria, Neo-Darwinian theories are not based on science, but (as Popper labeled them) pseudo-science.

The truth-value of a proposition

All of this discussion on what makes us science is valuable, but it isn't the most important thing we need to worry about. We should be primarily concerned about whether or not something is true first. As I've previously written, science is not the only way we know things. It isn't even the best way to know certain things. Meyer makes the same point in the video:
I don't care whether intelligent design is considered to be science or not. That is not the most important question. That is a semantic question. The most important question is whether it is true, or whether it is likely to be true, or most likely to be true given the evidence we have. What people have done to avoid answering that most important question is repair to these semantic arguments. "Intelligent design is not science; therefore we don't have to consider the case for it. I don't think that follows."
Watch the whole thing here:


References

1. Shermer, Michael. "Are We Becoming Morally Smarter?" Reason.com. Reason Foundation, 17 Feb. 2015. Web. 24 Feb. 2015. http://reason.com/archives/2015/02/17/are-we-becoming-morally-smarte/.
2. Collins, C. John. Science & Faith: Friends or Foes? Wheaton, IL: Crossway, 2003. Print. 30.
3. Thornton, Stephen. "Karl Popper." Stanford Encyclopedia of Philosophy. Stanford University, 13 Nov. 1997. Web. 24 Feb. 2015. http://plato.stanford.edu/entries/popper/#BacHisTho.
4. Thornton, 1997.
5. Thornton, 1997.
Image courtesy GeoffAPuryear and licensed under CC BY-SA 3.0.

Friday, February 20, 2015

A Lot of Hand-Waving by Evolutionists

In college, I was a physics major. In physics, we sought to provide some precise answers to specific data presented to us. For example, we know that a car turning a corner must rely on a certain amount of friction to turn a corner. We want to know how to calculate this so we can set safe speed limits on corners. In chemistry, we seek to know just what is happening when iron rusts or an acid and a base are combined. In medicine we seek to know exactly why someone who suffers from Sickle cell disease. Doctors have traced the problem back to a single DNA point mutation which then changes the coding of a single amino acid.1 This is pure science, seeking to find an answer while examining the details.



Of course, not all science can be done in this way. There are fields such as plate tectonics that take observed data and use them to create models of how the different plates of the earth's crust will affect each other. Still, these models attempt to be very specific in just what is moving and how, and it's this specificity that makes the difference in the explanatory value of any theory. The devil's in the details, as it were.

What's Needed to Make a Whale

Yet, when I talk with proponents of evolution, the discussions are different. Yesterday, I engaged again in an exchange with a proponent of evolution. I asked his again to provide a definition, to which he replied "Evolution Is Change in the Inherited Traits of a Population through Successive Generations" (borrowing the title from this web site.) But, as I wrote yesterday, that's not a very useful definition. Just because things change doesn't mean we get new biological systems. Men can be four feet tall or seven feet tall, but not 12 inches tall or 12 feet tall. Those who inherit the sickle cell trait are immune from malaria, but their children are at risk of a painful life and an early death. Even here, the inherited immunity isn't a new feature, but a crippling of a functioning system.

So, I ask for specifics. I offered the humpback whale as an example. Supposedly, the whale evolved from a land mammal over the course of about 10-12 million years.2 One may try to explain the size increase by simple growth over time, (although a recent article in the journal Science says that such an explanation fails), there are still a huge amount of systems that must be changed for a land-dwelling mammal to change into an ocean-dwelling one. The nose must be migrated to the top of the head and turned into a blow hole. Breathing is no longer automatic but must take conscious effort. Walking limbs must be transformed into flippers. The kidneys must be changed to handle the intake of salt water. Testes must be located inside the body to keep warm. The young must be able to nurse under water, and on and on.3

How Much Change Does that Take?

You may imagine that changing one body part into something different, like a nose into a blowhole would take quite a bit of DNA rearrangement. These morphological changes not only have to all happen, but they have to happen together, for a blowhole isn't going to help if you are breathing without thinking. The animal will simply drown. But even more problematic, the vast number of changes to the DNA must happen within that relatively short window of 10-12 million years because that's what the fossil record shows. If whales came from the land mammal pakicetus, then using the traditional dating of fossils found, all these changes must take place with what would be on an evolutionary timeline, a very brief span.

Thu, my question to my interrogator was simple how quickly would the mutations of DNA have had to happen to produce all of the necessary changes to get the whale from its supposed ancestor? Does any natural selection and genetic mutation that we observe now correlate to those changes? One must remember that we aren't taking about bacteria that reproduce very quickly and have very large populations. Mammals like pakicetus are the same size as a large dog, which means that they might reproduce only after a year or two upon maturation and produce a few litters. With a smaller population and more time between generations, evolution via mutation is made even more difficult.

So, what's the model? Where's the math? What's the actual number of beneficial mutations posited generation to make this kind of a transition? I was met with nothing but obfuscation. It was all hand-waving, and talk of me supposedly ignoring "hundreds of years of hard scientific work." This is what I find all the time in discussions of evolution. Everyone claims it's an established fact, but no one offers the details. Dawkins speaks of cells that morph into light-sensitive ones, then become aligned, and eventually we get the human eye.4 However he never gives us just which genes changed, how many would've needed to change, or how fast it would have to occur. It's a just-so story that has no numbers behind it. Modern proponents seem a little too light on the details to say evolution is of the same type of knowledge as the earth revolving around the sun.5 We have the formula for gravity. We have silence regarding genetic changes.

References

1. "Sickle Cell Disease." Genetics Home Reference. National Library of Medicine, Aug. 2012. Web. 11 Feb. 2015. http://ghr.nlm.nih.gov/condition/sickle-cell-disease.
2. "Going Aquatic: Cetacean Evolution." PBS. PBS, 21 Mar. 2012. Web. 19 Feb. 2015. http://www.pbs.org/wnet/nature/ocean-giants-going-aquatic-cetacean-evolution/7577/.
3. Andrews, Max. "Darwinian Whale Evolution." Sententias. Sententias, 06 Feb. 2012. Web. 20 Feb. 2015. http://sententias.org/2012/02/06/darwinian-whale-evolution/#more-1563. See also Richard Sternberg's video at https://www.youtube.com/watch?v=KbAzZEu13_w
4. Dawkins, Richard. Climbing Mount Improbable (New York: W.W. Norton & Co., 1996), 140-178.
5. "Is Evolution a Theory or a Fact?" Evolution Resources from the National Academies. U.S. National Academy of Sciences., 2013. Web. 12 Feb. 2015. http://www.nas.edu/evolution/TheoryOrFact.html.

Thursday, August 28, 2014

Blinding with Science

Frequently when I discuss issues of science as they relate to faith, I'm often told that science shouldn't be doubted. After all, science, unlike faith, isn't about what people want to believe. It only deals in cold, hard facts, and when science reaches a consensus, like it has with the modern neo-Darwinian paradigm, it is unreasonable to reject it. Rejecting the scientific beliefs of the vast majority of scientists is equal to denying that the earth is round.



That's the story, but that isn't science. It's scientism. Fundamental to science is the concept of questioning the facts we think we know, even what can be considered well-established facts. Newton's laws were thought to hold in all applications for centuries until quantum mechanics came along and threw a fly in the ointment. Other assumptions, such as the steady state model for the universe, have also been upended.

But many of those ideas are too esoteric for the average man on the street to really grasp. However, there is currently a paradigm shift happening in the health sciences that perfectly illustrates how accepted science can be flimsy, biased, and based not on facts but strong wills and politics. The story is fascinating and illustrates how just one man can create a belief that is so strong, it affects the viewpoint other experts, changes government regulations, and becomes an embedded belief by the general population.

In her article "The science of saturated fat: A big fat surprise about nutrition?" author Nina Teicholz summarizes her findings of a nine year investigation into the commonly-accepted belief that the more saturated fats you eat, the worse it is for your heart. I recommend you read the entire article, or if you would like even more detail, grab her well-documented book The Big Fat Surprise: Why Butter, Meat, and Cheese Belong in a Healthy Diet. However, here are a few quotes of how the myth of the unhealthy high fat diets became the unquestioned standard:

1. One man's assumption led to bad conclusions

Teicholz writes that the idea to link saturated fats to heart disease was proposed by Ancel Keys, a pathologist who was "an aggressive, outsized personality with a talent for persuasion."1  Keys' studies on this link "violated several basic scientific norms,"2 according to Teicholz.  For example, Key's findings were based on a single study, claiming to look at the diets of some 13,000 men across seven countries. However, Teicholz reports that Keys did not select random nations, but only those that supported his hypothesis, and he ignored others. She writes there were other problems with the study as well:
Due to difficulties in collecting accurate nutrition data, Keys ended up sampling the diets of fewer than 500 men, far from a statistically significant sample. And the study's star subjects — men on the Greek island of Crete who tilled their fields well into old age and appeared to eat very little meat or cheese — turned out to have been partly sampled during Lent, when the study subjects were foregoing meat and cheese. This must have led Keys to undercount their saturated-fat consumption. These flaws weren't revealed until much later. By then, the misimpression left by the erroneous data had become international dogma.3

2. One man's push led to accepted dogma

The second factor that led to the widespread acceptance of Keys views was a combination of good timing and Keys' dominant personality. Teicholz reports:
He found a receptive audience for his "diet-heart hypothesis" among public-health experts who faced a growing emergency: heart disease, a relative rarity three decades earlier, had skyrocketed to be a leading cause of death. Keys managed to implant his idea into the American Heart Association and, in 1961, the group published the first-ever guidelines calling for Americans to cut back on saturated fats, as the best way to fight heart disease. The US government adopted this view in 1977 and the rest of the world followed.4
Once the idea became ingrained, it became a foregone conclusion.
There were subsequent trials, of course. In the 1970s, half a dozen important experiments pitted a diet high in vegetable oil — usually corn or soybean, but not olive oil — against one with more animal fats. But these trials had serious methodological problems: some didn't control for smoking, for instance, or allowed men to wander in and out of the research group over the course of the experiment. The results were unreliable at best…

When Ronald M Krauss decided, in 2000, to review all the evidence purporting to show that saturated fats cause heart disease, he knew that he was putting his professional career at risk. Krauss is one of the top nutrition experts in the United States, director of atherosclerosis research at Children's Hospital Oakland Research Institute and adjunct professor of nutritional studies at the University of San Francisco at Berkley. But challenging one of his field's most sacrosanct beliefs was a near-heretical act.

Challenging any of the conventional wisdom on dietary fat has long been a form of professional suicide for nutrition experts. And saturated fats, especially, are the third rail.

3. The power of intimidation affects consensus

Finally, Teicholz states that Keys himself was not as interested in advancing the science as he was in keeping his findings in the center of belief. He would belittle and mock those who would oppose his theory:
Keys aggressively criticised these observations, which were like missiles aimed at the very heart of his theory… In response to a prominent Texas A&M University professor who wrote a critique of Keys, he said that the paper "reminds one of the distorting mirrors in the hall of jokes at the county fair".

Rolling over the opposition by sheer force of will was typical of Keys and his acolytes in defending their saturated-fat hypothesis. Keys was "tough and ruthless and would argue any point", Oliver, a prominent opponent, said. Since Keys's allies controlled so many top government health posts, critics were denied research grants and key posts on expert panels. As retribution for defending the healthiness of eggs, despite their cholesterol content, Oliver was publicly branded by two of Keys's main allies as a "notorious type" and a "scoundrel" because "he opposed us on everything".

In the end, Keys and his colleagues prevailed. Despite contrary observations from India to the Arctic, too much institutional energy and research money had already been spent trying to prove Keys's hypothesis. The bias in its favour had grown so strong that the idea just started to seem like common sense.5
The parallels between this and modern paradigms like global climate change or neo-Darwinian synthesis are striking. Each was formed at the right time by those looking to dismiss a creator or in a time of significant environmental sensitivity. Each has had high-profile proponents. Each has claimed the scientific high ground to the degree that any deviation from the accepted consensus would be mocked and belittled, and considered professional suicide.

Many good scientists would speak authoritatively on the saturated fat-heart disease link, even today. However, the consumer needs to be more dubious of any connection between the two. While many Keys's critics gained some clout by having a well-respected journal (the Lancet and the British Medical Journal) willing to publish their work, and thus began to crack the saturated fat myth, one wonders how long it would have persisted if the British medical professionals had not investigated the claims.

The tale of saturated fat and Ancel Keys should serve as a warning to those who claim that "consensus" and "accepted science" are good enough to keep scientific claims from being questioned. They show exactly the opposite. After all, scientists are people, and people are prone to be biased. So, don't accept the tale that science is above reproach. It can be a flawed belief system, too.


References

1. Teicholz, Nina. "The Science of Saturated Fat: A Big Fat Surprise about Nutrition?" The Independent. Independent Digital News and Media, 26 Aug. 2014. Web. 28 Aug. 2014.
2. Teicholz, ibid.
3. Teicholz, ibid.
4. Teicholz, ibid.
5. Teicholz, ibid.

Monday, August 18, 2014

Beyond Science: Understanding Real Knowledge

In previous articles, I looked at how many people make the mistake of assuming that science is the only way we can know something is true. We showed how this view, known as scientism, must be false since it is self-refuting. This time, I thought we'd look at the idea of how we know that we know anything at all and how to better understand the differences between knowledge and beliefs.

Types of Knowledge

Philosophers have spent a lot of time on understanding what it means when we say we know this or that. In their book Philosophical Foundations for a Christian Worldview, J.P. Moreland and William Lane Craig identify three basic types of knowing. The most basic type is knowledge by acquaintance which is simply that you have had some type of direct experience with an object or idea and therefore know it to be true. The authors offer an example of "I know the ball is in front of me." Because the ball is directly present in your conscious experience, you can confidently know that statement to be true. 1

A more debated aspect of this type of knowledge is basic mathematic statements and logical deductions. Some philosophers argue that we know 2+2=4 in the same sense that we know a ball is in front of us. It is directly perceived as true. You don't have to go out and observe 2+2 in different environments around the world or around the universe to confidently hold that he product will always turn out to be 4. We understand that it just is that way. Similarly, we experience the same type of understanding when we argue in this way: All men are born. Socrates was a man; therefore Socrates must have been born. That is a logical argument, but we know it to be true directly.

A second way we know something is through know-how. Know-how defines certain skills or abilities one may possess. When someone claims "I know how to play golf", they are expressing knowledge of ability. Moreland and Craig point out that knowledge of the laws or mechanics is not necessary to hold this type of knowledge. They write "For example, one can know how to adjust one's swing for a curve ball without consciously being aware that one's stride is changing or without knowing any background theory of hitting technique." 2

The third type of knowledge is what is usually debated the most. Known as propositional knowledge this type of knowledge deals with statements that make some kind of claim to fact. Statements such as "I know Abraham Lincoln was the sixteenth president of the United States", "I know our Sun is 93 million mionles away" or "I know humans evolved from apes" are all propositional statements.

Justified True Beliefs

One of the reasons propositional knowledge has been debated is because it has been more difficult than other types of knowledge to define completely and accurately. One of the most foundational definitions of propositional knowledge is the concept of "justified true beliefs" that Plato offered in his writing "Theaetetus". Plato said that if we claim to know something, then what we claim must indeed be true. If a claim is not true, then we didn't really know it; we were mistaken. Further, if we claim to know something we must actually believe the claim to be true. It makes no sense to know something but not to believe it. If I say, "I know the ball is on the floor, but I don't believe the ball is on the floor" I've spoken nonsense.

So truth and belief are what we would call necessary conditions for knowledge. For knowledge to exist, they must both be present. However, they are not sufficient conditions for knowing. Many people believe things, and those beliefs may in fact be true, but that doesn't mean they know those things. Take the statement "I know Jones had roast beef for dinner last night." Now, it may be the case that Jones did indeed have roast beef for dinner, and it may be the case that I truly believe Jones had roast beef for dinner, but by making that assertion without any basis, I've just guessed the right answerand thus cannot be taken as knowledge.

In order to truly know something, there must be some acceptable reason to hold that belief. Justified true belief is believing something that is true with good reason. If I claim to know Jones had roast beef for dinner last night because it's a Monday and he always has roast beef on Mondays, and I smelled roast beef coming from his home, I have good reasons to believe Jones in fact had roast beef. That is a justified belief that can be counted as knowledge. If, however, I claim to know Jones had roast beef for dinner last night because I consulted my Magic 8 Ball, that's not knowledge since the reasons I've given are spurious. It becomes the same as guessing.

Knowledge and the Limits of Science

So why does all of this knowledge stuff matter? Because it helps us understand what is real knowledge and what isn't. When looking at scientific propositions, we understand we can know certain things like the speed at which an object falls or what chemical reaction is necessary to produce nitro-glycerin. Science deals with observations of the material world, so these are justified beliefs; we can say we can know such things through science. However, for other claims, such as whether God exists or whether DNA is the proper basis for measuring the similarities between humans and other animals, science has no justification to make claims of knowledge.

You see, science can only tell us facts about the material world. By definition, science has no way of meaningfully commenting on the many other ways we know things. Science can tell us whether a person's heart is beating faster and he is sweating, but it must fall silent as to whether the cause of that reaction is lying or love. Similarly, science cannot tell us about the most unique aspect of humanity, that is the human soul. When looking at propositions such as the existence of God, science has no way of "testing for God-ness". However, I can know through reasoning that universe began to exist and whatever begins to exist must have a cause. 3 I can therefore conclude that if whatever exists must have a cause and the universe began to exist, then the universe must have a cause: God. That is a belief that has strong justification for it. It is knowledge that is outside the scope of science, but it is probably a more authoritative basis for knowing.

So, even though popular culture looks to the scientist to tell them "the facts" about all things, science is really woefully inadequate to explain many aspects of reality. Scientists may presupposes certain things like miracles cannot happen or there is no God, and then formulate other theories. But that's not knowledge, that's presupposition. Personal experience, emotions, reason, logic, and revelation all address truth-claims and all can be justifiable in their proper instances. To limit one's self to science in order to gain knowledge is like trying to build a house with only a hammer. A hammer can pound nails, but you wouldn't want to use it to drive a screw and it would be completely useless to cut wood.

References

1.Moreland, J.P. and William Lane Craig Philosophical Foundations for a Christian Worldview
(Downers Grove, IL:InterVarsity Press, 2003). 72.
2. Ibid. 73.
3. See my article "What the Kalam tells us about God's existence"

Wednesday, August 13, 2014

Scientism tries to Turn Man into a Monkey

Many Christians are familiar with the classic book The Pilgrim's Progress by John Bunyan. For those of you who aren't, it's an allegory of growing in Christian faith where the protagonist named Christian meets some friends (such as Evangelist and Faithful) as well as many unsavory characters like Mr. Worldly Wiseman, Hypocrisy, and Talkative in his walk down the narrow path. While Bunyan wrote in the mid 1600's,the book is amazingly poignant for today.

One particularly striking section dealt with a character named Shame. Christian's friend Faithful recounts to him Shame's accusations against believers. Specifically, Shame claims the religious are basically weak-minded individuals, not living in the real world. He goes on to point out how successful and intelligent people don't believe in such things and how believing in Christianity forces one to ignore the scientific advancements and knowledge of the day. "He moreover objected to the base and low estate and condition of those who were chiefly the pilgrims of the times in which they lived; also in their want of understanding in the natural sciences." 1 So, Shame accuses the Christians of being willfully ignorant. Ignoring what he holds to be the true knowledge of science, Shame charges Christian with substituting the crutch of religion to salve his wounds.

Our Popular Conception of Science

Today, we are even more apt to hear such objections to believing the biblical message. This is in large part due to the over-emphasized view science is given in our modern culture. Science is understood in today's world to be the only reliable source of truth. One has only to look at the advertisements we use to sell products to see how much we esteem the concept of scientific veracity. If you really want to make your case for the potency of a product, just have your spokesman wear a white lab coat, begin his name with Dr., or explain how "tests have shown" the item to be more effective. If science has shown something to be true, then it must be true. And if there is a conflict between beliefs and what science has shown, then most people will assume that it is our beliefs that are in error, not science.

These assumptions are unfortunate, but not altogether unsurprising. As I've said before, science has helped humanity in incredible ways. Our lifespan have been extended by decades, we can modify our environment if we're too hot or too cold, and technology has made our daily chores easier. Our learning has also increased exponentially; we better understand the way the world works, we can predict certain phenomena and we've even visited the moon! So with all science has proven it can do, how could it not be real way to know truth?

Scientism's Claim to Truth

There are two problems we run into when discussing science and the way we know things to be true. The more egregious error is the one the easier to identify and argue against. That is the belief that only things that are scientifically verifiable are truly knowable and everything else is opinion and conjecture. This view is known a Scientism, and has had proponents such as Carl Sagan, Stephen Hawking and the late evolutionist Stephen Jay Gould. Noted skeptic Michael Shermer defined Scientism as "a scientific worldview that encompasses natural explanations for all phenomena, eschews supernatural or paranormal speculations, and embraces empiricism and reason as the twin pillars of a philosophy of life appropriate for an age of science." 2

The proponents of Scientism hold that "only things that are scientifically verifiable are truly knowable", is a true and knowable statement. However, that statement is itself unverifiable scientifically. One cannot construct a hypothesis to test for the statement's veracity. There's no way to go into a laboratory and run this idea through a battery of tests to see whether it can be falsified. Scientism, by setting a standard that cannot itself meets, undercuts its own existence. It becomes what we call a self-refuting statement. Because it does so, Scientism should rightfully be rejected as illogical.

Who Chooses the Standard of Comparison?

The main problem with our popular view of science, though, is more subtle and it therefore takes more care to identify. Because science has taken such a high role in our society, statements that are couched in a scientific approach are thought to hold more weight than other types of assertions. However, many who are purporting to advance a scientific view are really making philosophic statements - and they're making a lot of assumptions along the way.

A good example of this is one that Christian philosopher Francis J. Beckwith related to me at dinner one evening. He told of how he had become engaged in a discussion on origins through an Internet bulletin board whose members were primarily biologists and other scientists. One member was asserting the fact of evolution by noting how science has shown human DNA and chimpanzee DNA to be 98% identical. The biologist then concluded that this proves humans and chimps share an evolutionary ancestor.

Dr. Beckwith countered this claim by asking a simple question: Why do you choose genetics be your basis of comparison? It seems an arbitrary choice. Why not any other field of science, say quantum mechanics? Dr. Beckwith went on to explain that if you examine humans and chimpanzees at the quantum level, why then we're 100% identical! Our atoms move and act in exactly the same way as the atoms of the chimp! Of course, human atoms and the atoms of the table where I'm writing this also act identically. How about if we examine each via physics? Again, we're identical: each species will remain in motion unless enacted upon by an outside force, for example.

The scientists had a very difficult time understanding Dr. Beckwith's point, but it was simply this: one cannot start with science to understand the world. Science relies on certain philosophical rules in order to work at all. What was happening is the biologist was making philosophical assumptions and then using science to try and support them. The assumption in the claim above is that all life can be reduced to its genetic make-up, and everything you need to know about any living thing can be deduced from its DNA. It's this assumption that's flawed. It doesn't follow that if humans share 98% of their DNA with chimps that evolution is therefore a fact. But the scientists today aren't trained in logic or philosophy, so they have a lot of difficulty understanding that they are making flawed philosophical arguments and packing them in scientific facts.

References:

1. Bunyan, John The Pilgrim's Progress
Baker Book house, Grand Rapids, MI 1984 p.89
2
. Shermer, Michael "The Shamans of Scientism" Scientific American June 2002
Accessed online at: http://www.sciam.com/article.cfm?articleID=000AA74F-FF5F-1CDB-B4A8809EC588EEDF
Come Reason brandmark Convincing Christianity
An invaluable addition to the realm of Christian apologetics

Mary Jo Sharp:

"Lenny Esposito's work at Come Reason Ministries is an invaluable addition to the realm of Christian apologetics. He is as knowledgeable as he is gracious. I highly recommend booking Lenny as a speaker for your next conference or workshop!"
Check out more X