- ► 2017 (47)
- ► 2016 (122)
- ► 2015 (325)
- ► 2014 (287)
- ► 2013 (141)
- ► 2012 (28)
- ► 2011 (25)
- ► 2010 (36)
- ► 2009 (11)
Come Reason's Apologetics Notes blog will highlight various news stories or current events and seek to explore them from a thoughtful Christian perspective. Less formal and shorter than the www.comereason.org Web site articles, we hope to give readers points to reflect on concerning topics of the day.
Thursday, April 30, 2020
Monday, October 15, 2018
Click below to watch the entire interview. You can get your copy of the book at Amazon here.
Friday, December 15, 2017
A couple of years ago, the Internet blew up over a huge debate—one that captured the attention of popular culture and caused fierce disagreements between friends and family members. I am, of course, talking about the infamous "What color is the dress?" meme portrayed in the accompanying image. One can perceive the dress colors to be either blue and black or white and gold, and it seems for most people once you see the colors a certain way, you simply can't see them from the other perspective.
Now, imagine you want to buy a gift for your mother's birthday and your father had sent you that same picture with the recommendation that since he's buying her a dress, you should purchase the accessories. Would your purchases make sense? We don't know. It all depends on what you see and whether your perception matches reality. Even if the one buying the accessories had the most exquisite fashion sense and was gifted in picking out the most tasteful and appropriate accoutrements, it matters what their perception of the dress colors were.
Scientific Consensus is Founded on ParadigmsI offer the thought experiment because it helps us to better understand how paradigms influence people. We all make choices based on a specific way of seeing things, and this is true in the fields of science as much as it is in any other discipline. In fact, the terms "paradigm" and "paradigm shift" were coined by Thomas Kuhn in his earthshaking book The Structure of Scientific Revolutions. Kuhn there demonstrates how scientific knowledge hasn't been acquired in a slow, steady, progressive line. That's a myth.
Kuhn states that what really happens is young scientists accept certain assumptions about how the world works because they've been taught that from those already in the field. He writes that the student studying in whatever scientific discipline he will eventually practice,
joins men who learned the bases of their field from the same concrete models, his subsequent practice will seldom evoke overt disagreement over fundamentals. Men whose research is based on shared paradigms are committed to the same rules and standards for scientific practice. That commitment and the apparent consensus it produces are prerequisites for normal science, i.e., for the genesis and continuation of a particular research tradition.1(emphasis added).What this means is that scientists within a particular field of study all start with some basic assumptions and then they rely upon those assumptions to solve problems on details within that model. So, if one were to start with the paradigm that the dress is white and gold, then the answer to the problem of what kind of accessories would complement the dress will come out differently than if one were to hold the paradigm that the dress is blue and black.
The Consensus Can be Influenced by Outside FactorsIf you are basing your accessory choices on the paradigm of a white and gold dress, and you find that the majority of those who you learn from and those you work with have also accepted this paradigm, you no longer ask about the color of the dress or whether whiter is a better color for a handbag than back. When someone comes into your fold and suggests black for a handbag, your reaction would be one of incredulity. Certainly any fool can see that black is the wrong color choice! You might even make fun of them and dismiss them as not doing good science. But what they've questioned is the paradigm you have assumed, not the reasoning to the color if the paradigm were true.
Here's the thing, though. These paradigms themselves are frequently caused by factors beyond dispassionate science. Kuhn himself discovered this when investigating the Ptolemaic and Copernican ideas of the solar system. Ptolemy's paradigm was first formed by Aristotle, who held to a couple of very Greek ideas, one of which was that some bodies are naturally inclined to move in a circular pattern. In other words, planets by their nature would move circularly because that's what they do. Aristotle's assumption set the paradigm that worked for many centuries and allowed the scientists for those days to come up with accurate predictions.
Aristotle's assumption on the nature of bodies moving in a circular pattern was based on Greek philosophy. Thus it was a philosophical commitment that shaped the science of planetary orbits and of our understanding the nature of our solar system for centuries. It was only when instruments became more sophisticated that flaw could be seen in the model. These flaws grew to the point of crisis until those within the community had to abandon their paradigm and adopt a new one. This is what Kuhn labels a paradigm shift.
The Consensus Can Be WrongBefore a paradigm shift occurs, there is a scientific consensus about whatever point one is discussing. But even though a consensus exists, that doesn't mean those who oppose the consensus are wrong. They may in fact be right, but they are simply offering a different paradigm.
When you read about the contentious scientific issues of our day like the origin of life, man-caused climate change, and neo-Darwinian evolution, it won't be long before someone makes the claim that given a scientific consensus exists on topic X, anyone who holds a contrary view is anti-science. That's simply wrong. It may be that those who hold to the contrary position see the flaws and wish to question the paradigm itself. The bigger question thinking people need to ask is "what are the assumptions implicit in this position and have they been tested?" The question of the color of the dress can be answered, if one enlarges the frame to see more of the picture. Doing this isn't anti-science but what Kun calls extraordinary science.
So let's not point to the idea of a scientific consensus as the final card in a debate. The consensus may be the very thing that needs to be questioned.
2. Chloe Farand. "Duck or Rabbit? The 100-Year-Old Optical Illusion That Could Tell You How Creative You Are." The Independent, Independent Digital News and Media, 14 Feb. 2016, www.independent.co.uk/news/science/duck-or-rabbit-the-100-year-old-optical-illusion-that-tells-you-how-creative-you-are-a6873106.html.
Tuesday, December 05, 2017
As informed citizens, we utilize the available tools around us to help make wise decisions on public policy. And public policy, of course, codifies how people are to interact with one another, the environment, and the government. Many issues wind up being a scientific/ethical/legal debate because Americans hold to different understandings of law, conceptions of ethics, and the relevant science. This article will not examine or critique our various conceptions of ethics, though it will assume ethics are employed in our decision-making process in general. The article will, however, examine the approach to science we take in responding to issues.
It is commonplace for voters to take cues from popular science and advocate accordingly. It is just as commonplace to find a strategy like this:
- identify data (commonly as an appeal to science), and then
- leverage that data to make an ethical decision (commonly via voting or advocacy).
The current default in society is to affirm policies that:
- Give choice to voters:
- For abortion, this means denouncing policies that prioritize unborn personhood, while
- Denying choice to voters:
- For climate change, this means championing policies that prioritize nature.
On the issue of climate change, there is an appeal to climate science to justify public policies for conservation, environmental restrictions, pollution controls, et cetera. Now, given the science and regardless of wherever one stands on the issue, it is easy to find contention and disagreement in the public sphere over climate change. This disagreement, however, does not stop climate-change-conscious citizens from advocating climate-change-conscious policies. I have yet to see such an advocate (and I can bet he or she would consider it absurd) to propose: “The public disagrees on whether or not humans are influencing the climate. Therefore, the default position should be to NOT advocate or attempt to pass climate-change-conscious policies at all.”
Now, if he or she were being consistent, then he or she would consider it similarly absurd to propose: “The public disagrees on whether or not the unborn are persons. Therefore, the default position should be to NOT advocate personhood for the unborn.” That is to say, it would not at all make sense to stop trying to make laws in favor of guarding unborn human life solely because there is disagreement.
Take human embryology: the relevant claims in science here are non-controversial and uncontested regarding the unborn. We can make statements like:
- “The organism has unique and human DNA” or
- “All things being equal, the unborn will continue its human development as the rest of us did and do.”
- “The only differences between the unborn and the born are size, level of development, environment, and degree of dependency” or
- “No combination of those differences have ever been sufficient to say that someone is or is not a person.”
So, to maintain logical consistency, a pro-choice/climate-change advocate ought to suppose that disagreement on an issue with scientific connections means either of these defaults:
A) Affirm policies that give choice to voters:
- for abortion, this means denouncing policies that prioritize unborn personhood
- for climate change, this means denouncing policies that prioritize nature, or
- for abortion, this means championing policies that prioritize unborn personhood
- for climate change, this means championing policies that prioritize nature.
- stop denouncing policies that prioritize unborn personhood, or
- stop championing policies that prioritize nature
What shall it be, then? Do not look to science? Do not make ethical policies that account for the science? Do not be logically consistent? The answer, of course, is: “None of the above.” So then, why be inconsistent with the strategy? Why not let science influence our ethical considerations? If the science affirms that humans are being bad stewards of the environment, then why not uphold policies that address responsible stewardship? If the science affirms that humans begin to exist at conception, then why not uphold policies that address the inalienable value of human life?
Let us be consistent. If it is a principle of ours to appeal to the science in addressing a science-related policy, then let us not deny the corresponding ethical position it would entail or affirm, even if it means we might have to abandon or revise our prior collection of ethical postures.
Friday, September 08, 2017
The news seems to be filled the last few days with one natural disaster on top of another. Texas has already been victimized by Hurricane Harvey, with massive flooding and untold suffering. It was the largest hurricane Texas has ever recorded and may be the most costly natural disaster in U.S. history with estimates placing the damage at up to $180 billion.1 But Houston may not hold that record long as Florida sits directly in the path of Hurricane Irma, with Hurricane Jose following behind her. We then have a massive 8.1 earthquake off the coast of Mexico which may cause a tsunami. What’s going on?
Given the terrible destruction and suffering caused by Harvey and Irma, people are beginning to wonder if there isn’t some kind of divine retribution going on. Jeffery Terry tweeted "#HurricaneHarvey is Gods punishment for those who support @realDonaldTrump may God have mercy on them" and University of Tampa Professor Ken Storey tweeted "I don’t believe in instant karma but this kinda feels like it for Texas. Hopefully this will help them realize the GOP doesn’t care about them." (Although Storey’s tweets are protected, you can view a screenshot here.)
#HurricaneHarvey is Gods punishment for those who support @realDonaldTrump may God have mercy on them— Jeffrey Terry (@JeffandPolitics) August 26, 2017
Jennifer Lawrence didn’t blame God, but did invoke Mother Nature and insinuated that the recent destructions are somehow related to the recent election of President Trump, saying "We voted and it was really startling. You know you’re watching these hurricanes now, and it’s really hard especially while promoting this movie, not to… not to feel Mother Nature’s rage, wrath."2 Of course Christians are not immune to the temptation, either. Newsweek reports that Rick Wiles claimed "‘here’s a city that has boasted of its LGBT devotion, its affinity for the sexual perversion movement in America. They’re underwater."3
Shark Attacks and Jumping the SharkSo, with so many out of the ordinary natural disasters occurring, shouldn’t we attribute them to God’s wrath? Before we jump to that conclusion, maybe it would be wise to find out just how out of the ordinary this weather cycle is. It seems that with media channels reporting the continued destruction in our 24-hour news cycle, one could hardly be blamed for assuming 2017 was a special year for natural disasters, but that’s not necessarily the case. According to the folks over at Weather underground, 2005 was truly a record year, with 28 storms and 15 hurricanes in the North Atlantic. Of those, five were large and/or deadly enough to have their names retired (Dennis, Katrina, Rita, Stan, Wilma). 2017 isn’t close to that yet.
Interestingly, we’ve actually been in a downward trend of hurricanes that hit the United States. Writing for the NOAA, climate scientists Gabriel A. Vecchi and Thomas R. Knutson show how the United States has been seeing a decrease in the number of storms causing damage on land. They provide the figure below with the following caption:
Since the late-19th Century global (green) and tropical Atlantic (blue) temperatures have risen – an increase that was partly driven by increased greenhouse gases. If one does not account for possible missed storms (first red line) Atlantic tropical storms appear to have increased with temperature; however, once one accounts for possible missed storms (second and third red lines) basinwide storms have not exhibited a significant increase. When one focuses only on landfalling storms (yellow lines) the nominal trend has been for a decrease.4So, the number of hurricanes displacing people and causing widespread damage is not increasing, even though we may think it is. A lot of it is because we forget just how bad seasons like 2005 really were and compare this year to last year or to two years ago. Some of it is the continued discussion in the media, spurring what is sometimes called "the shark attack effect" following the release of the book and movie Jaws. A quick explanation is that while shark attacks in real life are very rare (more people die from bee stings in this country each year than from shark attacks), once it captures our psyche, we are prone to look for more examples to confirm our fears. It’s kind of like how only after buying a new car you notice how many of that same model are on the road.
We live in a fallen world and natural disasters are a part of that fallenness. It is also true that God has and will use natural calamities to punish or correct nations. However, when people jump to that conclusion first, they remind me of Jesus’ disciples in John 9, who asked him, Rabbi, who sinned, this man or his parents, that he was born blind?" There, Jesus gave a most prescient answer: "It was not that this man sinned, or his parents, but that the works of God might be displayed in him." That should be the Christian response. Let us show the goodness of Christ’s love by reaching out to those afflicted by nature’s devastation and stop trying to pin the blame on some perceived sin. It will be a more effective way to share your faith with others.
2. Long, Jackie. "Jennifer Lawrence: ‘I’Ve Heard and Seen Things on TV That Devastate Me and Make Me Sick.’" Channel 4 News, Channel Four Television Corporation, 6 Sept. 2017, www.channel4.com/news/jennifer-lawrence-ive-heard-and-seen-things-on-tv-that-devastate-me-and-make-me-sick. Beginning about 5:14.
3. Sinclair, Harriet. "Did Gay Sex Cause Hurricane Harvey or Was It Climate Change? Some on the Right Blame LGBT Americans (No Seriously)." Newsweek, Newsweek, LLC, 3 Sept. 2017, www.newsweek.com/gay-americans-are-blame-hurricane-harvey-apparently-659059.
4. Vecchi, Gabriel A., and Thomas R. Knudseon. "Historical Changes in Atlantic Hurricane and Tropical Storms." Geophysical Fluid Dynamics Laboratory, GDFL/NOAA Research, 29 Aug. 2017, www.gfdl.noaa.gov/historical-atlantic-hurricane-and-tropical-storm-records/.
Monday, June 19, 2017
In my college history class, I was assigned the book The Discoverers by Daniel J. Boorstin. It was an interesting and eminently readable tome, becoming a best-seller. In what is labeled "a personal note to the reader," Boorstin states that he is a champion of the discoverer and that "the obstacles to discovery—the illusions of knowledge—are also a part of our story. Only against the forgotten backdrop of the received common sense and myths of their time can we begin to sense the courage, the rashness, the heroic and imaginative thrusts of the great discoverers. They had to battle against the current 'facts' and dogmas of the learned.1"
I believe Boorstin is correct in that for us to properly understand the momentous changes that paved human advancement we must look at the truth of historical setting and detail. Unfortunately, one area where Boorstin himself succumbs to the "current facts and dogma" that plague us today is the claim that the medieval period, when Christendom became dominant in Europe, ushered in some kind of dark ages.
In chapter thirteen of The Discoverers (not so subtly entitled "The Prison of Christian Dogma"), Boorstin writes that Christians in the medieval period abandoned the work of discovery in order to generate simple, theologically appealing frames that were divorced from, fact. He claims "the leaders of Christendom built a grand barrier against the progress of knowledge about the earth, "and that "we observe a Europe-wide phenomenon of scholarly amnesia, which afflicted the continent from A.D. 300 to at least 1300."2
The Explosion of Advancement in Medieval EuropeBoorstin's view is a popular one; the Middle Ages were a dark and regressive period for Europeans. The Church was supposedly a science-stopper and anyone who wishes to look for scientific leaps that would lead to human flourishing must at this point in history turn to the Muslims or the Orient. But it simply is a false view. Rodney Stark, Distinguished Professor of Social Sciences at Baylor University, clarifies:
Granted, like the Muslim conquerors, the Germanic tribes that conquered Roman Europe had to acquire considerable culture before they measured up to their predecessors. But, in addition to having many Romans to instruct and guide them, they had the Church, which carefully sustained and advanced the culture inherited from Rome. What is even more significant is that the centuries labeled as the "Dark Ages" were "one of the great innovative eras of mankind," as technology was developed and put into use "on a scale no civilization had previously known." In fact, as will be seen, it was during the "Dark Ages" that Europe began the great technological leap forward that put it far ahead of the rest of the world. This has become so well known that rejection of the "Dark Ages" as an unfounded myth is now reported in the respected dictionaries and encyclopedias that only a few years previously had accepted and promulgated that same myth. Thus, while earlier editions of the Encyclopaedia Britannica had identified the five or six centuries after the fall of Rome as the "Dark Ages," the fifteenth edition, published in 1981, dismissed that as an "unacceptable" term because it incorrectly claims this to have been "a period of intellectual darkness and barbarity."3In his book God's Battalions, Stark notes there were tremendous advancements in the technology of the day, such as swivel-axeled wagons, shoes for horses, and better harnesses. The plow was also redesigned and farming techniques, including the rotation of crops allowing fields to rest and not become nutrient-drained were adopted.
Making Life Better for the Average ManPutting the ability of horses with their new harnessed together with the more efficient plow had a huge impact on lifespans. Stark notes "land that could not previously be farmed, nor not farmed effectively, suddenly became very productive, and even on thinner soil the use of the heavy moldboard plow nearly doubled crop yields."4
Adding this to the improved farming techniques, Stark concludes:
As a result, starting during the "Dark Ages" most Europeans began to eat far better than had the common people anywhere, ever. Indeed, medieval Europeans may have been the first human group whose genetic potential was not badly stunted by a poor diet, with the result that they were, on average, bigger, healthier, and more energetic than ordinary people elsewhere.5Stark offers more and more varied examples of how during the Middle Ages that Christian Europe's "technology and science overtook the world" in his book The Victory of Reason: How Christianity Led to Freedom, Capitalism, and Western Success, but this will serve us for now. The idea that Christianity was a science-stopper in the Middle Ages is nonsense. Christianity not only taught that God's word was to be discovered, but it taught that all human beings are inherently valuable and both these key concepts made the Western world the leader it is today.
2. Boorstin, 1985. 100.
3. Stark, Rodney. God's Battalions: The Case for the Crusades. New York, NY: Harper One, 2009. 66.Print.
4. Stark, 2009. 69.
5. Stark, 2009. 70.
Tuesday, May 16, 2017
It should be no secret science plays an inordinately large role on modern culture. As I've noted before, scientific advancements have allowed human beings banish diseases that were once fatal, create new materials in the lab that outrival nature, and generally control and command their world in ways that had heretofore been thought impossible. In short, the last 150 years of scientific discovery have changed everything about how humans live and interact with their world.
Because of these great successes, societal attitudes toward science have become distorted. People place science on a pedestal, believing that if a claim is scientific, it will be unbiased and more reliable than other forms of knowledge. Science and faith are seen as foes and atheists will challenge Christians, claiming scientific facts are incrementally undermining Christian beliefs.
In reality the war between Christianity and science is a myth and the recently released Dictionary of Christianity and Science goes a long way toward helping to dispel that myth as the fraud it is. General Editors Paul Copan, Tremper Longman III, Christopher L. Reese, and Michael G. Strauss have assembled a strong collection of writings covering a wide range of topics in what would more properly be understood as a cyclopedic volume instead of a dictionary. With over 140 top scholars writing on over 450 topics, the Dictionary serves as an excellent starting point to research various topics that most Christians will face when researching or discussing these issues.
Given the breadth of the subject matter, the articles could have all been relegated to short introductory overviews and a list of additional resources at the end of each entry. But the editors wisely chose to have three different types of articles appear in the Dictionary. For the less controversial and more agreed upon topics (such as key historical figures in science or specific terms like emrpicism), an introductory article is all that's warranted. But for other entries they chose to include longer articles labeled essays that give more background, competing views, and the evidence they rely upon. The entry on "The Genesis Flood and Geology" is an example of one such essay.
Finally, there are the multiple-view discussions where different scholars who take up contrary positions are each allowed an extended article within the same entry. For example, of one were to look up the state of creationism, the user would be greeted with an introductory article on the concept of creation, an article entitled "Creation, Intelligent Design and the Courts," and four essays on creationism: one critical and one supportive of old-earth creationism and one critical and one supportive of young-earth creationism.
I'm really impressed with the level of scholarship and the wide range of topics that have been compiled in the volume. The editors included key figures like Thomas Kuhn and philosophical concepts like Inference to the Best Explanation that are not well-known outside the study of the philosophy of science. Further, articles on people like Galileo Galilei seek to strip the legendary tales of his scientific advancement and show why it would be incorrect to see his conflict with the College of Cardinals as a case of science versus religion.
There are a few drawbacks to the book. First, there is no table of contents or topical index. I suspect that is because it is marketed as a dictionary and as such will have its entries placed in alphabetical order. However, if someone looks up the aforementioned creation entry, he would be missing several other articles that focus on the topic, with multiple-view entries on the flood and on the Genesis account in the F and G areas respectively. One would then have to turn to the I section in order to read the Intelligent Design entry. And if someone doesn't know who Thomas Kuhn is and why his work is so important, it may be easy to miss this entry.
Secondly, while it cannot be avoided, the book is a product of this particular time. The articles that have the most information are those that are the most debated right now. In ten years, this volume will suffer from its age as some debates will change, others may be settled, and new discoveries will make several of the entries obsolete. I would hope an accompanying online site would be able to provide some kind of resource direction until the inevitable updated volume will be released. But these are just quibbles in an otherwise excellent product.
I think every Christian family should have a copy of the Dictionary of Christianity and Science. Anyone who has sought to understand controversial issues on science and faith by searching on Google or looking up the topic on Wikipedia knows that getting solid information from top scholars is challenging to say the least. I've noted myself that any old fool with a modem and an opinion can post online or edit a Wikipedia entry. The Dictionary of Christianity and Science gives the Christian a strong place to start in his or her understanding of how their faith does not contradict modern scientific advancement as well as to get a deeper understanding of what science actually is and where the state of the debates lie.
Thursday, February 23, 2017
The headlines were spectacular. Time Magazine pronounced "NASA Announces a Single Star Is Home to At Least 7 Earthlike Planets."1 Vox exclaimed "NASA has discovered 7 Earth-like planets orbiting a star just 40 light-years away."2 Even the official press release from NASA offered some tantalizing tidbits, noting that all seven planets of the TRAPPIST-1 system reside in the habitable zone necessary for life and it included artists' rendering of what the view may look like from one of these newly discovered sisters of earth.3
Certainly, the discovery of planets orbiting another star is an exciting one. The fact that the TRAPPIST-1 star is relatively close in astronomical terms (40 light years away) means the system is more easily observed by our telescopes; we can gather more data on the planets themselves. To find seven of them ups the chances that we may find water on them, too. But does this mean we've uncovered a bunch of earth-twins that are just ready to be populated by living organisms? Not by a long shot.
What do you mean "Earth-like"?Since capturing eyeballs and clicks are the driving force behind both news organizations and sites like Vox, one should be a bit cautious before jumping to conclusions by just a screaming headline. When I saw this story, I was intrigued, but upon reading the details, certain terms don't carry the weight one may assume at first.
For example, both the Vox and the Time article called these planets "Earth-like" in their headlines. That will certainly evoke a picture in the minds of most casual readers, but what does Earth-like really mean? Both articles did unpack the term to mean a planet whose size is within a certain percentage of Earth's and is not too hot or too cold for water to exist somewhere on its surface without it being boiled away or perpetually frozen. Mars is within our solar system's habitable zone, while experts disagree about whether Venus qualifies or not.
But just having the ability for water to exist really isn't enough for life. The TRAPPIST-1 star is a much weaker star than our sun. As Hugh Ross explains, TRAPPIST-1 is very small and very weak, not putting out much heat at all. Thus, the planets are a whole lot closer to their star than the Earth is to the Sun, which locks them into a non-rotational position – one side always light and extremely hot while the other is perpetually dark and continually freezing cold.
According to Ross, only the "twilight areas" of each planet would be able to support liquid water. Ross then states "Only in the twilight zone boundary between perpetual light and perpetual darkness will surface liquid water be possible. This possibility presumes that for each planet the twilight edge will not move. Given how close the planets are to one another, it is inevitable that the twilight edge on each planet will move. Thus, realistically none of TRAPPIST-1's planets are likely to ever possess any surface liquid water."4 Of course, it hasn't even been proven the planets have an atmosphere yet.
Also, since these planets must be very close to their weak sun, their years are very short: it takes only about twenty days for the furthest of the seven planets to complete an orbit and only one and a half days for the closest! Knowing how crucial seasonal changes are to life on Earth, there's absolutely no chance of seasons for any of these planets. What's worse, the planets orbits and close proximity mean their gravitational pull will affect each other. The moon's gravity causes the tides on Earth and it is only one sixth the pull of the earth's gravity.* Imagine how an equally sized planet's gravity orbiting close by would affect the Earth. Ross concludes, "These periodic gravitational influences rule out the possibility of life on these planets."
Selling the Sizzle, not the SteakThe "earth-like" description of these planets in the articles is I believe a little misleading. All the outlets I read hyped the possibility of finding life on these planets while never mentioning the incredible difficulties any life would face on them. The Vox story is a good example:
The more Earth-like exoplanets astronomers find in the galaxy, the more they update their estimates of how many Earth-like planets could be out there. "For every transiting planet found, there should be a multitude of similar planets (20–100 times more) that, seen from Earth, never pass in front of their host star," Nature reporter Ignas Snellen explains in a feature article. And the more exoplanets there are, the more likely it is that life exists on at least one of them.5 (Emphasis added).I highlighted that last line to make a point. While it is true mathematically that finding more planets can make the odds of finding life lower, it's a bit like claiming your odds for dealing four perfect bridge hands are lower the more shuffled decks you use. It's true but still beyond any reasonable explanation that someone will do so, whether you use a hundred, a thousand or a million decks. By obscuring the difficulties these planet offer for life and only highlighting the two or three possible similarities, these reports are selling the sizzle instead of the steak. There's much we can learn from this new discovery. Learning about extra-terrestrial life forming isn't really one of them.
2. Resnick, Brian. "NASA Has Discovered 7 Earth-like Planets Orbiting a Star Just 40 Light-years Away." Vox. Vox, 22 Feb. 2017. Web. 23 Feb. 2017. http://www.vox.com/2017/2/22/14698030/nasa-seven-exoplanet-discovery-trappist-1.
3. "NASA Telescope Reveals Record-Breaking Exoplanet Discovery." NASA. NASA, 22 Feb. 2017. Web. 23 Feb. 2017. https://www.nasa.gov/press-release/nasa-telescope-reveals-largest-batch-of-earth-size-habitable-zone-planets-around.
4. Ross, Hugh. "Earth's Seven Sisters: Are They Really Similar?" Reasons to Believe. Reasons to Believe, 23 Feb. 2017. Web. 23 Feb. 2017. http://www.reasons.org/blogs/todays-new-reason-to-believe/earths-seven-sisters--are-they-really-similar.
5. Resnick, 2017.
* This sentence has been corrected. It originally read "The moon's gravity causes the tides on Earth and it is only one sixth the mass of the earth."
Friday, February 10, 2017
Physicist and anti-theist Victor Stenger famously claimed "Science flies you to the moon. Religion flies you into buildings." This kind of throwaway line is standard fare for the new atheist types and is often repeated via memes shared on social media sites. Stenger isn't the only one who thinks religion is a way to manipulate others into doing immoral acts. Sam Harris claimed "One of the most pernicious effects of religion is that it tends to divorce morality from the reality of human and animal suffering. Religion allows people to imagine that their concerns are moral when they are not."1
I'm not sure how Harris concluded that religion divorces morality from suffering. If he were a true student of world religions he would recognize that the question of human suffering is the primary focus of most faiths. Hindus seek to be come closer to the divine, eliminating the suffering associated with the cycle of reincarnation. Buddhists teach balance to avoid pain and suffering. Islam holds that suffereing may be reduced by making right choices and following Allah and Christianity focuses on eliminating suffering by eliminating sin and its consequences. While I don't agree with the underlying assumptions of other faiths, it is disingenuous to say that religion divorces morality from suffering. The problem of human suffering is front and center in religious faith.
What about Jihadists?So how do we explain the ISIS or Al Qaeda suicide bombers then? Isn't it obvious that such horrendous acts are religiously motivated? I would say it's true only in part. Islam is a faith that offers Muhammad as its exemplar—the model Muslim to which all others should aspire. Muhammad was a warrior who slaughtered innocents and the famous "sword verses" of the Qur'an commands the faithful to "slay the idolaters wherever you find them, and take them, and confine them, and lie in wait for them at every place of ambush." (Sura 9:5) and "When you encounter the unbeliever, strike off their heads until you have made a great slaughter among them" (Sura 47:4). Also, the Qur'an promises a reward to the warrior who dies in his fight for Islam: "So let them fight in the way of God who sell the present life for the world to come; and whosoever fights in the way of God and is slain, or conquers, We shall bring him a mighty wage" (Sura 4:74).
Because Islam offers both the commands of the Qur'an and the example of Muhammad, it allows for jihadists to kill themselves while killing the enemy in the name of martyrdom. But that doesn't mean suicide terrorism is the first resort of Muslims. In fact, it turns out that suicide terrorism isn't a historically popular strategy for followers of Islam. Robert Pape in his book Dying to Win: The Strategic Logic of Suicide Terrorism notes that there were no suicide attacks by Muslims or any other groups from 1945 to 1980. From 1980 through 2003, Pape catalogued 315 suicide terrorism campaigns around the world with 462 individual suicide terrorists.2 Pape notes that "every suicide campaign from 1980 to 2003 has had as a major objective –or as its central objective—coercing a foreign government that has military forces in what they see as their homeland to take those forces out." 3Pape concludes, "The bottom line, then, is that suicide terrorism is mainly a response to foreign occupation."4 So, politics and power are the real motivation for terrorist campaigns. It thrives in Islam because the belief system doesn't contradict its use.
What about the Destructive Power of Science?The biggest problem with Stenger's quip is it is so self-selective. It gives a rosy picture of science by the example of one of our greate3st achievements and then contrasts it with one of our greatest horrors. But it isn't "science" that flies us to the moon. It's human beings who do that. Science allows human beings to understand thrust and gravity. It is a tool to help us accomplish whatever goals we have. Humans used science to develop the planes that Stenger seems to be so worried about, but he doesn't mention that. We use science to construct better weapons, too, producing some of them most incredible destructive powers on earth. Without science, we would never have had a Hiroshima or Nagasaki. And without religion, we would never have a Mother Theresa or a Father Damien.
The Golden Rule and the concept of the Good Samaritan find their origin in Christianity. Dr. Alvin J. Schmidt explains that it was the teachings of Jesus that "elevated brutish standards of morality, halted infanticide, enhanced human life, emancipated women, abolished slavery, inspired charities and relief organizations, created hospitals, established orphanages, and founded schools."5
Harris and Stenger's comments not only show their bias, but they are demonstrably wrong. They have simply created straw men in order to easily knock them down. Perhaps if they showed a little more Christian charity toward those with whom they disagree, they wouldn't be so nasty and could see things a bit more clearly.
References1. Harris, Sam. Letter to a Christian Nation. New York: Vintage, 2006. Print.
2. Pape, Robert Anthony. Dying to Win: The Strategic Logic of Suicide Terrorism. New York: Random House, 2005. Print. 14-16.
3. Pape, 2005. 42.
4. Pape, 2005. 23.
5. Schmidt, Alvin J. How Christianity Changed the World. Grand Rapids: Zondervan, 2008. 8.
Monday, January 30, 2017
It can happen during busy news cycles that some of the more important stories are missed. That may have been the case last week as scientists announced they had successfully created a human-pig chimera embryos in what was called a "first proof" of concept by the BBC. According to the report, scientists were able to inject human stem cells into a newly-formed pig embryo and then implanted the embryo into a sow, allowing it to grow for 28 days. They then looked to see whether the human cells were growing before destroying the embryo.
The ultimate goal in these tests is not to develop some kind of hybrid monster, but to be able to grow human organs in animals for eventual transplant to patients whose organs are failing. In this specific experiment, the embryos would be considered less than one-ten thousandth human. Still, it marks the first time functioning human cells have been observed growing inside a large animal, according to Juan Carlos Izpisua Belmonte of the Salk Institute, causing other researchers to describe the published findings as "exciting."
More Questions than AnswersI think this report is important for some very specific reasons. First, the idea of a human-pig chimera is shocking. It opens a lot of questions about humanity and our technical abilities. The report made it clear that this research is highly inefficient and would take many years to develop more fully. But the BBC report also noted this kind of research is "ethically charged" and offered a one-sentence disclaimer stating "There was no evidence that human cells were integrating into the early form of brain tissue." 1
The fact that our technology is progressing faster than our ability to place it in its ethical context is the biggest takeaway from the story. For those who hold to moral pragmatism—meaning the ends justify the means—then it makes sense to do whatever one wants in order to achieve a desired goal. Burt pragmatism isn’t real morality; it simply says the ends justify the means, which is a position used by tyrants.
How Science Cannot Account for MoralityMore interestingly to me is the fact that the chimera research is a clear example of how science cannot answer all the important questions. Prof. Belmonte, who was o0ne of the researchers on the project was clear that at this time they were not allowing the embryos to grow longer than one month, as that’s all they needed to confirm development t of human cells in the pigs:
One possibility is to let these animals be born, but that is not something we should allow to happen at this point.I’m glad to see a bit of caution in Belmonte’s words. He’s right to say not everything science can do it should. The statement really rebuts new atheists like Sam Harris, who famously argued in his book The Moral Landscape that science itself can be the foundation of morality. Well, here’s the science. Where does the morality come from? How would Harris ground any kind of moral reasoning as to whether we should do what we can do in this instance? Should we mix human and pig cells even more? What if human and pig brain cells were combined?
Not everything that science can do we should do, we are not living in a niche in lab, we live with other people - and society needs to decide what can be done.2
Ultimately, this shows that science can only tell us what is possible, not whether it should be. Moral reasoning must come a moral lawgiver, not from the fact that something can be done. Otherwise, we’ll all be left with a real Frankenstein’s monster of moral values.
References1. Gallagher, James. "Human-pig 'chimera Embryos' Detailed." BBC News. BBC, 26 Jan. 2017. Web. 27 Jan. 2017. http://www.bbc.com/news/health-38717930.
2. Gallagher, James. 2017.
Friday, January 20, 2017
Most atheists today are materialists. They don't believe people have immaterial souls and think that all of our experiences and thoughts can be reduced to electro-chemical functions in the brain. In fact, they often point to neuroscience to make their point.
In my debate against Richard Carrier he made such a claim, stating:
We can break your consciousness. A bullet can go through your brain or a surgeon can go into your brain and cut out a piece of it and you will lose that function. For e.g., there is a part of your brain that recognizes faces. We can cut that out and then you can't recognize faces anymore. You've lost a part of your consciousness. And every single thing that we do, like vision, the seeing of color, the seeing of red, is associated with a location in the brain that we can cut out, and you won't have it anymore. So we know that there is actual machinery that is generating this stuff.1Of course, Carrier is equivocating on the word consciousness, using it only in terms of ability rather than sentience. Using the word as he uses it above, a blind man is less conscious than a sighted man while a dog hearing a dog whistle has more consciousness than any human at that moment. Obviously, such an idea is flawed.
But let's leave that aside for the moment. Instead, I'd like to focus on Carrier's assertion that we can know a certain part of the brain is responsible for us seeing red or recognizing faces. I've heard this claim many times in conversations, usually with atheists pointing to fMRI imaging of people thinking about a thing or medical studies where neurologists will point to a damaged portion of the brain inhibiting something like speech. Like Carrier above, they claim that science has proved this is the area responsible for this function and the function is therefore wholly materialistic in nature. The brain is basically a biological computer and should be understood as such.
Testing What We Know vs. Assuming What We Don'tIt turns out, though, that Carrier's confidence in knowing "there's a part of your brain that recognizes faces" or whatever is over-simplistic. The case is made very well in a new article from The Economist magazine. Originally entitled "Does Not Compute," the article states how neuroscience has drawn most of their conclusions through the study of the two methods I mentioned above. However, Neuroscientists Eric Jonas of the University of California, Berkeley, and Konrad Kording of Northwestern University, Chicago decided to go a different route and test these tests, so to speak.
Jonas and Kording reasoned if the computer is an accurate analog for the brain, then they should be able to find which portions of a computer chip are responsible for specific functions by either incapacitating that portion or imaging the chip to capture activity. Since all aspects of a computer chip and its functions are already known, they could see how well these tests accurately identified the structures as those primarily involved with that function. The article states they chose a MOS Technology 6502 CPU chip, one used in early Atari games and Apple computers. It was simple enough to handle but still had a wide variety of programs and functions to be tested.
Assumptions Come Crashing DownThe results of their tests were fascinating. The article reports:
One common tactic in brain science is to compare damaged brains with healthy ones. If damage to part of the brain causes predictable changes in behaviour, then researchers can infer what that part of the brain does. In rats, for instance, damaging the hippocampuses—two small, banana-shaped structures buried towards the bottom of the brain—reliably interferes with the creatures' ability to recognise objects.Of course none of this proves that the consciousness of living beings comes from an immaterial source. There are other really good reasons to believe that. The big takeaway in Jonas and Kording's research is that all the Sturm und Drang made by atheists on how neuroscience has "proved" our thoughts come from our brains is shown to be bias rather than fact. Neuroscience is in its infancy and has proven nothing of the sort. In fact, even fMRI imaging is nothing more than "a conjecture or hypothesis about what we think is going on in the brains of subjects."3
When applied to the chip, though, that method turned up some interesting false positives. The researchers found, for instance, that disabling one particular group of transistors prevented the chip from running the boot-up sequence of "Donkey Kong"—the Nintendo game that introduced Mario the plumber to the world—while preserving its ability to run other games. But it would be a mistake, Dr Jonas points out, to conclude that those transistors were thus uniquely responsible for "Donkey Kong". The truth is more subtle. They are instead part of a circuit which implements a much more basic computing function that is crucial for loading one piece of software, but not some others.
Another neuroscientific approach is to look for correlations between the activity of groups of nerve cells and a particular behaviour. Applied to the chip, the researchers' algorithms found five transistors whose activity was strongly correlated with the brightness of the most recently displayed pixel on the screen. Again, though, that seemingly significant finding was mostly an illusion. Drs Jonas and Kording know that these transistors are not directly involved in drawing pictures on the screen. (In the Atari, that was the job of an entirely different chip, the Television Interface Adaptor.) They are only involved in the trivial sense that they are used by some part of the program which is ultimately deciding what goes on the screen. 2
At one time, people spoke assuredly of bloodletting as the cure to various maladies. They had confidence in the science of their day. Today, people speak confidently of how much they know from neuroscience. Carrier's assertions above are just one example. One cannot simply "cut out" one area of the brain responsible for facial recognition. If atheists are as open to reason as they say, they need to stop making grandiose claims from very tenuous data.
2. "Testing the Methods of Neuroscience on Computer Chips Suggests They Are Wanting." The Economist. The Economist Newspaper, 21 Jan. 2017. Web. 20 Jan. 2017. http://www.economist.com/news/science-and-technology/21714978-cautionary-tale-about-promises-modern-brain-science-testing-methods.
3. Nöe, Alva Out of Our Heads: Why You Are Not Your Brain, and Other Lessons of Consciousness.
New York: Hill and Wang, 2009. 20.
Image courtesy Gengiskanhg, CC BY-SA 3.0
Wednesday, September 14, 2016
Every group has its biases. Enlightenment thinkers believed reason could provide the ultimate answer to all questions. The Victorians stressed common manners and proprieties. Both were helpful in some ways; manners provided a common framework for engaging with large populations pushed together as modern cities developed and reason is an appropriate way to seek understanding. But they shouldn't be practiced to the exclusion of other ways we understand.
Today, the dominant framework most people assume will provide answers and meaning is neither manners nor reason, but science. Atheists and "freethinkers" especially tend to hold to an over-confidence in science as the path to discovering truth. As an example, I wrote an article entitled "Three Intractable Problems for Atheism" where I pointed out that the origin of universe, the origin of life, and the origin of consciousness are unexplainable if all that exists is matter following physical laws. One comment I received was "We don't know YET, because we've only just in the past century begun to seriously uncover the origins of the universe. If that day comes, and you don't like the answer, what will the next goalpost be?" What those who respond in such ways never say is why they think that science is even the right discipline to answering these questions at all.
Fingers and ForksIn fact, science will never be able to answer these questions because it isn't designed to do so. Let me offer an example. Early cultures primarily used their fingers to eat their food. They would pick and tear at a piece of meat or tear off a hunk of bread. Even in Jesus's day, this was pretty common. But using your fingers has some drawbacks, too. If your hands are dirty, they can contaminate the food. You can't touch things that are too hot, and the buildup of greasy food on your hands means you'll need to wash after a meal.
That's why the fork is such a great invention. It solves health issues that accompany eating only with one's fingers. But it does more than that. It allows one to keep an item from moving so it can be cut, adjusting the size of your bite to fit you individually. It skewers smaller food items, like individual beans, that would be hard to grasp with your hands. It also reflects proper manners, providing a symbol of separation from animals.
Forks have given human beings a great step forward in our culinary history, allowing us to eat in ways we couldn't have without it. However, if the chef places a bowl of tomato soup in front of me, the fork is no longer useful. The benefits that the fork conveys when consuming solid food are the very reason it fails when applied to liquids. To close the tines of the fork so it may hold liquid would rob the fork of its unique abilities to skewer other foods. I need a different tool.
Now imagine a person from "the fork is the only way to true nourishment" camp who seeks to eat the soup with his fork. He tries to eat the soup and quickly becomes frustrated. He can dip his utensil inn the soup for a long, long, time. He'll never get all the soup and probably burn more calories than he consumes trying. At this result, he may then conclude that soup isn't really food at all.
Choosing the Right Utensil When Searching for TruthScience is like a fork in humanity's quest for knowledge. It can do a lot of things. It has improved our health and allowed us to create new polymers. It has shown us facts about the material universe and its laws. But from where that universe and its laws originate, science cannot answer because it simply isn't designed to do so. It cannot tell us about things like consciousness since consciousness is immaterial.
When pressed, atheists usually try to escape their dilemma in one of two ways: they either claim science will get there eventually (what I call a Science of the Gaps argument). But that's just wishful thinking and as they seriously consider what human consciousness entails—things like the capacity for free will on a purely materialist framework—they begin to deny things like consciousness and free will are real.
Science, like a fork, is useful in the hand of humanity. It can serve us well as we seek to cut into the mysteries of the universe and digest what we discover there. However, it shouldn't be the only tool on the table. To ignore other ways of consuming knowledge is to limit not expand our intellectual palate.
Wednesday, August 10, 2016
I find it fascinating how blinded people can be to their own biases. One recent case in point is cosmologist Neil deGrasse Tyson and his imaginary country of Rationalia. Originally spawned by a single tweet, Tyson asserted "Earth needs a virtual country: #Rationalia, with a one-line Constitution: All policy shall be based on the weight of evidence."
It's pretty easy to see the glaring holes in such a proposition and several commentators were quick to point out a few of them. Practices such as eugenics, abortions for disability or population control, legislating against an unnatural ice age, and other disastrous consequences would have easily followed if Tyson's dream was a reality in prior decades. Several commentators for organizations like The Federalist, U.S. News and World Report, and New Scientist pointed out the foolishness in his original tweet.
However, Tyson doubled-down on his proposition with a recent Facebook post. Linking to those articles before casually dismissing them out of hand, Tyson upped the ante for his proposition, maintaining that Rationalia would not only solve deep political divisions, but it would usher in a new panacea of prosperity for humanity:
Unlike what typically occurs between adversarial politicians, in scientific discourse, when we disagree with one another there's an underlying premise: Either I'm wrong and you're right; you're wrong and I'm right; or we're both wrong. The unfolding argument actually shapes the quest for more and better data to resolve the differences so that we can agree and move on to the next unresolved problem.
In Rationalia, the Constitution stipulates that a body of convincing evidence needs to exist in support of an idea before any Policy can established based on it. In such a country, data gathering, careful observations, and experimentation would be happening all the time, influencing practically every aspect of our modern lives. As a result, Rationalia would lead the world in discovery, because discovery would be built into the DNA of how the government operates, and how its citizens think.1
The Competitive World of Scientific TheoryOf course, Tyson's Pollyana-ish assumption that scientists are always objective about the data while politicians are simply adversaries is ridiculous. Thomas Kuhn's The Structure of Scientific Revolutions lays out just how nonsensical such an assumption is. Kuhn argues that scientific consensus of a certain concept, such as the nature of light, can have "a number of competing schools and sub-schools"2 arguing for their own understanding even when they are all using the same data. Kuhn states:
"Each of the corresponding schools derived strength from its relation to some particular metaphysic, and each emphasizes, as paradigmatic observations, the particular cluster of optical phenomena that its own theory could do most to explain. Other observations were dealt with by ad hoc elaborations, ort they remained as outstanding problems for further research. (emphasis added)"3These are not detached, non-emotional observations. Scientists are people and each has a dog in the fight, so to speak. It isn't surprising that they would want to see their own theories succeed, just as politicians would want to see their own legislation pass. It isn't malicious, it's being human. And in modern research, when you add research grant money into the mix, there's a potent motivator to really push to justify one's efforts.
Paradigms and FlawsKuhn goes on to tell of other problems that plague scientific discourse, such as the "body of evidence" that Tyson looks toward may itself be limited given the limits of technology. Scientists may not be able to see how their theories are flawed simply because they have to guess at what data they should measure, where to look for it. Maybe the instrument that proves their theory false hasn't yet been invented. Charles Darwin couldn't have realized the complexity of living cells since there were no microscopes capable of displaying the amazing molecular machinery that allow the cell to function in his day.
This "body of evidence" that Tyson references may also be deeply flawed. Researchers at Open Science and at Bayer labs recently found 65 to 75% or more of results reported in some of the most prestigious scientific journals could not be repeated. There was a strong body of evidence for the researchers' conclusions, but no one had previously bothered to check and see if the evidence was good or not. In turn, we get biased polices such as the Food and Drug Administration's 60 year ban on dietary fat when it turned out the scientist pushing for the restrictions was more concerned with his legacy than the facts.
Some of the problem lies in the technicality and specialization of the scientific disciplines themselves. Kuhn notes that as one of the competing concepts gathers a majority, it becomes a consensus and ultimately a paradigm that closes out others.4 Then, as the field becomes more specialized, the paradigm is assumed by the practitioners and "his communiques will begin to change in ways whose evolution has been too little studied but whose modern end products are obvious to all and oppressive to many."5
Tyson Ignores This Body of EvidenceKuhn's arguments are based on historical observation for how scientific paradigms have developed. He has quite a body of evidence from which to draw: the entire history of the scientific enterprise. Yet, Tyson seems to completely ignore this in his proposal for a country of Rationalia. I find that interesting. If Tyson won't even acknowledge the body of evidence against science being just as flawed as politics or other governing methods, then he is proving the very point his critics are making. Just because a scientist comes to the conclusion of X doesn't make it right, morally good, or unbiased.
2. Kuhn, Thomas. "The Structure of Scientific Revolutions." The Philosophy of Science: An Historical Anthology. By Timothy J. McGrew, Marc Alspector-Kelly, and Fritz Allhoff. Chichester, U.K.: Wiley-Blackwell, 2009. 491. Print.
3. Kuhn, 2009. 491.
4. Kuhn, 2009. 492.
5. Kuhn, 2009. 492.
Image courtesy NASA Goddard Space Flight Center from Greenbelt, MD, USA (Dr. Neil deGrasse Tyson Visits NASA Goddard) [CC BY 2.0], via Wikimedia Commons
Monday, May 16, 2016
As I engage with atheists and skeptics, I hear so many of them state that religious beliefs are nothing more than outdated beliefs of a bygone era. They claim that as people of science in the 21st century we are so much more enlightened and rational than those of other eras. Level-headed people of the modern world who place their trust in science are not nearly as gullible as people in the past, they claim. Then they turn around and argue that gender has nothing to do with biology and a person's perceived identity is all that's required to change a male into a female.
I think this reminds me a lot of a sketch I saw in the early days of Saturday Night Live entitled "Theodoric of York; Medieval Barber." Host Steve Martin takes on the role of Theodoric and makes great fun of the idea that certain illnesses were treated by bloodletting. Part of the humor stems from Theodoric's modern-day rhetoric, whereby he ascribes knowledge and insight into his treatment:
You know, medicine is not an exact science but we're learning all the time. Why, just fifty years ago, we would've thought your daughter's illness was brought on by demonic possession or witchcraft. But nowadays we know that Isabel is suffering from an imbalance of bodily humors perhaps caused by a toad or small dwarf living in her stomach.1Certainly, Martin is using great exaggeration to make a joke. Yet it is true that bloodletting was practiced widely for many centuries, ever since prominent Roman physician Galen of Pergamum described the theory that there were four primary liquids or "humours " affecting the body: phlegm, blood, black bile, and yellow bile.2 Galen had through both observation and inference come to the conclusion that when a person is sick, their humours are "out of balance" as Michael Boylan explains:
When imbalance occurred, then the physician might intervene by making a correction to bring the body back into balance. For example, if the individual were too full of phlegm (making her phlegmatic or lethargic), then the phlegm must be countered. Citrus fruit was thought to be a counter-acting agent. Thus, if one feels lethargic, increasing one's citrus intake will re-create balance. The treatment is, in fact, generally effective.3
Biased Assertions Lead to Bad DiagnosesOf course today we see such an inference as silly and worthy of ridicule in an SNL sketch. Galen had an incorrect assumption of what blood was and how the body used it.4 It was his errant assumptions that are at the root of those crazy treatment methods. To be certain, bloodletting sometimes worked, but they probably caused far more harm than good overall.
Today's rush by the left–including the intelligentsia—to validate anyone who even hints at gender dysphoria should be disconcerting to any rational populous. I've pointed out before how we have fifty years of data under our collective medical belts on gender reassignment surgery and we know that the suicide rate for those suffering from gender dysphoria is as high after sexual reassignment surgery (SRS) as it is prior to transitioning. Dr. Paul McHugh, who helped pioneer the procedure at Johns Hopkins University has written extensively on the failure of SRS as an effective treatment and explained that Johns Hopkins stopped doing the procedure as a result.5
Now, the powerful agencies like the Obama Administration have gone even farther off the deep end and demanded that anyone who simply claims to be a different gender should be allowed to use the restrooms and locker rooms of their stated sex. The demand comes with no accountability and no requirement of proof that the claimant actually does wish to consistently live and be seen as whatever their stated gender preference is.6
Fluid Gendered Identity is the Bloodletting of TodayJust claiming it makes it so? Surely, this cannot be! Certainly, we are in a more rational time than that of the medieval barber. Certainly we don't approach a treatment based only on whatever our initial biases are, do we? It seems we do.
The biases that those who are pushing these laws in direct disregard for the safety and wellbeing of millions of women and young girls in our nation are sheer willed to have their version of life playout, regardless of the facts. We are not any more rational than people of other eras. Every culture can fall victim to what we want to be true and ignore those inconvenient facts when they get in the way of those desires.
I wonder if in a century or two we will look back on the insanity of the gender identity movement today and shake our heads with the same incredulousness that we do concerning the practice of bloodletting. If not, there will be untold thousands who are seriously harmed by such medical quackery guised as treatment.
2. Boylan, Michael. "Galen (130—200 C.E.)." Internet Encyclopedia of Philosophy. Internet Encyclopedia of Philosophy, n.d. Web. 16 May 2016. http://www.iep.utm.edu/galen/.
3. Boylan, Michael. "Hippocrates (c. 450—c. 380 B.C.E.)." Internet Encyclopedia of Philosophy. Internet Encyclopedia of Philosophy, n.d. Web. 16 May 2016. http://www.iep.utm.edu/hippocra/#SH1a.
4. "Galen." Medical Discoveries. Advameg, Inc., n.d. Web. 16 May 2016. http://www.discoveriesinmedicine.com/General-Information-and-Biographies/Galen.html.
5. McHugh, Paul. "Transgender Surgery Isn't the Solution." Wall Street Journal. Dow Jones & Company, Inc., 12 June 2014. Web. 02 June 2015. http://www.wsj.com/articles/paul-mchugh-transgender-surgery-isnt-the-solution-1402615120.
6. Davis, Julie Hirschfield, and Matt Apuzzo. "U.S. Directs Public Schools to Allow Transgender Access to Restrooms." The New York Times. The New York Times, 12 May 2016. Web. 16 May 2016. http://mobile.nytimes.com/2016/05/13/us/politics/obama-administration-to-issue-decree-on-transgender-access-to-school-restrooms.html.
Monday, May 09, 2016
"Not enough evidence!" That's the claim I hear over and over when asking atheists why they don't believe in God. Even when famous atheist philosopher Bertrand Russell was asked what he would say if he were to come face to face with God after his death, Russell famously replied, "I probably would ask, 'Sir, why did you not give me better evidence?"1
The demand for evidence can seem like reasonable request, but it can also serve as a smokescreen for those who are unwilling to believe. For example, developmental biologist Lewis Wolpert stated he rejected God fairly early in his life because he could find no evidence for God at all. In a radio show where he debated intelligent design with William Dembski, Wolpert said over and over again there is nothing he could see by studying the molecular machinery required for living cells to function that could serve as evidence for any kind of intelligence. Dembski asked "Is there nothing that biological systems can exhibit that would point you to a designer?" Wolpert emphatically replied, "Absolutely nothing, absolutely nothing." 2 This corresponded with his previous statement that "What we know about biology can all be explained in terms of the behaviors of cells."3
Intelligent Messages Hidden in DNAIs Wolpert's claim true? Is anything one finds in the cell able to be explained by cellular behavior? Earlier in their conversation Dembski alluded to the work of cellular biologist J. Craig Venter. Venter and his team made the headlines at that same time by assembling the DNA for a replicating synthetic bacteria (M. mycoides JCVI-syn1.0) one base pair at a time using computers. Singularity University reported, "To verify that they had synthesized a new organism and not assembled the DNA from another natural bacteria, scientists encoded a series of 'watermarks' into the genes" of Venter's bacterial DNA. He coded his own name, a URL address, and other messages.4
Let's now imagine a scenario where in 50 or 100 years, people are catching a strange new disease. Scientists have narrowed the illness to a foreign bacteria that doesn't behave like anything they've ever seen before. Wolpert's students isolate the bacteria in the lab and map its DNA structure to try and find a way to figure out where it came from. There, they find Venter's name encoded in the nucleotide, but because they have adopted Wolpert's standard that nothing inside the cell can count as evidence, they cannot assume there was an initial intelligence behind the origin of this bacteria. Venter's work cannot be counted as evidence because it appears inside the cell, and appealing to an intelligence as the origin of this new bacterial strain is supposedly the science-stopper.
Wolpert's dogmatic stance shows his incredible bias and demonstrates why complaints like his demand for evidence are structured to never succeed. It's a shell game. If the complexity of something like a researcher's name or a URL is sufficient enough to show intelligence behind the genome, then why wouldn't other complex, specific coding sequences also serve as evidence for a designer? Certainly other factors must be considered. However, Wolpert rules out the possibility of finding evidence for design at all within biological systems. To me, that sounds to be the much more unreasonable position to take.
2. Brierley, Justiin, William Dembski, and Lewis Wolpert. "2 Jan 2010 - Intelligent Design - William Dembski Debates Lewis Wolpert: Saturday 02 January 2010." Unbelievable? Premier Christian Radio, 2 Jan. 2010. Web. 09 May 2016. At about the 19:30 minute mark. http://www.premierchristianradio.com/Shows/Saturday/Unbelievable/Episodes/2-Jan-2010-Intelligent-Design-William-Dembski-debates-Lewis-Wolpert.
3. Brierley, Justin. 2010.
4. "Secret Messages Coded Into DNA Of Venter Synthetic Bacteria." Singularity HUB. Singularity Education Group, 25 May 2010. Web. 09 May 2016.
Get the latest news and articles delivered to your inbox each month - absolutely free!