Tuesday, December 15, 2009
For example, one commonly hears things like, "our planet is in peril," or, "if we don't act soon, we won't have a planet at all."
The history of our planet says otherwise. For example, the Earth was about 15 degrees C warmer than it is currently around 50 million years ago, during the Eocene. Our planet was very warm and largely ice-free, even at the poles. Yet it survived. Carbon dioxide levels have also been much higher in the past, perhaps even a dozen times higher in the deep past. What does this mean? It means our planet is not in peril; it has experienced much warmer temperatures than today, and even being ice-free does not somehow spell doom for planet Earth.
Climate change does threaten the specific species we have currently on Earth. In other words, some of the life on Earth is put in danger by climate change, not the Earth itself. Living things will have to adapt to a changing climate or diminish and possibly go extinct. The obvious example is the polar bear, as at least some polar bear populations are declining. If the Arctic ice does completely disappear, as some scientists are predicting, they will indeed have to adapt or perish.
So, while talk of the destruction of our planet may be rhetorically useful, it is inaccurate to say that the planet itself is somehow in danger, or to say that it will become uninhabitable. Rather, the planet itself will be fine, as will its ability to support life in general. For many species though climate change will be a growing threat, and if we desire preservation of our current ecosystems we will have to take the necessary steps to curb our influence.
Thursday, December 10, 2009
One interesting thing is the subtitle of the book: How the Bible fails to answer our most important question - why we suffer. While it is certainly an important question, it is not clear that this is the most important one; it is also not clear that the Bible's purpose is to answer this question. In this sense, I wonder if the whole premise of the book is slightly off. Usually it is the Christians who are accused of viewing humanity as all-important, but if Mr. Ehrman thinks the question of why humans suffer is the most important question there is, it would seem it is he who as an elevated view of humanity.
Another bizarre idea of Ehrman's is that he believes that to say the problem of suffering is beyond our ability to comprehend (i.e. a mystery) is the same as saying there is no answer to the problem of suffering. Just because humans may not know an answer does not mean there is no answer. It would seem that Ehrman is elevating humans to a level where if we cannot arrive at a solution to something or cannot comprehend it then it must not exist.
Still, I am interested in hearing him out. The books seems like it will be a passionate articulation of both a difficult intellectual problem and the reasons for Ehrman's loss of faith (he used to be an evangelical Christian).
Saturday, September 5, 2009
As we’ve covered earlier, Christianity often gets a bad rap for its supposed detrimental effects on scientific work, particularly during the medieval period. Many contemporary thinkers similarly dismiss astrology and alchemy as worthless nonsense despite the fact that historians of science, such as David Lindberg, have shown that modern astronomy and chemistry would likely not exist (or would be very different) without these medieval precursors.
Astrology throughout the Middle Ages was not simply the horoscope and zodiac obsessed practice that we know today. It was a branch of natural philosophy dealing with the physical influence of the cosmos on the earth. Casting horoscopes, etc., was a part of astrology, but a contentious one. Many medieval thinkers criticized this part of astrology while accepting the idea that celestial events have an influence on earthly ones. And they had good reasons for doing so. For example, it was clear that the sun had a profound influence on the earth, bringing heat and light and causing the seasons. The moon also had a clear influence by causing the tides. Several Greek intellectual traditions considered the investigation of the connections between the heavens and the earth as a legitimate and rational enterprise. Interest in astrology was also in many cases the primary motivation for the expansion of astronomical knowledge. Astrology played an important role in the development of modern astronomy and was not entirely wrong in its descriptions of the causal influence of the heavens on the earth.
Also, it turns out that long before modern secularists dismissed astrology as a pseudoscience, the Christian church was its major critic. Some of the common tenets of astrology included the idea that celestial bodies were divine and could influence or determine the fate of human beings on earth. The church objected to both of these doctrines, asserting that humans have free-will and that celestial bodies were not gods that could determine events on earth.
Monday, August 10, 2009
First, I want you to look at the computer that you are using right now to view this page. Whether it is a laptop or a desktop computer, you are looking at something rather special. Here's what I mean: your computer is made of ordinary elements. Silicon, aluminum, iron, carbon, etc. The elements in your computer are not any different from elements you could find anywhere else. But, in this case, they are allowing you to connect to a world-wide network of information and view things like this webpage. Why? Nothing in the elements themselves dictate this sort of function. One could have all the appropriate elements and compounds together, but there would be no tendency for a computer to emerge. This means that these elements, along with the physical and chemical laws that govern their behavior, allow a computer to be built out of them. However, the raw materials themselves along with the natural laws that govern them are not capable of creating a computer. We can therefore say that computers are contingent; that is, the existence of computers cannot be explained by any known scientific laws, and nothing in the universe requires that they exist. Contrast this to, say, the gravitational attraction between the Earth and the Sun. In our universe, the laws (of gravity, in this case) demand a force of attraction (in Newtonian terms) of a certain magnitude between these two masses. They have no choice in the matter. A computer, on the other hand, is entirely contingent. Your computer's elements have been organized in a complicated, specific arrangement that results in the functional computer you have in front of you, but nothing in the physics or chemistry of those elements led to this specific arrangement.
OK, so what does this have to do with the origin of life? It turns out that biological life, as we know it, is contingent. DNA, for example, contains the genetic information that you or I or any living thing are built according to. DNA can be said to contain the blueprints of an organism. It consists of long strands of what are called nucleotides, major parts of which are made of nucleobases such as adenine (A), cytosine (C), guanine (G), and thymine (T). The arrangement of these nucleobases is largely responsible for the information contained in the strands of DNA. For a simple analogy, think of the words in an encyclopedia, or in this post. The arrangement of the letters (of the English alphabet, in this case) determines the information contained in a written entry. Different arrangement, different meaning. Different arrangements of the four "letters" of the genetic language (A, C, G, T) are responsible for all of the different organisms we see on earth. Now, this is not merely an analogy; DNA is not like a language, it is a language. Contemporary biologists have come to realize that when they speak of genes, they are speaking of information, not something physical. Here's where it gets interesting: information transcends physics and chemistry.
I don't mean this in some mystical, metaphysical way. Think about it. When you write a journal entry in your notebook, you are using ink and paper to represent information. But the ink and paper itself cannot be responsible for the information content, nor can the ink and paper itself said to be the information. Rather, the ink and paper happen to be the physical medium for embodying that information. The information itself transcends the physical and chemical. At first the information was in your mind, then on your paper, and then could be typed into a computer document, then transferred onto a thumb drive, etc.
In each case the information was being physically stored in a medium, but it is clear that the physical media are not themselves responsible for the creation of the information. In fact, the physical medium must be neutral, or flexible, in order for it to usefully store information. Consider your ink and paper again. If whenever you wrote the letter "t", the physical properties of the ink and paper forced the next bit of ink to form the letter "s", then you would have a very difficult time writing anything meaningful. The physical and chemical properties would determine the content of the paper, and your journal entries would be limited to repetitive strings of letters. Physical and chemical laws are good at repetition (a rock will always fall the same way, bits of lava will crystallize again and again in the same pattern), but cannot explain the origin of complex, specific information.
Let's get back to biology: our DNA strands, with the arrangement of our bases (letters) A,C,G, and T, cannot be explained by any physical or chemical laws. Like the ink and paper, the physical laws of the DNA strand itself allow a near infinite variety of arrangements of DNA letters. How then, can the materialist explain the origin of genetic information? If one is a materialist (i.e. one who holds that the physical, or nature, is all that exists), one is stuck trying to explain the origin of genetic information in terms of the known physical and chemical laws of the genetic molecules themselves. This is very literally like trying to explain this essay by appealing to the physics and chemistry of the molecules in the computer screen, or like trying to explain a newspaper's information content in terms of the physical and chemical properties of the ink and paper. As we have seen, the physical medium itself must remain neutral on the arrangement of letters or else the medium is useless for information storage. A neutral physical medium cannot explain why we see one particular arrangement and not another.
[In case the reader is wondering, chance cannot explain the first bit of genetic information; the probabilities of accidents resulting in even the "simplest" self-replicating cell are incredibly low (a single cell in your body contains vastly more information than Encyclopedia Britannica). Origin of life researchers do not consider chance to be a viable explanation, but are investigating scenarios where unique chemical conditions can hopefully explain the emergence of the first self-replicating machine.]
I am not trying to suggest that materialists are stupid; I am merely drawing attention to an important conceptual problem. If you are familiar at all with origin of life research then you know that scientists have investigated a huge range of possible scenarios leading to the origin of the first cell with some genetic information. While this research is extensive and informative, it is also above all strongly inconclusive. See Shapiro's Origins for a devastating critique, or Schopf's Life's Origin for an optimistic but honest assessment of the research. I am not calling for origin of life researchers to quit and say "God did it." I am, however, suggesting that researchers should pay careful attention to criticisms like those discussed above. It is possible that new laws will be discovered that can shed light on this problem. Only time will tell if the origin of life will eventually be explicable in physical and chemical terms. I will enjoy following along to find out.
Monday, July 20, 2009
Thursday, July 2, 2009
All of these efforts to start making kids smart earlier are wrongheaded because the education is not developmentally appropriate to those age groups. Research has found, for example, that children who learn how to read later on, around 6 or 7, actually become better, more enthusiastic readers in the long run. If they are "taught" too early, the children learn how to associate symbols with sounds, but that's it (that's all their brains can do at that point). Though parents may think their kids are reading, they are actually just parroting back noises, and any "advantage" in reading the kids may appear to have disappears around first grade. And kids who are part of normal kindergarten (not super-academic kindergarten), with good old-fashioned play time, become the better students over the course of their education.
Research has also shown that for every hour of things like "Baby Einstein" babies and toddlers watch, their vocabulary decreases by some percentage. Their brains are simply not ready for that, and it does not help to try and force it. What then should parents do to encourage good brain development? Keep it simple: love your kids, provide a nurturing environment, let them play, and read to them. Read to them a lot. And don't worry if your child seems like they are developing a little late...the research shows that these children often have the more advanced brain down the road.
P.S. the commonly heard "Einstein was just an average student" is misleading; though he sometimes earned average grades, he was actually a brilliant student. The problem was that he was kind of a rogue and his instructors often didn't like him. He was reading Immanuel Kant at age 14 and loving it (for context, one of Kant's books remains untranslated because the scholars who have tried have stopped for fear of losing their sanity).
Sunday, June 28, 2009
The idea of evolution undoubtedly had a strong effect on Victorian beliefs and attitudes. An evolutionary idea of progress permeated Victorian culture; one might say it was the zeitgeist of the 19th century and even the beginning of the twentieth. Europeans saw themselves as the pinnacle of progress, while nonwhite peoples were seen as inferior and less intelligent, with the primates just below them, and so on. But, this cultural bias towards other races existed before Darwin and his theory of evolution. As early as the 17th century (Darwin published The Origin of Species in 1859) naturalists were studying human skulls from various cultures and concluding that whites were the superior race. Louis Agassiz, Samuel Morton, and Robert Knox were prominent 19th century writers/scientists who, before Darwin, concluded that blacks were an inferior race, even the “lowest grade of humanity.”
While we certainly cannot blame Charles Darwin for these prejudices, his theory was seen to provide scientific support for them. His specific theory of evolution, suggested that all life forms compete in the struggle for existence. The more fit will survive while the less fit die out. It was clear, then, that European culture, having survived and progressed far beyond any others, was superior to other cultures. In other words, Darwin’s mechanism could explain, scientifically, how some races could come to be superior to others. It was also used to validate slavery. For example, it was argued that if blacks were set free that they would inevitably go extinct. Blacks, being inferior to other races, would lose in the struggle for existence; it was therefore charitable to keep them as slaves and preserve their race (besides, it was also well-known that blacks became vicious when given freedom and an education). We must remind ourselves again, though, that the idea of evolution did not originate with Darwin and was already quite well-known by the time Darwin published his book. The idea of a struggle for existence, both biological and social, had also already been made popular by Thomas Malthus and Herbert Spencer. In other words, Darwin’s theory was built upon, and was in some sense a product of, preexisting Victorian values.
Victorian anthropologists, before Darwin, were ethnocentric: Europeans were seen as the highest, most advanced race, while other races were viewed as lower and inferior. However, these anthropologists were largely monogenist, meaning they viewed all races as being part of the same human species. In the late 19th century, after Darwin published The Origin of Species, many physical anthropologists became polygenist, viewing other races as separate species from Europeans. Eventually a developmental view of society emerged. In short, this developmental viewpoint saw native peoples as relics of our evolutionary past; people who had not evolved much higher than the apes, and whose adults had intelligences similar to that of European children (a side note: many Victorians also saw European women as having intelligences similar to a child’s). This evolutionary, developmental view of culture did not exactly encourage an equitable view of races and cultures.
It is safe to say that Darwin’s theory was used to support the ethnocentric and racist Victorian view of other races. It also perhaps tilted physical anthropology into a slightly more racist mode (it also eliminated previously strong ties with missionary work). But we cannot place too much blame on Darwin. Racism was a general prejudice of the time, and cannot be linked solely to Darwinism (even the general idea of evolution cannot take all the blame: Louis Agassiz, mentioned above, was a creationist who viewed blacks as having been separately created and inferior to whites). Darwin's theory may be responsible for adding scientific credibility to racism, but not its genesis.
Darwin himself, on the other hand, was an abolitionist and was convinced that the differences between races was one of education and upbringing, not inherent natures. This was made clear to him through his experiences with "civilized savages," men from primitive cultures who had been raised in England and whose behavior and intellect were indistinguishable from Europeans.
Saturday, May 2, 2009
Darwin's combination of chance and natural selection was crucial; chance by itself (random mutations in the genetic code or random conglomerations of organic chemicals) is a hopeless approach to constructing even the simplest biological molecule. Take a small protein which consists of a chain of around 100 amino acids; the chance for a small functional protein to form in a random search is around 1 in 1270000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000. This is well beyond any reasonable probability limit, as would be the probability for any other biological feature (e.g., a strand of DNA). But, with natural selection preserving the individual random changes that are beneficial, larger changes can gradually accumulate. Richard Dawkins illustrates this with the following scenario: take the phrase "METHINKS IT IS LIKE A WEASEL." Imagine a computer randomly generating phrases out of the 26 letters of the English alphabet; the probability of producing this phrase through random letter generation is practically zero. However, if whenever a letter that is part of the phrase appears, the computer keeps that letter, then gradually (in fact, quite easily) the target phrase will be generated. In the biological world, natural selection and mutation achieve something similar: natural selection preserves the beneficial random mutations, resulting in new biological information.
But can information really be gotten so easily? It turns out that Dawkin's illustration is fatally flawed. Consider: the computer had a target phrase, and preserved incremental changes by comparing the randomly generated letters to this target phrase. In nature there can be no targets; natural selection and random mutation do not have a goal in mind. In nature, only beneficial changes can be preserved, not changes that will be beneficial in the future. An intelligent agent (Dawkins, in his illustration) may know that the gibberish is gradually turning into something that makes sense (METHINKS IT IS LIKE A WEASEL), but blind natural forces cannot possibly be shooting for such a target. So it turns out that the illustration meant to show how blind natural forces can generate information actually contains the target information ahead of time, put there by an intelligent agent.
William Dembski and Robert Marks II are working on what they are calling the Law of Conservation of Information. Essentially, it states that you cannot get more information out of a computational algorithm than you put in initially. If this is true, nonteleological evolution cannot in principle explain the origin of new biological information. See their paper here: http://www.uncommondescent.com/evolution/life%E2%80%99s-conservation-law-why-darwinian-evolution-cannot-create-biological-information/
Sunday, April 26, 2009
Christians often associate the big bang theory with an atheistic explanation of the universe (atheistic in the sense that the explanation does not require invoking a creator; the universe, in other words, has a naturalistic explanation). In fact, one of the big bang theory's original formulators and proponents was a Belgian priest by the name of George Lemaitre. While he based his theory on science, not his religious beliefs, there was a clear consistency between the traditional Christian view of the origin of the universe and the big bang theory. The establishment view of the time was a static, eternal universe; this preference can be traced back to the Greeks and was often considered the more appropriate, secular way of viewing the universe. Several decades after Lemaitre's initial proposal, some other astronomers had gathered some observational and mathematical support for the big bang theory. While the evidence was not conclusive, it certainly was worth paying attention to. However, this scientific work was largely ignored and ridiculed by the scientific establishment. Some scientists ridiculed the big bang as being a thinly disguised attempt to bring religion into the fold by characterizing God's initial creation event in scientific terms. The Pope also noticed the congruence between Christianity's creation event and the big bang, formally declaring that science had vindicated Christianity's long standing belief that the universe had a beginning. This of course further enraged the more zealous of the secular astronomers and probably further delayed acceptance of the theory.
So, in contrast to what people often seem to believe, the big bang theory actually can quite easily be seen to support (or at least be compatible with) the initial creation event of Christianity. In fact, there is more tension between an atheistic worldview and the big bang than there is between Christianity and the big bang (a side note: I do not consider it wise for Christianity to ally itself too closely with any specific theory). After all, an eternal universe does not require a starting point, and therefore, no creator. The big bang theory posits that the universe itself (including space, time, and all known laws of physics) originated during this singular event. The implication is that something outside of space, time, and the known laws of physics must be responsible for the initiation of this event. This has not gone unnoticed by secular astronomers, hence the efforts by scientists such as Stephen Hawking to escape the conclusions of their own work by mathematically wriggling their way out of a universe with a beginning (Hawking uses imaginary time to circumvent an actual beginning, preferring instead a self-contained universe with no possibility of a creator).
To sum up, Christians need not fear the big bang theory (or embrace it too closely). Rather, we can observe the shared characteristics that it has with Christian beliefs and watch where the science leads.
Saturday, April 25, 2009
Supplied you not your spirit, but your shape.
All Eden's wealth arrayed before your eyes;
I fathomed not you wanted to escape.
And though I only ever gave you love,
like every child you’ve chosen to rebel;
uprooted flowers and filled the holes with blood;
ask not for whom they toll the solemn bells.
A child of dust to mother now return;
for every seed must die before it grows.
and though above the world may toil and turn,
no prying spade will find you here below.
Now safe beneath their wisdom and their feet,
Here I will teach you truly how to sleep.
Sunday, April 19, 2009
Recent and Future Climate Change
While past climate change remains somewhat enigmatic, there exists a stronger consensus regarding current climate change (IPCC, 2007). As discussed above, the earth has warmed approximately .75 degrees Celsius over the last 150 years (Maslin, 2009). While around 26% of this warming can be accounted for by solar forcing, the majority of the warming is due to anthropogenic CO2 emissions (Karl, 2003). Each year human activities add around 6 gigatons of CO2 to the atmosphere (Eubanks et al., 2006). Approximately 4/5 of these emissions come from the combustion of fossil fuels, with the remaining 1/5 coming from deforestation or other land-use changes (Maslin, 2009). Anthropogenic influence on the warming of the 20th and 21st century has now been detected through the modeling of the recent climate change (Hegerl et al., 2007). Separating out anthropogenic warming from natural variability is necessary because we are currently in a naturally warm interglacial period known as the Holocene (Maslin, 2009).
The Holocene began around 10,000 years BP (Maslin, 2009). This marked the end of the last ice age, which reached its maximum around 21,000 years BP (Carlson, Clark, Raisbeck, & Brook 2007). Following the last glacial maximum (LGM) earth has been warming steadily (Maslin, 2009, p. 42). Approximately 19,000 years BP there was a 10+ meter rise in sea levels that took place within ~500 years (Clark, McCabe, Mix, & Weaver, 2004). Within a period of ~500 years at the beginning of the Holocene there was a rise in temperature of 8-13 degrees Celsius (Birks & Ammann, 2000). The Laurentide Ice Sheet, which covered much of North America, retreated between 9,000 and 8,400 years BP. This resulted in a further ~5 meter increase in sea levels within a ~1000 year period (Carlson et al., 2007). Within the Holocene there have been periodic cooling and warming episodes, some of which were dramatic, but the last 1000 years have been relatively stable (Thornalley et al., 2009; Maslin, 2009, p. 45). Starting in the late 19th century temperatures have begun to rise once again (IPCC, 2007; Maslin, 2009).
Rising temperatures will bring a host of other changes, some of which will further modify earth’s climate. These changes include melting ice cover, rising sea levels, and shifting oceanic and atmospheric circulation patterns (Maslin, 2009). Changes that could further influence earth’s climate are called feedback mechanisms; these could either accelerate or buffer global warming, and are one of the largest sources of uncertainty in future climate predictions (Bony et al., 2006; Raisanen, 2007).
Effects of Global Warming
The first effect of global warming discussed by Al Gore in the documentary An Inconvenient Truth is the melting of mountain ice caps, or glaciers (Bender, 2006). The first example given is of the retreating glacier atop Mt. Kilimajaro in Africa. Kaser (2004), however, found that this particular glacier’s retreat is due to factors other than global warming; this is in part evidenced by that fact that temperatures never rise above freezing at the glacier’s altitude. However, other glaciers are retreating due to global warming (Kaser, 2004). Approximately 67% of glaciers in the Himalayan mountain range are currently retreating (Ren, Karoly, & Leslie, 2007). According to Ren et al. (2007) glaciers in the Himalayan mountains, a source of fresh water for approximately half of the world’s population, may disappear by 2100.
Another concern as the earth warms and ice cover melts is that global sea levels will rise and threaten the coastal communities of people around the world (Alley, Clark, Huybrechts, & Joughin, 2005). From 1961 to 2003 sea level has been rising at an average rate of 1.8 mm per year; from 1993 to 2003 the average rate increased to 3.1 mm per year (IPCC, 2007). This rate is low compared to some episodes of past climate change; during the Holocene melting of the LIS (discussed above) sea levels rose at rates up to 10 mm per year (Carlson et al., 2007). Two particular melting episodes between the LGM and the start of the Holocene had peak rates perhaps greater than 50mm per year (Alley et al., 2005). While current rates are historically low, even small increases in sea level rise could have a substantial impact on coastal areas through erosion, groundwater contamination, and increased vulnerability to storm surges (Alley et al., 2005).
Approximately half of the recent sea level rise comes from thermal expansion of the water itself; the other half is due to the melting of land-based ice sheets (Alley et al., 2005; IPCC, 2007). In An Inconvenient Truth Al Gore does not mention thermal expansion, only the melting of ice caps on mountains and the ice sheets over Greenland and Antarctica (Bender, 2006). Gore uses computer models to simulate a 20 foot sea level rise to demonstrate the future effects of melting ice cover (Bender, 2006). However, current melting rate estimates for Greenland and Antarctica are, respectively, +0.5 mm per year and -0.6 mm per year; this results in a total net contribution to sea level of around zero (Alley et al., 2005). Contributions to sea level rise from mountain glaciers are projected to be < 1 mm per year through the 21st century (Raper & Braiswaite, 2006).
While current sea level change from ice sheets may be negligible, large uncertainties exist concerning ice-sheet dynamics and the possible responses to global warming (Alley et al., 2005; Huyberys, 2006). An increase of glacial earthquakes in Greenland has been detected, and overall melting rates are increasing (Alley et al., 2005; Ekstrom, Nettles & Tsai, 2006). During the last interglacial period (129,000 years BP) sea level was 4-6 meters higher than present (Otto-Bleisner et al., 2006; Overpeck et al., 2006). Greenland is thought to have contributed > 2 meters to sea level rise at that time (Overpeck et al., 2006). A current rise in sea level of that amount could cover some low-lying countries (Overpeck et al., 2006). Dramatic past changes in sea level raise the possibility of such dramatic change happening during the present warming; more research into ice-sheet dynamics is needed (Alley et al., 2005).
Another effect of global warming that has received high media attention and was featured in An Inconvenient Truth concerns changes in hurricane patterns (Curry, Webster, & Holland, 2006; Bender, 2006). Studies have shown an increase in SST over the past 50 years (IPCC, 2007). This rise in SST correlates with a rise in the frequency of intense hurricanes since 1970 (IPCC, 2007). Al Gore uses the example of Hurricane Katrina, which struck the Gulf Coast in the U.S. in 2005, to illustrate the impact of global warming (Bender, 2006). Some studies, however, indicate that the effect of global warming on hurricanes is uncertain: Landsea, Harper, Hoarau, and Knaff (2006) argue that the database of past hurricane activity is too short and unreliable to use to detect trends in intense storms. In addition, there is evidence that as temperatures increase so will wind shear over the Atlantic and Pacific oceans (Vecchi & Soden, 2007; Wang & Lee, 2008). Wind shear is an atmospheric phenomenon that could result in a decrease in hurricane activity and the number of hurricanes making landfall (Vecchi & Soden, 2007; Wang & Lee, 2008). A statistical analysis by Dailey, Zuba, Ljung, Dima and Guin (2009), however, found that increasing SST will likely increase the number of hurricanes making landfall at least in the Southeastern United States. More research in this area is needed, as storm risk is also an important policy issue for coastal communities (Curry et al., 2006).
Ocean circulation is another variable likely to be affected as global temperatures rise. The Atlantic meridional overturning circulation (AMOC), an important transport mechanism for moving heat around the globe as well as an important part of the carbon cycle, can be affected by both temperature change and salinity change as fresh glacial meltwater is added to the oceans (Thornalley et al., 2009). Changes in Atlantic currents can have significant regional climate impacts (Clark et al., 2004). In the early Holocene a regional (and possibly global) cooling event was caused by meltwater disrupting North Atlantic Deep Water (NADW) formation (Rohling & Palike, 2005). Disruption of NADW formation can paralyze the Gulf Stream, an important transporter of heat to Northern Europe (Maslin, 2009). There is evidence of meltwater and other factors already affecting the current salinity of the Atlantic Ocean (Curry et al., 2003). If meltwater disrupts the Gulf Stream, Europe could expect much colder, more extreme winter weather (Maslin, 2009). More research is needed in order to accurately model future changes in ocean circulation and its potential effects on climate (Curry et al., 2003).
Biodiversity will also be affected by climate change; some species will benefit and others will be adversely affected (NRC, 2008). Some forests may be threatened by rising temperatures (Scholze, Knorr, Arnell, & Prentice, 2006). Other plant species, such as soybeans, will have increased vulnerability to predation as CO2 levels rise (Zavala, Casteel, DeLucia, & Berenbaum, 2008). Some grasslands, on the other hand, seem to be mostly immune to climate change (Grime et al., 2008). Overall, animal biodiversity is projected to decrease due to global warming, particularly for species that cannot easily migrate or adapt to a changing climate (Maslin, 2009).
Several other areas will also be impacted by global warming, including agriculture, the spread of certain diseases, increases in wildfires, and an increase in extreme weather events such as floods and droughts (Maslin, 2009; Scholze et al., 2006). Some models and observations suggest that human influence can already be detected in a change of precipitation patterns; precipitation is increasing at middle latitudes in the Northern hemisphere, while it is decreasing in the Northern hemisphere subtropical regions (Zhang et al., 2007).
The following is part of my research into climate change. I do not consider it complete; some sections need to be fleshed out a bit, but it's what I've got for now. Also, I had some formatting problems with the last half or so, hence the breaks instead of indents between paragraphs.
Climate change may be the most important scientific issue facing the global community (Maslin, 2009). In the past few decades several international organizations have formed to meet the need for scientific, political, and economic analysis. Climatology and paleoclimatology are relatively new fields of research; the development of more sophisticated global-scale observations is needed to validate and refine existing atmospheric models (Crutzen, 2000). The large uncertainties surrounding climate change remain current areas of research (Raisanen, 2007).
Global climate change is thought to be primarily due to variations in the Earth’s orbit around the Sun and varying levels of greenhouse gases (Maslin, 2009; Stanley, 1999). These influence the Earth’s energy budget by affecting, respectively, the amount of energy received from the Sun or the amount of energy lost due to radiative cooling (Maslin, 2009).
Variations in the earth’s orbit thought to be responsible for climate change include changes in the earth’s precession, obliquity, and the eccentricity of its orbit. Planetary orbits are elliptical, and the eccentricity of a planet’s orbit is the ratio of the two foci to the major axis of the ellipse (Morrison, Wolf, & Fraknoi, 1995). In other words, eccentricity describes how stretched out the oval-shaped ellipse is; this determines the range of distances between the planet and the sun that is experienced during a revolution.
Obliquity describes the tilt of the earth on its axis (~23°) and precession is the approximately 26,000 year cycle in which the earth “wobbles” on its axis like a top (Morrison, Wolf, & Fraknoi, 1995). These variations affect the amount of incoming solar radiation (or insolation) received from the sun and therefore affect Earth’s climate (Maslin, 2009).
Orbital variations (sometimes called Milankovitch cycles after the scientist who popularized the theory) are an example of external forcing on earth’s climate. Greenhouse gases are an example of an internal forcing (Maslin, 2009). When the Sun’s radiation reaches the Earth, around 30% of it is reflected back out into space and the other ~70% is absorbed. Earth’s atmosphere absorbs around 20% of the energy and the earth’s surface absorbs the remaining 50% (Karl, 2003; Maslin, 2009). The energy absorbed by earth’s surface is then re-radiated in the form of infrared light (Karl, 2003). This infrared light (or heat energy) is mostly released into space; greenhouse gases, however, absorb some of this energy and re-radiate it within the earth’s atmosphere. This is the “greenhouse effect” from which greenhouse gases (hereafter GHG) receive their name, and the presence of GHG results in an increase in the earth’s average temperatures (Maslin, 2009). The greenhouse effect keeps earth habitable, as without any GHG the earth’s average temperatures would be around -18° C, over 30 ° C colder than current averages (Ward & Brownlee, 2000, p. 207).
Examples of GHG and their relative contributions to the global greenhouse effect include water vapor (60%), carbon dioxide (25%), ozone (8%), methane, and nitrous oxides (Karl, 2003). The atmospheric levels of these gases have varied throughout the earth’s history and contributed to past climate change (Maslin, 2009; Stanley, 1999).
Orbital variations and GHG are only two of a myriad of variables involved in earth’s climate system (Rind, 2002). Others include the ocean circulation system, variations in solar output, aerosols, vegetation, and various feedback mechanisms (Rind, 2002). Scientists are divided over the relative importance of each of these mechanisms, and the sheer number and dynamicity of these variables make accurate reconstructions and models difficult (Bony et al., 2006; Maslin, 2009; Raisanen, 2007).
What is clear is that the earth has experienced dramatic climate changes in the past; these changes include natural cycles between ice ages and warmer, interglacial periods as well as the corollary changes in sea level, temperatures, precipitation patterns, ocean circulation, atmospheric circulation, and ice cover. The exact mechanisms responsible for past climate change are a source of debate and uncertainty within the scientific community (Maslin, 2009; Paillard, 2006).
Past and Current Climate Change
The global average temperature of earth has increased by approximately .75 degrees Celsius over the past 150 years (Maslin, 2009; IPCC, 2007). According to the IPCC (2007) the consensus among scientists is that the primary cause of this global warming is anthropogenic (human-caused) carbon dioxide emissions. Since the industrial revolution humans have been burning fossil fuels (e.g. coal, oil, gasoline) for energy. The combustion of fossil fuels results in the formation of carbon dioxide, which is released into the atmosphere (Eubanks et al., 2006). Pre-industrial levels of carbon dioxide (CO2) in the earth’s atmosphere were around 280 parts per million (ppm). Current levels are ~385 ppm, an increase of over 100 ppm (Maslin, 2009). Since CO2 is a greenhouse gas, rising CO2 levels result in an increase in the amount of outgoing infrared radiation absorbed within the earth’s atmosphere. This additional heat energy causes an overall warming of the earth (Eubanks et al., 2006).
In An Inconvenient Truth, the popular documentary about global warming, Al Gore makes the claim that earth’s past ice ages and intervening warm periods are due to the rising and falling of carbon dioxide (CO2) levels (Bender, 2006). Gore points to reconstructions of temperatures and CO2 levels for the last 650,000 years drawn from ice cores in
In the Miocene period, 13.9 million years before present (BP), a global cooling episode was initiated by a change in earth’s obliquity (Holbour, Kuhnt, Schulz, & Erlenkeuser, 2005). This resulted in the extensive ice sheets that continue to cover
In the past 2.5 million years orbital variation has been the dominant forcing involved in the transitions into and out of ice ages (Maslin, 2009). Over the past 423,600 years, during the late Pleistocene, Milankovitch cycles account for the majority of climate change (Meyers, Sageman, & Pagani, 2008). Both precession and obliquity cycles were involved in these changes (Huybers, 2006; Meyers et al., 2008).
During the last interglacial period, ~129,000 years BP, orbital variations caused a warming episode that resulted in extensive open water in the
The current interglacial epoch, the Holocene, began around 10,000 years BP (Maslin, 2009). There is evidence for precessional forcing of climate change during this period, including changes in both ocean hydrology and atmospheric circulation and precipitation patterns (Partin, Cobb, Adkins,
Milankovitch cycles correlate well with past climate change (Maslin, 2009; Meyers et al., 2008; Soon, 2007). The relationship between GHG levels and climate, on the other hand, is a source of controversy among scientists (Kerr, 2001). Historical records indicate a strong correlation between GHG and climate (Alley,
It is likely that both orbital variations and GHG levels have contributed to past climate change (Paillard, 2006). Other mechanisms such as varying solar output and shifts in ocean circulation also play an important role in regional and global climate change (Curry, Dickson, & Yashayaev, 2003; Rohling & Palike, 2005; Thornalley, Elderfield, & McCave, 2009). The earth is a dynamic interconnected system and further research is needed for any certain conclusions about the mechanisms that can explain past climate changes (Paillard, 2006; Rind, 2002). In particular, research into the interactions between orbital forcing and other mechanisms, such as ice-sheet or cloud feedbacks, is needed (Huyberys, 2006; Bony et al., 2006).
Tuesday, March 31, 2009
A few months ago I signed up for a free trial of their credit monitoring service to check my credit score. I canceled it the following day and didn't give it another thought. One month later I get a charge from Equifax out of my bank account. I called and explained what happened and I was told that it would be fixed. A few weeks later I called again to see what had happened and to ask why my money had not been credited back to my account. I was told some reason or other and that it would be fixed within 48 hours. Another few weeks later and still nothing. I'm going to call again now and see what happens.
Update: I was just told that I was given the refund on March 10th. It is March 31st, and I just looked through my bank statement; no refund is there. I was advised to call my bank about it. Great. I love this stuff.
Update #2: I just contacted my bank and they are crediting my account while they investigate the issue. I will be notified of the results within 90 days. Equifax will be sorry now...I will get my money back! My 12 dollars and 95 cents!
Update #3: It's been over a year since all of this, and at this point I don't even remember what happened in the end...I know I received my refund, but I'm not sure if it was from my bank or from Equifax. Either way, the whole thing was a pain.
Monday, March 9, 2009
Saturday, February 28, 2009
Those of you who have seen Al Gore's An Inconvenient Truth (or turned on the TV or surfed the web) will likely have heard about two of global warming effects: increased hurricane activity and the drowning of polar bears. It turns out that while there is some truth in these claims, the reality is quite complicated.
Hurricanes are powered by warm ocean surface waters evaporating and then condensing, releasing latent heat. The warmer the water, the more evaporation, and therefore the more heat released to power the storm. As sea surface temperatures rise then, so should the number and intensity of hurricanes. Al Gore quite bluntly lays the blame for hurricane Katrina at the feet of global warming. Climatologists, however, point out that it is impossible to pin a local, specific event like Katrina as being caused by global warming. While it is true that some theoretical studies have linked rising water temperatures will increased hurricane activity, the observations are that the number of hurricanes have in fact not increased at all. What has increased is the relative number of intense (category 4 or 5) hurricanes. There has been some controversy over this claim, however, as our reliable records only go back around 30 years. In other words, how can we be confident that the number of intense storms has gone up when we don't have a reliable long term record to compare it to? In addition, several studies have shown that as the planet warms, wind shear over the Atlantic and East Pacific will increase. This atmospheric phenomenon prevents hurricanes from forming at all. Yet other studies have shown that this increased wind shear will result in less hurricanes making landfall in the U.S. So, the jury is certainly still out on exactly how global warming will affect hurricane activity.
As for polar bears, I have discovered that populations are indeed on the decline (down 20% or so in the last 30 years). While this is probably influenced by climate change and the decline of arctic sea ice, it is not yet true that we have found drowning polar bears because of a lack of sea ice. The study that seems to have sparked all the media interest (and led Al Gore in his movie to show a CGI clip of a polar swimming in an endless ocean searching for ice) is one where four polar bears drowned due to a fierce storm off the coast of Alaska. Not global warming; an intense storm. While it is hypothesized that the bears may have been swimming longer than usual due to less ice, Al Gore's portrayal hardly seems appropriate. This seems to be another example of misleading information around what will likely be a real issue. If global warming continues, polar bears will have to adapt or perish. It is disappointing though that the facts were twisted to support an agenda.
OK, so in this and the last three posts it will hopefully have become clear that much of the science surrounding past and current climate change is both complicated and tentative. So what should we do? There are those who advocate doing nothing in the interest of preserving our economy, and those who advocate radical change. For myself, the uncertainty of the science can be separated from policy decisions. Why? Consider our current source of energy, fossil fuels. These fuels formed during the Carboniferous period 300 million years ago. The processes that formed all of the coal and oil is slow, somewhat mysterious, and only seems to occur under certain conditions. This is why coal and oil are considered non-renewable resources. World population will likely hit 9 billion in the near future. Developing economies, like China's, are increasing consumption of fossil fuels at tremendous rates. Besides the fact that fossil fuel emissions are probably contributing to climate change, there is another perfectly good reason to develop alternative sources of energy: we are going to run out of fossil fuels (in as little as 30 years by some estimates).
You may have heard of Pascal's wager, a philosophical oddity that suggests that it is a better logical bet to believe in God than not to believe in God. After all, if you believe in God and you're wrong, nothing really happens to you when you die. But, if you don't believe in God and you're wrong, you may be sentenced to eternal damnation. Therefore, it is a safer bet to believe in God. Putting aside the merits of this theological wager for a second, consider the current crisis. If we act to stop global warming, and it turns out to be a complete hoax, what do we get? Some time and money will be spent, but we will also find ourselves with reduced pollution (fossil fuel combustion results in environmental damage besides global warming), zero dependence on oil of any sort (plus it will run out anyway), and the development of new, clean, efficient energy sources. Now, consider the flip side: if we do not act, and the less nice predictions about future warming are true, we could find ourselves with a host of problems, including: sea level rise, habitat devastation, increased floods and droughts, changing atmospheric and ocean currents, and others.
When it comes to modifying the Earth's climate with greenhouse gas emissions, an experiment in which the outcome cannot be predicted, it just seems to be to be a safer bet to start phasing out fossil fuels now and perhaps keeping reserves for emergency situations. I have called this "Gore's Wager" in the past, but as I find out more about climate change, the less enthusiastic I am about using his name. I'll have to think of something else...
Wednesday, February 18, 2009
We are currently in the Holocene, which began around 11,000 years ago, marking the end of the last ice age. Glaciers covered many of the continents during the last glacial maximum, but beginning around 20,000 years ago Earth has warmed. The glaciers have retreated and left only arctic and Antarctic ice as well as mountain glaciers. The retreat of the glacier covering North America left behind the Great Lakes as well as the many kettle ponds here in New England. Cape Cod and Long Island are remnants of the debris left behind by the glacier's movement.
In other words, melting glaciers are nothing new. However, beginning with the industrial revolution humans have begun to add carbon dioxide to the atmosphere in appreciable amounts. From around 280 ppm (parts per million; within 1 million air molecules 280 of them would be CO2) before 1900 CO2 levels have risen to around 390 ppm. This is still an extremely small part of the atmosphere, but CO2 is a greenhouse gas that absorbs heat radiated from the Earth (originally from the sun). Venus, for example, has an atmosphere almost entirely made of CO2 and has surface temperatures around 400 degrees Celsius. While the Earth has many sinks for excess CO2 (only around half of total human emissions actually stay in the atmosphere), humans are adding enough to slowly but surely increase global levels.
Over the 20th century global temperatures have risen around 3/4 a degree Celsius or a little over one degree Fahrenheit. This may not sound like much, but remember, a lot of heat is needed to warm the entire planet by even 1 degree. Is all of this warming due to humans? Probably not; remember, Earth has been warming for the last 20,000 years (with a few notable exceptions), and sea levels have been rising steadily at about 2 mm per year over the same period. Most likely humans are accelerating the warming of the current interglacial period. And here we come back to the main point: this is essentially new territory. We simply do not know exactly what will happen as we add more and more CO2 to the atmosphere.
The Earth has various feedback mechanisms that have acted to stabilize climate throughout Earth's history. These seem to have prevented both an irreversible global icehouse as well as an irreversible runaway greenhouse effect. Can the Earth deal with the amount of CO2 we are adding? There are studies showing that even if we stopped adding CO2 today, climate change might continue for centuries. While modeling future climates is extremely difficult, there seems to be a general consensus that, in the words of the IPCC, "very likely" humans are contributing to the current warming and that warming will continue as long as we keep adding greenhouse gases.
Next: science vs policy.
Monday, February 16, 2009
Climate change in the past seems to have been caused by a combination of orbital variations of Earth (Milankovitch cycles) internal forcing by greenhouse gases (including water vapor, CO2, methane, ozone, etc.), and various feedback mechanisms. Scientists disagree about the relative importance of each. It is clear from ice cores and other records that CO2 and methane vary along with global temperature, but a clear cause and effect relationship is absent, despite what Mr. Gore might tell us. We are also led to believe that CO2 levels are off the chart compared to Earth's past. This is not quite correct. Current levels of CO2 (around 385 parts per million) are certainly the highest in the last 650,000 years, and while this might seem like a long time, it is very brief compared to the 4.6 billion year history of planet Earth. There is evidence that CO2 levels have been up to 15 times higher than current levels around 400 million years ago and at least 5 times the current levels more recently (geologically speaking). Of course, people weren't around back then, and there were no coastal cities, but still, sometimes Mr. Gore makes it sound like planet Earth itself couldn't survive. This is of course nonsense; as recently as 60 million years ago Earth likely had average temperatures 15 degrees Celsius higher than current averages. And Earth survived, as did its plants and animals. There have been times where there haven't been any glaciers anywhere, and as recently as 120,000 years ago sea levels were 4 to 5 meters higher than current levels.
So if you hear that current warming is unprecedented, that is simply false. Point this out to 9th graders, though, and you see some of them glaze over and start to dismiss the possibility of anthropogenic climate change. The problem is that Mr. Gore and others try to use past climate change as a simple analog to explain what is happening now. This doesn't work for several reasons; as mentioned above, past climate change involves dozens of factors that are not fully understood by any climate scientist. The current scenario is actually new in the history of the world.
Next: uncharted territory
Sunday, February 15, 2009
As of now I am kind of torn over this issue and how to teach it. It is one of my favorite parts of the year for several reasons: I find it fascinating, and most of my students find it interesting as well. The problem is that climate change has become a partisan issue. Democrats and liberals tend to believe every word Al Gore says about global warming, while conservatives dismiss it as junk science. As usual, neither are correct. My dilemma is how to teach a nuanced view of climate change to my 9th graders, or rather, to have them arrive at a nuanced view through critical thinking.
I usually start our time on climate change with a viewing of Al Gore's An Inconvenient Truth. The reason is that Mr. Gore actually does a pretty good job explaining much of the science of global warming and its potential effects on Earth. The collection of images and animations are top notch, and Al Gore seems to sincerely care about our planet. Most students are impressed and come away totally convinced that anthropogenic (human-caused) global warming is real and that we must act swiftly or else the world will end. This sort of thinking is encouraged by junk science movies like The Day After Tomorrow. Al Gore isn't quite that far down the alarmist spectrum, but he's pretty close.
After the video (my students take notes) we summarize as a class the various current and future effects global warming may have on the only habitable planet that we know of. These include glacial ice caps melting (which are an important source of fresh water), sea levels rising, a redistribution of precipitation patterns, ocean current changing, and many others. We discuss the basic science of the greenhouse effect (various gases, including CO2, absorb infrared light radiated from Earth and warm the atmosphere) and that humans contribute around 6 gigatons of CO2 to the atmosphere each year (for reference, around 3 million fully loaded 747's would equal 1 gigaton). Then I go on to point out some of Gore's mistakes and/or glossed over complexities.
For example, Gore's first illustration of massive glacial retreat is atop Mt. Kilimanjaro. Unfortunately, studies published by reputable scientists in peer-reviewed journals have shown conclusively that global warming is not the cause of the retreat. Rather, it is a pre-industrial revolution change in atmospheric moisture patterns that is responsible (also, the temperatures at that altitude never rise above freezing). This is not to say that other glaciers are not retreating due to warming temperatures, but Mt. Kilimanjaro was a poor choice by Mr. Gore, and in the interest of truth I feel compelled to point this out.
One example of uncertainty/complexity that Gore overlooks is the relationship between past CO2 levels and temperature. We are shown the data from ice cores from Antarctica that show the past several ice ages and the correlation between CO2 and temperature over the past 650,000 years. When CO2 levels are high, temps are high, and when CO2 levels are low, temps are low. You see, says Gore, CO2 levels control climate, and he goes on to show how our current CO2 levels are rapidly going off the chart in a vertical direction. The problem is that the published studies of these ice cores show that historically temperatures always rise first, followed around 800 years later by a rise in CO2 levels. Also, it is fairly well accepted that these climate changes, the change from ice age to interglacial period and back again, are ultimately caused by variations in the Earth's orbit, known as Milankovitch cycles (though various other factors also come into play). Gore's point though, or so it seems to me, is that in these ice cores we have clear evidence that CO2 has caused drastic climate change in the past and that we are about to experience apocalyptic levels of change due to anthropogenic CO2 emissions. This, to me, is a gross misuse of the ice cores to argue for a valid point: that rising CO2 levels likely will contribute to global warming. How do you explain this to freshmen in high school?
More coming soon.
Saturday, February 7, 2009
The quantitative approach assumes a single reality, an external reality that is independent of people's experiences. There is a right answer to every question, and a quantitative approach allows access to that solitary truth. Qualitative researchers on the other hand assert that there is not one reality, but multiple realities that we can only discover through a qualitative approach. Each person has their own reality, and these would be missed or ignored through a strictly quantitative approach. This thinking is part of what might be considered postmodernism (or relativism): the belief that there is no absolute truth but instead there are multiple truths. For example, many see Thomas Kuhn's work, The Structure of Scientific Revolutions, as showing that even scientific knowledge of that one reality is suspect and that therefore not one reality but multiple realities exist. The reality of any given person then depends on culture, language, and so on.
I believe that the language used in this sort of dialogue is all wrong. When relativists (or qualitative researchers) say that there are multiple realities they are simply mistaken. What they should say is that there are multiple perspectives. For example, on an episode of the radio program This American Life the story of a missing boy was told. When a boy was found, two families claimed it was their boy, as both had indeed lost their son. Each were convinced that the boy belonged to them; in other words they each had their own realities. But clearly the boy could only have been from one family and not both, and therefore perspective is the more appropriate term. I think perhaps the fear is that perspective is not a strong enough word, and that different people's experiences will be ignored unless they are described as a reality. I hope no relativist would actually claim that the boy actually was both of the boys that had gone missing. It seems clear that there is only one external reality: the boy could have only been from one family or the other.
How then do we interpret the clear value of qualitative research and ideas like those of Kuhn's? The answer is that relativists were wrong about what is relative. They would claim reality itself is not singular. They are making a claim about ontology, or what actually exists. The proper claim should be about epistemology, or what we can know. There is one reality, but our access to it is fuzzy at best. Even scientific knowledge, which usually enjoys a vaulted epistemic status, is far from certain, as the work of Kuhn and others has shown. It is a mistake however to confuse what I will call epistemological relativism (the uncertainty of all knowledge; even scientific, quantifiable knowledge) with ontological relativism (that there actually exists more than one reality).
To sum up, we should replace talk of "multiple realities" with talk of multiple perspectives, being sure to appreciate and take seriously how people perceive reality. For example, my perspective right now is that it's time to order a pizza. How can I be sure that this is actually a reality, not just my perspective? My wife has told me so.