“Supernovae can diminish the ozone layer and allow more ultraviolet radiation to reach the surface”
In science, no aspect is completely isolated from another: sometimes research into one area can end up solving a mystery in another seemingly unconnected field. For instance, when biologists are trying to work out the time since two species split from each other, and so calculate dates of branching on the evolutionary tree of life, they can use two different techniques.
First, the fossil of an organism can be dated using the radioactivity of the rocks around it. Alternatively, counting the number of differences in the DNA sequence of two living species allows an estimation of the time since they diverged based on the number of mutations that have accumulated, like a molecular clock. The problem is that the fossil age and molecular clock age don’t always agree, and there are particular disparities in the evolutionary tree of the birds.
However astrophysics might have an explanation, says Adrian Melott of the University of Kansas. If the accumulation rate of mutations wasn’t in fact constant, then the ticking of the molecular clock would vary over time and throw off the correspondence between fossil dates and molecular ages. A major contributor to mutations in cells is background radiation, from radioactive elements
is an astrobiologist at University of Leicester and the author of The Knowledge: How to Rebuild our World from Scratch (www.theknowledge.org) in rocks, for example, but also from cosmic rays striking Earth from deep space. Although the Earth’s magnetic field and thick atmosphere do protect the surface from most cosmic rays, particularly violent events such as a nearby supernovae can still send a flood of ionising radiation – muons and neutrons – all the way to the ground. In addition, a supernova can diminish the ozone layer in Earth’s upper atmosphere and so allow greater levels of ultraviolet radiation from the Sun to reach the surface. Both this ionising radiation and heightened ultraviolet exposure can damage the DNA in any organism they strike, resulting in a greater mutation rate and so a faster ticking of the molecular clock. As Melott argues, any measurements of the rate of the molecular clock made today, when the Earth is not currently experiencing a heightened radiation hazard, would not include this acceleration.
In particular, Melott points to an increase in the isotope iron-60 in sediments dating back around 2.5 million years; evidence that one or more supernovae went off within a few hundred lightyears of Earth in recent evolutionary history. This event, and similar ones through Earth’s past, could have periodically accelerated molecular clocks and so caused the disparity seen between fossil ages and molecular dating methods.
Melott is quick to clarify that he’s not saying this hypothesis is necessarily better than other proposed explanations for these disparities between dating techniques, just that it is a possibility that deserves to be investigated more fully. But like any proper scientific hypothesis, Melott has proposed ways that his idea can be tested.
If it is indeed bursts of radiation from outer space causing an acceleration of molecular clocks, then you would expect that deep-sea life would be shielded from this fluctuating effect and so there should be a much closer agreement between fossil and molecular dating methods. LEWIS DARTNELL was reading… A possible role for stochastic radiation events in the systematic disparity between molecular and fossil dates by Adrian L Melott Read it online at http://arxiv.org/abs/1505.08125