This ocean invaded its neighbor earlier than anyone thought
News and notes about science
Arctic. Atlantic. Long ago, the two oceans existed in harmony, with warm and salty Atlantic waters gently flowing into the Arctic. The layered nature of the Arctic — sea ice on top, cool freshwater in the middle and warm, salty water at the bottom — helped hold the boundary between the polar ocean and the warmer Atlantic.
But everything changed when the larger ocean began flowing faster than the polar ocean could accommodate, weakening the distinction between the layers and transforming Arctic waters into something closer to the Atlantic. This process, called Atlantification, is part of the reason the Arctic is warming faster than any other ocean.
Satellites offer some of the clearest measurements of changes in the Arctic Ocean and sea ice. But their records only go back around 40 years, obscuring how the climate of the ocean may have changed in prior decades.
“To go back, we need a sort of time machine,” said Tommaso Tesi, a researcher at the Institute of Polar Sciences-cnr, Italy.
In a paper published Nov. 24 in the journal Science Advances, Tesi and colleagues were able to turn back time with yard-long sediment cores taken from the seafloor, which archived 800 years of historical changes in Arctic waters. Their analysis found Atlantification started at the beginning of the 20th century — decades before the process had been documented by satellite imagery. The Arctic has warmed by around 2 degrees Celsius since 1900. But this early Atlantification did not appear in existing historical climate models, a discrepancy that the authors say may reveal gaps in those estimates.
“It’s a bit unsettling because we rely on these models for future climate predictions,” Tesi said.
“It was quite a lot of surprises in one study,” said Francesco Muschitiello, an oceanographer at the University of Cambridge and an author on the paper.
The authors are not sure of the precise reasons behind the early Atlantification. If human influences are the cause, then “the whole system is much more sensitive to greenhouse gases than we previously thought,” Muschitiello said.
In another possibility, earlier natural warming may have made the Arctic Ocean much more sensitive to the accelerated Atlantification of recent decades. “Could it be that we destabilized a system that was already shifting?” Tesi said. — Sabrina Imbler
This fire-loving fungus eats charcoal, if it must
When a wildfire plows through a forest, life underground changes, too. Death comes for many microorganisms. But, like trees, some microbes are adapted to fire.
Certain fungi are known as pyrophilous, or “fire-loving.” After a fire, pyrophilous fungi “show up from nowhere, basically,” even in areas that haven’t burned for decades, said Tom Bruns, a mycologist at the University of California, Berkeley. Some sprout in fiery shades of orange and pink. “It’s a worldwide phenomenon, but we don’t really know much about them,” he said.
A new study, published in October in the journal Frontiers in Microbiology, aimed to uncover the food source that allows Pyronema, a genus of pyrophilous fungi, to appear so quickly in such big numbers after a fire. What researchers discovered is that the damage left by the fire itself may allow the fungi to thrive. That could affect how the ecosystem recovers, as well as how much carbon gets released into the atmosphere after wildfires.
During a severe wildfire, a lot of carbon in the top layer of soil goes into the atmosphere as carbon dioxide, while some of it stays put as charcoal, or what scientists call pyrolyzed organic matter.
Slightly deeper in the soil, it’s less hot — but hot enough that any living microbes and insects exploded and died, said the study’s lead author, Monika Fischer, a postdoctoral scholar at the University of California, Berkeley.
So, is Pyronema just living off this layer of death? “Or can Pyronema actually eat charcoal?” Fischer said.
To find out if Pyronema can eat charcoal, the authors grew the fungus from samples collected by Bruns’ team after the Rim fire in California in 2013.
To confirm that the fungus was actually doing what it appeared to be doing, the lab grew pine seedlings in an atmosphere with carbon dioxide containing carbon-13, an isotope whose unusual weight makes it easy to trace, and then put the trees in a specialized furnace to form charcoal, which was fed to the Pyronema. The fungus’s carbon-13-labeled emissions, then, suggested that it really was snacking on charcoal.
The researchers also tracked normal carbon dioxide coming out of the fungus, and substantially more of it than the charcoal, suggesting it was eating something else.
Fischer offered this interpretation: “Pyronema can eat charcoal, but it really doesn’t like to.” — Ellie Sheche
A tool kit to help scientists find the ultimate chickpea
When you open a can of chickpeas and fish out the nutty, savory little beans, you’re partaking in a history that began around 10,000 years ago. The modern chickpea’s ancestor, a wild Middle Eastern plant that likely had tiny, hard seeds, was cultivated by humans around the same time as wheat and barley, and began to evolve as early farmers selected plants whose seeds were larger and more succulent. Archaeologists have even found what appear to be domesticated chickpeas buried beneath Jericho in the West Bank, so deep that they would have been grown even before the inhabitants of one of history’s longest occupied cities began to make pottery.
The humble chickpea has had a somewhat rocky road to its present popularity, however, suggests a new study published in November in Nature that sequences the genomes of more than 3,000 examples, making it one of the largest plant genome sequencing efforts ever completed.
The researchers now believe that after chickpeas were first domesticated in Turkey’s southeastern Anatolia region, their cultivation may have stagnated for millenniums. The result was a genetic bottleneck that makes all chickpeas today descendants of a relatively small group from a thousand years ago.
What’s more, the modern varieties grown by most farmers are low in genetic diversity, which means that they are at risk of failing under the stress of climate change. By mapping the legume’s genetic makeup in such rich detail, the scientists hope to make it easier for plant breeders — who develop new kinds of crops — to bring diversity back into the chickpea’s genes, giving it a flexible tool kit to survive drought, flooding and diseases.
While hummus may have become ubiquitous in American grocery stores only in the past 15 years, chickpeas have long been a staple crop in the developing world, said Rajeev Varshney, a research program director at the International Crops Research Institute for the Semi-arid Tropics in Hyderabad, India, as well as a professor at Murdoch University in Australia and an author of the new paper.
But chickpeas’ status as a developing world crop has meant that they have not received as much attention from breeders as commodities like corn, Varshney said. Chickpea farmers grow a handful of varieties that have been improved over the years without, for the most part, the benefit of genetic information that might give breeders more control over what traits the beans will have. — Veronique Greenwood