Thunderstorms create natural nuclear reactions
You’re bound to have run into all the superlatives that surround thunderstorms. A big thunderhead can be channeling several times more energy than the nuclear weapon dropped on Hiroshima. The temperature of the lightning itself can be many times hotter than the surface of the sun. So this shouldn’t be that surprising:
On 6 February, the detectors sensed an unusual event. A double lightning bolt just off the coast [of Japan] shot out an initial, one-millisecond spike of γ-rays, with relatively high energies of up to 10 megaelectronvolts. This was followed by a γ-ray afterglow of less than half a second.
That combination signals the destruction of a positron. A positron as in antimatter. That positron didn’t come crashing into the atmosphere from elsewhere, it was made right there in the sky above Japan through the incredible energy of lightning. Though only a few such particles generated by lightning have been detected, there are reasons to believe it may be far more common.
Also generated in this instance was an atom of the isotope carbon-14.
The main source of the carbon-14 in the atmosphere has generally been considered to be cosmic rays. In principle, lightning could also contribute to the supply.
If the term “γ-ray” is unfamiliar to you, it’s another way of saying gamma rays. Note that this does not mean that being struck by lightning will turn you into the Incredible Hulk. It will just make you … incredibly hurt.
Come on. Let’s look at more articles.
Thinking fast may hold off dementia
If you’re playing brain games on your iPad in hopes that it will help stave off declining brain function with age, there have been plenty of indications that you may be wasting your time. Numerous previous studies have indicated that—no matter how many trains you route to the station, no matter how many pets you return to their owners—this kind of training seems to have negligible effect in keeping brain function high.
Except … there’s a new study out. It’s a large study. It’s a long-term study. And it seems to indicate that at least one form of “brain training” offers a significant chance of dodging the dementia bullet. But you have to know what kind of training to use.
The Advanced Training in Vital Elderly study (ACTIVE) was a randomized trial on the efficacy of three different types of cognitive training to preserve cognitive and daily function in older adults. Participants were randomized to either strategy-based memory or reasoning training, speed of processing training, or no-contact control conditions.
Some of the people in the trial got no special training. One group was regularly challenged with tasks that required strategic thinking. Another group got training in the kind of spatial-list type of memory retention often associated with memory contests. The last group was given tests where they had to respond in a hurry.
And the winner is … well, I already gave it away in the title. Those folks doing training where they had to respond quickly did better long term than those who faced slower, but possibly more intense reasoning tasks. In fact, neither the strategic thinkers nor the memory-listers showed any improvement above the no-training loungers. It was only those forced to make fast decisions who demonstrated a significant result.
Initially healthy, well-functioning older adults randomized to speed of processing cognitive training had a 29% reduction in their risk of dementia after 10 years of follow-up compared to an untreated control group.
More training also showed more improvement, which helps indicate a strong relationship. If you want to try the tool that was used in the study yourself, it’s available online. It’s not clear yet if there’s anything particularly special about this tool, or whether any instrument that forced the same kind of decision under pressure would serve as well.
How you use language shows if you’ve ever faced serious adversity
I won’t kid you. This one was tough for me to follow and to absorb, but once you put it all together, what this article says is kind of super-duper astounding. A team composed of researchers from the University of Wisconsin–Madison and the University of Arizona has made an amazing discovery about the relationship between how people express themselves in language and the level of stress they’ve experienced in their lives.
The present data indicate a systematic relationship between personal expression and gene expression.
In their study of 143 people, the authors were able to detect which participants had been exposed to significant adversity in their lives by listening to samples of their speech—and they could do so regardless of age, sex, and race.
To understand how this works takes tackling a series of concepts. First, there’s the area of epigenetics. It’s a term that literally means “around genetics,” and it has to do with factors that can affect the expression of genes. I wrote a lengthy Darwin vs. Lamarck section here on my first pass at explaining this, but … just trust me, there are things that can alter your genes after your birth.
This can come from exposure to certain diseases or chemicals. The epigenetic effects of LSD once even made it into a television anti-drug ad. It can also come from something that seems somehow more ephemeral, but is just as damaging: Stress. The whole field of “social genetics” is very, very new stuff, but early work has already demonstrated that factors such as stress and isolation can not only cause genes to operate differently, but those effects are not temporary.
Which brings us to “conserved transcriptional response to adversity” or CTRA. The idea of a CTRA is that an organism—including people—exposed to high levels of stress, can have lasting change written into genetic structures. These can have visible effects. For example, the way that people sometimes get sick in high-stress situations can represent a modification to the immune system that doesn’t go away when the stress leaves. That change can be connected to inflammatory or auto-immune issues later in life. Stress isn’t just bad for you in the short term, it’s bad for you in the long term. It can even be connected to some cancers.
So … language. The claim behind this paper is that when people go through periods of adversity—whether that’s something like a combat situation, being diagnosed with a serious illness, or any other source of extraordinary high stress—it can result in a lasting impact on use of language. The authors studied a lot of areas, but the results are pretty easy to sum up: People who have suffered stress-related damage to their genetics talk less. They say fewer words in an average day. They’re less likely to join a conversation. When they do speak, they use shorter sentences.
On the one hand, this seems entirely fitting with a phenomenon many people have observed. Who doesn’t have some relative who tends to sit in the corner, saying little, or speaks in terse phrases—someone who has “had a hard life.” Who hasn’t had someone who came back from a war and “never wanted to talk about it”? But what’s amazing is that the study suggests that there’s a genetic reason behind those silences.
You can even think of it in historical terms. The word “laconic,” meaning someone who speaks tersely, using very few words, comes from Lakōn, the old Greek name for Sparta. This is supposed to be the way the ancient Spartans actually spoke—rarely, and in short phrases. That’s the way they spoke … after childhoods that emphasized isolation and lives that were dedicated to fighting. And, because epigenetic changes can be inherited, they may have even passed this changed way of handling language onto their children. Spartans might have talked like Spartans, because they had no choice.
What about any of history’s so-called “dark ages?” If all of society is under a high enough level of stress, could you actually induce widespread stress-related changes that capped the ability of people to express complex ideas, reducing society’s ability to recover?
There’s just a ton to think about in here.
Your weekly scientific argument: Vacationing Caribbean crossbills
This week’s they-said/they-said example of scientists throwing elbows comes in the form of an argument about crossbills in the Bahamas.
Previously, a team headed by a scientist from the University of Florida argued that habitat loss and changing vegetation patterns in the Bahamas was leading to the extirpation of the Hispaniolan crossbill (Loxia megaplaga), which generally makes its home on larger Caribbean islands. They based this on fossil evidence that seemed to show the birds as commonly resident in the area, versus limited evidence of their continued presence.
That was followed up this week by a pair of scientists from the University of Wyoming saying not so fast. There was no extirpation of the Hispaniolan crossbill, according to this team, because there were no Hispaniolan crossbills to extirpate. And in fact, the other team had misinterpreted fossil evidence of another bird, the Red crossbill (Loxia curvirostra), which comes from the US mainland.
Which was then chased up by a reply from the original team that opens with:
We appreciate the issues raised in Benkman’s letter, which is critical of our paper. Here, we will address these issues.
The actual content of this back and forth is enjoyable for several reasons. One, you get to see scientists delivering sentences like the one above—which is highly reminiscent of those occasions when you see a senator stand up to say something beginning with “My good friend from Kentucky ...” when what you know they really mean is “McConnell, you ignorant slut.” Two, the argument here actually comes down to minuscule measurements of fossil bird bones and is unlikely to deeply alter world events. Three, it lets me talk about the term “extirpation” which means “local extinction.” For example, not only are Red crossbills now extripated from the Bahamas, they’re gone from the adjacent areas of the United States. Four, I have a Red crossbill on my list of birds sighted this year.
Forget the worm leg, look at the instrument
This paper from a big group of scientists at the Technical University of Munich details the structure of a single limb on a velvet worm. Velvet worms are Onychophorans—fascinating and ancient creatures who make an appearance all the way back in the Cambrian, 500 million years ago. They also have a unique way of capturing prey through the copious production from their “slime glands.” Yeah.
But the reason I’m bringing up this article has nothing to do with the details of the worm, and everything to do with how the German team got those details. They used a machine called a “nanoCT.” What this machine can do is pretty astounding.
Comparative analyses proved that the nanoCT can depict the external morphology of the limb with an image quality similar to scanning electron microscopy, while simultaneously visualizing internal muscular structures at higher resolutions than confocal laser scanning microscopy.
A device that can image the outside of something with the same kind of quality produced by an electron microscope, and then produce images of the inside of that animal with similar resolution is just darn cool. NanoCT devices first became available about a decade ago, and they’re gradually becoming both cheaper and better.
Because, like a big CAT scan, nanoCT produces its images as slices through objects, they can be used to build 3D models of what’s going on inside a living thing. And since the nanoCT slices are so incredibly small, those models can be detailed enough to reveal structures that were impossible to see before.
Refuges that lasted out the last climate change, might not make it this time
Refugia are areas that acted as long-term buffers against extinction. In the United States, areas in the Appalachians served as refugia for species pressed to the south by glaciation during the last ice age. Fir forests near the Mediterranean performed a similar service in Europe. Those areas of diverse forest survived as the glaciers retreated … but they might not make it much longer.
That’s because the trends for the 21st century are not just warmer in many areas, but drier. The lack of moisture and rising temperatures alone will be enough to mean that many areas that have served as a refuge over the last 11,000 or more years will no longer be so diversity-friendly. There will also be increasing wildfires (see the western US and Canada over the last few years), spreading pests and diseases, and replacement by new species that find the warmer, drier conditions more to their liking.
Of course, not every place on the planet is going to dry out.
Models anticipate abrupt growth reductions for the late 21st century when climatic conditions will be analogous to the most severe dry/heat spells causing forest die-off in the past decades. However, growth would increase in moist refugia.
Where will these refugia be? Will they be large enough to preserve a wide variety of species? Will a drying, over-heated, over-populated planet allow these spaces to act as ecological lifeboats or grab them up for agriculture or other use? Stay tuned.
Einstein is still right. Dammit.
For anyone still hoping to find a way around that darn light speed limit so we can get on with fighting some Klingons or joining up with the Culture, there’s bad news.
Two teams of physicists have subjected Albert Einstein’s general theory of relativity to some of the strictest tests so far, and found no deviation from the theory’s predictions.
Honestly, the two factors that these groups looked into were already pretty well nailed down. They just looked at more data, and more accurate data, that covered a longer period.
But there were no obvious paths to warp five hidden in the numbers.
Hanging with the popular genes.
Humans have over 19,000 genes, and with a widening array of tools, most of those DNA programs are up for examination. But they don’t all get equal attention.
One researcher did some research … on research. To see which genes get the most attention.
Heading the list, he found, is a gene called TP53. Three years ago, when Kerpedjiev first did his analysis, researchers had scrutinized the gene or the protein it produces, p53, in some 6,600 papers. Today, that number is at about 8,500 and counting. On average, around two papers are published each day describing new details of the basic biology of TP53.
Why is it that 1 in 19,000-some genes such a black hole of research funding?
The gene is a tumour suppressor, and widely known as the ‘guardian of the genome’. It is mutated in roughly half of all human cancers.
As you might expect, most of the top ten genes on the list are related to cancer.
Checking the calendar on written history, oral history and C-14
No matter how many times it’s misused in bad movies—or misstated by anti-science sites—you can’t establish the age of dinosaurs with carbon-14. With a half-life of around 5,700 years, carbon-14 is good for dating things back to about 60,000 years—less than 1/1000 of the way back to the Cretaceous. But that short range makes C-14 good for dating things that have happened through most of human history.
An international team of scientists from both Europe and North America took a look at some events that had written records, like the Black Death in Europe, as well as some that are supported by the “adawx,” the oral records of history as kept by the Tsimshian First Nation in British Columbia. The Tsimshian adawx is passed on from generation to generation through recitations at a special feast and indicates a migration away from their traditional homeland from about 500 to 1000 CE.
In both cases, the team seemed to find a population decline at the time expected.
This represents the first formal test of indigenous oral traditions using modern radiocarbon modeling techniques.
But hopefully not the last. Oral traditions like the adawx are more than just fairytales, and putting some authoritative dates to the events recorded may make it possible to paint a better picture of what was happening in the Americas (and elsewhere) before written records appeared.