The May 21 Nature carried a special section on bees (behind a paywall), and the heart of it was a detailed evaluation on the state of the science on the role of the insecticides known as neonicotinoids, or neonics for short.
What I've been reading for some time in my scattered science sources is that the science is suggestive but inconclusive. This article got down to the nitty gritty, explaining exactly why it's inconclusive so far, why the nature of the problem makes it very difficult to design a decisive experiment. But it also pointed out one specific arena in which we have both the smoking gun, and a straightforward way (short of the politically difficult goal of banning the chemicals outright) of greatly reducing honeybee mortality, a step our neighbor to the North has already taken. A simple upgrade to the machinery used to plant maize is all it takes.
One touted advantage to neonics is their mode of application. Rather than spraying it over the field, indiscriminately killing every insect it contacts, you soak the seeds in a bath of it before you plant. Only the bugs that try to eat the seeds get a lethal dose. The controversy has been over the effects, if any, of the sublethal exposure bees will get from the trace amounts that end up in the leaves, flowers, and pollen.
But as it turns out, the standard corn-planting machinery abrades the surface of the kernels, making the planter spit out a trail of deadly dust. The neonics in that dust are at the full strength poison. Christian Krupke, an entomologist from Purdue studying corn plantings in the Midwest in 2010 says, "It was so incredibly toxic -- a bee flying behind a planter would just die on the spot."
Beyond the legless bumblebee, the remedies for that - and details of what little is known about the risks of pollen exposure.
Metal deflectors attached to the planters can minimize release of the dust; and in 2013 a lubricant was introduced that cuts way back on the abrasion that creates the dust in the first place. The former are now required by law in Europe; Canada started insisting on the lubricant when planting treated maize and soya, and the government reports that associated bee mortality has dropped by 70%.
The U. S. needs to install similar requirements. The science is solid, the added expense is incremental, and it wouldn't require agribusiness to give up any of their beloved neonics. It should be a politically easy no-brainer.
Now, what about those sub-lethal doses that linger in nectar and pollen, mostly of maize and rapeseed, once the crops have grown?
The picture is murky; bees are under a variety of pressures: from the Varroa destructor mite, from a variety of viruses associated with it, and above all from loss of habitat. It's not easy to separate out the role of pesticides.
Colony collapse disorder is known to be the result of a loss of navigation skills in honeybees and bumblebees. It's also known that both Varroa infection and large sublethal doses of neonics wreak havoc on bees' ability to orient, especially when foraging in new fields.
That's powerful reason for caution. But the effects of the much smaller doses actually found in nectar and pollen of treated plants are harder to gauge. And it's tricky to design an effective experiment. In a free ranging field trial, you can't control for dosage; in a lab, where you can control the insects' diet, their navigation systems aren't challenged by wide-ranging foraging.
Field trials, being more expensive, have all been funded by neonics manufacturers. The two main experiments, in France and Canada, have found no statistical difference in survival rates of bees whose hives are situated in treated and untreated fields. However, laying aside suspicions about conflicts of interest, the size of the test plots has been 2 hectares. And honeybees typically forage over 150 to 600 hectares. So only a small percentage of their intake was likely to be from the treated plots. Honest or not, these tests have been inadequate and not seriously reassuring.
In the first phase of 2012 lab trials, Henri in France fed a season's worth of thiomethoxam to honeybees; and Goulson in the UK did the same feeding imidacloprid to bumblebees. (Controls of course received just plain pollen and sugar water.) In order to test their ability to orient, in the second phase bees were allowed to range freely. The honeybees failed significantly often to find their way back to the hive. Bumblebees showed a similar effect; worse, they produced 85% fewer queens. That's particularly troublesome in the case of the bumblebees. Honeybee queens live for several years; bumblebee queens live only one, and the colony must produce fresh queens if it is to continue into another year.
Those experiments prompted the current two year moratorium on neonicatinoids in the EU. But they can't be regarded as conclusive, because they didn't reproduce natural conditions. Every organism has some ability to detoxify poisons that it ingests. Undergraduates may fare poorly if they take a whole semester's sublethal dose of alcohol in the first month or week. It's not enough that the quantities are the same; the administration schedule can also matter enormously. So these studies ring an alarm, but don't give us smoking-gun proof that, under the more drawn-out ingestion of a real foraging season, the bees couldn't have successfully detoxified the same dosage.
I'd love to sound a more certain trumpet. But that's where we are for the moment. So before engaging with the pesticide manufacturers in philosophical discussion about just when a precautionary principle should kick in? Let's go after that maize dust.