Avril 2015

Message de l'éditeur de l'Académie - L'ère de l'obscurantisme - Andrew Miall, MSRC, University of Toronto

1. Is the din really harmless? Long-term effects in auditory cortex - Jos Eggermont, FRSC, University of Calgary

2. Global Warming, the subjugation of Science and the trend to "Science Lite" - W. K. Hocking, FRSC, Western University

3.  Using stellar corpses to explore fundamental physics - Harvey B. Richer, FRSC, University of British Columbia

Message de l'éditeur de l'Académie - L'ère de l'obscurantisme - Andrew Miall, MSRC, University of Toronto

Andrew Miall

Tous les articles de cette édition du bulletin ont été rédigés par de nouveaux membres qui sont intervenus lors du RSC Café à Québec en novembre dernier. Merci à eux pour leurs contributions. 

Dans son article, Wayne Hocking déplore la tendance de « vulgarisation de la science » qui, selon lui, monopolise le débat sur le changement climatique. Je pense que cela n'est qu'une partie d'un problème beaucoup plus large, caractérisé par une attitude anti-science chez beaucoup de nos dirigeants politiques, par une approche exagérément simpliste de la science dans les médias, et par une perte de confiance dans la science par une majorité de la population.

Notre ère moderne - caractérisée par les merveilles des découvertes scientifiques, le progrès technologique, l'élimination progressive de la pauvreté, la quasi-élimination de nombreuses maladies à travers le triomphe de la santé publique, et la diffusion croissante (bien que fragile) de la démocratie et des droits de la personne - a commencé avec le Siècle des lumières au milieu des années 1600, lorsque la raison, l'analyse et la logique étaient mises en avant par Francis Bacon, John Locke et Voltaire, et des philosophes comme Descartes, Kant et Hume. Le progrès scientifique n'aurait jamais émergé sans la détermination des « naturalistes » (comme on les appelait autrefois) à remettre en cause la doctrine de la Bible et de la tradition. Certaines des plus grandes avancées qui en ont découlé, telles que la découverte de l'évolution et du processus de sélection naturelle, sont toujours contestées par certains, mais, de façon générale, la science a su s'imposer.

Cependant, en 2007, David Colquhoun a écrit dans The Guardian, « Les 30 dernières années marquent l'ère de l'obscurantisme, une période caractérisée par la négation de la vérité scientifique et où le dogme et l'irrationalité redeviennent respectables. » Il cite ainsi, entre autres tendances, « les conceptions surnaturelles et superstitieuses de la médecine », la conviction que la Terre est jeune (moins de 6 000 ans), le créationnisme et les pouvoirs cicatrisants des cristaux.

Récemment, Timothy Caulfield a publié un livre intitulé « Is Gwyneth Paltrow wrong about everything? », dans lequel il explique comment le culte de la célébrité est parvenu à tromper le public sur une panoplie de questions. La méfiance vis-à-vis des géants de l'industrie pharmaceutique est l'une des caractéristiques de cette tendance, ce qui a contribué à la popularité des médicaments, suppléments diététiques et vaccins « naturels ». Il faut dire que les scientifiques n’ont pas montré l'exemple. Nombre d'articles parus dans des revues médicales, soulignant l'efficacité d'un nouveau médicament, ont en réalité été écrits par des entreprises pharmaceutiques, et non par les chercheurs sous le nom desquels les articles ont été publiés. Des tromperies délibérées ont été constatées, comme dans la célèbre affaire impliquant Andrew Wakefield, lequel avait abusivement déclaré qu'il existait un lien entre vaccin et autisme. Beaucoup de personnes ont été dupées par des traitements « naturels » contre le cancer, conduisant généralement à une mort prématurée.

Mais que se passe-t-il? La politisation du débat scientifique légitime fait bien souvent partie du problème. La propension des médias à simplifier à l'excès des questions complexes, ce qui a tendance à accentuer la controverse et non à la résoudre, est également problématique. Enfin, la perte d'autorité fait aussi l'objet de préoccupations. Les grandes organisations publiques et privées (c.-à-d., le secteur des entreprises) ne doivent plus considérer comme acquis le fait que les déclarations et affirmations qu'ils font sur leurs travaux soient automatiquement prises pour argent comptant.

La vérité est que la science réelle, celle qui permet d'augmenter le stock de connaissances depuis deux cents ans, a mis au jour des subtilités non comprises par la plupart des individus (l’accusation de Hocking sur les universités qui évitent les exercices de calcul fait partie de cela). C'est pourquoi les scientifiques possédant les compétences nécessaires pour expliquer la science au public sont si précieux. La médaille McNeil est un moyen important pour la Société royale de reconnaître les meilleurs d'entre eux. Le progrès scientifique est souvent synonyme de controverse, et c'est là que notre processus de groupe d'experts peut jouer un rôle crucial. Les membres doivent sans cesse être encouragés à écrire des articles populaires et à promouvoir la vraie science dans les médias. Ce bulletin est un endroit idéal pour le faire!

1. Is the din really harmless? Long-term effects in auditory cortex - Jos Eggermont, FRSC, University of Calgary

Jos EggermontIf you, like me, are living besides a busy four-lane street in front of a large hospital you know that the only sounds that you are aware off are those from the street- and air-ambulances. The continuous road traffic noise, the din, is not noticed at all. The auditory cortex is responsible for that. The process is partly due to a habituation process that fairly quickly (within an hour) turns down the central auditory system gain for that particular sound. But if the reduction in sensation level were only due to habituation you would hear the din again after a good night sleep, and that does not happen. 

Over the last decade my lab, due to the efforts of some exceptional postdocs, has investigated the mechanisms behind these long-term effects of moderate level noise. Adult cats were exposed for three weeks to either continuous, or 12 hr on-12 hr off, noise or other multi-frequency sounds in the frequency range of 4-20 kHz, and presented at levels of less than 68 dBA. Because the legal limit for 8 hr exposure per day is 85 dBA, and tripling the exposure duration (to get to a full day) would allow ~80 dBA, this should not cause a hearing loss. Indeed, we did not find hearing losses as measured by electrical responses in the brainstem, nor damage to the hair cells in the inner ear. 

To our surprise, however, we did find that neurons in auditory cortex of the cat that code the exposure frequency region of 4-20 kHz ceased largely to respond to sound in that frequency range up to levels of 75 dB. In contrast, the neural responses in the octave regions below and above the exposure frequencies were greatly enhanced in strength and threshold levels were reduced by with ~20 dB. The decrease of neural activity in the exposure frequency region likely explains the near inaudibility of these sounds. We then looked how long it took to reach this reduced level, by exposing for durations from 2 days to 3 weeks. We found that two weeks was sufficient and that longer exposures did not change the effect.  Then we studied how long it took for the effect to disappear, assuming it would be a plastic mechanism. We found that after at least 6 weeks cortical neurons responded again to all sound frequencies at the normal sensitivity level (3). 

So no long-term harm done? Subsequent experiments and analyses showed that the normal regular mapping of sound frequency in the auditory cortex was now completely random for the frequencies between 4 and 20 kHz. This tonotopic mapping did not return to normal even after 3 months of recovery from the exposure in the quiet cat room that held a number of freely roaming, but mostly sleeping, cats. It is currently not clear whether this map reorganization has perceptual consequences for sound discrimination.

What does potentially have an effect on sound discrimination is the difference in central gain for the exposure region (reduced) and the octaves above and below that region (increased). To transpose this to adult humans, whose hearing is limited in practice to frequencies below 15 kHz —from exposure to loud occupational and recreational noise—, consider that the street noise frequencies are mostly below 3 kHz. This frequency region would thus show a reduced gain, whereas the frequencies up to 6 kHz would be enhanced. This could result in problems with speech understanding especially in noisy environments (1,3). 

A disconcerting aspect of these findings is that continuous daily exposure does not allow recovery, since the induction time is so much shorter than the recovery time. It is thus conceivable that workers exposed to machine and other noises, even if only for 8 hr/day and wearing sound barriers that reduce the effective levels below the legal limit, can over time build up a permanent perceptual deficit probably more serious than that produced by street noise. This has been confirmed by research in Finland (2). It was found that long-time workers in a noisy factory environment, while having clinical normal hearing, could not distinguish a /ba/ phoneme from a /pa/ phoneme. In addition they did not show the normal electrophysiological response (i.e., the mismatch negativity) that signals a pre-attention ability of the brain to differentiate those phonemes.

About 10% of people have an abnormal sound level perception known as hyperacusis, where sounds that are perceived as normal by the other 90% are found disturbingly and sometimes painfully loud. This is generally attributed to an increased central gain as a result of reduced neural inhibition. Recalling that in the cats that were exposed to a 4-20 kHz sound, the central gain was increased above and below the exposed frequency region (likely by a disappearance of the lateral inhibition normally provided by the neurons responding to 4-20 kHz). This increased central gain for those frequency regions would cause hyperacusis, as the much-enhanced neural responses recorded from cat auditory cortex indicate. Central gain increases are based on changes in the excitatory synapses’ transmitter-release properties. These do not only increase sound-evoked releases, but also increase spontaneous transmitter release resulting in increased spontaneous firing rates in cortical neurons. This has long been considered a necessary, albeit not sufficient, requirement for tinnitus (i.e., ringing in the ears). So a side effect of long-term exposure to moderate level sounds, long considered safe for the auditory system, would be tinnitus and hyperacusis both in the presence of clinical normal hearing (3). 

Answering the question posed in the title is not straightforward. Laboratory experiments are performed in very controlled acoustic environments, whereas human sound exposure is variable, and not every day exactly the same. It may well be that certain changes in sound exposure after work may offset some of the induced changes, and that recovery in a quiet environment (<45 dBA) is not always beneficial. Nevertheless, our findings suggest that annual testing of workers exposed to long-duration and daily repeated noise should not be limited to measuring hearing sensitivity thresholds, i.e., an audiogram. Testing their abilities to understand speech-in-noise would provide a sensitive diagnostic of the hearing problems that do occur in the absence of hearing sensitivity loss.

References

1) Gourévitch B, Edeline J-M, Occelli F, Eggermont JJ. (2014) Is the din really harmless? Experience-related neural plasticity for non-traumatic noise levels. Nature Reviews Neuroscience 15: 483-491.

2) Kujala T, Shtyrov Y, Winkler I, et al. (2004). Long-term exposure to noise impairs cortical sound processing and attention control. Psychophysiology, 41, 875–881.

3) Pienkowski M, Eggermont JJ. (2012) Reversible long-term changes in auditory processing in mature auditory cortex in the absence of hearing loss induced by passive, moderate-level sound exposure. Ear and Hearing, 33: 305-314.

2. Global Warming, the subjugation of Science and the trend to "Science Lite" - W. K. Hocking, FRSC, Western University

W. K. HockingI will begin this article by reflecting on Global Warming, The article is not another rant for or against global warming.  Nor is it about bashing any political party. Climate change is part of the story, but only part. The article looks at the evolution of science - and especially hardcore science - over recent decades.  

The first alarm bells regarding global warming were sounded by scientists. Likewise the first warnings about stratospheric ozone depletion. And so it will be with regard to other potential (possibly unforeseen) catastrophes. While in the domain of science, these topics could be discussed with objectivity and  tolerance. Yes, these effects were there, but so were other, more natural events such as El Nino, La Nina, Milankovitch cycles, and so on. What was the balance?

But then global warming became public property - rightfully so.  But there were issues in dealing with a process that involved temperature changes of a fraction of a degree per decade. Was it really important? So the topic changed to "climate change" - not just global warming, but any dramatic changes that might raise public awareness. Severe weather became an area of focus. And somewhere along the path, the issue became one of great polarity. It was inevitable, of course, once mega-bucks entered the picture. Oil companies denied it. But equally, it was politically "cool" to "fight" for global warming, whether the protagonist  believed in it or not. Carbon taxes, originally envisaged with the best of intent, became new money-making machines. A sense of balance was lost. Scientist who wanted to see the whole picture - including astronomical cycles, and natural weather variations - were dismissed as sceptics, categorized along with those who were sure there was no effect. 

However, alas, they were not lone outcasts for long. For it soon came to pass that all scientists became outcasts. While we speak of the pendulum of public opinion, the pendulum in this case is not a pendulum - it is more like a magnet on a string  moving between two steel walls - it oscillates for a while, but eventually is grasped by the pull of one wall or the other, and is pulled unforgivingly in that direction. Political leaders of all shapes recognize the value of manipulating these steel walls, aka extrema of public opinion. But science was a problem. It clouds the issues - a real scientist sees many sides of the picture. It's a little sad when a member of the public cannot distinguish between the stratospheric ozone layer and  the mesospheric one. It's tragic when a minister of science or minister of the environment cannot, especially if the display of ignorance is a public one. So it became clear that in order to avoid embarrassment,   science should be moved out of the picture. Scientists were told - more than once -  that "they had done their job and raised the alarm- now it was up to the politicians to develop future policy".  New layers of bureaucracy were introduced between the political and scientific platforms, so that high-ranking public servants - and even company CEOs and university administrators - never really needed to talk directly to a scientist. It also empowered these overlords with enormous power - now granting agencies, for example, no longer were working to optimize science productivity, but rather they controlled the scientists! Such people now speak without shame about how they like to support their favourite scientists (and of course by inference, do less for less preferred souls) - something  that should reek of conflict-of-interest, but no longer does, it seems. Indeed the top-down control is even more ominous - ironically, the worst administrators are often those with just enough knowledge to be dangerous. Recent PhD graduates who move directly into administration, with no postdoctoral experience, but confident in their (limited) science background, can be a bigger threat than an administrator who knows little science but is smart enough to know that they know little science, and so seeks advice. In any case, good "leadership" must be subdued and supportive.

"Global warming" and "Climate Change" now has legions of experts who are not scientists. Hardcore science has lost its grip. At least one university, in setting up a so-called ""sustainability program", refused to allow courses involving calculus into the program, these being of course too challenging for the thousands of non-scientists who might take the program. Hard-core courses would attract too few students to be economically viable - after all, for universities, it's really about money, not knowledge.  Yet the basis of much of what we know about climate change comes from hard-core science. Computer models that help us see the future rely on techniques like -yikes! - calculus (and more). So we have reached the point where arguments about climate change are debated in-vaccuo. Science has much more to offer, but has been sidelined. I recently attended a seminar by an (apparently) eminent UK philosopher, who is a  government consultant on matters of climate change. The talk was about the statistics of climate change, particularly about "Bayesian statistics" and "double-counting". These topics may seem obscure to you, dear reader; but unfortunately the speaker was similarly hampered, so useful advice from this source would have to be questionable. There is much that science can and must still do - but now, with science effectively side-lined, discussions about these topics take place in a vacuum of fundamental knowledge. The science that is left in these programs is a sort of "science lite" - it wears the attire of science, and looks like science to the untrained eye, and of course to the administrators- both in governments and universities at all administrative levels - who dressed it; but take away the clothes, and underneath there is little flesh. 

Yet so much is missed without real science in the picture. There is so much more to our future, which requires that scientists are produced with a deeper, hard-core focus. We face a period of potential asteroid strikes, excessive man-made heating, and magnetic field changes, among other potential disasters, and it is only scientists trained at a fundamental level who can truly address these issues. Lite science, and speculation by untrained personnel, will not even foresee these events, let alone mitigate them. Yes, everyone needs to be involved - but that should include hard-core scientists. And in particular, these scientist cannot function under the constraints of corporate structure - scientific freedom is at the heart of our survival. Science is not a democracy - lone opinions count, as illustrated by legions of "out-of-the-box" thinkers, from  Newton to Einstein and beyond. We cannot allow Science to be subjugated by the will of corporate and government overlords, and must resist this current trend to a downward control principle. 

3.  Using stellar corpses to explore fundamental physics - Harvey B. Richer, FRSC, University of British Columbia

Harvey RicherWhite dwarfs (WDs) are stars that have depleted their nuclear sources of energy, but remain luminous by radiating their stored thermal energy out into space.  Since they cool at a very predictable rate, they can be used as delicate probes of a wide variety of physical phenomena including establishing the chronology of formation of our Galaxy, testing neutrino production rates and exploring crystallization in the core of these fascinating objects.  

In 1844 the great astronomer and mathematician F. W. Bessel demonstrated that Sirius, the brightest star in the sky, wobbled as it moved through space. This was not such an unusual situation as all it meant was that Sirius had a faint companion (Sirius B) and that it and Sirius A formed a binary system with the components orbiting around a common centre of mass with a 50 year period. What did turn out to be unusual was that the companion, first imaged in 1862 by the American Alvan Clark, turned out to be a faint blue star. All known faint stars at that time were red, implying a low surface temperature. The very blue colour of Sirius B signified that it was very hot and its faintness meant that it was also very small. Sirius B was the first example discovered of a class of stars called White Dwarfs  (WDs). These objects are the remnants of stars up to about seven times the mass of our Sun that have completed their nuclear evolution. The source of their light at this stage is simply the radiation out into space of their stored thermal energy. In a time that is much longer than the current age of the Universe (13.7 billion years), they will deplete their store of energy and become black dwarfs. 

While these stars did not lead directly to the development of quantum mechanics, they do require quantum theory to explain their structure; as such they can be considered a macroscopic demonstration of quantum physics. One aspect of this structure, predicted by the Indian/American astrophysicist Subrahmanyan Chandrasekhar, was that there was an upper limit to their mass at about 40 percent more than the mass of the Sun. Any WD exceeding this limit would explode as a supernova. Such supernovae have been observed and analysis of their brightness coupled with the velocity of their host galaxy has led to the concept of “dark energy” dominating the Universe.

WDs cool at a very predictable rate, much like the embers of a fire that is burning down. Thus nature has provided us with precision clocks that we have used to date various components of our Galaxy and provide a lower limit to the age of the Universe that is independent of its expansion rate. Below I detail two other important physical phenomena that we are exploring using these remarkable stars.

Rate of neutrino emission

The Sun is powered by the nuclear conversion of hydrogen into helium. During this process, energetic particles called neutrinos are produced. Neutrinos are almost massless particles that travel near the speed of light and interact very weakly with matter. These solar neutrinos provide the only direct confirmation of the source of solar energy. Attempts to detect them began in the 1960s, but from almost the very beginning it was clear that the rate of detection did not agree with theory – only about one third of the expected number were seen. Eventually it was understood that neutrinos could “oscillate” from one type to another (there are three known neutrino types) in the solar interior and because the existing experiment was sensitive to only one type, the observed rate was only about a third of that expected. In 2002 the Nobel Prize was awarded to the two physicists that led this experimental work.

In the interior of hot WDs, neutrinos are copiously produced. These neutrinos are almost one thousand times less energetic than those manufactured in the Sun and they are manufactured in an exotic and entirely different manner from the solar neutrinos. The production process here is the plasmon neutrino process wherein a photon, interacting with the plasma of electrons in the WD core, attains an “effective mass”. This then allows the photon to produce a virtual electron-positron pair that subsequently annihilates into a neutrino-antineutrino pair. Such a process is normally forbidden as a photon does not have a mass and in the decay it cannot conserve both momentum and energy. Neutrino rates for this process have been calculated and experimentally inferred, but a direct astrophysical check on the rates in this energy regime has never been carried out.  

In 2013 I had a program with colleagues to use the Hubble Space Telescope (HST) to carry out imaging in an ancient populous star cluster. The experiment exploited the ultraviolet capabilities on HST to observe in the core of the cluster where the youngest and hottest WDs resided. We identified in excess of 10,000 WDs with the hottest one having a surface temperature around 100,000 degrees.  

In a comparison with theoretical models, we determined that the very hottest WDs were cooling at the expected rate to within a factor of two or so, thus verifying the neutrino production mechanism. While this is a superb result, several astronomical quantities entered into the analysis, all having significant errors and hence reducing the precision.

Core crystallization

The core of a typical WD consists of the products of helium-burning, that is a mixture of carbon and oxygen. The density is very high – about a million times that of water. When the core is hot the carbon atoms behave as a classical gas, moving about with high thermal velocities. But as the WD cools, the high density can force the atoms into a crystal lattice and, in effect, the core of the WD “freezes”. When this happens the star will emit some excess energy as its atoms lose their thermal velocities. This will show up as a slowing in the rate of cooling of the WDs. Observationally, this will exhibit itself as a “bump” in a plot of the number of WDs versus their age.

We have looked for and have found an excess number of WDs in our data just where the core is predicted to crystallize – a lovely confirmation of WD cooling theory. WDs in effect become huge diamonds in the sky late in their evolution.