
*A p value helps determine the significance of results when conducting a hypothesis test in statistics.
In November, PM Trudeau’s principal adviser, Gerald Butts, took time out from his presumed full schedule to block me on Twitter. My offense was, I think, trivial; I replied to a Tweet from Gerry where he was crowing about the Liberals’ tremendous fiscal management as outlined in their fiscal update. I asked how the oil and gas sector was doing and if he’d like to comment on any associated drop in tax revenues. I also inquired about the health of Canada’s steel and aluminum industries and what progress had been made on expansion of the Trans Mountain pipeline. I concluded by suggesting that things were going swimmingly.
Okay, I was a little sarcastic, but didn’t name-call or use offensive language. Unlike Gerry, who has called people he disagrees with “Nazis” on Twitter, I merely suggested that things were perhaps not as rosy as the government would have us believe and I think plenty of room remains for more and harsher criticism.
Like his boss, Gerry is strangely adolescent when it comes to social media. He may not be obsessed with selfie moments, but he often engages in petty squabbles on Twitter that a grown-up in his position should be embarrassed by. Still, he’s under no obligation to pay attention to cranky old guys like me who share none of his responsibility with respect to guiding the ship of state; blocking me is his prerogative.
More worrying than his online behaviour is his undue influence over and apparent happiness with the federal Liberals’ rejigging of the national economy. By now, everyone must be aware that Butts is on record saying he believes there should be no fossil fuel industry in Canada by 2050. Trudeau let slip in a town hall meeting in 2017 that he shares that view when he asserted that the oil sands had to be phased out. We all know, or were certainly told repeatedly, that Stephen Harper had a “hidden agenda” even though I’m not sure what anyone would point to as evidence of its execution in whole or in part. Strangely, no one suspects the present “sunny ways” government of having any such nefarious plan while evidence of a consistent, deliberate and, so far, very effective strategy to cripple the fossil fuel industry in Canada is all around us.
Of course, some people welcome such an agenda and don’t care if it’s covert just as long as it satisfies the moral imperative we all share to fight climate change. And everyone is of course up to date on the science, or at least claim to be, so that discussion is over. When you’re safely atop the moral high ground it is easy to accept and even regurgitate the Liberal pap regarding the actions they’ve taken. The Great Bear Rainforest is assuredly no place for a pipeline so Northern Gateway was justifiably cancelled. Our coastlines (well, at least a specific portion of one) must be protected so hooray for the tanker ban. And TransCanada, abandoned their Energy East project simply because market conditions had changed; that decision was in no way influenced by the introduction of onerous new regulations by the Liberal government. Trudeau was pro-Keystone XL during the 2015 election campaign, happily so while hiding behind the smokescreen of Obama’s purely political and obvious intention to cancel it. It never occurred to Justin and his crew that Clinton might not win the U.S. election and that Trump would immediately roll back Obama’s ideologically-driven decision. Now, Trudeau is mum on the project.
Which leaves only Trans Mountain and it increasingly looks like buying it was a stroke of genius because the government can simultaneously thump its chest about taking action while controlling the pace by which the approval and construction processes are undertaken; it must be done right, don’t you know, so it’s going to take some time. Thank goodness that this government believes in the rule of law and the authority of the courts, particularly when the courts make judgements that fully align with their own agenda. It’s so fantastic, this play, it even provides a stream of revenue to the government from the very industry it is bent on dismantling while it maintains a chokehold on distribution growth.
But the obfuscation, the high-minded pronouncements, the outright chicanery can all be forgiven because the climate is in crisis and we must do this for the planet and the future of humanity. As the recent United Nations Framework Convention on Climate Change Conference of the Parties, COP24, in Katowice, Poland was approaching we were inundated with waves of gloomy prognostication from, to name a few, the IPCC, Britain’s Met Office and the U.S. Global Change Research Program. We were repeatedly told that the deadline for action is now closer, the costs of inaction have soared, and the situation has clearly never been more dire.
On October 31st, in the midst of this pre-COP hysteria, an important new and fully peer-reviewed paper was published in the prestigious, gold standard science journal, Nature. The paper, Quantification of ocean heat uptake from changes in atmospheric O2 and CO2 composition, was authored by researcher Laure Resplandy and nine others. It referenced 69 other papers and the research was supported by or used data from a number of highly-regarded scientific institutions that are front and centre in climate change research including: the National Oceanic and Atmospheric Administration (U.S.); the Princeton Environmental Institute (U.S.); the National Center for Atmospheric Research (U.S.); the Scripps Institution of Oceanography (U.S.); and the Canadian Greenhouse Gas Program.
The research was designed to quantify the amount of heat being absorbed by the world’s oceans using a novel approach. Rather than using actual temperature data gathered over time from a variety of sources, ocean heat uptake would be calculated by measuring changes in the levels of oxygen and carbon dioxide released as the ocean warms. Using this method, the researchers determined that the oceans have been warming 60% faster than previously thought over the past quarter-century.
This staggering result was immediately reported by media outlets around the world. It provided yet another horrifying statistic showing just what a bind the planet is in. The fact that research findings of this sort now drive headlines is faintly amusing. Research papers like Resplandy et al, 2018, make for dry, and often impenetrable reading. The language is arcane, and the mathematics and statistical analyses are well beyond the grasp of most people. Any visual material, graphs and charts, provide little relief and can be all but indecipherable. It’s difficult to accept that the average journalist, untrained in science, can properly decode a research paper and fully grasp its findings and conclusions. It’s doubtful they ever read the actual papers but more likely rely instead on the PR talking points that journals like Nature or the institutions that fund the research provide when a paper is released. But Resplandy et al was pitched as being policy-relevant and therefore very newsworthy.
Most of the time, this is a smoothly functioning process that continually builds up the climate change narrative over time. It is relentless and has generated a near-universal acceptance of climate change “truths,” i.e., that it is driven by increasing amounts of CO2 being dumped into the atmosphere through the human use of fossil fuels, that it is spiralling out of control, that the consequences are catastrophic and that the science cannot be questioned.
Periodically, however, some brave soul does the unthinkable and successfully challenges a research finding. And that is what happened with Resplandy et al. An independent climate science researcher, retired British financier Nicholas (Nic) Lewis, read the paper and almost immediately noticed problems with the math and statistical analyses underpinning it. Lewis tried to reproduce the paper’s results and contacted the authors to determine why he was unable to do so. Ultimately, he discovered that the paper grossly overstated the heat uptake of the world’s oceans and underestimated the uncertainty attached to those findings; in short, the paper was all but worthless.
Lewis suggested that the authors should retract the paper and advise the media they were doing so and why. He then publicized his analysis on the blog of climate scientist Judith Curry, Climate etc. It must have been somewhat embarrassing, but Nature did acknowledge problems with the paper and advised that the team of researchers were working to resolve them. Several mainstream media outlets also reported that issues had been found with the paper. While the announcement of the paper’s findings had generated considerable media interest, the climb-down over its failings was more subdued.
This whole episode is simultaneously both important and entirely inconsequential. Important because it highlights that much of the science is accepted without question as few people seriously examine or are even capable of seriously examining what is being produced, including peer reviewers. Inconsequential because the impact of Lewis’s efforts to get at the truth is Lilliputian in scale; almost no one is paying attention. Additionally, with all the furious condemnation in the public sphere of “deniers” or “skeptics,” few are encouraged to actively do what science demands which is to challenge any and all scientific knowledge no matter how strong the consensus supporting it.
As an independent researcher, Lewis conducts his own research and collaborates with others. The peer-reviewed results from these efforts have been published in reputable journals. Lewis has also served as an IPCC Expert Reviewer. Some of his work has focused on better approximating the magnitude of the expected temperature response to a doubling of atmospheric CO2, which is captured in a couple of key metrics that are still only expressed as range estimates by the climate science community. His findings shift the range down slightly relative to estimates from other studies. These metrics are critically important for several reasons, not least because they make it possible to calculate a price for CO2 emissions that reflects their actual cost to the environment.
Discovering problems in Resplandy et al was not the first time that Lewis has revealed deficiencies in the work of well-funded and highly regarded researchers or government agencies operating at the very nexus of climate science. In 2013 he held the British Meteorological Office to account for its misrepresentation of how its major climate model known as HadGEM2-ES performed relative to actual observations of climate data. The Met Office claimed its model incorporated, and the model’s outputs were consistent with, the findings of the most recent climate science including work that Lewis had collaborated on. Lewis demonstrated these claims to be false. The Met Office climate forecasts were high and outside the range of those of other researchers and agencies, including the IPCC’s.
Lewis’s success as an unfunded ‘amateur’, exposing significant failings in the work of climate science professionals is extraordinary but not without precedent. Perhaps the most noteworthy case is that of retired Canadian mining analyst Steve McIntyre. McIntyre became interested in climate science when he noted that the ‘hockey stick’ curve, made famous in Al Gore’s film An Inconvenient Truth, and which was purported to depict 1,000 years’ of global average temperature eliminated noted climate events like the Medieval Warm Period and the Little Ice Age. McIntyre initially reached out to climate scientist Michael Mann asking for access to his data and computer code so that he could try and understand and replicate Mann and his team’s results. Mann was uncooperative from the start and eventually became hostile. It is a long and well-documented story but McIntyre along with fellow Canadian Ross McKitrick, an environmental economist, ultimately managed to punch serious holes in the ‘hockey stick’ as well as the data and methodology used to produce it. McIntyre and McKitrick were even targeted in some of the infamous ‘Climategate’ emails that cast several of the key actors in the world of climate science, including Michael Mann, in a very poor light.
What Lewis, McIntyre, and McKitrick have in common are exceptional math skills and proficiency in statistical analysis. These are essential tools in the world of climate science, relying as it does on elaborate computer models based on a mix of actual observational data, proxy data from sources such as ice cores, lake sediments and tree rings, and parameterized values for model elements where real or proxy data are not available or where it is not yet, or will never be, possible to model the physical characteristics of the elements in question. Climate science is not a single discipline but is a complex amalgam of multiple disciplines including, physics, chemistry, biology, etc.; common to all, a reliance on statistical analysis. As the edifice is built almost entirely on statistics and statistical modelling and as a huge number of the so-called ‘experiments’ conducted are computer simulations using datasets developed with these models, it becomes apparent that the most critical skillset among all those possessed by a team of climate scientists is world-class statistical analysis chops. That two ‘amateur’ scientists and a professor of environmental economics who possess these talents, Nic Lewis, Steve McIntyre and Ross McKitrick, have been able to critically challenge and ultimately negate the findings of highly-credentialed, well-funded and esteemed professional climate scientists is proof of that.
As a point of clarification, in the context of this discussion the terms relating to statistics and statistical analysis are used to imply highly advanced techniques, not simple measures like the arithmetic mean of a set of numbers or the use of percentages to illustrate a simple linear trend. Much of the controversy regarding Michael Mann’s ‘hockey stick’ focused on his team’s apparent misapplication of a predictive technique called principal component analysis (PCA). Following is a description of PCA that appeared in a nature/methods article published in June 2017 on the Nature.com website:
PCA reduces data by geometrically projecting them onto lower dimensions called principal components (PCs), with the goal of finding the best summary of the data using a limited number of PCs. The first PC is chosen to minimize the total distance between the data and their projection onto the PC (Fig. 1a). By minimizing this distance, we also maximize the variance of the projected points, σ2 (Fig. 1b). The second (and subsequent) PCs are selected similarly, with the additional requirement that they be uncorrelated with all previous PCs. For example, projection onto PC1 is uncorrelated with projection onto PC2, and we can think of the PCs as geometrically orthogonal. This requirement of no correlation means that the maximum number of PCs possible is either the number of samples or the number of features, whichever is smaller. The PC selection process has the effect of maximizing the correlation (r2) (ref. 2) between data and their projection and is equivalent to carrying out multiple linear regression3,4 on the projected data against each variable of the original data. For example, the projection onto PC2 has maximum r2 when used in multiple regression with PC1.
No lay user of statistical information, e.g., a sports enthusiast interested in comparing the save percentage of NHL hockey goalies or a would-be home buyer tracking interest rate trends, could be expected to have even heard of PCA or other advanced analytical techniques much less possess the remotest understanding of them. That simple truth likely also applies to nearly all policymakers who use summaries of the findings generated using these methods as the basis for their policy prescriptions.
I am not arguing here that the mentioned exposures of poorly executed and effectively valueless research overturn the central climate change narrative that human activity is affecting the world’s climate. What these events do is raise the possibility that other research findings, widely accepted and endlessly propagated by policymakers and the media, might also be flawed or even fundamentally unsound. They also highlight the enormous complexity and the still prevalent uncertainty of climate science. These characteristics are also highlighted by many other peer-reviewed studies that explore very specific issues but receive no media attention or attention from advocacy or political groups. Politicians, the media, and other influencers portray climate science as if it were a hard, tangible and readily understood object, an absolute truth. It is more akin to a spongiform, amorphous blob that defies easy understanding and about which almost nothing is absolute.
Richard Lindzen, Professor Emeritus, Department of Earth, Atmospheric and Planetary Sciences at the Massachusetts Institute of Technology (MIT), recently delivered a lecture, Global Warming for the Two Cultures. He identified the two cultures as being “non-scientists” and “scientists” and described the divide that exists between them. For the issue under consideration, Lindzen assumes that members of both groups are highly educated but on some levels incapable of communicating with each other. Lindzen said, “The gap in understanding is also an invitation to malicious exploitation. Given the democratic necessity for non-scientists to take positions on scientific problems, belief and faith inevitably replace understanding, though trivially oversimplified false narratives serve to reassure the non-scientists that they are not totally without scientific ‘understanding.’ The issue of global warming offers numerous examples of all of this.”
Lindzen then walked his audience through a description of the climate system that he suggested should be readily understood and accepted as uncontroversial by the “scientists” in attendance and hopefully somewhat intelligible to the “non-scientists” present. He concluded this description by urging attendees to, “Consider the massive heterogeneity and complexity of the system, and the variety of mechanisms of variability as we consider the current narrative that is commonly presented as ‘settled science’.”
In an exploration of the current prevailing political narrative, Lindzen suggests that the belief that climate can be summarized by a single variable, the globally averaged change in temperature and that this variable is controlled by, “the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide,” is, “based on reasoning that borders on magical thinking.” The unquestioning acceptance of this belief gives politicians confidence that they know exactly what policies are required to control CO2.
We are now experiencing firsthand in Canada the impacts of decisions being made and political actions being taken that are based on scientific findings that are speculative in nature and where the massive uncertainty attached to them is unrecognized if not completely unstated. As Lindzen points out, “our leaders are afraid to differ, and proceed, lemming-like, to plan for the suicide of industrial society.”
To take this discussion full circle, no one, not even non-scientists can have been fooled by the party trick Trudeau performed in 2016 at the Perimeter Institute for Theoretical Physics in Waterloo when he discoursed on quantum computing. If anything, that stunt is the perfect analog for how his government and many others use references to science as props to justify their policy decisions. It is nothing more than the rote recitation of a few talking points to establish for their audience their understanding of something about which they know very little. Trudeau and those around him are non-scientists. They have liberal arts educations and their presumed expertise is in politics. Gerry Butts may have been the head of an activist environmental organization, but his role was to foster and try to exercise political influence, not direct scientific inquiry.
An essay, I Pencil, written in 1958 by Leonard E. Read illustrates this point simply and eloquently. Read points out that a pencil is made from a handful of basic materials, wood, graphite, rubber, metal and paint, but no person on the planet fully understands all the processes that go into making a pencil or can make one without the support of countless others who create the required materials and tools. The breadth and depth of human experience and knowledge that led to the design and creation of something so commonplace and simple cannot be understated.
Do we imagine that any of the key actors in the present government are capable of fully explaining the relatively straightforward science and engineering required to make a pencil? So why do we think they fully comprehend the science and know enough to support the massively dislocating solutions they propose to solve the climate problem? As a group, they are simply not qualified to make truly science-based policy decisions but only believe themselves to be.
The problem today is that the lines have become incredibly blurred. Scientists like Michael Mann, and they are legion, have become politically vocal and confront opponents not to argue the science but to delegitimize them by political means. Scientists now advocate policy solutions, which is outside their areas of expertise and functional mandates. Ordinary citizens have divided into camps of believers and deniers even though neither group has the capability to properly evaluate the information that is presented to them. This turmoil does not provide the terra firma to support policies designed to dramatically re-engineer our energy systems, our economy, indeed our whole way of life.
While I have argued here for the necessity of placing mathematical and statistical analysis front and centre in climate science, there is also an argument to be made for paying greater attention to empirical data. Empirical data is necessary to prove or disprove favoured theories and to assess the quality of modelled ‘experiments.’ So far, observations have shown that the modelling, in many cases, is overly pessimistic and the situation is possibly less dire than the catastrophists claim.
A recent aggregation of Environment Canada weather data by Ross McKitrick reveals that data collected by the 30 weather stations nationally that have reported local temperatures from 1888 to 2017 reveal a warming trend of 0.1⁰C/decade over that time frame. For the 267 weather stations with data from 1978 to 2017, the recorded trend is 0.24⁰C/decade. McKitrick’s report, Trends in Historical Daytime Highs in Canada 1888-2017, shows that the overall warming trend is not uniform geographically or temporally and that in some locations no (statistically) significant trend is identified while in others the trend is greater than the national average trend. Summary notes from Trends in Historical Daytime Highs in Canada 1888-2017:
1. There is a trade-off between the number of available stations and the length of record. There are 30 stations with data back to 1888 and 267 stations with data back to 1978.
2. Over the past 130 years the median warming rate in the average daytime high is about 0.1 degrees per decade or 1 degree per century.
3. Over long samples there is little polar amplification (increased warming with latitude) but it does appear in fall and winter months in more recent subsamples.
4. Over the past 100 years, warming has been stronger in winter than summer or fall. October has cooled slightly. The Annual average daytime high has increased by about 0.1 degrees per decade. 72 percent of stations did not exhibit statistically significant warming or cooling.
5. Since 1939 there has been virtually no change in the median July and August daytime highs across Canada, and October has cooled slightly.
6. There are 247 stations with data back to 1958. However as the time span decreases the range of observed trends greatly expands. All months exhibit median warming but with much wider variability.
7. Post-1958 Arctic coverage is much better than earlier. There is little indication of polar amplification.
8. Post-1978 the range of trends grows dramatically. The median trend in March and April is slightly negative.
9. Some polar amplification is observed in the post-1978 annual trend, mainly due to the late fall and early winter months.
Putting these data in perspective, imagine you are sitting in your home and a family member complains that it feels cold and they request that the heat be turned up. You agree and, fortunately, as a means of managing your family’s home heating costs, you have a thermostat that measures temperature in hundredths of a degree. You generously turn the heat up by 0.24⁰C. A few minutes later the same family member complains that they don’t notice any difference even though the temperature has gone up nearly a quarter of a degree. Perhaps, given more time, say, a decade, they would notice, but likely not.
A 2016 report prepared for York Region, Historical and Future Climate Trends in York Region, predicts dramatic increases in temperature in the region by mid-century. York Region stretches from the northern boundary of metropolitan Toronto to the southern shores of Lake Simcoe. The report used historical data from 1981 to 2010 and climate model ensembles to develop its predictions including IPCC emissions scenario RCP 8.5. This scenario is a business-as-usual, high economic and population growth and low technological change scenario which yields the highest estimates of temperature increase of all the IPCC scenarios. In discussions of IPCC forecasts, it is generally acknowledged to be unrealistic and overly dire in its perspective, but it was explicitly chosen for this report.
There is a section on methodology where the datasets and models chosen and how they were used are discussed. It is likely opaque to most readers and I suspect for the majority of, if not all the municipal politicians and bureaucrats for whom the report was prepared. The report also includes a lengthy section on historical climate impacts where rainstorms, ice storms, hot spells, dry spells, and other weather events are not too subtly attributed to climate change, although this type of attribution is not supported by the climate science community generally, and by the IPCC specifically.
Rather than explaining in lay terms how the report’s findings were produced, what assumptions it relies on, and the limitations that should be considered, it would appear to be designed to make the report appear as technically sophisticated and scientifically defensible as possible. It may well be both of those things but, equally, could be a smokescreen from behind which a predetermined set of conclusions to support a favoured policy agenda has been delivered.
Richmond Hill, a town in York Region, has recently announced the hire of a project manager whose mandate will be to plan and implement municipal climate change priorities that will address the allegedly increasing impacts of climate change and reduce greenhouse gases. The scale of this action is obviously far less than that of the actions taken by the federal government to limit emissions and move to wind down the oil and gas industry. But it starts from the same premise and is built on the same shaky foundation. It also is a decision taken by people who are, like most of their federal and provincial counterparts, and the general public, scientifically illiterate but very happy to pronounce on subjects about which they know almost nothing.
As a nation, we should all stop and pause and try to figure out how to address this issue without dividing into camps and without wilfully destroying the institutions, industry, and infrastructure that helped create the best world that humans have ever lived in. While the debate rages about the Trudeau government’s imposition of a carbon tax, a debate characterized by name-calling, divisiveness and the repetition of simplistic bromides such as the need to “put a price on pollution,” we still haven’t had the discussion we need. That discussion must include scientific information from across the spectrum made accessible to the majority who are non-scientists. It must also include an equally accessible estimation of uncertainty around any of the major claims made. In short, a proper discussion and resolution of the differences between the “two cultures.” Then, we will have a basis for developing policy. Living in a democracy as we do, we accept the idea of majority rule and so the notion of a consensus view being an appropriate way to settle scientific questions has gained wide support. But it is another societal ideal that would be more properly applied and is more analogous to the scientific method; the requirement in our legal system that to win conviction the prosecution must prove its case beyond a reasonable doubt. That is what we need to move forward. As amateur and non-consensus researchers have proven time and again, we have not yet met the ‘no reasonable doubt’ standard, there is still ample room and justification for skepticism regarding the prevailing narrative. Moving forward without doing so is akin to surrendering to mob rule. We should not, and cannot, let that happen.