The ‘R’ Word – Use it Thoughtfully

July 28, 2015, was a great day for illustrating why the ‘R’ word – renewables – (wind and solar energy) should be used judiciously by thinking people.

First thing to note is that it was a hot day in Southern Ontario. Environment Canada issued a heat warning, forecast a high of 33°C, and recorded a temperature of 31°C at Pearson Airport at 1:00 PM.

Heat warning

On such a day, demand for electricity would have been at or near its annual peak as countless homes, businesses and public facilities cranked up their air conditioning.

The second thing to note is that the wind and solar generators that Ontarians have been forced to subsidize in vain pursuit of the provincial government’s green vision were effectively useless in terms of their contribution to meeting demand.

The Independent Electricity System Operator (IESO) manages the electrical grid in Ontario. The IESO reported that Ontario demand at 12:00 PM was 21,026 MW. The following screen shot taken from the IESO website indicates that at 12:00 PM wind generators were delivering 30 MW, and solar generators 97 MW, out of a total supply of 21,306 MW. Combined that’s just over ½ of 1%.

The chart also shows how so-called ‘dispatchable’ power sources, i.e., those that can be turned on and off or turned up and down in fairly short order, hydro and gas generation for example, were ramped up over the course of the day as the temperature, and demand for electricity, rose. It’s also evident that the nuclear fleet was providing the base load power needed day in and day out by the province, as it always does.

The installed capacity of wind is actually more than 3,000 MW but, on a hot, windless day, its effective capacity was about zero. Solar was operating pretty much at the peak of its capacity at midday; a fairly modest contribution however, despite there being acres of rooftops and farm fields covered with photovoltaic panels in Ontario.

Generation

All of this essentially useless and ineffective generating capacity costs ratepayers in the province billions of dollars and has contributed to Ontario’s declining industrial competitiveness. Anyone, and especially any politician, who suggests we need to move more quickly to renewables should be challenged on their facts and assumptions. Personally, I’m not keen on the prospect of freezing or frying in the dark because some misguided zealots are busy ‘saving the planet’ at my expense.

A few other points to note: gas plant capacity has increased in Ontario to offset the loss of retired coal-fired generating capacity and to provide back-up when it’s either too dark/cloudy, or too still for the renewables to deliver. So, Ontarians have had to pay twice for the effective capacity available.

Nuclear and hydro generation are greenhouse gas emissions-free and accounted for 86% of the electricity generated in Ontario in 2014 (62% for nukes, 24% for hydro). Few other jurisdictions on the planet can match that green hue.

Biofuel, another class of renewable, has also ramped up slightly. For the record, ‘biofuel’ in Ontario means burning wood. Yes, wood. It’s meant to be renewable because new trees will grow and replace those burned and will absorb the carbon dioxide emitted through combustion. That only takes 30-60 years, but hey, if we get to thump our chests and trumpet our green credentials then I guess it’s worth it. I can’t really think why else we’d be going back to wood to meet our energy needs.

Stay cool.

Advertisement

Small Beer – Climate Change Alarm Hits a New Low Over a ‘Record High’

After months of anticipation the global warming alarmist community dolefully (gleefully?) reported that a new record high annual average surface temperature for the globe was observed in 2014. According to the believers, this new milestone in global average temperature was a record in the modern period, i.e., since 1880, and more evidence, as if any were needed, of the inexorable rise of the earth’s temperature due to human activity.

Whenever some hapless doubter, after a cool summer or a frigid winter, dares to query the legitimacy of the whole anthropogenic climate change theory, they are immediately advised that such variation is “just weather.” For example, in the winter of 2013-14 ice cover in the Great Lakes at 88.3% was the fourth highest it has been since 1973 and not far off the record of 94.7% set in 1979, according to Environment Canada. That agency was quick to note that:

A season of unusually cold weather in the Great Lakes basin is not a sign that the century long trend of rising temperatures has reversed. In fact, while Canada and the eastern U.S. froze at times this winter, many locations including Alaska and Europe, were experiencing unseasonably warm temperatures.

Fair enough, and great to see the scientists at Environment Canada were able to slip the muzzle long enough to get that comment published on their website.

With this perspective in mind, we must surely be able to assume that the new record global average temperature is not merely some variation due to weather, but a troubling shift in an already unsettling trend. The new record was announced by both the National Aeronautics and Space Administration (NASA), and National Oceanic and Atmospheric Administration (NOAA), in the US.

According to the NOAA website the global average surface temperature (land areas and oceans combined) was 0.02ºC above the previous high recorded in 2005 and 2010. That’s right, 2/100ths of a degree warmer. Their error bounds are multiples of that value which means this result is not statistically significant, i.e., 2014 was not meaningfully different from 2005 or 2010 and the change could simply be due to accumulated errors in the reporting and calculation of the value. But, it’s a new record, and it supports the scary narrative so, hey, let’s get it out there.

NASA, on its website, noted:

The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA’s Goddard Institute of Space Studies (GISS) in New York.

That statement, while factually correct, is a little misleading. Depending on your start and end points, you may get a long-term warming trend, or no trend at all. As NASA observes, the ten warmest years in the modern temperature record, with the exception of 1998, have occurred since 2000. But since at least 2000 there has been no trend, as this latest result confirms. Global average temperature has not risen for over a decade, no matter the spin put on the observational record by these agencies.

Meanwhile, the level of CO2 in the atmosphere since 2007 has risen from an average of about 370 parts per million (ppm) to around 400 ppm, an increase of approximately 8% and still rising. The climate change models, which give dimension to the ‘threat’ posed by the ever-increasing level of CO2 and are critical to the narrative, all over-estimate the temperature response to that rise. This probably explains in part, the eagerness with which this new ‘record’ was released. Without a meaningful increase in temperature soon, support for the entire enterprise might begin to wane as the discrepancy between actual temperatures and forecast temperatures increases. Already, the gap raises serious questions about the validity and usefulness of the models and that introduces a lot of uncertainty about what an appropriate policy response looks like.

To its credit, NOAA also reports that while the surface temperature observations remain high (a 2/100ths of a degree record, let’s not forget) the satellite measurements of atmospheric temperature did not deliver a similar result. There are two main analytical datasets used to assess satellite measurements, one produced by the University of Alabama in Huntsville (UAH) and another produced by Remote Sensing Systems (RSS). They use slightly different methods and so UAH found that 2014 was the third warmest year for the lower- and mid-troposphere and RSS found it was the sixth warmest for the same atmospheric zones. UAH measurements extend from 1979 forward and RSS’s extend from 1981. Both found that 2014 average temperature in the stratosphere was the 13th warmest since satellite measurement began.

According to NOAA:

The stratospheric temperature is decreasing on average while the lower and middle troposphere temperatures are increasing on average, consistent with expectations in a greenhouse-warmed world.

Well, maybe, but what is missing from that statement is NOAA’s own uncertainty with respect to temperature changes in the stratosphere. NOAA is a participant in an ongoing study of stratospheric temperature that hopes to resolve unexpected results from the analysis of satellite temperature observations. On their website they reference an article, “The mystery of recent stratospheric temperature trends”, published in Nature in 2012, the abstract of which reads as follows:

A new data set of middle- and upper-stratospheric temperatures based on reprocessing of satellite radiances provides a view of stratospheric climate change during the period 1979–2005 that is strikingly different from that provided by earlier data sets. The new data call into question our understanding of observed stratospheric temperature trends and our ability to test simulations of the stratospheric response to emissions of greenhouse gases and ozone-depleting substances. Here we highlight the important issues raised by the new data and suggest how the climate science community can resolve them.

The “mystery” referenced in the title of the Nature article is that from about 1995 forward there is no trend – not up or down, just flat and strangely similar to surface temperature results over roughly the same period. Not so cut and dried then, but certainly something worth exploring. Perhaps someone at NOAA should brief the PR team.

Finally, it’s also interesting to consider NOAA’s comments on sea ice extent. Rapidly melting polar sea ice is a much-vaunted marker of a warming planet.

Recent polar sea ice extent trends continued in 2014. The average annual sea ice extent in the Arctic was 10.99 million square miles, the sixth smallest annual value of the 36-year period of record. The annual Antarctic sea ice extent was record large for the second consecutive year, at 13.08 million square miles.

Seems odd that you’d lead with the “sixth smallest” and wind up with “record large” considering that the record Antarctic sea ice extent in 2014 was more than two standard deviations from the mean – statistically very significant – while a 2/100ths of a degree change in surface temperature – which most assuredly is not statistically significant – is worthy of a global press release.

The 2014 temperature ‘record’ is pretty small beer, but sufficient it seems to keep the whole climate change juggernaut rolling.

Lance’s Faustian Dilemma

Image

The legend of Faust revolves around a pact with the devil struck by the central character of the same name. A magician, Faust had studied his art for years but had made little progress. Frustrated, he uses black magic to summon the devil who agrees to assist him in exchange for Faust’s soul. The deal becomes trying for Faust as the devil proves to be a difficult partner. The tale of Faust has taken many forms but any deal or bargain where someone relinquishes their claim to the rewards that attach to leading a pure or moral life in exchange for power or material success today is often regarded as Faustian.

Modern tales of success follow a different arc, that of the self-made man. The mainstream media, particularly the business media, and for that matter, scholarly journals devoted to business often attempt to identify what it is that makes some business leaders more successful than others. They, and the public at large have a particular fascination with the superstars; people like Steve Jobs, Bill Gates, Richard Branson or Jack Welch. Certainly these business leadership icons appear to have been struck from markedly different moulds but the prevailing wisdom would suggest that they all share certain characteristics and that they are exemplars of some combination of most or all of them.

Consider the following list of characteristics that a hiring committee might look for in a potential CEO:

  • Toughness or resilience;
  • Fearsome competitiveness;
  • Intelligence and shrewdness;
  • Singularity of purpose, disciplined;
  • An independent workhorse as well as a great team player;
  • A passion for their vocation or the business;
  • A strategic thinker;
  • Superb at execution;
  • Track record of success – consistently delivers value to shareholders;
  • Great communicator – always on message;
  • A visionary – someone who understands the promise of their brand and delivers on it;
  • Understands corporate social responsibility and embeds it.

Now imagine that you’re chairing a hiring committee charged with finding a new CEO and one of the résumés that you’re asked to consider belongs to someone named Lance Armstrong. Yes, that Lance Armstrong, the famous cancer survivor and now-disgraced cyclist who for years denied that performance enhancing drugs played any part in his success and who never backed down from a fight with his critics or detractors. Easy decision, no? Just toss his résumé in the garbage can. After all, he’s a fraudster, a liar, a cheat, and he really hurt large numbers of people including teammates, associates and sponsors. He also betrayed the faith of countless people who looked to him for inspiration while they or their loved ones fought their own terrible battles with cancer.

But maybe it’s not that simple. I’m not going to argue that Armstrong shouldn’t pay the price for his misdeeds – many of his contemporaries in the cycling world have been stripped of their medals or titles because they were caught doping and he deserves the same punishment. What I will suggest is that Armstrong’s record of success is at least as impressive as that of many other public figures and he shares many of the same traits that we typically laud as positive contributors to their success. The big difference is that Armstrong chose to make his mark in an arena, professional cycling, where success depended not only on the possession and exercise of those very important traits but also on the willingness of participants to make a modern-day Faustian bargain – you had to cheat if you were to have any chance of winning.

In his confessional interview with Oprah Winfrey, Armstrong stated that he did not believe it was possible during the time he competed for him to win the Tour de France seven times without doping. This has been the sorry defence that his supporters have been putting forward for some time; everyone was doing it so, if he doped, he only did it to level the playing field. It is a morally bankrupt position and indefensible. But let’s set morals aside and consider the journey that led Armstrong, and large numbers of his peers, to a meeting with the devil and a decision whether or not to follow him down a different road.

Cycling is a grueling sport and the Tour de France an almost inhuman event. The foundation for success in cycling, assuming one has the provenance of a suitable physique, is years of arduous training and almost unimaginable sacrifice. Cyclists are whippet-lean and must live an ascetic life. They must avoid excess weight at all costs because it takes too much energy to carry that weight over long distances and up punishing climbs. They must have the ability and the will to suffer. They must be able to respond at the moment they feel they have nothing left to give, with another push, another gear. You need only look at photos or videos of cyclists grinding up a climb or trying to outrace their competitors in a stage-ending sprint to appreciate how physically and mentally tough they have to be.

The marquee event in cycling is the Tour. Run over twenty-three days – twenty-one days of stages and two rest days – and covering about 2,000 miles it also requires riders to climb thousands of feet in the Alps and Pyrenees mountains. It is a storied event and many of its champions are the stuff of legend. The Tour is not the only great cycling test, the Giro d’Italia and the Vuelta a Espana are similarly challenging, but probably less well known at least to casual followers of the sport. For Lance Armstrong, winning the Tour was all that mattered – everything else was just preparation.

The Tour has suffered the taint of alleged and proven cheating by competitors almost since its inception in 1903. As in many other power and endurance sports, athletes have sought advantage through the use of various drugs, sometimes with tragic consequences. In 1967, British rider Tom Simpson collapsed while climbing Mont Ventoux and died. He was seriously dehydrated, in part because Tour regulations at the time limited riders to two bottles of water per stage, but an autopsy also revealed the presence of amphetamines in his blood and amphetamine pills were found in the pockets of his jersey. Two-time Tour winner and mid-20th century cycling legend Fausto Coppi when asked in an interview if he had used drugs to enhance performance replied “yes, whenever it was necessary”. Asked when it was necessary, he responded “almost all the time”. In Coppi’s day, it should be noted, drug use was permitted.

Doping then, has many precedents within the sport and it would be hard to know with certainty, which legendary performances were “clean” and which were not. Different drugs and techniques have fallen in and out of favour among competitors as regulations and testing have evolved. The challenge for users has always been maximizing the benefits while minimizing the risk of detection. By the time Lance Armstrong became a professional and set his sights on the Tour, erythropoietin (EPO), testosterone and human growth hormone (HGH), as well as other drugs, were widely used, but it appears that EPO and blood doping were the keys to achieving significant gains in performance.

EPO used to enhance performance is a synthetic version of a naturally occurring protein. It works by stimulating the body to produce more red blood cells, which increases the capacity of the blood to carry and deliver oxygen to hard-working muscles thus increasing endurance. In a way, in the 1990s, it was a perfect storm because there was no test for EPO until 2000 and the only way its use could be regulated was by measuring its effect on haematocrit levels, i.e., the percentage of blood made up of red blood cells. That level varies person to person and is typically around 41% – 42%, but the regulators decided that as long as a rider’s haematocrit did not exceed 50% they were “clean”. If their haematocrit was found to be in excess of 50%, they were suspended for fifteen days, barely a slap on the wrist. This was almost an invitation to riders to use EPO. It’s easy to understand the temptation of using an undetectable drug that, in simple terms, would increase oxygen-carrying capacity by a fifth.

It is also easy to imagine the impact that such an increase in oxygen-carrying capacity would have on the performance of an already highly trained and incredibly fit athlete – and the advantage it would confer over non-using rivals. Former Armstrong teammate Tyler Hamilton in The Secret Race: Inside the Hidden World of the Tour de France: Doping, Cover-ups, and Winning at All Costs, described how he and his teammates, including Armstrong, were not just beaten, but destroyed in the races they contested when they first attempted to scale the heights in European cycling without chemical assistance. Hamilton also outlines how, at some point, any new rider would come to the fork in the road where they must decide whether to hold to their principles, their possibly naïve beliefs and aspirations of doing it clean, or start following the path that so many others had already chosen. Perhaps what sets Lance Armstrong apart from the rest is that he didn’t just want to level the playing field, he appears to have wanted to tilt it in his favour. And so he did, with stunning results.

Armstrong became a modern day legend with seven consecutive Tour victories and an Olympic bronze medal. He became his own brand and leveraged his fame to establish the Livestrong charitable foundation. Through the former, he influenced thousands, if not millions of people to take up cycling and boosted the profile of the sport, at least in North America, to a previously unimaginable level of popularity.  And, through the latter, he gave hope to countless other people that they, as he had done, could overcome cancer. He became one of the highest earning athletes anywhere and enjoyed multiple commercial endorsements. And he did it all by out-competing his opponents on and off the bike, and by relentlessly adhering to his no-dope narrative and castigating, or worse, those who dared to cast doubt on it.

This, however, is not a simple, black and white story. Yes, he’s been exposed and has confessed to most, if not all, of his wrongdoing. And it’s true that he took money under false pretense and that he’s likely going to be forced to return some of it. It’s also true that he treated some people, including some who were close to him and who aided his success, very badly. What’s remarkable is how moral outrage has soared as rapidly as Armstrong has fallen. Remarkable because on balance, and for all his deception, it could be argued that he did more good than harm. Unlike the bankers, still unpunished, who perpetrated a fraud so massive that it all but dropped the global economy to the floor and caused harm to millions of innocent people, Armstrong’s biggest offence is that he misled his sponsors, the world of cycling and his supporters. Part of the narrative is still true; he is a cancer survivor who beat the odds spectacularly and, irrespective of whatever tangible impact his charity has had on cancer research and support, he provided a positive message that many people have born witness to.

The problem for Armstrong, consistent with the story of Faust, is that once he made his pact with the devil the deal did not end there. With each passing year and Tour victory he was forced to tie his own narrative into ever more convoluted and restrictive knots. The myth of Lance was like an inverted pyramid, a huge structure resting on a single point – the lie that he must propagate and defend in perpetuity if the whole edifice was not to ultimately tip over and crush him.

His sponsors, in the wake of the USADA (US Anti-Doping Agency) report and the subsequent stripping of his titles by the sport’s governing bodies, have all jumped ship which means that, for the most part, they enjoyed the benefits of the pre-fall Armstrong reputation and have now been able to neatly sever their ties to him and avoid any blowback. His foundation too, has ended its relationship with him. If we think of these entities as shareholders of the enterprise he headed up, they’ve generally done pretty well with their investment over time.

There is evidence to suggest that many successful corporate executives are psychopaths or have psychopathic tendencies. They willfully deceive people, with a notable lack of remorse, they lack empathy and they place their own interests at the center of all that they do. Is this behaviour so different from Armstrong’s? And is it possible that some of these people’s success is contingent on making a Faustian bargain as Armstrong’s was? The difference between Armstrong’s behaviour and that of many other successful people is the scale and highly public nature of his deception. Strike his messy public downfall from the record and Armstrong’s résumé ticks all the boxes – he’d be one hell of a candidate for that vacant CEO’s seat. In plain, black and white, moral terms, Armstrong’s conduct was inexcusable. From a different perspective perhaps this is just life imitating art. The rewards, both real and imagined, were so enticing that he made the deal – and now, tragically for both Armstrong and all those who believed in him, he has been called to account.

And Passion, Too

Image

I recently used the word ‘passion’, my location, and nothing else to search a national job board. It returned 2,540 local jobs that required applicants to be passionate; not just generally but in quite specific ways. Potential candidates were expected to have a passion for: results, life, excellence, quality products, travel and the outdoors, the culinary world, shoes and fashion, to educate, quantitative analysis, and almost anything else you can think of.

Passion has no single meaning, but generally it is associated with intense, often overpowering feelings. The Merriam-Webster online dictionary includes in its definition:

A. the state or capacity of being acted on by external agents or forces

B. emotion

C. plural: the emotions as distinguished from reason

D. intense, driving, or overmastering feeling or conviction

E. an outbreak of anger

F. ardent affection: love

G. a strong liking or desire for or devotion to some activity, object, or concept

H. sexual desire

I. an object of desire or deep interest

This is heady, potent, maybe even volatile stuff! I guess it’s safe to assume that most employers are probably seeking to hire people with feelings of “love” for, or “devotion to some activity, object or concept”, as opposed to someone possessed by “anger” or whose actions are governed by “external agents or forces”.

The range of opportunities pulled up by the search was broad, from low wage relatively unskilled jobs to executive positions. Expecting passion from a highly paid executive responsible for leading the way through turbulent economic waters seems reasonable, it comes with the territory. At the other end of the spectrum, though, maybe the bar should be set a little lower.

One organization is looking for “Maids” who have a passion for “ECO green cleaning”. It’s a special person who is stirred by the thought of whisking out a commode with a natural bristle brush and a biodegradable cleanser, or who looks forward to cleaning an oven without benefit of powerful chemical agents. There was also an opportunity for a “Meat Wrapper”. Only artisans who can get excited when the plastic film is perfectly tensioned over the pork chops and the price tag is precisely aligned with the edges of the foam tray need apply. For either job, I’d have thought that someone who showed up ready for work the second day would have met requirements; if they showed up raring to go, they’re possibly overachieving.

Some time, when those of us not working in HR weren’t looking, passion apparently crept up and overwhelmed more mundane qualifications like actual skills, professional experience or academic achievement. If you search on “passion at work”, Google will return more than 400 million results. Sample just a few and you’ll discover that passions run high about work and passion. There appears to be two camps – one that believes pursuing your passion will lead to career success and happiness and the other that contends passion for work is developed over time in the doing of that work. I find myself leaning toward the second camp while having sympathy for the idealism of the first. I’ve always believed that real passions find us, not the other way ‘round. And those passions or pursuits may offer very limited opportunity for financial or material gain but that’s not what they’re about.

In November, I attended the final round of the 2012 Canadian Rally Championship, The Rally of the Tall Pines, held near Bancroft, Ontario. It was a cold day and the Tall Pines is a tough event. 2012 series champions, Antoine L’Estage and Nathalie Richard, after leading most of the rally, had to settle for second place when the right rear suspension on their Mitsubishi collapsed late in the event. Their crew, working in the sub-zero evening cold under a canvas shelter, managed to repair the car during the final half hour service stop allowing them to finish. This type of drama was played out over and over again during the course of the event.

Rallying in Canada is kind of a fringe sport. There are few people making a living at it. The top teams are professional and funded with corporate sponsorships. The majority of teams, however, are self-funded amateurs with limited commercial backing. But I’d wager that virtually all the participants and those that support them – usually friends and family – go rallying because it is their passion. These people throw their hearts and souls (and wallets) into prepping their cars, trailering them across the country and then thrashing them on the rough backcountry roads where rallies are typically held. It’s a difficult, demanding and costly sport but you know the people involved love the challenge, just love to be there, and any reward for their effort is secondary.

The rally scene is perhaps a microcosm of the working world; a handful of people get paid to work at what they truly love, the rest pursue their passions on their own dime. There is an axiom that holds “we should do what we love and the rest will follow”. It’s a wonderful ideal, but seriously useful only as a rough guide. The whole social contract exists so that essential stuff will get done that otherwise wouldn’t – there will never be enough people around who are passionate about wrapping meat or green eco-cleaning to satisfy demand.

So, in the context of employment, maybe we need to be a little less cavalier in our use of the word passion. No one would dispute that it’s reasonable for employers to expect effort and commitment from employees – passion, not so much. If someone arrives at that magical intersection between what truly stirs them and an employment opportunity that’s a great but probably rare thing.

I think employers should worry less about finding a candidate ablaze with passion for “X” and put more effort into figuring out how to create the conditions where a little spark can be nudged into a flame and then into a good steady burn. The kind of person willing to lie on their back in a frozen field servicing a damaged rally car for the chance to help achieve a good result, is the kind of person who likely goes the extra mile in all they do. Employers should stop waiting for passion to walk in the door, and start figuring out how they can generate it within their organizations. All matches in a box have the same incendiary potential; without the right set of conditions, none will start a fire.

Usage Based Insurance – Big Brother Just Might Be Your Friend

A 2011 Towers Watson white paper, “The brink of a revolution – Usage based insurance promises profound change”, laid out a vision for auto insurance that the authors reported is fast becoming reality http://www.towerswatson.com/assets/pdf/4583/1104_Towers_Watson.pdf. Usage based insurance (UBI), sometimes called pay-as-you-drive (PAYD), is simple in concept but somewhat more complex to deliver. The basic idea is that drivers’ insurance premiums should be calculated based on their actual driving behaviour, rather than by relying exclusively on proxies such as age, gender or number of years licensed, or self-reported data such as annual distance driven. The trick is being able to reliably gather sufficient, meaningful data on each insured driver to allow usage based pricing.

After something like a decade of experimentation by insurers and technology companies around the globe, the means to gather the data is both robust and inexpensive enough to support large-scale deployment of UBI. According to Towers Watson, insurance companies representing more than 60% of the U.S. market now have, or are in the process of developing, UBI programs. The telematics technology employed to deliver PAYD uses sophisticated electronics to measure speed, time of travel and length of journey, rate of acceleration/deceleration, cornering force and, in some instances, vehicle location. These data are acquired in real time as the vehicle is driven, and then transferred wirelessly at journey’s end to the insurers for storage and subsequent analysis.

Many telematics devices are designed for easy installation, often by the vehicle’s owner, into the onboard diagnostics port that became standard on virtually every vehicle manufactured and sold in North America about fifteen years ago, and in Europe about a decade ago. Some more complex systems require installation by qualified technicians, but this is less and less the case as ongoing development has succeeded in squeezing more and more functionality into the cheaper plug-in devices. Perhaps more importantly, vehicle manufacturers and original equipment suppliers have been hard at work on telematics hardware and software to support a variety of needs, and the day is probably not far off when the capability to support UBI will be either built into or available as an option on all new vehicles.

Insurers have long known that the more you drive the higher the probability that you will one day be involved in an accident. Similarly, they know that driving at certain times of the day, or days of the week, carries greater risk of accident than driving at other times. Consider, for example, how different the experience of driving early on Sunday morning is when compared with driving in rush hour on any weekday. With UBI, insurers can clearly identify those drivers who rarely drive in weekday rush hours or late at night on weekends (another high-risk time) and who also drive just a few thousand kilometers annually. With conventional insurance products, the premiums paid by these drivers differ little from those paid by their neighbours who commute daily in rush hour and then pile on the kilometers driving to the cottage every summer weekend. Using UBI, insurers can offer substantial discounts to demonstrated lower risk drivers. The flip side is that higher risk drivers will have to shoulder heavier premiums; ultimately, a fairer model than what we have today.

So why hasn’t this happened already? The principal factors have been cost and a lack of real-world data to convince insurers they should add UBI to their range of products. Momentum is gathering, however, as costs decline and other factors come into play. One of the ancillary benefits of UBI is that it makes people more aware of their vehicle usage, and in an age where emissions and resource use are of increasing concern this is seen as a positive. UBI is also thought to have the potential to positively influence people’s driving behaviour. For example, if a driver understands that excessive speeding will impact their insurance premium they may be inclined to ease up – a safer practice but also good for the environment and fuel consumption.

The main opposition to UBI has focused on its potential impact on personal privacy, its ‘Big Brother’ aspect. For this reason, many insurers have consciously decided not to incorporate GPS functionality into the telematics devices they use. Although where you drive may be an important risk factor, gathering location data is regarded by many as too invasive. Most responsible drivers who don’t drive a lot will likely be quite happy to share their other driving data in exchange for a double-digit percentage reduction in their insurance premium. That’s what the pioneering UBI providers have been discovering. Most regulatory bodies too, are comfortable with, or even openly supportive of the UBI model and see it as a means for providing much needed premium relief to responsible drivers.

In the U.S., the telecom companies have begun to recognize UBI as a potential new revenue source. Sprint’s Emerging Solutions group recently announced that it is capable of providing end-to-end solutions for insurers wanting to get into the game. http://www.youtube.com/watch?v=JSNJkeKwS08&feature=player_embedded Sprint can look after shipping, installing and connecting devices to its network and can then deliver the collected data either directly to an insurance company or via a cloud-based platform. Sprint further claims that it can implement a pilot program of up to fifty vehicles in thirty days or less. Insurers can also buy off-the-shelf analytics to facilitate building a UBI pricing model or utilize their own in-house actuarial expertise. Allstate, which has a UBI product, believes that PAYD will have about 20% of the U.S. market in five years’ time. http://online.wsj.com/article/SB10001424052702303901504577462382411135766.html

Ready to buy? Well, you’ll likely have to wait a bit if you live in Canada. To date, only one insurer, Aviva Canada, has offered UBI to its customers and only on a trial basis, having pulled the plug on its program in 2011. Chances are, other companies may be looking at offering a PAYD product but Canada’s insurance industry is probably better known for stifling over-regulation rather than for being a hotbed of customer-centric innovation. As in other arenas, we won’t be pulling the bandwagon but we’ll probably hop on when the parade is well under way.

Why environmentalists and policymakers should read auto mags

I’m not sure what environmentalists and policymakers read in their spare time but I’d like to suggest they could do worse than read the odd car magazine. I’ve been reading them for years and have found them to be a window into emerging social trends, technologies and how government and industry can work together to improve the lives of ordinary citizens. For example, I first heard about cell phone technology in Road & Track. Other auto publications have helped expand my knowledge of recycling, energy use and pollution control – and all while being entertained.

The auto industry is often vilified by people who are passionate about living in a more environmentally and socially responsible way. For some, the car is a symbol of everything that is wrong with our modern industrial culture. There is no denying that private vehicles, for all the economic, convenience, and other benefits they provide, also impose a significant burden on the planet’s finite resources. And car accidents claim lives or leave people permanently injured, serious social and economic costs. Still, though, the car genie is long out of the bottle and won’t be put back soon, or easily, so we really have to be clever about how we live with the genie now and in the future.

The car was born of ingenuity, coincident technological developments and entrepreneurial spirit. The industry’s early and rapid growth and its ascension to a place of prominence in our modern economy bear testimony to the power of capitalism and free markets. Automobiles directly or indirectly transformed everything, from the physical landscape to how we live, work and play. Over time, and as vehicle ownership and use became pervasive, some of the negative aspects, e.g., tailpipe emissions or safety concerns, became more and more apparent and less and less tolerable.

Even if you view the car as a problem and an impediment to realizing a safer, greener world, there are valuable lessons to be learned from our now century-plus relationship with it. Over the past 40-50 years, the industry and its products have been transformed. This was achieved by identifying specific targets and then enacting regulations that forced automakers to respond. For example, tailpipe emissions of various gases and particulates have been dramatically reduced in response to California-led legislation that over the years established ever more stringent standards.  The smog over Los Angeles and other major urban centres – a visible manifestation of auto use – was the catalyst for massive, positive change. The US Environmental Protection Agency reports “…emissions from a new car purchased today are well over 90 percent cleaner than a new vehicle purchased in 1970.” Even with more vehicles on the road, the total environmental burden for specific emissions is lower. http://www.epa.gov/airquality/peg_caa/carstrucks.html

Great strides have also been made in recycling vehicles that have reached the end of their useful life. In 2000, the European Union issued a vehicle end of life directive and there have been many other initiatives on the part of governments and automakers globally aimed at reducing the volume of material that goes to landfill. Today, attention is paid to this issue from the design stage forward.

Similarly, highway death tolls and other safety concerns prompted the introduction of regulated standards so that modern automobiles are far more crash-worthy than their predecessors (check out this crash between a ’59 Chevy Bel Air and an ’09 Chevy Malibu staged by the Insurance Institute for Highway Safety http://www.iihs.org/50th/default.html. In addition to all of the “passive” safety features, e.g., collapsible steering column, seat belts, deformable structures and air bags found in modern vehicles, there are also “active” safety systems such as anti-lock brakes and electronic stability control which improve drivers’ chances of avoiding collisions in the first place.

Currently, automakers are focusing considerable attention on fuel economy. The US (Canada is following) is ramping up its fuel economy standards dramatically as a means for reducing greenhouse gas (GHG) emissions. The objective is straightforward – reduced fuel use and, therefore, lower carbon emissions – even if the solution is not.  Automakers are working on combustion technology, aerodynamics, new weight-saving materials and construction techniques, and other technologies to meet the regulatory requirements. They’re even building electric vehicles. The market will also play its role in shaping the final solution(s).

This approach differs in a very important way from other approaches such as the hotly debated idea of a carbon tax. People may or may not respond as hoped depending on their willingness and ability to shift consumption away from other goods so as to maintain their same level of energy consumption. A carbon tax is a blunt instrument that will potentially impose hardship on those who can least afford it. The objective of a tax is to raise money; taxes have generally been ineffective instruments for achieving other policy objectives. Governments are notorious for collecting revenues for a specific purpose and then redirecting the funds toward unrelated objectives.

Governments also founder when they attempt to pick solutions rather than set targets and regulate industry participants. In a number of jurisdictions globally, in an effort to reduce GHG emissions associated with electricity generation, governments picked wind and solar power as the solution. They have then have imposed taxpayer-funded subsidies to try and make their solution work, distorting the market in the process while also harming their domestic economies. In many cases, they simply shift emissions to a neighbouring country from which they buy power to make up for generation shortfalls.

Cars continue to play an important role in our everyday lives and in the global economy. That is why environmentalists and policymakers should read car magazines. They might then understand how regulated targets and market responses can, and do, deliver meaningful, positive outcomes. They would get some reasonable insight into the massive progress automakers have made toward making their products more socially and environmentally responsible and how that could be effected in other industries. They might also understand how consumers are willing to play along or even advocate for change when they have access to information about the intent and results of the process.