I haven’t posted anything to the Black Swan Blog for 8 months. One reason is that I have had a few other projects that have really monopolized my time. But the other reason is that I usually write blog posts in response to something that interests me in the world of renewable energy. Frankly, not much has been happening since last summer.
I was pretty certain that Texas would be encountering severe problems because of the fluctuations in their wind energy generation. In fact, they had no problems at all last summer. That was helped by the addition of 1.3 GW (Net of reductions) of Natural Gas plant capacity (according to the December, 2014 Report on the Capacity, Demand, and Reserves) and a maximum peak demand of only 66.5 GW in August 2014 as compared to a peak demand of 67.25 GW in August, 2013 and an all-time peak of 68.3 GW in August, 2011).
I did find it interesting to note that ERCOT has now redefined the peak capacity percentages for wind resources. This is essentially the percentage of nameplate wind capacity that can be relied upon during peak demand times. Based upon 6 years worth of data and a large installed generation base this value is 12% for onshore resources in the summer, 19% for the winter. For offshore resources the values are 56% in the summer and 36% for the winter. Unfortunately the vast majority of Texas wind farms are onshore and peak demand is in the summer.
Some of my most popular blog posts have been about the experience with renewables in Hawaii and on that topic there have been a number of interesting developments. In many ways, at least with regards to solar energy, Hawaii is on the bleeding edge with regards to dealing with the opportunities and the problems associated with incorporating large amounts of solar energy into their utility grid.
The issues I had raised in an earlier post have started to reach a critical stage. The “success” of the rooftop solar program has brought many of the circuits in the state to the point where they are at risk of becoming unstable, possibly leading to failures or equipment damage. As a result new permit requirements have been put in place and in the last quarter of 2014 there was a dramatic drop in the number of rooftop solar installations, continuing a trend that started in January, 2013 (Note: I am not a big supporter of rooftop solar even in Hawaii for a number of technical and social equity reasons that I have discussed previously. The law suit between Solar City and the Salt River will determine whether or not a fixed infrastructure charge for solar panel owners will hold up in court).
In another blog post I was very critical of the Hawaiian Electric Company’s approach to renewable energy. It seemed to me that they didn’t have a realistic plan and were basically completely lost with regards to creating a sustainable energy environment. So it was no surprise to me that the company was sold to NextEra in December, 2014. NextEra brings economic clout and a track record of successful renewable projects to the Aloha state utility but will also bring a focus on the “bottom line” that was missing.
Meanwhile, Kauai Island Utility Co-op (KIUC) is taking what I believe is a much different and better approach to the development of solar power. A major focus has been on utility-scale solar installations. In December, 2012 the largest solar installation in Hawaii came on-line in Port Allen. On a sunny day the 6 MW facility is able to provide almost 10% of Kauai’s daytime energy needs.
KIUC recognized that solar power output can vary by as much as 70-80% because of passing clouds. As a result the Port Allen facility was designed to have a large battery backup component that could compensate for short-duration power drops. Through real-world operational experience they found that their initial battery configuration could not stand up to the rapid cycling experienced when trying to stabilize solar power. As a result the utility is replacing the lead-acid batteries in the initial configuration with lithium-ion batteries.
KIUC has not been deterred because of the operational problems it has experienced. The utility is taking the sensible position that these kinds of issues can be expected when trying to really push a new technology. They are in the process of commissioning an addition 24 MW of utility-scale solar which will provide up to 80% of Kauai’s daytime electrical needs. And if the new batteries prove to be cost-effective the utility can start to extend the impact of the solar array by releasing stored energy to the grid in the late afternoon and early evening.
As far as I am concerned KIUC is on the right track. Now if only they would combine a Concentrated Solar Plant with their PV installations they could provide solar power 24 hours a day as they do at the Gemasolar plant in Spain.
I have been biking to work from time to time for many years first in Calgary (which has a truly awesome bike path system) and now in Vancouver (which has a pretty pathetic bike path system despite lots of hype and money spent on downtown bike lanes). But I have never biked more than a few times per month on a consistent basis.
Here at the Black Swan Blog I have discussed lots of interesting applications in the field of electrically-powered transportation: everything from cars to planes and boats. I have also been keeping a close eye on developments in the area of electric bikes.
The announcement of the e-bike by the folks that make Smart Cars seemed to me to legitimize the whole concept. And after I published my blog on electric cars my good friend and high-tech guru Steve Darden pointed me in the direction of the Copenhagen Wheel. This looks like a fascinating concept but unfortunately (?) they have been so successful at generating interest that they have deferred actual delivery of their product until 2015. (by the way, do not visit Steve and Dorothy’s site unless you are really comfortable with escape fantasies and feelings of envy).
I started looking really seriously at electric bikes and after doing a bit of research I paid a visit to Evolution Bikes and test-drove a couple of different bikes manufactured by BH Bicyles. Although the brand is best know for racing bikes their line of electric bikes offers outstanding technology and design.
These bikes are “pedal assist” meaning you have to actually pedal to engage the electric motor. But with the BH bikes you won’t be breaking a sweat no matter how fast you pedal. The bike senses the pedal pressure and provides more boost the harder you push – you feel super-human as you climb moderate hills at 25-30 kph.
In the end I purchased a BH EasyMotion Neo Carbon like the one pictured above. At about 20 kg it can easily be carried on a bike rack and the 30 gears mean that the bike is a very comfortable ride even without the electric assist (I can’t say the same about the Smart e-bike which felt slow and clunky under electric power and was really not a fun time in manual mode).
The best part about having the electric bike is that it allows me to use my Cannondale Synapse Road bike more as well. I have an 18 km commute each way with several large hills and a rather long and nasty bridge to contend with. I found biking both ways on the Cannondale took just a bit too much time and energy for me to do on a daily basis. But now I am able to take the Cannondale one way and the EasyMotion the other.
Since buying the electric bike I have only driven to work once. The 4,000 lb van can now be used for what it is really good at – hauling around 5 adults plus dog and luggage when we go on family outings and road trips.
Biking every day has been good for my fitness level too. This past week my wife brought to my attention the Vancouver Rotary Club Bike-A-Thon in aid of Deaf and Hard of Hearing in British Columbia. It is a 120 km ride and I now feel that is something I can take on (we’ll find out if that is true on July 13). Of course that will be on the Cannondale, not the electric bike.
P.S. Should you be interested in supporting me on this ride for charity you can make a donation at my Ride-A-Thon page.
Earlier this month the Black Swan Blog registered it's 40,000th "Read" on energyblogs.com. I took the opportunity to compare the most popular blogs now with those from August, 2013 when I celebrated 20,000 "Reads". I found it interesting to note that only 4 of the Top Ten from 8 months ago are still in that group of most popular blog postings. For example, the blog I wrote after the first anniversary of the Black Swan Blog has moved very quickly to sit at number 4. The complete list of the current Top Ten follows:
We should use Concentrated Solar Power ONLY after sunset
This posting is the only one to have maintained its position in the ranking. In this posting I discuss how photo-voltaic solar panels could be used in conjunction with a Concentrated Solar Power (CSP) plant to provide relatively economical and dispatchable power.
Post-secondary Institutions Harvest Underground Energy
This posting climbed one spot. It describes how Post-Secondary Institutions are using Geoexchange (sometimes referred to as Geothermal) systems to provide heating and cooling to their campus buildings. In my opinion all new commercial and industrial buildings should be required to implement Geoexchange systems which use about half of the electrical power as compared to traditional HVAC systems.
What if "Climate Change" is the next "Y2K"?
This posting from July, 2013 is new to the "Top Ten". It might seem from the title of this blog that I don't believe that climate change is real or that it is at least partially caused by burning hydro-carbons. That is not the case. My point in this posting is that the considerable "hype" around climate change may be distracting us from focusing on the real fundamental problem – we are consuming non-renewable resources in an unsustainable way. I am concerned that should the climate change concerns cool down we would lose interest in what I believe to be the more important problem.
Reflections on one year of blogging
This posting from October, 2013 discusses my experience with blogging. I am a little surprised at how fast it has surpassed much older postings.
Hawaii Renewables Facing Cross-Currents and Headwinds
This posting has slipped from the #2 spot but still remains very popular. In it I discuss some of the opportunities and challenges facing the renewable industry in Hawaii. In many ways the Aloha State is leading the movement to a more sustainable future but I am not convinced that they are taking the optimal approach.
The Next 5 years for Renewables – A Best Case Scenario
This posting from July, 2013 is new to the "Top Ten". It is the second of a pair of postings where I speculate about developments in the renewable energy industry. Interestingly, the "Worst Case Scenario" posting is much less popular. I guess that means we are either eternally optimistic or unwilling to "face the music".
Power Generation – There's No Place Like Home
This posting slipped from the #4 spot. It discusses several approaches to reducing residential energy consumption and distributed generation such as solar panels, Geoexchange, and community wind projects. I need to revise this posting soon because my views on roof-top solar panels have changed.
How Much Battery Storage is Enough for Roof-Top Solar Panels?
This posting is also new to the "Top Ten". It discusses the ebb and flow of power between the utility grid and a residential roof-top solar array. It provides a link to a couple of tools that can be used to estimate the amount of battery storage needed and the net electricity generated at different latitudes.
The Hawaiian Electric Company Integrated Resource Plan – Welcome to Fantasy Island!
When HECO published its IRP I was surprised at how unrealistic the assumptions and development plans were. I reference an independent consultant's report that takes issue with the almost exclusive focus on computer-generated models of supply and demand. As far as I am concerned this plan is not realistic.
The Fright Before Christmas
This is my Christmas blog posting from 2012 – it has grown steadily in popularity.
Thanks for your continued interest and support.
Almost exactly 6 months ago The Black Swan Blog celebrated its 1st anniversary. It is about to achieve another milestone – 40,000 “reads” on the first site that I began blogging at – energyblogs.com.
At the same time that I was wondering if I should acknowledge this milestone I was attending the BCNET Conference in Vancouver which had the theme “Building Value Through Collaboration”. The presentations by two of the keynote speakers made me consider the value of my blogging and how I might be able to increase that value.
Jesse Hirsh made the case that traditional sources of “authority” from governments to professors to the legally recognized “professions” (i.e. doctors, lawyers, engineers, etc.) are being challenged and in many cases discarded or ignored. He suggested that we are entering an age in which a person’s expertise and their ability to influence will allow them to become a “Cognitive Authority”. And he further suggested that by identifying such “Cognitive Authorities” each of us can “tune in” to the “signal” – the useful information – that is becoming more and more difficult to separate from the cacophony of facts and opinions that we are bombarded with every day.
The key message I took from Jesse’s remarks is that there is an important role and even a responsibility for those of us who express opinions on forums such as The Black Swan Blog. In order to earn the right to be considered a “Cognitive Authority” we need to provide significant value for those that choose to read the material that we post.
As far as I am concerned that “value” must incorporate the following principles:
- The information should be presented with the intent to broaden the reader’s perspective and base of knowledge on the topic being discussed. There are many sides to every story and it is perfectly normal and even useful to forcefully support one point of view. But that point of view must be reasonable. In our legal system we use the concepts of “preponderance of the evidence” and “balance of probabilities” to determine the truth about a body of factual evidence. With The Black Swan Blog I mentally test the ideas I am writing about using those concepts before I publish a blog posting.
- The information and opinions should be original to at least some extent. I would much rather provide a link to an existing “in depth” study than simply regurgitate ideas that have been expressed previously. The ability to easily refer to other material on the Internet is perhaps the most powerful new capability available to us in the information age. In my postings I make a real effort to refer to original material wherever possible.
- In responding to comments from readers it is essential to be respectful in all cases and positive and supportive of comments that provide additional insights into a topic, whether or not they support the premise of the posting. However, there is also an obligation to firmly refute statements that are clearly factually incorrect or deliberately misleading. I would also not hesitate to delete any comment that belittles or is otherwise disrespectful of another participant in a discussion.
Another keynote speaker at the BCNET Conference was Dr. Alec Couros, Educational Technology & Media Professor. Alec challenged the audience to truly “think different” about approaches to learning at all levels. He used the example of the Nyan Cat to demonstrate that the content of a work was insignificant compared to the creativity and innovation that the work can inspire. He also emphasized the power of Youtube and Twitter to support rapid learning and the organic creation of groups of people with shared interests.
Alec urges us to move beyond the exchange of data and facts to begin “sharing our collective human experiences”. In a world where knowledge is becoming a commodity, with handheld computing devices leveling the playing field, a deeper understanding of the human experience is perhaps becoming the most important goal we can aspire to achieve.
I consider myself to be a “techie” venturing dangerously close to “Nerdville” but I have to admit that I have not been a very active user of either Twitter or Youtube. Based upon the what I saw at BCNET, including a great presentation on Enterprise Social Collaboration I will be trying to make better use of these tools in the future.
On a related topic I have been in discussions lately with a number of people about the urgent need for digital curators on the Internet. Within the context of a discussion thread a “Cognitive Authority” can play a role. But in the broader sense of information management there is a need to categorize, prioritize, and sort through the millions of documents, photographs, videos and other digital assets that are strewn across the Internet like the contents of a teenager’s bedroom. It is, I realize, an impossible task to complete but any efforts in this area will improve the current situation.
In the academic world authors provide abstracts and in the corporate world we use “Executive Summaries”. But within the Wild, Wild West of the Internet the closest thing we have would perhaps be sites like Wikipedia. We need to do more.
Having taken that position I have to examine The Black Swan Blog with a critical eye. My conclusion, to paraphrase Pogo is “I have seen the enemy and he is me” .. or is it “I”? The Black Swan Blog has no table of contents and no abstracts so, as usual, I am as guilty as anyone when it comes to implementing an action plan based upon my own recommendations. But at least in this case I can remedy the problem relatively easily. So watch out for a table of contents including abstracts which should be in place within a week or two.
In a previous posting I stated my belief that the pure electric vehicle was the way of the future and that this sector of the automobile industry would grow more or less continuously for the foreseeable future. I decided to do a bit more investigation into how quickly that could happen given trends in vehicle sales over the past few years. I also decided to look into what has been happening with fuel economy rates given that retail gasoline prices have more than doubled in North America in the last ten years. Unfortunately, what I found was not terribly encouraging.
The chart below displays U.S. vehicle sales since the turn of the century.
There are a couple of things of note.
First, the shift from passenger cars to “trucks” (which includes SUVs) between 2000 and 2005 was significant. This trend did not slow down until the price of gasoline hit about $2.30/gallon and even then the impact was not dramatic. What was very dramatic was the decline in vehicle sales in the U.S. as the financial crisis of 2008/2009 battered the economy.
In those years, when cash was scarce for so many Americans, vehicle sales dropped almost 30% sending almost every U.S. automobile manufacturer into bankruptcy. Truck/SUV sales were hit particularly hard, dropping below sales for passenger cars for the first time in the 21st Century. Presumably this reflected a recognition that the cost of owning and operating a truck/SUV was hard to justify in tough economic times.
As the economy gradually recovered it could have been the case that this lesson would have had a lasting impact; that more economical and fuel-efficient vehicles would continue to dominate. Sadly (in my opinion), this has not been the case.
Sales of Trucks/SUVs have rebounded even more quickly than sales of passenger cars and have regained their leadership position. There is every indication that the gap will continue to grow despite historically high gasoline prices.
What impact have these buying patterns had upon the average fuel consumption for the U.S. vehicle fleet? The trends are shown in the graph below.
The gap in fuel economy between trucks/SUVs and passenger cars is large and has actually increased from 6 MPG to over 7 MPG since the turn of the Century. This is primarily because the two categories of vehicles are treated differently under the Energy Policy and Conservation Act which mandates certain levels of fuel economy for vehicles manufactured in the U.S.
The bottom line is that despite having made some progress in the past few years Canada and the U.S. continue to exhibit the worst vehicle fuel economy in the world (for an in-depth analysis see “International comparison of light-duty vehicle fuel economy: An update using 2010 and 2011 new registration data”). And despite record-breaking retail gasoline prices, tough economic times, and an increasing awareness of environmental issues we keep slipping back into the habit of driving fuel-hungry vehicles.
There are justifiable reasons for that purchasing pattern. We do get some nasty weather in much of North America including snow and ice which makes a four wheel drive vehicle a safer ride. And because there are so many SUV’s, pickup trucks and 4×4’s on the road driving a smaller, lighter passenger car can be more than a little intimidating. To some extent the whole situation becomes one of “I need to drive a big, strong vehicle because everyone else has a big, strong vehicle.”
Is there any realistic hope that vehicle buying habits will change in North America anytime soon? The incentives for such change could include significant increases in retail gasoline prices (very likely in the next 5-10 years), significant changes to the CAFE rules (unlikely because of intractable opposition from automobile manufacturers and conservative politicians), and/or a real change in public attitudes towards CO2 reductions that could moderate climate change (I have seen very little evidence of this as described in another blog posting).
Taking all factors into account the prognosis for a significant change to more fuel-efficient, generally more expensive and smaller vehicles is poor. That does not bode particularly well for EV’s which are even more expensive and often smaller than fuel-efficient gasoline, diesel, or propane-powered vehicles.
It was recently announced here in British Columbia that the the “Clean Energy Vehicle Program Rebate” has depleted its funding pool and would not be extended. These rebates provided up to $5,000 in direct government grants for EV’s, representing about 14% of the price of a Nissan Leaf. Even with this fairly generous rebate program less than a thousand EV’s were sold in BC in the last two years – and BC considers itself (perhaps incorrectly) to be the “greenest” province in Canada.
There is another concern that may start to become apparent over the next year or two. The new breed of EV’s rely upon Lithium-ion batteries – the same type of battery that is used to power mobile phones, laptop computers, iPads and other tablets. Having used these types of devices extensively over the past 10 years I have never had a single device where the battery was not essentially useless after about 3-4 years. Perhaps automobile batteries will perform better – I certainly hope they do. But as the early Nissan Leafs and Tesla’s start to age they may degrade significantly; And that would have a chilling impact on EV sales around the world (note that the Prius uses a NiCad battery that has proved to be extremely reliable over 10 years or more).
So have I changed my opinion on EV’s? The short answer is “No”. I still believe that we have embarked upon a revolutionary change that will take place at a steady pace. However, it could well be that the pace of that change will be slow for most of this decade. Only a serious spike in the price of oil, which is always a possibility, could radically speed up the EV revolution. But that would have all kinds of other negative economic impacts that we would all probably like to avoid.
Less than a week after writing this post I was on a business trip to Anaheim and finally was lucky enough to find a Nissan Leaf “in the wild”. It’s owner, Matt Buchanan was kind enough to spend a few minutes talking to me about his beautiful “Felix”. In fact he stated that he was always happy to talk to people about the car and has had lots of questions about it.
Matt has had the car for a few months and is very pleased with it. He noted that the acceleration was particularly impressive and he feels that the Nissan Leaf is the best engineered car he has ever driven.
In terms of range Matt feels that the 75 mile range on a charge is a reasonable claim although he has not driven more than about 50 miles with the car yet – hasn’t had the need to in his normal driving.
One issue that Matt highlighted was the situation with fast charging stations. Originally almost all the stations were free but some are now charging a fee – typically a monthly subscription plus a per minute charge. So this will impact the economics of driving the car if the trend continues.
Overall Matt is very satisfied with his “Felix” and would recommend a Nissan Leaf to anyone considering purchase of an EV.
John Lennon’s iconic song “Imagine” has been rated #3 on Rolling Stone’s list of the “500 Greatest Songs of All Time”. It envisages a world where elimination of some of the major things that divide humanity – religion, nationalism, and materialism – are discarded in order to achieve global peace and harmony.
While the ideas delivered so powerfully in this song are immensely attractive at the conceptual level, things get a bit more nuanced when you take a cold, hard look at the details. Would any of us really want to try and live with “no possessions”? I don’t think so. And yet there are social changes afoot that are heading in that general direction.
This fairly radical view of the future is based upon a growing recognition, especially amongst the millenial generation, that we don’t all need to own a copy of every possible consumer item even if we can afford to have one. Instead, it might be possible to enjoy almost exactly the same lifestyle as we do today by employing a technique which was common in rural agricultural communities not so long ago – it’s called sharing!
Evidence of the growing popularity of this approach is everywhere.
On a vacation in Chicago last summer my family used the Divvy bike-sharing system. Unlike a traditional rental shop which requires you to pick up and drop off a bike at the same location, the Divvy system encourages you to pick up a bike from any of hundreds of locations, drive it to where you want to go and drop it off at a station near your destination. You can keep doing that as many times as you like in 24 hours for as little as $7.
Over the course of a week various combinations of family members drove bikes throughout the downtown area and along the lakeside bike paths for more than 20 hours in total. The cost? $87 including taxes and some surcharges for trips lasting more than 30 minutes. The convenience and flexibility of the system really made it a pleasure to use.
An identical strategy is taking place with car-sharing. Here in Vancouver both Zipcar and the Modo car co-op are growing steadily. As with the Divvy bike sharing system these services allow you to locate the nearest available car using a computer or Smartphone app, pick it up and drive it to your destination where you simply leave it for the next system user.
Car and bike sharing are really not that innovative in the sense that car and bike rentals have been around for a very long time. But what about specialty consumer goods?
Do we all really need to have a full set of power tools? When was the last time you used a router, circular saw, or sonic stud-finder? And what about that deep fryer, chaffing dish, or food processor?
No doubt it is handy to know that you have these items around in case you need them (if you can actually find them in some dark storage cupboard buried under other “essential” items – I often can’t).
But even if you can afford to own them and even if you have a storage space for them think about this.
What if we could avoid the enormous use of energy required to fabricate these items, package them and deliver them to a retail outlet if we just didn’t require as many? In order to try and assess what the potential savings could be I recently put together a video on the Fantastic Voyages of the Stackable Chair (See it on Youtube).
The idea of sharing our precious possessions is more than a little bit disconcerting. And there are certainly things (like my 1975 Stratocaster) that I personally would not feel comfortable entrusting to the use of anyone but a very close friend. But there are many other things that I use but rarely that it would make sense to make available in some sort of sharing scheme. How much damage could someone do to my 20 foot aluminium ladder or my wheelbarow?
There might even be a few items that I wouldn’t be too upset to see damaged or destroyed – the Garden Gnome I received as a gift from my aunt Matilda comes to mind.
A recent article in a local newspaper here in North Vancouver described a new initiative which demonstrates how the concept could actually be put into practice as well as providing a great summary of the phenomenon that is becoming known as collaborative consumption.
The benefits go beyond efficiency and a reduction in energy use. Sharing within a community, be it a University, neighborhood, or club reinforces the social connectivity within the organization and builds that most precious of social commodities – trust.
I grew up in a rural community where helping neighbors take hay off the fields in the late summer was just expected behaviour. In the winter the outdoor ice rinks were built by volunteers. When the local recreation hall was destroyed by fire the entire community pitched in to build a new one. So I get the idea of sharing work. But the concept of collaborative consumption takes sharing to a new level.
I am not quite ready to go “all in” with this idea. But I do find it intriguing enough to pursue it in some form or other. Any concerns I have are definitely not enough to overcome the reality that this is just the right thing to do on so many levels.
“Imagine all the people sharing all the world…”
Unrealistic? Probably. But what a fantastic tribute it would be to John Lennon’s vision and legacy if collaborative consumption reaches even a fraction of its potential.
There has been a lot of discussion about the electric vehicle revolution and what its impacts will be. Are EV’s gaining traction or getting stuck in the mud? Will they quickly replace internal combustion powered vehicles or will they represent a “green” niche market for decades to come? Will manufacturers be willing to lose billions of dollars on EV development forever or will they eventually make most of their profits from this technology?
The questions about EV’s go far beyond the impact on the automobile manufacturing industry (which is one of the biggest industrial concerns in the world). The impacts upon electricity utilization and the grid, both positive and negative, will in many ways shape future decisions about generation and grid management. I am going to explore a few of these questions in this blog posting.
Firstly, what is the current status of EV sales worldwide?
In trying to answer this question we are immediately faced with another question. What is an EV?
One definition would be that an EV uses an electric motor as its primary propulsion system. Such a definition would probably exclude the Toyota Prius and other Hybrids which normally use the internal combustion engine for motive power and reserve the electric motor for very low speed driving (under 25 miles per hour) and more importantly to boost power during acceleration. The Chevy Volt would meet that definition as it only uses an electric motor to power the vehicle even though it has an internal combustion engine which can generate electricity to drive the engine when the battery pack has discharged to a certain level.
In my blogs I always stress that we need to be looking at the ultimate goal which is to eliminate our use of hydro-carbons, including gasoline. Simply reducing our use of gasoline is not sufficient. By continuing to use an internal combustion engine for long distance travel hybrids and even the Chevy Volt avoid the most difficult issue facing EV adoption. Namely, an unacceptably short range under normal driving conditions.
The Chevy Volt can travel approximately 45 miles on battery power alone under good conditions. The Plug-in Hybrid version of the Toyota Prius is rated at 14 miles. Of course both of these figures can be considerably less in cold winter conditions or under heavy load (for example going uphill for a long distance).
The average commute for U.S. workers is about 16 miles so the Volt would probably work using electric power only. The Prius would definitely not. Neither would work for many weekend trips under electric power alone.
For these reasons I am not going to include either plug-in hybrids or the Chevy Volt in my definition of Electric Vehicles. I will discuss vehicles that are practical, electric power only vehicles that have no gasoline tank. These vehicles are, in my opinion, the true future of the automobile.
Using that definition there are only two mass-market EVs available today. The Nissan Leaf and the Tesla Model S.
Although it is difficult to get accurate quarterly sales figures the graph below represents a reasonable estimate of how sales of these two vehicles have grown since the launch of the Leaf in 2011 and the Model S in mid-2012.
There are now more than 80,000 Leafs on the roads of the world and about 30,000 Tesla Model S’s. This number has been increasing at a steady pace, notably due to a price decrease by Nissan at the beginning of 2013 and by increasing recognition of the Model S as a vehicle that has dependable long-range capability.
The EPA estimate for “average range” for the Leaf is 75 miles. That will certainly handle most commuter trips and some longer trips.
The Tesla Model S is EPA rated at 265 miles range with the largest battery available in 2013. The Model S can also be equipped with super-charging capability which is able to fully recharge the battery in less than an hour. The large battery range and the existence of super-charging stations make long road trips with a Model S quite realistic. A map of the super-charging stations in place at the end of 2013 is displayed below.
EPA ratings and marketing brochures are one thing. Real world experience can be quite different.
When I began to write this blog posting I started looking out for EV’s in my home city of Vancouver, B.C. After a few days I caught sight of a Tesla (I probably missed a few Nissan Leafs which are harder to differentiate from other similiar sized vehicles). I was able to follow the Tesla into a parking lot where the owner, Barry Yates, kindly agreed to an impromptu interview.
Barry purchased his vehicle the first day they were available locally. He regularly travels to Whistler Mountain for ski trips, a distance of about 70 miles. The road to Whistler climbs uphill to an elevation of more than 2,000 feet and has to be done in cold, winter conditions. Barry told me that he never has trouble making the round-trip on a single charge, partially because there is regenerative charging on the trip down on some of the steeper declines.
One thing that surprised me was that Barry felt the Tesla had good winter road handling despite being a rear-wheel drive vehicle. The large battery distributes the weight very evenly between the front and rear wheels which probably helps. Barry has installed snow tires which is a normal requirement for all vehicles travelling to Whistler.
Barry has also made a few trips to Seattle, Washington, about 150 miles from Vancouver. He has been in the habit of stopping at a Super-charging station at Burlington, Washington, about half-way to Seattle. The 15-20 minute stop tops up his charge so that he doesn’t have to worry about being low on power as he gets closer to Seattle. Anyone that has been in the traffic jams on the I-5 can appreciate that.
What is the bottom line? I think a fair evaluation would say that currently available technology can produce a vehicle that meets the everyday needs of most North Americans.
But that does not guarantee that the adoption of EV’s will be quick or smooth. The Tesla Model S is an impressive vehicle. However, the pricetag is also impressive with the long range version costing more than $70,000. The Nissan Leaf, at about $30,000 is more manageable particularly after various rebates and incentives are accounted for. But for a vehicle its size it is not inexpensive.
There are very significant savings to be had with a true EV with regards to fuel costs. Barry Yates indicated that his home electric bill had gone up about $50/month after installing a 220 V charging system for his Tesla. However, his fuel bill went down more than $500/month. An annual savings of something like $5,000 isn’t a bad return on an investment of $70,000 – as long as you have the $70,000 to put into a vehicle.
Prices will come down as the technology matures, as manufacturers start to achieve economies of scale, and as inevitable increases in the price of gasoline make the returns more attractive. So in my opinion the transition to EV’s is underway and won’t slow down anytime soon.
Given that new reality it would be wise to consider some of the non-automotive consequences that will likely result from this transition.
First, what will the impact of EV’s be on electrical load factors?
Like most things when it comes to load factors the impacts are not that easy to predict. However, it is likely that a typical charge cycle will extend from the time a person gets home from work until the battery is fully charged.
Barry Yates indicated that charging his Tesla takes about 10-12 hours on a 220 V outlet (the same type used for a clothes dryer or oven). As the number of EVs increases this new source of load will start to have an impact on demand curves and the grid.
In Northern areas where the peak demand is in winter this will be particularly problematic. The period 5:00 pm to 8:00 pm is already a high demand timeframe and adding EV charging will definitely result in new record demands unless significant changes in energy usage can be implemented.
In the south where summer air-conditioning results in peak demand the impact will not be as severe. The main impact will be to extend the typical peak demand period (2:00 pm until 5:00 pm) later into the evening but it is unlikely that higher peak demand would result.
For workers with longer commutes there will possibly be a need to charge vehicles after arriving at the workplace. This should not be much of a problem because the morning peaks are not as high as afternoon and evening peaks regardless of the season.
Based upon this very preliminary high level assessment it would probably be wise to try and delay home EV charging until later in the evening. A start time of 11:00 pm would still provide a long enough charging period for most users. As EVs, like other appliances, become “smarter” it may well be possible for them to be programmable to delay the charging cycle until a specified time. Perhaps, ala Siri, this could be done by voice command.
“Car – start charging at 11:00 pm” – sounds both futuristic and creepy at the same time!
In anticipation of an eventual fleet of hundreds of thousands of EVs, considerable research has been conducted into how this resource can be used for grid stabilization and frequency smoothing services. A number of papers published by scientists from the National Renewable Energy Laboratory (NREL) have discussed implementing real time demand response by controlling when the EV fleet starts and stops charging (see for example “Value of Plug-in Vehicle Grid Support Operation”). Of course this would depend upon vehicle owners allowing the local grid operator to control the charging functions of their vehicles. It also assumes a reliable grid-toEV communication infrastructure and protocol was in place.
There has also been some speculation that EV batteries could be used as a source of electricity for the grid when sharp drops in generation capacity occur (as a result of changing weather patterns which impact renewable generation sources such as solar or wind or as a result of an unexpected plant/unit shutdown). This is much less likely because it would require that whatever outlet the EVs were plugged into was capable of receiving electricity as well as delivering it.
Finally, there have been proposals to combine used EV batteries into an array that could act as utility-scale energy storage, capturing excess electricity at night or other low demand times and delivering it as a peak demand source. I discussed this research in one of the first postings in the Black Swan Blog.
It was more than 100 years ago that Henry Ford’s Model “T” rolled out of a factory in Detroit Michigan signalling the beginning of the end for steam powered automobiles. Those were radical times; the internal combustion engine and the assembly line combined to bring affordable transportation to the masses. Our love affair with the automobile has never waned since that time.
The change we face today is no less radical.
This revolution will put you in the driver’s seat of vehicles that move so quietly they can hardly be heard; they will not pollute our atmosphere; they will not rely upon the extraction of an energy source that cannot be replenished.
I don’t know about you but I can honestly say that I can hardly wait until I have managed to trade in my 7 passenger Town & Country (can you really call a 4,000 lb vehicle a mini-van?) for an EV – maybe even a Smart Bike!
Having spent more than 25 years in the oil and gas industry I have seen my fair share of hydro-carbon price fluctuations. So it has not come as a complete surprise to me that the “shale gas” phenomenon has had such a dramatic impact on North American Natural Gas prices.
At the beginning of the 21st century Natural Gas prices were about $4.00/Million BTU and thereafter they rose rapidly to $8-$10/Million BTU in the years 2005-2007. The economic crisis that started in the fall of 2008 coincided with increasing production due to the success of shale gas development which translated into a very rapid decline in Natural Gas prices to just over $2.00/Million BTU in 2012. Since then prices have recovered somewhat to about $4/Million BTU.
The low prices since 2008 have resulted in a very predictable decline in the number of drilling rigs exploring for new natural gas reserves. The impact is displayed in the graph shown below.
There are a few very striking features of this graph.
First, the almost total elimination of vertical drilling rigs is interesting. In traditional gas fields widely spaced vertical wells are able to drain the reservoir efficiently because the gas flows quite freely through the rock. In technical terms this type of reservoir has relatively high permeability.
Reservoirs that consist of rocks with lower permeability cannot be produced very efficiently with vertical wells. It is much more efficient, although also much more expensive, to develop these reservoirs using horizontally drilled wells as shown below.
As horizontal drilling grew more common in the late 1990’s it was possible to economically produce reservoirs that previously had been difficult or impossible to exploit. These so-called “tight gas” reservoirs became an ever more important source of Natural Gas in North America.
Because “tight gas” does not flow freely through the reservoir rock these wells produce a lot more gas in the first year of production than they do in subsequent years. In the industry this is known as the production decline rate.
While a decline rate in a high quality traditional reservoir might be 1-2% (allowing fields such as the Groningen in the Netherlands which came on-stream in 1963 to produce for an estimated 80+ years) tight gas can decline at 10-20% or more annually.
The “shale gas” phenomenon is a variation on “tight gas” which involves the injection of high pressure fluids and chemicals into a horizontally drilled well to break apart or “fracture” the reservoir rock near the well bore. After years of research & development fracking techniques have become standard industry practice and reliably result in significant production from shale reservoirs.
The exploitation of “shale gas” has increased dramatically since 2005 resulting in a glut of Natural Gas in North American markets. This in turn has driven down the price of Natural Gas to near historic lows in constant dollar terms. The Energy Information Agency forecasts that Natural Gas production in the United States will continue to increase for the next two decades based upon ever-increasing production of “shale gas”. They also forecast only modest increases in Natural Gas prices to the range of $7-8/Million BTU by 2035.
I am not convinced that this scenario is at all realistic.
The steep decline in drilling activity over the past 5 years is going to catch up with us at some point in the near future. It usually takes a few years to tie new gas wells into the distribution system and put production facilities in place. Therefore there is a lag between drilling activity and production.
The other difficult obstacle to overcome is the impact of rapid decline rates on total shale gas production.
Assuming a constant amount of drilling activity and discovery success the total production flattens out after about 15-17 years with a 10% annual decline and after only 10 years with a 20% decline, as shown in the graphic below. As the amount of shale gas in production increases the annual decline eventually is equal to the annual additions made through drilling for new reserves (Gary Swindell has analyzed production decline rates in great deal in a paper published in 1998 and extensively updated in 2005).
As noted above drilling activity has not been constant over the past 5 years but has actually decreased pretty dramatically. It follows that there will probably not be a large increase or in fact any significant increase in shale gas production over the next five years.
At the same time the older gas reservoirs will continue to decline at a slow rate as they have for decades.
Putting all these factors together it seems likely that Natural Gas supplies in North America will tighten up somewhat in the next few years. This dynamic of gas “booms” and “busts” is one we have seen many times before and is primarily driven by commodity prices.
When prices reach lows such as they hit in 2012 drilling activity dries up, supplies tighten due to declines and prices go up. Eventually prices go up enough for exploration companies to be willing to renew the search for new gas reserves. That process takes a couple of years during which supplies tighten even more and prices go up further. Eventually the balance swings in the opposite direction and supply meets or exceeds demand and prices soften.
The implications of this cycle are quite worrisome when put in the context of electricity generation.
The MACT regulations will force the closure of more than 40 GW of coal-fired generating capacity in the next few years. This is firm and dispatchable generation that can be called upon at peak demand times. No amount of solar and wind can replace that loss reliably without massive amounts of affordable energy storage which does not exist.
Utilities are struggling to come up with plans to replace the lost coal-fired generation capacity. In many cases the current low prices are pushing utilities towards the construction of Natural Gas fired plants.
That cannot be considered to be a negative choice. Natural Gas burns more cleanly than coal and produces about half of the CO2 per Watt of electricity generated. But there are a couple of problems with a wholesale switch to Natural Gas.
For those truly fearful about climate change then the fact that Natural Gas produces CO2 will continue to be a problem.
Probably more important on a daily basis will be the potential impact on utility rates if Natural Gas prices escalate significantly.
There is a reason that more than half of the electricity generated in the United States up until the turn of the century came from the burning of coal. Coal was and remains the least expensive energy source available.
Coal can also be stockpiled at a generating plant. That may not seem important but congestion in pipelines can be a real problem when temperatures drop and both residential users and power plants are consuming Natural Gas at the maximum rate possible. That was an issue in the NE part of the continent during the recent “Polar Vortex” storm.
My fear in all of this is that utilities will spend 10’s of billions of dollars building Natural Gas plants which will help drive prices up – and those price increases will be passed on directly to electricity consumers.
My hope is that this rather bleak future of higher prices and continued CO2 emissions will cause utilities and governments to consider putting more time and money into developing affordable energy storage solutions. If we could store energy on a very large scale we could time-shift solar and wind generation to match our demand patterns. That is, in fact, a requirement before we can move completely away from the burning of hydro-carbons to generate electricity. There are other measures that we can pursue – many of which are described in my Sustainable Energy Manifesto.
In previous blog postings I have expressed my concerns about the relative return on investment and the economic fairness of roof-top solar panels. But I am also a big fan of solar power which is, after all, the most abundant and the most reliable energy source that we have at our disposal. In this blog I want to draw attention to some encouraging news in the industrial development of solar power. I also want to point out a few very fanciful uses of solar power that I believe demonstrate some of the future potential of this resource.
First, it is an exciting time to be involved with Concentrated Solar Power (CSP). With the commissioning of both the Solana Plant in Arizona and the Ivanpah Plant in Nevada over the next few months the global CSP generating capacity will almost double. The Solana Plant is particularly encouraging because it incorporates molten salt storage allowing the Plant to run for up to 6 hours after sunset. It is not the first plant to incorporate molten salt storage but it is the biggest.
Half a world away CSP developments in North Africa and the Middle East are starting to gain traction. The Noor I CSP Plant broke ground in Morocco in May, 2013 with financial support from the German government. In the same month the Internationally backed Climate Investment Funds approved a revised plan for the rapid development of CSP in North Africa. This plan aligns with the Desertec Foundation’s vision of utilizing solar resources in desert regions to transform local economies while supporting a transition to sustainable energy resources.
This year the government of Saudi Arabia made a massive committment to the development of solar power with the goal of converting most of the oil-fired desalination facilities in the Kingdom to solar power. That would provide some relief for global oil supplies (currently almost 2% of global oil production is used in Middle East desalination plants) as well as representing another very substantial increase in global CSP capacity.
The only negative development in the world of CSP is the 180 degree change to support mechanisms for the development of this technology in Spain.
Prior to 2013 Spain had been a world leader in developing CSP and is home to the two premier CSP engineering firms. However, the elimination of almost all financial supports for CSP developers in August, 2013 has led to a collapse of CSP projects in Spain. Luckily there continue to be many new opportunities in Africa, the Middle East and the U.S.
Photo-Voltaic solar panels have had more of a mixed year in 2013. Module prices seem to have bottomed out and the resulting price competition has led to the bankruptcy of a number of manufacturers. In jurisdictions where the penetration of solar panels has reached double digits as a percentage of normal load incentives are being cut back and in some cases regulatory barriers are being raised, most notably the capacity studies in Hawaii. In Arizona monthly service fees are being added to the utility bills for homeowners with rooftop solar panels. The many challenges facing PV solar represent a serious risk to the further development of this resource.
Although dropping solar cell prices and associated reductions in margins are disrupting the supply side of the PV solar business these developments are making it possible to showcase solar power in ways never before possible.
The team behind the Solar Impulse solar-powered airplane announced that they will attempt an around-the world flight in 2015 entirely on solar power. This well-funded and experienced team has been working for more than 10 years to make solar powered flight a reality.
Solar Impulse is not the only game in town when it comes to harnessing the energy of the sun to power an aircraft. Flying somewhat under the radar is Eric Raymond and the team behind the Sunseeker series of aircraft. The newest member of the family, the Sunseeker Duo (shown above) is currently undergoing flight tests. It will be the speediest solar-powered aircraft ever built. It will also be the first to be able to carry a passenger. I would encourage my readers to visit these sites and if you like what you see consider making a donation which will help these organizations continue their ground-breaking work.
Shifting from the skies to the oceans, the world’s largest solar-powered ship, MS Türanor recieved a new life mission as a research vessel after completing the first solar-powered circumnavigation of the earth’s oceans. It has set off on a Swiss-sponsored voyage to study the seasonal changes in the behaviour of the Gulf Stream.
These innovative applications of solar power demonstrate the potential of an energy source that can meet many of our current needs. Efficient and cost-effective energy storage remains elusive but with a dedicated global effort storage solutions will be developed. In the meantime it is interesting to watch as solar power moves from the hand-held calculator to powering transcontinental flights and beyond.
Most supporters of renewable energy development are probably pretty comfortable with the way things are going. Wind and Solar generation has been increasing both in "nameplate capacity" and in actual production of electricity. There have not been any significant grid failures that can be blamed on renewables. Apart from a consolidation within the solar cell manufacturing sector there have not been any notable bankruptcies within the electricity generating sector. All visible signs are positive for a continued expansion of renewable resources.
When I talk to groups about renewable energy I start off with a Youtube video which demonstrates testing the compression strength of a concrete block. For 2 minutes and 40 seconds this is the most boring video you could imagine. The block shows absolutely no sign of stress. At 2:41 the concrete block fails and is utterly destroyed. As far as I am concerned we are at about 2 minutes and 30 seconds with respect to the electrical grid.
In order to understand what I believe to be the serious risks facing the electrical generation and distribution system it is necessary to review the structure of the system as it was before renewables began to be developed in a significant way. The chart below shows hypothetical load profiles for a peak demand day during the spring/fall, winter and summer as well as a line that represents the overall generating capacity in the system.
It can be observed that the system demand/load varies considerably throughout the day and throughout the year. It is also clear that there is a great deal of excess supply available for most hours on most days. In fact, only on the highest peak demand days of the entire year will the demand come close to the supply. That is by design as every well-managed electrical generation system in the world requires a reserve margin of 8-15% above peak demand. This reserve is meant to provide resiliency for the grid to accommodate scheduled maintenance shut-downs at major facilities such as nuclear plants, natural gas-fired and coal-fired plants as well as unscheduled outages due to storms or switching problems or other operational issues.
(Note: I appreciate that many people will raise objections to the demand curves presented in that their local situation might be very different. That is one of the challenges facing every Independent System/grid Operator. Local demand curves can be all over the map due to the mix of commercial, residential, and industrial users. My point is not that these particular curves are the most typical in all locations. The point is that demand varies significantly over the course of the day and through different seasons.)
So before we began to develop renewable energy there was plenty of generation capacity within the system. In fact, many generation facilities were not running at anything close to capacity most of the time.
Because of a public policy decision to reduce the burning of hydro-carbons (and the associated production of CO2 emissions) wind and solar generation sources have been subsidized through a variety of financial instruments including capital grants, tax credits, and feed-in-tariffs. Renewables have also been given preferential access to the grid in most jurisdictions.
These measures have achieved the stated policy goal. Wind and solar now make up a significant percentage of generation capacity in a number of jurisdictions and at times provide a large percentage of electrical production.
For example, Germany has developed over 30 GW of solar power and over 30 GW of Wind. On a blustery spring day in Germany renewables can meet up to 40% of the total electrical demand for a few hours at mid-day. There are regular announcements of "new records" for both solar and wind generation. A similar situation exists in Texas with regards to wind and in parts of Hawaii with regards to solar.
Remembering that there was already a surplus of generation capacity in the system before the development of renewables it is obvious that when renewables hit their generation peaks most traditional thermal generation plants are unable to sell electricity. That would not be a problem if the construction of these plants had not been financed based upon assumptions regarding how often they would be used and what wholesale electricity prices would be. In fact, the economics of running these plants has deteriorated to the point where many utilities, especially in Europe, are on a "credit watch".
The rational response of companies trying to sell electricity into a market that has a great over-supply would be to decommission some of the oldest and most polluting plants to bring supply and demand into a better balance. But there is a problem. Renewable resources cannot be relied upon, particularly at peak demand times. The chart below displays the wind resource available compared to the demand curve for a week in November, 2013 in Texas (this week was not chosen on purpose to make wind look bad. It was literally the first file I found on the ERCOT site when I was starting to write this blog).
In this situation demand rose throughout the week as a strong high pressure system spread across the state bringing with it colder temperatures while at the same time shorter days required more lighting. One of the more troublesome realities of meteorology is that large, stable high pressure systems are often responsible for peak electrical demand in both winter and summer because they are associated with clear skies and temperature extremes. These systems are also commonly characterized by very low winds across a wide area.
As a result while demand continued to climb wind energy faded away to almost nothing. At this point most of the thermal generation assets available within Texas had to come on-line in order to meet demand.
So it is impossible to decommission even the oldest and least efficient thermal generation plants in the system regardless of how many wind farms have been built and solar panels deployed. German utility E.on came face-to-face with that reality in the spring of 2013 when they were instructed by the local grid operator to keep an old plant operational even though it would rarely be needed.
But a new day is dawning in the U.S. and it could be a darn cold (or hot) one.
The EPA announced regulations in December 2011 that will require coal-fired thermal generation plants to clean up or shut down. The reality is that for many of these plants it will not be feasible to clean them up. In fact, in some cases the EPA will not even allow them to be updated with modern pollution controls. As a result more than 30 GW of firm generation capacity will be decommissioned over the next several years.
Plans to replace this loss are in some cases vague and have been changing often. Increased conservation and better utilization of existing plants are frequently included in Integrated Resource Plans. In other cases greater reliance upon renewables is explicitly identified. These are not really replacements for firm capacity.
A number of new Natural Gas fired plants are also under construction. While current low gas prices make this an attractive option the threat of future significant price hikes as well as the EPA's stated goal to regulate CO2 emissions are worrisome and are impacting the ability to secure financing of these plants in some cases.
As more and more coal-fired plants are retired it is likely that total system firm generation capacity will drop resulting in smaller reserves. This, in turn, will make the system more susceptible to storms or other unplanned outages.
The degree to which grid security is compromised will vary from region to region depending upon the penetration of renewables, number of coal-fired plant retirements and the health of the local economy which has a major impact on electricity demand. Based upon those factors I believe Texas and the Mid-west are the areas most at risk.
It may be that the reduction in coal-fired generation will do nothing more than cull excess capacity out of the system with no negative impacts. But groups such as the Institution of Engineering and Technology in the UK have issued warnings about the progressive stress on a system that has taken decades to evolve and is now faced with unprecedented challenges.
Like the concrete block in the Youtube video the system is not displaying any outward signs of weakness. The question is this – will the North American electricity system encounter its own version of second 2:41?
« Previous entries Next Page » Next Page »