Thursday, May 28, 2015

66. MARY MEEKER AND THE BATTERY                                                         

Mary Meeker is a venture capitalist at Kleiner Perkins who publishes annually a report on internet trends. Her 2015 report can be found here. One can find some interesting insight and subtle observations on how our new digital society is evolving and using technology. At the highest level, it says our society is going, or perhaps a more appropriate tense, has gone, mobile. It is not surprising. We see it in the office with our colleagues, at home with our teen children, with strangers on the streets, cafes and trains, and even in remote travel locations where economies rated as developing or underdeveloped are adopting mobile at a rapid rate. Just look at internet penetration in India, China, Asia and Africa. Social media, e-commerce, revolutionized services, and instantaneous communication are some of several drivers. We hear now a new saying: Mobile is Uberizing the world.

This broad mobile transformation of society is and will for the foreseeable future remain powered by the lithium-ion battery. We cannot conceive of a world let alone a mobile society without a compact, cost effective, portable and safe energy source like the lithium-ion battery. As much as the battery has enabled mobility, a mobile society is and will continue to drive and enable the evolution of the battery, regardless of how challenging it may be. The stakes are just too high now!

One of Mary's slides shows the number of hours an average adult in the US spends in one single day glued to a screen watching or working on digital media. 

In the course of seven years, since the iPhone was first introduced, we have tacked 2.5 hours of viewing time reaching a total of nearly 3 hours of screen time on mobile devices alone. This includes time checking email, tweeting and posting messages on Facebook, booking your next vacation and following your stocks...etc. This growth corresponds to doubling in the last 4 years, not quite as fast as Moore's Law but certainly a lot faster than the annual increase in energy density in lithium-ion batteries. Will this trend continue? I don't know, but if it does, we are looking at 8 hours or more of daily screen utilization on a mobile device by 2020, and that will most certainly put some serious strains on today's battery technology.

Batteries with an energy density near 600 Wh/l were first introduced in the market around 2013 and continue to be state-of-the-art batteries today with a capacity near 3,000 mAh. It is widely accepted that for a standard 5-in mobile device, such a capacity is sufficient for a solid and honest full-day operation.  But if the trend is to increase the utilization per day by an additional 5 hours by 2020, then batteries ought to be sized to last longer. That is a very tough ask when energy density is barely increasing at 4% per year.

The answer: barring a surprise revolution in materials discovery, the answer is fast charging, unless consumers are willing to accept bulkier, thicker and heavier smartphones, a trend that is not showing any signs of life. But fast charging is not only the ability to put charge in a battery at a fast rate, but also the infrastructure supporting it, from the proper chipsets to AC adapters, cabling...etc. With new chipsets from manufacturers like Qualcomm supporting new high power protocols (e.g., QC 2.0), new standards in particular the new USB Type-C, and clever battery management and charging algorithms similar to the ones developed here at Qnovo, fast charging will become a standard in the coming few years, allowing the end user to charge multiple times in one day without the agony of having to wait hours for a charge.

© Qnovo, Inc. 2015 / @QNOVOcorp @nadimmaluf #QNOVOCorp

Monday, May 18, 2015


Fact: I am able to ride a significantly longer distance on my bicycle than I can drive in my electric can be frustrating to see my electric vehicle run out of juice! So I began to wonder on a recent cycling trip to what extent my Ford Focus Electric vehicle was less  energy efficient than my bicycle? or perhaps was my electric vehicle not well designed and the battery appropriately sized? It was time to dig a little deeper.

Curiosity meant that I had to take the thought one step further: how would these two transportation modes compare with nature-provided bipedalism, and ultimately, with the modern gasoline-powered automobile. This is most likely a purely academic exercise in the sense that none of these transportation modes is meant to replace the other, at least not today, but is meant with the hope that we can learn from one method to improve our design and engineering methodologies.

So let's start by measuring an estimate of energy consumption for each of these transportation modes. First, I know from the dashboard of my electric vehicle that I have averaged 235 Watt-hours per mile (Wh/mi) over the past 20,000 miles of driving.  I also know from my bicycle instrumentation that I am averaging approximately 50 kCal per mile -- granted, that is at a faster pace than most casual riders but still represents a good starting number. We also know that a medium-sized gasoline-powered sedan has an average fuel economy of approximately 25 miles to the gallon (mpg). And lastly, a wide range of sports publications estimate that a running human burns about 100 kCal per mile.

Next, we need to harmonize these units for a useful comparison. I will spare you the math and give you the conversion factors. I assumed here the EPA's equivalent figure of 33,700 Wh in each gallon of gasoline. The factors that matter are:

1 kCal/mi = 1.162 Wh/mi = 0.00003449 Gal/mi

Summarizing into one table, we get:

I took the liberty of adding electric bikes and the Tesla Model S to the mix as well as adding an approximate gross weight for each mode, noting that the weight of the bikes do not include the weight of the rider. So what are the results telling us?

First, something we already knew or suspected: An electric car is about 4 to 5x more energy efficient than a sedan with a gasoline-engine. This is primarily due to the fact that an electric motor is more than 90% efficient, whereas a gasoline engine is near the 20% other words, 80% of the energy in a gasoline engine is lost to heat, and only 20% is used for movement.  Go EVs!

My bicycle is nearly 4x more efficient that my Ford Focus Electric. That is a little odd because the human body is not a very efficient machine. So why is the bicycle powertrain more efficient that this ultra-efficient electric motor? The answer is weight! A bicycle with a rider weighs 1/20th the weight of an electric car.  The Tesla Model S is considerably heavier than my EV and consequently consumes more energy. The next time you wonder why the Toyota Prius has better fuel economy than a regular sedan, think weight. Geez, we kind of knew that, didn't we?

But now, we start making observations that are less intuitive. An e-bike is more efficient than a bike, which in itself is more efficient than a human running, yet all three are sufficiently close in weight. The human body is mechanically not very efficient, especially when compared to the power train of a bicycle. The rolling motion of a bicycle lends itself to lesser friction than walking and running, and is thus more efficient. An e-bike replaces the rider with an electric motor that is more efficient than his or her leg muscles -- though not as healthy!

So what does it all mean? First, electric-powered transportation is the way of the future -- as long as we are not getting the electricity from dirty coal-fired power stations. Second, shed the weight, and that is the weight of your vehicle, and your own weight if you like to ride. Third, a banana provides a human being with about 105 kCal, or equivalently, 120 Wh of energy. That can power a human rider on a bike for 2.5 miles. No power-generator can turn a banana into sufficient electricity for any useful purpose. In other words, while electricity looks green and clean, by the time we consider its cost of generation both in dollar terms and impact on the environment, it probably cannot compete with a healthy banana. Besides, eating a banana beats all forms of fast-charging....Be safe in all your travels!

© Qnovo, Inc. 2015 / @QNOVOcorp @nadimmaluf #QNOVOCorp

Wednesday, May 13, 2015

64. ON MOORE'S LAW AND SNAIL'S LAW                                                      

It's all over the media: Moore's Law just turned 50! What is Moore's Law? It's more an observation than a law, but it has stuck around, now 50 years, that we think it is a law, like gravity.

On 19 April 1965, Gordon Moore, at the time the head of R&D at Fairchild Semiconductor, and later the CEO of Intel, made an observation turned prophecy. He predicted at the time* that the number of components and transistors on an integrated circuit (IC or chips) would double every approximately 18 months while holding the cost of the chip constant. In layman's terms, it means that the industry will be able to double the complexity, and hence the computational power, of these chips every roughly 1.5 years without increasing the cost of the function. And for 50 years, this prediction held remarkably well and has been hailed by the tech and semiconductor industry. 

Its implications are just spectacular. Next time you hold a smartphone in your hand, pause for a moment and think about the fact that it is thousands of times more capable than the Apollo Guidance Computer that landed Neil Armstrong and Buzz Aldrin on the moon. Moore's Law is in some ways the new Bible of the tech industry with an implicit expectation that all new technologies ought to follow this trend. But is that true? and specifically, is it and will it be true of energy storage and battery systems?

Left: Processor power measured in MIPS shown on a logarithmic scale         Right: Battery energy density in Wh/l

The answer in a nutshell is a big fat NO! Whereas history has shown that semiconductors follow Moore's Law, that same history shows that the trend in batteries is closer to the progress of gastropods, hence, Snail's Law. The two figures above illustrate the difference. From 1995 to 2015, the computational power of processors made by Intel increased by a factor 300X, effectively doubling every 2 years. In contrast, over that same period of 20 years, the energy density of lithium-ion batteries increased by 4X, or less than 7% annually.  No one disputes this fact because most consumers complain about their batteries, and few, if any, complain about the processor or the electronics in their devices. But why is that? They both involve materials and manufacturing, yet the differences are stark.

It boils down to a balance between the laws of physics and economics. The laws of physics dictate the amount of technical improvement that is possible given a scientific and/or engineering problem. In the case of semiconductors, these were the laws of physics that dictated the shrinking of the dimensions of transistors in a silicon chip. Back in 1965, these transistors did not operate anywhere near the fundamental limits of materials or equipment. So physics were not the limiting factor in this balance, but economics were. In other words, the R&D and manufacturing costs associated with shrinking transistors had to increase at a lesser pace than the technical performance of these transistors. Under such a circumstance, these added costs were amortized over a rapidly increasing technical performance, and hence the benefit of Moore's Law: more performance for the same cost point. Said differently, shrink the dimensions more, get more benefits, and this equation becomes seemingly a virtuous circle....that is until it starts to hit the limits of physics, at which point the balance tips -- something that the industry may be soon facing.

For batteries, that balance between technical limits and economics was really never in place. First, the cost of R&D and manufacturing was not offset by increasing performance, in particular, energy density. In other words, every increase in energy density manifested itself initially as an increase in unit cost. So there really was never an equivalent to Moore's Law's cost-constancy. As a matter of fact, as we examine closely the economics of lithium-ion batteries over the past 20 years, we find that the cost of these batteries declines as a function of cumulative production volumes, not annual production volumes. This is a much slower cost curve and is partly responsible for why raising R&D investments for battery research does not make a lot of sense to battery manufacturers. There is more supporting evidence in the fact that battery vendors live on single-digit gross margins, whereas many semiconductor companies have gross margins close to 50% -- i.e., much better profitability.

Second, lithium-ion batteries are already hitting some serious material and physical limits. The presently used material systems in lithium-ion batteries seem to saturate right around 650-700 Wh/l. Going above these figures means higher R&D and manufacturing investments for new materials, and these costs are difficult to amortize.

The result is that energy density begins to level off or improve at an increasingly slower rate. Yes, a breakthrough from a university research program or an innovative company may change that, but history has shown that such breakthroughs don't come from wishful thinking, but rather from years and billions of dollars in research, both of which are becoming scarcities in batteries. 

But that may not be such a bad thing. When technologies begin to level off, cost pressures rise immensely as better manufacturing methods are introduced and as more competitors, especially in low-cost geographies, join the fray. So that means costs will drop rapidly -- this perhaps may be the implicit corollary and inverse of Moore's Law. In mathematical form, we can anecdotally write this as Snail's Law = 1/[Moore's Law].

In summary, I believe that the battery industry is entering a new era with accelerating cost pressures, accompanied with a shift to improving the performance of the entire battery system that includes the individual cells, the control electronics, algorithms and software. And that will bode well to companies that are skilled in this system integration exercise , regardless of the application and end-market.

*You can read here Dr. Moore's original article from 1965 reprinted in the Proceedings of the IEEE

© Qnovo, Inc. 2015 / @QNOVOcorp @nadimmaluf #QNOVOCorp

Tuesday, May 5, 2015


Tesla Motors announced a few days back its new battery pack for the residential market. Called the PowerWall, it will be offered through Tesla's sister company, SolarCity, to residential customers. Tesla says that this home battery will allow consumers to charge it using solar panels to power their homes at night. Alternatively, consumers can charge it at night when electricity prices are low, and use this stored energy during the peak-rate hours, often in the afternoon hours. In the parlance of utilities, this is called "peak shifting," i.e., the consumer gets to shift the load. In financial terms, this is called arbitrage, i.e., buy energy at the low price, then sell it (or use it) when the price is high.

Tesla's PowerWall is rated at 10 kWh with a smaller product version rated at 7 kWh. SolarCity advertises that the fully installed cost of the PowerWall is $7,140 + tax, which would be a total of $7,730.

So let's decipher these numbers and see whether they make sense to a residential home. To make the analysis simple, I will offer my own electricity usage as a model to determine whether our home is a candidate for the PowerWall. Fortunately, PG&E, our utility, offers through its website a detailed log of our electrical usage, hour by hour. So I downloaded our home usage data and analyzed our consumption of electrical energy as well as our rate -- in other words, the amount we pay for each kWh of electricity that we consume. For the analysis, I used the data for four representative months during 2014: January, April, July and October, or the first month in each quarter of the year. 

First, let's see how much electricity we utilized. In January, we used on average of 26 kWh per day; in April the daily average was 22 kWh, rising to 28 kWh in July then dropping to 19 kWh per day in October. Again, these are daily averages over the entire month in consideration. So on average, it nets out to about 24 kWh per day over the four seasons. According to PG&E's analysis of our data, our electricity consumption ranks in the middle of the range of "similar homes." I am not sure what it really means, but I will take it as saying that our daily averages are good representations of many homes in our area.

Given our average daily utilization, let's amortize the cost of the PowerWall over the expected 10-year lifespan of the battery. Simple math gives us a cost of $2.15 per day just for the battery capital cost, or about $0.09 /kWh, again, just for the cost of the battery itself, i.e., it does not include the cost of the electricity to be stored.

Now let's assume that I would want to use this battery to offset my peak afternoon pricing by purchasing the electricity at night on the cheap, storing it in this battery, then using it during the peak hours. This arbitrage would necessitate that the rate differential from PG&E, i.e. the difference in electricity price between the peak rate and the low rate, ought to be more than the $0.09 / kWh just to break even. In reality, it ought to be substantially bigger for me to realize some meaningful savings. The bigger the differential, the more the savings. If the differential is small, then the economics of the battery will simply not make sense.

The next chart shows the actual rate ($/kWh) I paid to PG&E during the four months in consideration. Our plan is a time-of-use which means our rates fluctuate during the day and seasonally. For the months of January, April and October, the lowest rate was about $0.12 /kWh. This is low for the day, but not as low as one may read in the papers about the rates in Texas -- these could hit a low of $0.05 /kWh. My peak rate in January and April was a little over $0.20 /kWh, so the differential in January and April was marginal and not very economically compelling. The peak rate in October climbed to $0.35 /kWh, so that made the battery more interesting for the month of October. 

But here's the real shocker. My low rate in July jumped up to $0.20 /kWh at midnight, nearly double what it was in January...ouch, PG&E! It's a shocker because the rates that the utility will charge me, both the low and high rates, are subject to change in the future. I have no control over these rates, yet by buying a PowerWall battery, I have now committed myself to a period of 10 years to recoup my money using a mathematical formula that is almost guaranteed to change in time. So the risk is all mine but my upside is minuscule and questionable over these 10 years. This is not an economical incentive. This is a recipe to give Tesla and SolarCity a great upside opportunity while shifting the economical burden onto residential consumers. The math simply does not work out. Tesla needs to drop their battery prices further and the utilities need to support this effort (i.e., not torpedo my potential savings by raising the lowest rates in the future) before I can feel that there is a stable financial incentive for me. Until then, I will not be buying a PowerWall any time soon. As Christopher Helman at Forbes accurately pointed out, this Tesla home battery is another toy for rich green people.

© Qnovo, Inc. 2015 / @QNOVOcorp @nadimmaluf #QNOVOCorp