1. Home
  2. Featured Articles
  3. A Deeper Look at Lithium Ion Cell Self-Discharge Measurement Method

A Deeper Look at Lithium Ion Cell Self-Discharge Measurement Method


by Ed Brorein, Keysight Technologies


What is a cell’s self-discharge? Self-discharge is the loss of charge over time while the cell is not connected to any load. Some amount of self-discharge is normal. However, excessive self-discharge is indicative of underlying problems within the cell that can potentially lead to catastrophic failure. In manufacturing, self-discharge is a critical parameter that is tested to ensure all lithium ion cells are screened.

There are two main methods for measuring self-discharge, as illustrated in Figure 1. First is the traditional delta open circuit voltage (OCV) method, where loss of the cell’s OCV is typically measured over days to weeks of time. The amount of loss in OCV is an indicator of the amount of self-discharge in the cell. While simple to implement, the delta OCV method takes a long time to yield a result. Second is the potentiostatic method. Here, the cell is maintained at a constant state of charge (SoC) by holding it at a fixed potential with a stable external DC source. Initially, all the self-discharge current is internally furnished by the cell. After settling for a couple of hours to reach equilibrium, all the self-discharge current is furnished by the external DC source, which can be directly measured. While more complex to implement, the potentiostatic method yields a result in much less time.

Figure 1: Self-discharge measurement methods

Both methods are impacted by a variety of factors, but by different degrees, depending on the factor. Properly performed, both methods yield valid and comparable results. By understanding how these measurement methodologies work, what governs their test time, and how a variety of factors impact them, one can achieve valid and consistent results, regardless of the method chosen.

What Determines Test Time?

What determines the test time for the delta OCV method can be better understood by referring to the illustrations in Figure 2. This method relies on measuring a very small voltage drop over time that rides on top of a very large DC offset, which is the cell’s OCV. This will dictate using a 10-volt measurement range on a digital voltmeter (DVM). The cell’s OCV loss rate is governed by the parallel combination of the cell’s effective capacitance, CEFF, and its internal self-discharge resistance, RSD, as shown on the right.

Figure 2: What determines the delta OCV method test time?

For this example, testing was performed on a group of NMC 18650 cylindrical cells. CEFF was on the order of 10,00 farads and RSD was on the order of 10s of kiloohms, making the OCV loss rate extremely slow. The main population of cells had a self-discharge OCV drop of 1 mV or less over a 10-day period while a small number of outlier cells with high self-discharge had 2 mV or more loss over the same period, as shown on the graph on the left. 10 days was about the minimum amount of time needed to get sufficient OCV drop that had acceptable uncertainty and error due to several factors, including:

  • DVM 10-volt measurement range accuracy
  • Cell OCV temperature coefficient of voltage (TCV)
  • Thermal EMF errors generated by electrical contacts
  • DVM 10-volt range temperature coefficient
  • Ambient temperature differences between the initial and final OCV measurements

Up to two weeks may be needed for a valid result using the traditional delta OCV method, depending on how well these factors can be controlled. The advantage of the delta OCV method is its simplicity and ability to choose an optimum test time to obtain sufficient OCV drop for an acceptable level of uncertainty and error. The main downside of the delta OCV method is the long test time and that the cells lay dormant during this time, increasing work-in-process (WIP). This requires more factory space along with the hazards associated with storing large volumes of cells.

In contrast to the delta OCV method, what determines the test time for the potentiostatic method can be more easily understood by referring to Figure 3. Instead of the cell being open-circuited, it is now basically short-circuited by the external potentiostatic DC source, which is first set to exactly match the cell’s OCV before being connected. The measurement settling time is now determined by the parallel combination of the cell’s effective capacitance, CEFF, and the potentiostatic method setup’s series resistance setting, RSERIES, as shown on the right.

Figure 3: What determines the potentiostatic method test time?

In practice, RSERIES is in the range of an ohm or less. This is orders of magnitude smaller than the cell’s self-discharge resistance, RSD, making the potentiostatic method response time orders of magnitude faster than the delta OCV method. As shown in the graph on the left for the 18650 cells tested in this example, the time for the measurement to fully settle out was about two hours. Note however, discerning the cells with high self-discharge from the main population can often be determined well before that due to the clear separation of the measurements for cells with high self-discharge versus the main population. Top factors impacting measurement uncertainty and error for the potentiostatic method include:

  • Cell temperature coefficient of voltage (TCV)
  • Potentiostatic DC source stability
  • Potentiostatic DC source temperature coefficient
  • Net thermal EMF changes
  • Ambient temperature change over course of the test period

The main advantage of the potentiostatic method is that its test time is usually no more than a couple of hours in practice. Another advantage is that the measurement equipment’s accuracy and temperature coefficient are not significant factors because the self-discharge measurement does not have a large offset to contend with, like in the delta OCV method. In addition, no large offset allows discerning cells with high self-discharge, much less than waiting the full measurement settling time. Finally, the series resistance setting in the potentiostatic method set up can be adjusted to achieve an optimum balance between measurement response time versus sensitivity to the factors given. On the downside, there are several tradeoffs of using the potentiostatic method to achieve greatly shortened test time, including:

  • Overall, it is a relatively complex set up in comparison to the delta OCV method set up
  • The external DC source must be very stable, typically requiring just a few microvolts of drift over the test period
  • The external DC source also requires having an extremely low temperature coefficient

It is sensitive to the cell’s TCV, which can be rather high, depending on the cell’s state of charge (SoC). This often dictates maintaining a very tight temperature control of the cell during the test period, typically at a couple of tenths of a °C or less.

The microvolt-level performance of the potentiostatic DC source requires specialized equipment or set up to achieve. Although the potentiostatic method usually requires very tight temperature control of the cell under test, as the test period is relatively short, this can be realized with passive control in some combination of insulation, thermal mass, and blocking any air drafts.

Impact of Charging or Discharging

Any charging or discharging causes the cell’s charge to have a gradient that needs to redistribute itself over days of time to return to an equilibrium state again.  This is a significant factor as it has a major impact on self-discharge measurements, regardless of the methodology used. This is important in manufacturing as self-discharge testing follows right after cell formation. During formation, charging and discharging cycles are applied to a freshly assembled cell. Thus, the charge is not at equilibrium.

In the case of charging, the impact is that it adds a large, exponentially decaying offset to the OCV loss rate. The initial peak OCV loss rate depends on how heavily the cells were charged. When the cells are finally rested and their charge is at equilibrium again, the OCV loss rate becomes constant, due only to the cell’s self-discharge. Before the cells are fully rested, this translates to adding offset error to the delta OCV method measurement, making self-discharge appear larger than it is. Discharging has the same effect, but in the opposite direction. For the 18650 cells tested in the example here, they were subjected to moderate charging, which introduced the time-dependent OCV exponential loss rate illustrated in Figure 4.

Figure 4: Impact of charging on delta OCV method measurements

Even after nine days of resting following charging, the effect of charge redistribution on the 10-day delta OCV test that followed is that it contributed another 30% of offset measurement error for the main population of cells tested in the example. For the potentiostatic method, the impact of charging or discharging is comparable to that on the delta OCV method testing, requiring a comparable amount of rest time for charge redistribution to reach equilibrium.

A rest time of nine days is a long time to wait, thus increasing manufacturing costs. To greatly reduce this rest time after the formation process, manufacturers will usually resort to high-temperature aging. This accelerates the charge redistribution settling, reducing rest time. As one example, using 40° C aging as opposed to room temperature aging will typically yield about a 3-fold rest time reduction. This would cut the number of days of rest in the given example from nine days to three.

Additional Factors Impacting Self-Discharge Measurements

In addition to what has already been covered, other factors impacting self-discharge measurements to be considered include:

The cell’s % SoC: The cell’s SoC should ideally be between 30% to 80% SoC for self-discharge testing. Self-discharge decreases towards zero with 0% SoC. Conversely, self-discharge increases greatly at high % SoC, especially over 80%.

The cell’s temperature: In contrast to temperature changes impacting the measurement set up and cell’s TCV, absolute temperature impacts the cell’s self-discharge level. A cell’s self-discharge will about double for every 10° C rise or increase about 15% for a 2° C rise. A 2° C ambient change is what will typically be experienced over a day or longer in a building.

To get consistent and comparable self-discharge measurement results, it is imperative these factors are kept the same over time and from lot to lot.

Good Practices Consistently Yield Valid Results

By carefully managing the factors impacting measurements, one can consistently achieve valid self-discharge measurement results. This is true regardless of the method used for making self-discharge measurements. To illustrate this, the results of both methods used for self-discharge measurements on the cells tested in the example are plotted in the graph in Figure 5.

Figure 5: Correlating results between the two methods

The values measured on each of the cells for the potentiostatic method are plotted on the vertical axis while the corresponding values using the delta OCV method are plotted on the horizontal axis. When a best fit line was plotted through the points, they lined up well, with the line projecting back through the origin. This demonstrates excellent correlation between the two measurement methods, confirming achieving valid and consistent results.


In closing, self-discharge is an important parameter that all lithium ion cells are screened for in manufacturing as excess self-discharge is indicative of underlying problems in the cell that could lead to catastrophic failure.

The two main methods of measuring self-discharge on lithium ion cells are the traditional delta OCV method and the potentiostatic method. Each have their own unique advantages and disadvantages. The delta OCV method, while simple to implement and perform, takes days to weeks of time to yield results. This greatly increases WIP, requiring more factory space with the associated hazards of having to store large volumes of cells. In comparison, the potentiostatic method has a test time of up to a couple of hours, but is more complex to implement and requires tight temperature control of the cells under test.

Shown here were how these methodologies work, what governs their test times, and how different factors impact their results. By understanding these things, one can be confident of achieving valid and consistent self-discharge measurement results, regardless of the methodology chosen.

About the Author

Ed Brorein

Ed Brorein started at Hewlett Packard in 1979. During his 42 years with HP, Agilent, and now Keysight, Ed has been an engineer in R&D, manufacturing, and finally, marketing in a variety of roles, helping customers with application of DC power products and test systems for a variety of electronic devices. Presently, Ed is responsible for helping customers with Keysight’s battery and cell testing solutions to gain application insights into this industry.

Ed holds a BSEE degree from Villanova University in PA, and an MSEE degree from the New Jersey Institute of Technology.