Next Article in Journal
Influence of Temperature and Electrolyte Composition on the Performance of Lithium Metal Anodes
Next Article in Special Issue
Life Cycle Assessment of a Lithium-Ion Battery Pack Unit Made of Cylindrical Cells
Previous Article in Journal
Effect of the Etching Profile of a Si Substrate on the Capacitive Characteristics of Three-Dimensional Solid-State Lithium-Ion Batteries
Previous Article in Special Issue
Review of Achieved Purities after Li-ion Batteries Hydrometallurgical Treatment and Impurities Effects on the Cathode Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Attention-Based Long Short-Term Memory Recurrent Neural Network for Capacity Degradation of Lithium-Ion Batteries

1
Department of Mechanical Engineering, Mettu University, Mettu P.O. Box 318, Ethiopia
2
Department of Industrial Management, National Taiwan University of Science and Technology, Taipei 10607, Taiwan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 3 August 2021 / Revised: 1 September 2021 / Accepted: 30 September 2021 / Published: 13 October 2021
(This article belongs to the Special Issue Circular Battery Technologies)

Abstract

:
Monitoring cycle life can provide a prediction of the remaining battery life. To improve the prediction accuracy of lithium-ion battery capacity degradation, we propose a hybrid long short-term memory recurrent neural network model with an attention mechanism. The hyper-parameters of the proposed model are also optimized by a differential evolution algorithm. Using public battery datasets, the proposed model is compared to some published models, and it gives better prediction performance in terms of mean absolute percentage error and root mean square error. In addition, the proposed model can achieve higher prediction accuracy of battery end of life.

1. Introduction

Lithium-ion (Li-ion) batteries are widely used in backup power supplies, portable communication equipment, consumer electronics, electric vehicles, and energy storage systems [1]. As the number of charge/discharge cycles increases, the battery capacity will gradually decrease. The main problem is to diagnose their state-of-health (SOH) and predict the remaining useful life (RUL) [1,2,3,4]. The RUL of a Li-ion battery is the length of time from the current time to the end of life (EOL), where EOL is approximately 70–80% of its nominal capacity [3]. In other words, the capacity drops to a specific value called the threshold limit, and it will reach a pre-defined aging metric. Therefore, RUL prediction is one of the main tasks of a battery management system.
There are three types of RUL prediction methods: model-based method, data-driven method, and hybrid method [1,2,3,4]. The model-based method, including physical model and mathematical model, is to build a model for analysis with observable values and other key indicators during battery circuits or electrochemical degradation. The data-driven method is to build regression models for analysis with large amounts of data. It can be summarized as artificial intelligence (AI) or machine learning, filtering processes, statistical methods, and stochastic processes. Hybrid models combine two or more model-based methods or data-driven methods to improve the accuracy of RUL predictions. Kim et al. [5] proposed a SOH classification method based on multilayer perceptron (MLP). Zhang et al. [6] proposed long short-term memory (LSTM) recurrent neural network for the remaining useful life prediction of lithium-ion batteries. However, Li et al. [7] combined the LSTM model with empirical mode decomposition algorithm and Elman neural network for battery RUL prediction. Ren et al. [8] implemented another deep learning approach for lithium-ion battery remaining useful life prediction. Similarly, Yang et al. [9] implemented an improved extreme learning machine algorithm for RUL prediction. Support vector machine also applied for battery state-of-health and RUL prediction using different approaches. For instance, Wei et al. [10] used the hybrid model based on particle filter and support vector regression for prediction of battery SOH and RUL. Wang et al. [11] applied support vector regression with differential evolution algorithm to predict battery RUL using cycle, current and voltage as input function. Yang et al. [12] proposed support vector regression for battery state-of-health prediction. Zhao et al. [13] presented a data-driven method that combines feature vector selection (FVS) with SVR model for battery SOH and RUL prediction using the time interval of an equal charging voltage difference as health indicator. Wang et al. [14] used constant voltage charging profile as a health indicator for lithium-ion battery RUL prediction. Moreover, different researchers applied other machine learning models, such as relevance vector machines (RVM) [15,16,17], Gaussian process regression (GPR) [18,19,20,21], Gaussian process models [22], random forest regression (RFR) [23], and gradient boosted regression (GBR) [24,25], using different approaches for battery SOH estimation. This study focused on the hybrid model of LSTM with attention mechanism, as it gives better prediction accuracy.
Deep learning models such as LSTM and gated recurrent unit (GRU) have been paid superior attention in several research fields, due to vanishing problems of the traditional recurrent neural network. The long-term memory unit of these models stores long-term information in the state of other units. The outputs of the traditional RNN model have been limited until the development of these deep learning models. LSTM or GRU models have often replaced traditional RNNs and use gates to control input-output information. They can solve the problem of gradient vanishing of traditional RNN model. Thus, LSTM is one of the recent prediction techniques for time-series problems, and it has three gates, such as input gate, output gate, and forget gate. Therefore, the network of the LSTM model can memorize longer sequences and manage longer dependencies in order to converge on specific problems. However, the network cannot fully memorize long-term information or state and transfer to the next LSTM unit, which makes it difficult to avoid the defect of long-term forgetting in the LSTM model. Therefore, implementing LSTM model alone cannot give better and adequate accuracy in the process of continuing prediction. Currently, some researchers have familiarized the LSTM model with the attention mechanism, in order to improve the information-processing capabilities of the model. They can obtain better prediction accuracy when the models are incorporated. Attention mechanism allows the network to focus on specific more valuable information selectively. As a result, attention mechanisms soon expanded to various fields, including time series prediction. Therefore, we develop a long short-term memory recurrent neural network model with attention mechanism to analyze the capacity degradation of lithium-ion batteries. The proposed model has two parts: the LSTM model and attention model. The attention mechanism is located on the output layer of LSTM, and it is used to model long-term dependencies. At the same time, we used a differential evolution (DE) algorithm to obtain the optimal hyper-parameters of the model. The performance of the proposed method for capacity degradation and battery end-of-life prediction was studied using the four public battery datasets. The rest of this article is organized as follows. Section 2 introduces the feature extraction of Li-ion batteries. The proposed model will be discussed in Section 3. Section 4 describes the analysis of capacity degradation estimates and RUL predictions. Finally, we make conclusions and further research directions.

2. Lithium-Ion Battery Datasets

A total of sixteen batteries come from four different types of Li-ion batteries, and are used to compare the predictive capabilities of different models. Four 18650-size rechargeable batteries are from the NASA battery data [26], and three LiCoO2 cathode cells are from the CALCE battery data [7,27]. Besides, four pouch-shaped cells are from the Oxford battery degradation dataset [28], and five commercial Li-ion phosphate/graphite cells are from Toyota data [29]. Table 1 shows some battery specifications, and detailed experimental settings can be found in the literature [25,26,27,28,29]. When the capacity drops to approximately 70%~80% of the rated capacity, the experiment is stopped. The cycle life of the battery capacity is normalized as
C normalzied = C current C nominal
where C normalzied represents the SOH, C current is the capacity of the current battery, and C nominal is the rated capacity of the battery (see the last column of Table 1). The capacity of the battery is affected by the loss of available lithium ions and the loss of anode and cathode materials. Two variables, such as cycle number and temperature, are used as input to model except for CALCE data. Since temperature data cannot be obtained from CALCE data, the number of cycles is used only as an input variable.

3. Long Short-Term Memory with Attention Mechanism

Attention-based LSTM model has two parts: LSTM layer and attention layer. The network of LSTM model can learn and manage longer sequences and dependencies in order to converge on specific solutions. The three nonlinear gating units of long short-term memory recurrent neural network are forget gate, input gate, and output gate [30]. The purpose of the storage unit in LSTM network is to identify when to acquire new information and when to forget old information. In order to allow the networks to focus on valuable selected information, the attention layer was combined with the LSTM model in this study. The clue of an attention-based LSTM model is to add an attention layer to the output layer of LSTM unit for modeling long-term dependencies in the network. It can also control importance-based sampling. Therefore, this study considered an LSTM model with an attention mechanism for the prediction of capacity degradation trends. Equation (2) provides the input gate of the network, which controls the level of the new memory added. Equation (3) regulates the amount of forgotten memory in the forget gate. Finally, Equation (6) moderates the level of the output memory in the output gate, and finally, LSTM calculates the control state h i and the cell state c i .
i i = σ ( W i * [ h i 1 , x i ] + b i )
f i = σ ( W f * [ h i 1 , x i ] + b f )
c ˜ i = tanh ( W c * [ h i 1 , x i ] + b c )
c i = f i c i 1 + i i c ˜ i
o i = σ ( W o * [ h i 1 , x i ] + b o )
h i = o i tanh ( c i )
where σ and tanh are the sigmoid and activation function, respectively; indicates multiplication of the elements; [ h i , x i ] is h i 1 and x i concatenation; Wi, Wf, Wc, and Wo are the learning weight parameters; bi, bf, bc, and bo are the learning bias parameters. Moreover, an attention mechanism is used to improve the accuracy of the LSTM model [31]. The attention layer aids the selection of the critical output of the earlier layers for each subsequent phase in the model. It assists the networks to focus on specific important information. The functions of the attention model are indicated as
M = tanh ( W h H W u u a e n
α = s o f t max ( w T M )
r = H α T
where H is the matrix of extracted features [lt1, lt2, …, ltn], e n ε R n is a vector, u a is embedded attention mechanism, α is the vector form of extracted feature H attention weights, and r is the final output of the attention model that is the weighted sum of extracted features H. The embedding is learned during model training. Figure 1 shows the summary of the proposed model, which is attention-based LSTM model for capacity degradation trend prediction of lithium-ion batteries.
Support vector machine is a common machine learning model for prediction, classification, clustering, and other learning tasks. In this study, support vector regression is used for the prediction of the capacity degradation trend. Let the training set { ( x 1 , y 1 ) , , ( x n , y n ) } , where x i R n is a feature vector, and y i R is the target output. The SVR function is given by y i = f ( x i ) = w T ϕ ( x i ) + b and
f ( x ) = i = 1 l ( β i β i ) K ( x i , x j ) + b
where ϕ ( x ) is a nonlinear mapping function, w R n and b R are adjustable coefficients, and K ( x i , x j ) = exp ( γ | | x i x j | | 2 ) is the Gaussian radial basis function. The SVR hyper parameters such as gamma ( γ ), Cost (C) and Epsilon ( ε ) are optimized by DE algorithm in this study.
The MLP model consists of a feed-forward artificial neural network model for classification or regression problems. A seven-layer MLP network, such as an input layer, five hidden layers, and an output layer were considered in this study. The number of neurons, dropout, epochs, and batch size of the MLP model are optimized by DE algorithm.
The parameters of the proposed model are optimized by the DE algorithm. DE algorithm is the search heuristic which was innovated by Storn and Price in 1996 [33]. In this study, DEoptim R library is used, which has different parameters such as NP, F, and CR. The number of parameter vectors in the population is represented by NP in the DE algorithm, and it guesses the optimal parameter value at generation zero. It finds the optimal parameters from the random values between the lower and upper bounds. The variable F is a positive factor between zero and one. The mutation of the algorithm can be continued until either the length of the mutation has been made or random number greater than a crossover probability (CR), which is between zero and one. The choices of NP, F, and CR depend on the specific problem. The following procedures are carried out to obtain optimal parameters of the proposed model, SVR and MLP models using the DE algorithm.
Step 1. Normalize all features, such as capacity, cycle, and temperature.
Step 2. Choose the fitness function, which is mean absolute error (MAE) in this study. It can be obtained by
M A E = i = 1 n ( C i C ^ i ) n
where C i is the actual SOH at cycle i , C ^ i is the predicted SOH at cycle i , and n is the number of cycles used in the calculation.
Step 3. Select the ranges for the model parameters.
Step 4. Decide the values of DE parameters. NP = 40, CR = 0.9, and F = 0.8 are used in this study.
Step 5. Obtain the optimal values for each parameter.

4. Analysis Results

4.1. Model Performance for Prediction of Capacity Degradation Trend

Four Li-ion batteries, such as #5, #6, #7, and #18, are used to illustrate the effectiveness of the proposed model for SOH estimation. We compare the prediction accuracy of the proposed model with the other two models, as indicated in Table 2. The first 80 cycles of the battery are used as training data, and the remaining cycles are used as test data. Mean absolute percentage error (MAPE) and root mean square error (RMSE) are chosen for evaluating the prediction accuracy of the models, which are obtained as
M A P E = i = 1 n | ( C i C ^ i ) / C i | n × 100
R M S E = 1 n i = 1 n ( C i C ^ i ) 2 )
In Table 2, the results show that the proposed model performs better than SVR and MLP models in terms of MAPE and RMSE for test data. For battery #5, the RMSEs of SVR, MLP, and LSTM with attention were 0.0123, 0.0174, and 0.0078, respectively. The results show that the proposed model is the best model, followed by the SVR model. Moreover, the performance of the models trained with training data #1–#80 for batteries #5 and #6 is shown in Figure 2a,b. It shows that the proposed model has better prediction accuracy than SVR and MLP models.
We investigate the effects of regeneration by using three different ranges of training data, which are cycle numbers #1–#80, #1–#100, and #1–#120. Table 3 shows the RMSE values of batteries #5, #6, #7 and #18 under three different test data. For instance, the RMSE values of the proposed model on battery #5 under three different test data are 0.0078, 0.0047, and 0.0058, respectively. The results show that the proposed model is not significantly affected by a different training dataset. Therefore, we can conclude that the proposed model is a robust approach.
Pouch-shaped batteries were used to study the performance of a fusion method based on wavelet de-noise (WD) and hybrid Gaussian process function regression (HGPFR) model under different training data [20]. Here, three different training data for Cell-1 and Cell-7, including 100–3000, 100–3500, and 100–4000, are considered. Table 4 shows the predicted RMSE values for different testing dataset in order to compare the proposed model with published methods. For example, the RMSEs of GPR, HGPFR, WD-HGPFR, SVR, MLP, and LSTM with attention for Cell-1 under training data (cycles 100–3000) are obtained as 0.0600, 0.0408, 0.0108, 0.0085, 0.0101, and 0.0030, respectively. The results show that LSTM with attention model can provide higher prediction accuracy than the three models in [20], SVR, and MLP models.
In addition, the performance of the proposed model for unseen datasets is validated by other cells in the same experiment. For example, batteries #5 and #6 are used as training data to predict the SOH of battery #7 and #18. Table 5 shows the prediction accuracy of the models on the test data. The proposed model also outperforms individual models for invisible data. Therefore, we can conclude that the proposed model provides better performance for unseen data.

4.2. Battery EOL Prediction

When the capacity drops to 70% or 80% of the rated capacity, the battery is regarded as EOL. In this study, 70% of the rated capacity is considered as EOL for battery #18, CS2_37, and CS2_38. For battery #7, 75% of the rated capacity is used as the EOL of the battery. Note that the capacity value of battery #6 is greater than 2.0 Ah in cycles 1–7, and the training data start after cycle #8. For Cell-7 from Oxford battery, 80% of the rated capacity is considered as battery EOL. Moreover, 85% of the rated capacity is regarded as EOL for batteries from Toyota data, because if 80% is used as the battery EOL, the predicted EOL of all models will be similar (i.e., the test end). Table 6 shows the performance of the models in the battery EOL prediction for unseen dataset. The first cycle number is used as the starting point for all test data, and EOL is expressed in cycle number. To evaluate the prediction performance of the models, we used relative error (RE) as a performance measure, which is obtained as
R E ( % ) = R R ^ R × 100
where R represents the actual EOL value and R ^ represents the predicted EOL value. The results indicate that LSTM with the attention model is better than SVR and MLP models in most cases for the unseen dataset. Therefore, we can conclude that LSTM with attention model provides better prediction performance for battery EOL prediction.

5. Conclusions

Cycle life prediction plays a vital role in a battery management system. In this study, we propose an LSTM model with attention mechanism to analyze the capacity degradation of Li-ion batteries. In addition, the DE algorithm is used to obtain the optimal hyper-parameters of the SVR, MLP, and LSTM with attention models. Using four batteries from the NASA data and two cells from the Oxford data, the proposed model performs better than SVR and MLP models for the prediction of capacity degradation trend in terms of MAPE and RMSE criteria. Moreover, we found that the proposed model is not significantly affected by different training datasets, and it can accurately predict the SOH and EOL of the battery for unseen datasets. Therefore, we can conclude that LSTM model with attention mechanism can produce more accurate and reliable results.
We will study on another hybrid artificial intelligence model for battery state-of-health estimation and RUL prediction in the future.

Author Contributions

T.M raised the ideas for F.-K.W., T.M. wrote R code and analyzed the data under the advice of F.-K.W. Finally, they wrote the manuscript and edited it. All authors have read and agreed to the published version of the manuscript.

Funding

There is no external funding for this research.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Supplementary data to this article can be found online at the website in Refs. [7,26,27,28,29].

Conflicts of Interest

There is no conflict of interest of the authors.

References

  1. Barre, A.; Deguilhem, B.; Grolleau, S.; Gerard, M.; Suard, F.; Riu, D. A review on lithium-ion battery ageing mechanisms and estimations for automotive applications. J. Power Sources 2013, 241, 680–689. [Google Scholar] [CrossRef] [Green Version]
  2. Wu, L.; Fu, X.; Guan, Y. Review of the remaining useful life prognostics of vehicle lithium-ion batteries using data-driven methodologies. Appl. Sci. 2016, 6, 166. [Google Scholar] [CrossRef] [Green Version]
  3. Su, C.; Chen, H.J. A review on prognostics approaches for remaining useful life of lithium-ion battery. IOP Conf. Ser. Earth Environ. Sci. 2017, 93, 012040. [Google Scholar] [CrossRef]
  4. Omariba, Z.B.; Zhang, L.; Sun, D. Review on health management system for lithium-ion batteries of electric vehicles. Electronics 2018, 7, 72. [Google Scholar] [CrossRef] [Green Version]
  5. Kim, J.; Yu, J.; Kim, M.; Kim, K.; Han, S. Estimation of Li-ion battery state of health based on multilayer perceptron: As an EV application. IFAC-PapersOnLine 2018, 51, 392–397. [Google Scholar] [CrossRef]
  6. Zhang, Y.; Xiong, R.; He, H.; Pecht, M. Long short-term memory recurrent neural network for remaining useful life prediction of lithium-ion batteries. IEEE Trans. Veh. Technol. 2018, 67, 5695–5705. [Google Scholar] [CrossRef]
  7. Li, X.; Zhang, L.; Wang, Z.; Dong, P. Remaining useful life prediction for lithium-ion batteries based on a hybrid model combining the long short-term memory and Elman neural networks. J. Energy Storage 2019, 21, 510–518. [Google Scholar] [CrossRef]
  8. Ren, L.; Zhao, L.; Hong, S.; Zhao, S.; Wang, H.; Zhang, L. Remaining useful life prediction for lithium-ion battery: A deep learning approach. IEEE Access 2018, 6, 50587–50598. [Google Scholar] [CrossRef]
  9. Yang, J.; Peng, Z.; Wang, H.; Yuan, H.; Wu, L. The remaining useful life estimation of lithium-ion battery based on improved extreme learning machine algorithm. Int. J. Electrochem. Sci. 2018, 13, 4991–5004. [Google Scholar] [CrossRef]
  10. Wei, J.; Dong, G.; Chen, Z. Remaining useful life prediction and state of health diagnosis for lithium-ion batteries using particle filter and support vector regression. IEEE Trans. Ind. Electron. 2018, 65, 5634–5643. [Google Scholar] [CrossRef]
  11. Wang, F.K.; Mamo, T. A hybrid model based on support vector regression and differential evolution for remaining useful lifetime prediction of lithium-ion batteries. J. Power Sources 2018, 401, 49–54. [Google Scholar] [CrossRef]
  12. Yang, D.; Wang, Y.; Pan, R.; Chen, R.; Chen, Z. State-of-health estimation for the lithium-ion battery based on support vector regression. Appl. Energy 2018, 227, 273–283. [Google Scholar] [CrossRef]
  13. Zhao, Q.; Qin, X.; Zhao, H.; Feng, W. A novel prediction method based on the support vector regression for the remaining useful life of lithium-ion batteries. Microelectron. Reliab. 2018, 85, 99–108. [Google Scholar] [CrossRef]
  14. Wang, Z.; Zeng, S.; Guo, J.; Qin, T. Remaining capacity estimation of lithium-ion batteries based on the constant voltage charging profile. PLoS ONE 2018, 13, e0200169. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Zhang, Y.; Guo, B. Online capacity estimation of lithium-ion batteries based on novel feature extraction and adaptive multi-kernel relevance vector machine. Energies 2015, 8, 12439–12457. [Google Scholar] [CrossRef] [Green Version]
  16. Liu, D.; Zhou, J.; Pan, D.; Peng, Y.; Peng, X. Lithium-ion battery remaining useful life estimation with an optimized relevance vector machine algorithm with incremental learning. Measurement 2015, 63, 143–151. [Google Scholar] [CrossRef]
  17. Song, Y.; Liu, D.; Hou, Y.; Yu, J.; Peng, Y. Satellite lithium-ion battery remaining useful life estimation with an iterative updated RVM fused with the KF algorithm. Chin. J. Aeronaut. 2018, 31, 31–40. [Google Scholar] [CrossRef]
  18. Yang, D.; Zhang, X.; Pan, R.; Wang, Y.; Chen, Z. A novel Gaussian process regression model for state-of-health estimation of lithium-ion battery using charging curve. J. Power Sources 2018, 384, 387–395. [Google Scholar] [CrossRef]
  19. Yu, J. State of health prediction of lithium-ion batteries: Multiscale logic regression and Gaussian process regression ensemble. Reliab. Eng. Syst. Saf. 2018, 174, 82–95. [Google Scholar] [CrossRef]
  20. Peng, Y.; Hou, Y.; Song, Y.; Pang, J.; Liu, D. Lithium-ion battery prognostics with hybrid Gaussian process function regression. Energies 2018, 11, 1420. [Google Scholar] [CrossRef] [Green Version]
  21. Richardson, R.R.; Birkl, C.R.; Osborne, M.A.; Howey, D. Gaussian process regression for in-situ capacity estimation of lithium-ion batteries. IEEE Trans. Ind. Inform. 2019, 15, 127–138. [Google Scholar] [CrossRef] [Green Version]
  22. Li, F.; Xu, J. A new prognostics method for state of health estimation of lithium-ion batteries based on a mixture of Gaussian process models and particle filter. Microelectron. Reliab. 2015, 55, 1035–1045. [Google Scholar] [CrossRef]
  23. Li, Y.; Zou, C.; Berecibar, M.; Nanini-Maury, E.; Chan, J.C.W.; van den Bossche, P.; Van Mierlo, J.; Omar, N. Random forest regression for online capacity estimation of lithium-ion batteries. Appl. Energy 2018, 232, 197–210. [Google Scholar] [CrossRef]
  24. Mansouri, S.S.; Karvelis, P.; Georgoulas, G.; Nikolakpopoulos, G. Remaining useful battery life prediction for UAVs based on machine learning. IFAC-PapersOnLine 2017, 50, 4727–4732. [Google Scholar] [CrossRef]
  25. Sun, S.; Zhang, J.; Bi, J.; Wang, Y. A machine learning method for predicting driving range of battery electric vehicles. J. Adv. Transp. 2019, 2019, 4109148. [Google Scholar] [CrossRef]
  26. Saha, B.; Goebel, K. Battery Data Set, NASA AMES Prognostics Data Repository; NASA Ames: Moffett Field, CA, USA, 2007.
  27. Miao, Q.; Xie, L.; Cui, H.; Liang, W.; Pecht, M. Remaining useful life prediction of lithium-ion battery with unscented particle filter technique. Microelectron. Reliab. 2013, 53, 805–810. [Google Scholar] [CrossRef]
  28. Birkl, C. Oxford Battery Degradation Dataset 1; University of Oxford: Oxford, UK, 2017. [Google Scholar]
  29. Severson, K.A.; Attia, P.M.; Jin, N.; Perkins, N.; Jiang, B.; Yang, Z.; Chen, M.H.; Aykol, M.; Herring, P.K.; Fraggedakis, D.; et al. Data-driven prediction of battery cycle life before capacity degradation. Nat. Energy 2019, 4, 383–391. [Google Scholar] [CrossRef] [Green Version]
  30. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  31. Raffel, C.; Ellis, D.P. Feed-forward networks with attention can solve some long-term memory problem. arXiv 2015, arXiv:1512.08756. [Google Scholar]
  32. Wang, F.K.; Mamo, T.; Cheng, X.B. Bi-directional long short-term memory recurrent neural network with attention for stack voltage degradation from proton exchange membrane fuel cells. J. Power Sources 2020, 461, 228170. [Google Scholar] [CrossRef]
  33. Storn, R.; Price, K. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
Figure 1. Attention-based LSTM model [32].
Figure 1. Attention-based LSTM model [32].
Batteries 07 00066 g001
Figure 2. Prediction performance of three models: (a) battery #5 and (b) battery #6.
Figure 2. Prediction performance of three models: (a) battery #5 and (b) battery #6.
Batteries 07 00066 g002
Table 1. Some specifications of the batteries.
Table 1. Some specifications of the batteries.
DataBatteriesAmbient
Temperature (°C)
Discharge Current (A)Rated Capacity (Ah)
NASA#5, #6, #7, and #182422
CALCECS2_35, CS2_37, and CS2_382511.1
OxfordCell-1, Cell-2, Cell-3, and Cell-74010.74
Toyota#16, #31, #33, #36, and #433011.1
Table 2. Prediction results of different models from the NASA data (training data: cycles #1–#80).
Table 2. Prediction results of different models from the NASA data (training data: cycles #1–#80).
Models#5#6#7#18
MAPE (%)RMSEMAPE (%)RMSEMAPE (%)RMSEMAPE (%)RMSE
SVR1.41250.01232.10360.01900.66280.00741.27300.0112
MLP2.13880.01741.69340.01550.69530.00811.72210.0145
LSTM with attention0.54620.00781.15590.01230.63480.00741.12670.0112
Table 3. RMSE of the prediction results under different training data from the NASA data.
Table 3. RMSE of the prediction results under different training data from the NASA data.
BatteryTraining DataModels
SVRMLPLSTM with Attention
#51–800.01230.01740.0078
1–1000.01110.01010.0047
1–1200.00600.00870.0058
#61–800.01900.01550.0123
1–1000.01190.02050.0072
1–1200.01370.01310.0073
#71–800.00740.00810.0074
1–1000.00490.00760.0047
1–1200.00890.00680.0046
#181–800.01120.01450.0112
1–1000.01560.01380.0129
1–1200.01030.01440.0094
Table 4. RMSE of the prediction results under different training data from the Oxford data.
Table 4. RMSE of the prediction results under different training data from the Oxford data.
BatteryTraining DataModels
GPR 1HGPFR 1WD-HGPFR 1SVRMLPLSTM with Attention
Cell-1100–30000.06000.04080.01080.00850.01010.0030
100–35000.05250.01810.00720.00790.01190.0043
100–40000.05980.01630.01080.00580.01020.0032
Cell-7100–30000.10260.01470.00610.00790.00410.0034
100–35000.08330.04440.00560.00340.00710.0023
100–40000.06810.02310.01450.00690.01100.0037
Note: 1 Results are obtained from Ref. [20].
Table 5. RMSE of the prediction results for unseen datasets.
Table 5. RMSE of the prediction results for unseen datasets.
DataTraining SetTesting SetSVRMLPLSTM with Attention
NASA#5 and #6#70.04460.04310.0310
#180.03220.02980.0232
CALCECS2_35CS2_370.03390.03410.0315
CS2_380.02460.02430.0223
OxfordCell-1, Cell-2, and Cell-3Cell-70.02120.02110.0204
Toyota#16, #31, and #33#360.00300.00450.0027
#430.00530.00440.0034
Table 6. Predicted EOL for unseen datasets.
Table 6. Predicted EOL for unseen datasets.
DataTraining SetTest SetRSVRMLPLSTM with Attention
R ^ RE (%) R ^ RE (%) R ^ RE (%)
NASA#5 and #6#71269425.409226.981158.73
#1897952.06943.09981.03
CALCECS2_35CS2_377217453.337382.367210.00
CS2_387807711.157632.187780.25
OxfordCell-1, Cell-2, and Cell-3Cell-77000540022.86540022.86600014.29
Toyota#16, #31, and #33#366516540.466530.316500.15
#436446531.406521.246450.16
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mamo, T.; Wang, F.-K. Attention-Based Long Short-Term Memory Recurrent Neural Network for Capacity Degradation of Lithium-Ion Batteries. Batteries 2021, 7, 66. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries7040066

AMA Style

Mamo T, Wang F-K. Attention-Based Long Short-Term Memory Recurrent Neural Network for Capacity Degradation of Lithium-Ion Batteries. Batteries. 2021; 7(4):66. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries7040066

Chicago/Turabian Style

Mamo, Tadele, and Fu-Kwun Wang. 2021. "Attention-Based Long Short-Term Memory Recurrent Neural Network for Capacity Degradation of Lithium-Ion Batteries" Batteries 7, no. 4: 66. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries7040066

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop