Next Article in Journal
On the Complementarity of the Harmonic Oscillator Model and the Classical Wigner–Kirkwood Corrected Partition Functions of Diatomic Molecules
Previous Article in Journal
Fault Diagnosis for Rotating Machinery Using Multiscale Permutation Entropy and Convolutional Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Feature Extraction and Classification for Lower Limb Motion Based on sEMG Signal

Institute of Automation, Chongqing University, Chongqing 400044, China
*
Authors to whom correspondence should be addressed.
Submission received: 10 July 2020 / Revised: 29 July 2020 / Accepted: 30 July 2020 / Published: 31 July 2020
(This article belongs to the Section Signal and Data Analysis)

Abstract

:
The real-time and accuracy of motion classification plays an essential role for the elderly or frail people in daily activities. This study aims to determine the optimal feature extraction and classification method for the activities of daily living (ADL). In the experiment, we collected surface electromyography (sEMG) signals from thigh semitendinosus, lateral thigh muscle, and calf gastrocnemius of the lower limbs to classify horizontal walking, crossing obstacles, standing up, going down the stairs, and going up the stairs. Firstly, we analyzed 11 feature extraction methods, including time domain, frequency domain, time-frequency domain, and entropy. Additionally, a feature evaluation method was proposed, and the separability of 11 feature extraction algorithms was calculated. Then, combined with 11 feature algorithms, the classification accuracy and time of 55 classification methods were calculated. The results showed that the Gaussian Kernel Linear Discriminant Analysis (GK-LDA) with WAMP had the highest classification accuracy rate (96%), and the calculation time was below 80 ms. In this paper, the quantitative comparative analysis of feature extraction and classification methods was a benefit to the application for the wearable sEMG sensor system in ADL.

1. Introduction

Due to the aging of the population, an increasing amount of elderly or weak people need help in daily life [1,2,3]. With the development of wireless networks and wearable sensor technology, a wearable sensor can sense the human body’s biological signal and classify the movement mode or body posture [4,5,6]. The auxiliary equipment based on the surface electromyography (sEMG) sensing systems, such as a rehabilitation robot and booster robot, can help the elderly or weak people to lead a better life [7,8,9].
The sEMG sensor measures the potential generated by muscle activity. The sEMG signals are generated in the range of 30 to 150 ms before human motion [10,11]. Therefore, the prediction of human motion can be realized by feature extraction and classification technology.
The sEMG is recorded from the surface of the human skeletal muscle by the surface electromyographic electrode, which contains much essential information related to limb movement. The key problem of these studies is to extract effective features from signals according to different motions [12]. The feature extraction methods of the sEMG signal mainly include time domain, frequency domain, and time-frequency domain. Among them, time-domain analysis is the most commonly used method such as integrated sEMG (IsEMG), mean absolute value (MAV), simple squared integration (SSI), root mean squared (RMS), wavelength (WL), zero-crossing (ZC), and Willison amplitude (WAMP) [13]. FEIYUN XIAO et al. used root mean square, waveform length, the absolute standard deviation of difference, integrated sEMG signal (IsEMG), and sEMG low-pass filtered (50 Hz) signal (LPFEMG) features to quickly and accurately estimate joint motion [14]. Osama dorgham et al. used time-domain features (such as MAV, RMS, VAR, and STD) to estimate muscle strength under different loads [15]. Shengli Zhou et al. used the frequency domain analysis method of median frequency (MDF) and peak frequency (PKF) to extract features, and combined with the Gaussian model, the accuracy of motion classification reached 89.5% [16]. Erdem Yavuz et al. extracted sEMG signal features by calculating Mel-Frequency Cepstral Coefficients (MFCCs) for basic motion classification [17]. The time-frequency domain analysis method can extract a large amount of information from the sEMG signal, among which the wavelet transform (WT) feature extraction method is the research hotspot in recent decades. Turker tuncer et al. used the iterative feature extraction method of discrete wavelet and tested the human muscle force with an sEMG data set, and the classification accuracy was 92.96% [18]. C. Sravani et al. used the flexible analytic wavelet transform (FAWT) to decompose the sEMG signal into eight sub-bands and extracted useful features, and the average accuracy of human motion classification was 91.5% [19]. Xugang Xi et al. used wavelet transforms to decompose sEMG signals into 32 scale signals and extracted sEMG features through coherent analysis to classify six movements of lower limbs, and the average classification rate was 93.45% [20]. Haotian she et al. used the time-frequency analysis method of the Stockwell transform (S-transform) and principal component analysis (PCA) to reduce the feature vector’s dimension and improve classifier operation speed, the average classification accuracy was 93.62% [21]. Hongfeng Chen et al. used the feature extraction method of the convolutional neural network (CNN) to improve the accuracy of human motion classification [22]. However, sEMG is a non-stationary, complex, and nonlinear signal. The entropy measurement method can reflect the sEMG signal’s complexity, which helps extract the effective features of the sEMG signal [23,24,25]. Shangchun Liao et al. used the method of integrating the entropy feature and wavelength feature of samples to realize the classification of human upper limb motion. Without expensive hardware support, the calculation was small, and the accuracy was 91.05% [26].
Another critical step of human motion classification is the selection of classification technology. Based on the above feature extraction methods, researchers mainly used support a vector machine (SVM), decision tree (DT), random forest (RF), nearest neighbor (KNN), and naive Bayes (NB) to classify human motion [27,28,29,30,31,32]. Rohit Gupta et al. used a time-domain analysis method to classify the movement of the lower limbs, and concluded that the linear discriminant analysis (LDA) classifier had the highest accuracy, and for different feature subsets, the classification accuracy was between 89% and 99% [33]. AI Qingsong et al. extracted the wavelet coefficients of sEMG, used linear discriminant analysis (LDA), and a support vector machine (SVM) based on the Gaussian kernel function to classify the lower limb motion accuracy higher than 95% [34]. In recent years, the neural network has been widely used in human complex motion classification because of its powerful nonlinear fitting function [35,36,37,38,39]. Chen Yang et al. extracted RMS, WC, and PE features of sEMG signals using the backpropagation neural network, generalized regression neural network, and least square support vector regression (LS-SVR) to predict the knee angle; the root mean square error was less than 7.7°, which can be used in a rehabilitation robot [40]. Lina Tong et al. used the Butterworth filtering method to extract sEMG features, and proposed a joint angle estimation method for real-time sEMG signals based on the backpropagation (BP) neural network and autoregressive (AR) model; the delay of this algorithm was 10 ~ 15 ms (PC), and the average angle RMS error was 4.27° [41].
Thanks to the surface electromyography, the surface of human skeletal muscle is recorded through surface electromyography electrodes and contains many feature information related to limb motion. By analyzing these features, we can distinguish daily human activities of the lower limb. Meanwhile, for the system with good performance, when selecting sEMG signal features, the features with maximum class separability, high recognition accuracy, and minimum computational complexity should be selected to ensure the high stability of auxiliary equipment. As far as the authors knew, there was almost no quantitative performance comparison of the feature extraction and classification methods for lower limb sEMG in daily human activities. Therefore, the purpose of this study was to determine the optimal sEMG features and classification methods.
The rest of this paper’s structure is as follows: Section 2 outlines the daily activities and data acquisition of human lower limbs. Section 3 analyzes the feature extraction method and classification method of the sEMG signal and proposed a feature evaluation method. The experimental results are given in Section 4. The discussion and conclusions are given in Section 5 and Section 6.

2. Data Acquisition

We chose the five most common activities of lower limbs in our daily life: horizontal walking (HW), crossing obstacles (CO), standing up (SU), going down the stairs (DS), and going up the stairs (GU). By analyzing the kinematics and biological characteristics of human lower limb muscles, the inner side of the gastrocnemius muscle (MG) is helpful for walking and running; the lateral femoral muscle (VL) and semitendinosus (ST) have the function of flexing the knee joint and stretching the hip joint. Therefore, we selected the above three muscles as the source of myoelectric signal acquisition, as shown in Figure 1.
We used an sEMG acquisition system developed and manufactured by Biometrics UK, as shown in Figure 2. The sampling frequency was 2000 Hz, and the amplifier’s input impedance was higher than 10,000,000 M Ohms. The skin did not require the conductive gel for processing to obtain better signal quality. The experimental computing platform processor was Intel (R) Core (TM) i7-9750H CPU@ 2.60GHz, the memory was 16 G, and the data analysis software was MATLAB2015b.

3. Algorithm Description

3.1. Feature Extraction

For each motion, we collected sEMG signals of 2 s (4000 sample points) and analyzed 11 common sEMG feature extraction methods, as shown in Table 1.
(1) Root mean square (RMS)
The root mean squared value (RMS) revealed the amount of strength yielded by a muscle.
R M S = 1 N n = 1 N x n 2
where x n was the sample data, N was the sample length, which was 4000.
(2) Variance (VAR)
The VAR measured the power of the myoelectric signal.
V A R = 1 N 1 i = 1 N x i 2
where x i was the sample data and N was sample length, which was 4000.
(3) Wilson Amplitude (WAMP)
Through the Willison amplitude, the number of times that two adjacent samples overcame a threshold was counted, reducing artifacts produced by noise.
W A M P = 1 N n = 1 N f ( | x n | ) ,   f ( x ) = { 1 , x t h 0 , o t h e r w i s e
where x n was the sample data and N was the sample length, which was 4000.
(4) Zero-Crossing (ZC)
The zero-crossing feature was to count the events produced by muscular activity.
Z C = i = 1 N 1 u ( x i x i + 1 )
where x i was the sample data and N was the sample length, which was 4000.
(5) Mean of absolute value (MAV)
The mean of absolute value was a reflection of muscle contraction levels:
M A V = 1 N i = 1 N | x i |
where x i was the sample data and N was the sample length, which was 4000.
(6) Waveform length (WL)
The WL represented the amplitude, duration, and frequency of the signal.
W L = i = 1 N | x i x i 1 |
where x i was the sample data and N was the sample length, which was 4000.
(7) Integrated sEMG (IsEMG)
The IsEMG was related to the signal sequence firing point.
I s E M G = i = 1 N | x i |
where x i was the sample data and N was the sample length, which was 4000.
(8) Simple squared integration (SSI)
The simple square integration function described the energy of the sEMG.
S S I = i = 1 N | x i | 2
where x i was the sample data and N was the sample length, which was 4000.
(9) The Energy of Wavelet Packet Coefficient (EWP)
The EWP calculated the energy of the wavelet packet transform signal. It can process both high-frequency components and low-frequency components.
(10) The Energy of Wavelet Coefficient (EWC)
This feature computed the energy of the wavelet-transformed signal.
E W C j = 1 K k = 1 K W j , k 2
where E W C j was the coefficient of wavelet energy. The K was the number of the j - th layer decomposed coefficient. The W j , k was the k - th coefficient of the j - th layer decomposed coefficient.
(11) Fuzzy entropy (FE)
The FE can describe the complexity of the sEMG signal and reflect the possibility of the new information in the signal.
F u z z y E n = ln Φ m ( r ) ln Φ m + 1 ( r )

3.2. Feature Separability

The Euclidean distance (ED) was used to measure the distance for sample features. The longer the distance, the greater the difference between sample features. The standard deviation (SD) was used to measure the dispersion for sample features. The smaller the standard deviation, the more stable the sample features.
We used the ratio between ED and SD that we called the RES index as a feature statistic measured metrics. The E D ( m , n ) was defined as
E D ( m , n ) = ( m 1 n 1 ) 2 + ( m 2 n 2 ) 2
where m and n represented two of the three feature sets. The S D was defined as
S D = w = 1 N W ( r w σ ) 2 N W
where r w represented the eigenvalue and N W represented the feature set size. The RES index was defined as
R E S ( m , n ) = E D ( m , n ) S D ¯ .
Additionally, we standardized the features then calculated the RES index. The normalization for features F n o r m was performed, and it was defined as
F n o r m = F + min ( F ) max ( F + min ( F ) ) .
Obviously, as the RES index value increased, we could extract the best feature value.

3.3. Classification

We considered and listed the following five representative classification algorithms, as shown in Table 2.
(1) Multiple Kernel Relevance Vector Machine (MKRVM)
Different kernel functions correspond to feature spaces of different sEMG signals. The multi-kernel function is more capable of describing complex lower limb motion. Therefore, using the multiple kernel relevance vector machines (MKRVM) can improve the accuracy of lower limb motion classification.
(2) Random Forest (RF)
The random forest algorithm measures each feature’s contribution to the classification and ranks them according to the random forest algorithm’s evaluation criteria. In this way, we can understand important features in the feature set, which is very helpful for how to improve the feature classification.
(3) Backpropagation neural network (BPNN)
The BPNN is a network trained according to the error backpropagation algorithm. Because of its strong nonlinear fitting ability, it is widely used in human motion classification based on sEMG signals.
(4) Gaussian Kernel Linear Discriminant Analysis (GK-LDA)
The basic idea of the GK-LDA is to use Gaussian kernel functions to project high-dimensional vectors into low-dimensional vector spaces, causing the sample to have the largest inter-class distance and the smallest intra-class distance in the new subspace to improve the accuracy of the classification effect.
(5) Wavelet neural network (WNN)
The WNN is a local basis function network. The basis function has an adjustable resolution scale, causing the network to have a stronger nonlinear learning ability. The wavelet basis function has tight support, so the interaction between neurons is small, and the learning speed is faster.

4. Results

We used three sEMG sensors to classify five lower limb motion. The five motions were HW, CO, SU, DS, and GU, as shown in Figure 3.
Five healthy subjects, aged 23, 23, 25, 26, and 24, were selected to participate. The body fat rate was 17% ± 3%, and the height was 170 ± 5 cm. During the experiment, the subjects completed each action cycle in about 2 s. Therefore, no matter how long the action time was, we only needed to select the first 2 s of the sEMG data as the raw data for classification, which could recognize activities for a longer period of time. We collected sEMG signals from five movements of lower limbs per subject.
We recorded the changing trend in the three muscles’ sEMG signals in five movements, as shown in Figure 4.
There were significant signal changes in the three selected muscles during the lower limb movement. The sEMG signal will fluctuate only when the movement changes. Among the three myoelectric signals, the crossing obstacle was the most obvious. There were also similar sEMG signal maps for GU, HW, and CO, as DS and SU were hard to distinguish from the raw signal. The trend of each movement in the three channels was different, which helped to distinguish different movement, and verified the correctness of our muscle selection.

4.1. Feature Separability Results

To evaluate the performance of the feature extraction method, Figure 5 shows the scatterplot of 11 methods. Each feature extraction algorithm had three features after reduction. The scatter plots of the two features were extracted from five movements using 11 methods. Each movement used a specific color and ten sampling points. From Figure 5, we could see the feature separability of 11 feature extraction algorithms. Figure 5b,j,k had better performance.
The performance of 11 feature extraction algorithms was evaluated by the RES index defined by Formula (13). The results are shown in Table 3. The RES indexes of EWP were 15.1, 12.9, and 12.5, respectively. The RES indexes of WAMP were 12.0, 11.2, and 8.0, respectively. The RES indexes of FE were 13.4, 14.5, and 6.5, respectively. The EWP, WAMP, and FE had good separability. The RES indexes of EWC were 13.6, 12.9, and 4.2, respectively. The RES indexes of IsEMG were 12.3, 9.2, and 8.4, respectively. The EWC and IsEMG had poor performance.

4.2. Movements Classification Results

The feature data sets of five movements were input into five classifiers (Table 2). We collected 1500 sets of data from five subjects. All classifiers used 5-fold cross-validation. The data set was divided into five subsets. Among these subsets, one of the subsets was selected as the test data, and the remaining four subsets were used as the training data. Figure 6 shows the average classification accuracy of the five classifiers with 11 feature algorithms. Figure 6 shows that RF and GK-LDA classification results were excellent for all feature algorithm, and the variance of GK-LDA was the smallest.
We drew a box graph to describe 11 feature algorithms’ classification accuracy, as shown in Figure 7a. According to Figure 7a, the EWP, WL, FE, and WAMP accuracy was higher than other feature algorithms, and the dispersion degree of the EWP was the lowest. Additionally, we drew a box graph to describe the classification accuracy of 5 classification algorithms, as shown in Figure 7b. According to Figure 7b, the GK-LDA classification accuracy was higher than other algorithms, and the dispersion was the lowest. It was proved that the GK-LDA classifier with EWP, WL, FE, and WAMP features has better classification performance.
The hyperparameter adjustment had an important influence on the classifier results. In the MKRVM and GK-LDA classification process, the parameter value was −10 to 10, the step size was 0.5, and the best result was selected in all experiments. In the WNN and BPNN classification process, the enumeration method was used to determine the training times which were set to 5000, the learning rate was 0.01, the training error was 0.001, the number of hidden neurons was 6, and the best classification result was selected. In the RF classification process, the enumeration method was used to determine that the number of trees was 10, and the number of features was 2 to obtain the best classification results.
The calculation time and classification accuracy are shown in Table 4. The GK-LDA using the WAMP feature ranked first at 96%. The GK-LDA classifier’s accuracy rate with the features of EWC, EWP, IsEMG, WL, FE, and WAMP reached more than 90%, which were satisfied for the accuracy of the lower limb classification. The calculation time of the GK-LDA was below 80 ms. Except for BPNN, the average accuracy of classifiers with EWP, FE, and WAMP features was more than 90%. It was proved that the EWP, FE, WAMP, and GK-LDA have excellent performance.
Figure 8 is a comparison chart of the average classification accuracy rate and average calculation time of 11 features in five classifiers. The WAMP, combined with five classifiers, had a better classification effect in real-time and accuracy.
The sensitivity analysis of the algorithm could determine the error of the motion classification, which was helpful in enhancing the practicability and safety of the system. Hence, we selected 800 testing sets to verify the sensitivity of 55 models. Table 5, Table 6, Table 7, Table 8 and Table 9 show the sensitivity metrics of 11 combined algorithms. Except for the BPNN algorithm, other algorithms had high accuracy in CO. GU classification, the average classification rate of GK-LDA, and RF algorithms were higher than other classification algorithms. The accuracy of the GU classification in the GK-LDA classifier was higher than 90%. All classifiers had the lowest HW classification accuracy, but the GK-LDA classifier accuracy was higher than other classifiers, reaching more than 80%. The classification accuracy of each action with WAMP, FE, and EWP feature classifiers was higher than other combined algorithms, reaching more than 88%. It was proved that the GK-LDA classifier with EWP, FE, and WAMP has high reliability.
Furthermore, we selected a female subject with a height of 155 cm and an age of 25 years. Then, we collected 500 samples as the testing samples. Because of the difference between the new subjects and the five subjects, the accuracy of the new subjects was lower than the five subjects. However, the accuracy of the GK-LDA classifier with EWP, FE, and WAMP features was 93%. The average classification accuracy of the GK-LDA was 90%, higher than BPNN, WNN, RF, and MKRVM. Additionally, the variance of the GK-LDA was the smallest, as shown in Figure 9. The adaptability of the GK-LDA was proved.

5. Discussion

For systems with excellent performance, the sEMG signal feature should be selected with maximum class separability, high recognition accuracy, and minimum computational complexity, ensuring a misclassification rate in implementation. It is helpful for the wide application of the wearable system based on the sEMG signal. Additionally, in many studies, the lower limb motion classification accuracy was improved by increasing the number of electrodes. However, in the actual lower limb movement process, the sEMG sensor will be disturbed by noise, the more sensors will lead to the instability for classification accuracy, and the wearer will also feel uncomfortable. Furthermore, with the increase in the number of electrodes, more data dimensions must be processed, and the calculation will increase. Therefore, this paper collected three muscle sEMG signals in the lower limb, which was in accord with the typical application of the lower limb auxiliary equipment. We proposed a feature evaluation method to analyze the effectiveness of feature extraction and compared the quantitative performance of the sEMG feature extraction and classification methods in daily human activities. We aimed to determine the optimal feature extraction and classification method, and provided guidance for the design of the lower extremity motion classification system based on sEMG. The results showed that the EWP, FE, and WAMP have better performance in the feature separability. The main reason is that the above methods can accurately measure the complexity of the sEMG signal and extract its features from multiple scales. Among them, the WAMP has the best performance in real-time. If the computation time of the entropy and wavelet packet transform can be reduced, they will be the best choice. The accuracy of the GK-LDA with EWC, EWP, IsEMG, WL, and FE features was more than 90%, and the response time was less than 80 ms. After the box plot analysis, the GK-LDA with EWP, FE, WL, and WAMP features has good classification reliability. Through sensitivity analysis, the CO and DS classification accuracies were the highest, reaching 100%. Although the accuracy of the HW classification was the lowest, it was still over 80%. This could satisfy the needs of daily activities. There were some limitations to this study. It was necessary to use a wearable sEMG system to collect the lower limb sEMG data of the elderly or patients in the actual environment. It was unknown how effective the algorithm was in the elderly or real patients’ lower extremity activities. Additionally, sEMG is a bioelectrical signal recorded from the muscle surface by electrodes, but it is easily affected by electrode aging, sweat, and external electromagnetic interference. Therefore, the interference compensation of the sEMG signal will be considered in the next step. Furthermore, this paper only considered the classification effect of discrete actions and did not consider the classification effect of continuous action changes. Hence, in the next step, we need to determine the starting position of different activities in the sEMG signal and obtain different active sEMG signal regions.

6. Conclusions

To improve the accuracy and real-time classification, we analyzed a series of feature extraction and classification methods. Additionally, we proposed a feature evaluation method to analyze the effectiveness of feature extraction. The conclusion was that the WAMP, FE, and EWP feature extraction methods were highly separable, and the WAMP calculation time was shorter. The GK-LDA was the best method in the lower limb classification. GK-LDA and WAMP was the best combination in sEMG feature extraction and classification. The results were helpful to the development of a wearable sEMG system. It has important implications for other sEMG signal-based devices, such as clinical assistive devices, walking assist devices, and robotics or prosthetic devices. The next step is to apply the algorithm to real situations and combine the existing sEMG sensor with the physical signal sensor, such as an accelerometer or gyroscope sensor, to improve the classification accuracy. Meanwhile, the classification effect of different movements in the lower limbs’ continuous activities for the elderly or actual patients should be considered. In future applications, these algorithms can also be used to predict the risk of falling, only needing to collect and analyze the sEMG signal when falling.

Author Contributions

Conceptualization and methodology, P.Q.; project administration, X.S. Both authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Projects U1613226 and U1813217 supported by NSFC, China, and Project 2019-INT009 from the Shenzhen Institute of Artificial Intelligence and Robotics for Society.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. He, J.; Sheng, X.; Zhu, X.; Jiang, C.; Jiang, N. Spatial Information Enhances Myoelectric Control Performance with Only Two Channels. IEEE Trans. Ind. Inform. 2019, 15, 1226–1233. [Google Scholar] [CrossRef]
  2. Wu, Q.; Chen, B.; Wu, H. Neural-network-enhanced torque estimation control of a soft wearable exoskeleton for elbow assistance. Mechatronics 2019, 63, 102279. [Google Scholar] [CrossRef]
  3. Cerone, G.L.; Botter, A.; Gazzoni, M.; Modular, A. Smart and Wearable System for High Density sEMG Detection. IEEE Trans. Biomed. Eng. 2019, 66, 3371–3380. [Google Scholar] [CrossRef]
  4. Song, W.; Han, Q.; Lin, Z.; Yan, N.; Luo, D.; Liao, Y.; Zhang, M.; Wang, Z.; Xie, X.; Wang, A.; et al. Design of a Flexible Wearable Smart sEMG Recorder Integrated Gradient Boosting Decision Tree Based Hand Gesture Recognition. IEEE Trans. Biomed. Circuits Syst. 2019, 13, 1563–1574. [Google Scholar] [CrossRef]
  5. Jiang, S.; Gao, Q.; Liu, H.; Shull, P.B. A novel, co-located EMG-FMG-sensing wearable armband for hand gesture recognition. Sens. Actuators A Phys. 2020, 301, 111738. [Google Scholar] [CrossRef]
  6. Liu, H.; Dong, W.; Li, Y.; Li, F.; Geng, J.; Zhu, M.; Chen, T.; Zhang, H.; Sun, L.; Lee, C. An epidermal sEMG tattoo-like patch as a new human–machine interface for patients with loss of voice. Microsyst. Nanoeng. 2020, 6, 1–13. [Google Scholar] [CrossRef] [Green Version]
  7. Li, Z.; Deng, C.; Zhao, K. Human-Cooperative Control of a Wearable Walking Exoskeleton for Enhancing Climbing Stair Activities. IEEE Trans. Ind. Electron. 2020, 67, 3086–3095. [Google Scholar] [CrossRef]
  8. Ghaffar, A.; Dehghani-Sanij, A.A.; Xie, S.Q. A review of gait disorders in the elderly and neurological patients for robot-assisted training. Disabil. Rehabil. Assist. Technol. 2020, 15, 256–270. [Google Scholar] [CrossRef]
  9. Li, Z.; Ren, Z.; Zhao, K.; Deng, C.; Feng, Y. Human-Cooperative Control Design of a Walking Exoskeleton for Body Weight Support. IEEE Trans. Ind. Inform. 2020, 16, 2985–2996. [Google Scholar] [CrossRef]
  10. Fox, S.; Kotelba, A.; Marstio, I.; Montonen, J. Aligning human psychomotor characteristics with robots, exoskeletons and augmented reality. Robot. Comput.-Integr. Manuf. 2020, 63. [Google Scholar] [CrossRef]
  11. Chathuramali, K.G.M.; Kiguchi, K. Real-time detection of the interaction between an upper-limb power-assist robot user and another person for perception-assist. Cogn. Syst. Res. 2020, 61, 53–63. [Google Scholar] [CrossRef]
  12. Liu, H.; Tao, J.; Lyu, P.; Tian, F. Human-robot cooperative control based on sEMG for the upper limb exoskeleton robot. Robot. Auton. Syst. 2020, 125. [Google Scholar] [CrossRef]
  13. Ramírez-Martínez, D.; Alfaro-Ponce, M.; Pogrebnyak, O.; Aldape-Pérez, M.; Argüelles-Cruz, A.J. Hand Movement Classification Using Burg Reflection Coefficients. Sensors 2019, 19, 475. [Google Scholar] [CrossRef] [Green Version]
  14. Xiao, F.; Wang, Y.; He, L.; Wang, H.; Li, W.; Liu, Z. Motion Estimation From Surface Electromyogram Using Adaboost Regression and Average Feature Values. IEEE Access 2019, 7, 13121–13134. [Google Scholar] [CrossRef]
  15. Dorgham, O.; Al-Mherat, I.; Al-Shaer, J.; Bani-Ahmad, S.; Laycock, S. Smart System for Prediction of Accurate Surface Electromyography Signals Using an Artificial Neural Network. Future Internet 2019, 11, 25. [Google Scholar] [CrossRef] [Green Version]
  16. Zhou, S.; Yin, K.; Fei, F.; Zhang, K. Surface electromyography–based hand movement recognition using the Gaussian mixture model, multilayer perceptron, and AdaBoost method. Int. J. Distrib. Sens Netw. 2019, 15, 1550147719846060. [Google Scholar] [CrossRef]
  17. Yavuz, E.; Eyupoglu, C. A cepstrum analysis-based classification method for hand movement surface EMG signals. Med. Biol. Eng. Comput. 2019, 57, 2179–2201. [Google Scholar] [CrossRef]
  18. Tuncer, T.; Dogan, S.; Subasi, A. Surface EMG signal classification using ternary pattern and discrete wavelet transform based feature extraction for hand movement recognition. Biomed. Signal Process. Control 2020, 58, 101872. [Google Scholar] [CrossRef]
  19. Sravani, C.; Bajaj, V.; Taran, S.; Sengur, A. Flexible Analytic Wavelet Transform Based Features for Physical Action Identification Using sEMG Signals. Irbm 2020, 41, 18–22. [Google Scholar] [CrossRef]
  20. Xi, X.; Yang, C.; Shi, J.; Luo, Z.; Zhao, Y.B. Surface Electromyography-Based Daily Activity Recognition Using Wavelet Coherence Coefficient and Support Vector Machine. Neural Process. Lett. 2019, 50, 2265–2280. [Google Scholar] [CrossRef]
  21. She, H.; Zhu, J.; Tian, Y.; Wang, Y.; Yokoi, H.; Huang, Q. SEMG Feature Extraction Based on Stock-well Transform Improves Hand Movement Recognition Accuracy. Sensors 2019, 19, 4457. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Chen, H.; Zhang, Y.; Li, G.; Fang, Y.; Liu, H. Surface electromyography feature extraction via convolutional neural network. Int. J. Mach. Learn. Cybern. 2019, 11, 185–196. [Google Scholar]
  23. Wu, Y.; Hu, X.; Wang, Z.; Wen, J.; Kan, J.; Li, W. Exploration of Feature Extraction Methods and Dimension for sEMG Signal Classification. Appl. Sci. 2019, 9, 5343. [Google Scholar] [CrossRef] [Green Version]
  24. Ge, L.; Ge, L.-J.; Hu, J. Feature Extraction and Classification of Hand Movements Surface Electromyogram Signals Based on Multi-method Integration. Neural Process. Lett. 2018, 49, 1179–1188. [Google Scholar] [CrossRef]
  25. Chen, X.; Chen, J.; Liang, J.; Li, Y. Entropy-Based Surface Electromyogram Feature Extraction for Knee Osteoarthritis Classification. IEEE Access 2019, 7, 164144–164151. [Google Scholar] [CrossRef]
  26. Liao, S.; Li, G.; Li, J.; Jiang, D.; Jiang, G.; Sun, Y.; Tao, B.; Zhao, H.; Chen, D. Multi-object intergroup gesture recognition combined with fusion feature and KNN algorithm. J. Intell. Fuzzy Syst. 2020, 38, 2725–2735. [Google Scholar] [CrossRef]
  27. Amanpreet, K. Machine learning-based novel approach to classify the shoulder motion of upper limb amputees. Biocybern. Biomed. Eng. 2019, 39, 857–867. [Google Scholar] [CrossRef]
  28. Mukhopadhyay, A.K.; Samui, S. An experimental study on upper limb position invariant EMG signal classification based on deep neural network. Biomed. Signal Process. Control 2020, 55, 101669. [Google Scholar] [CrossRef]
  29. Cene, V.H.; Balbinot, A. Enhancing the classification of hand movements through sEMG signal and non-iterative methods. Health Technol. 2019, 9, 561–577. [Google Scholar] [CrossRef]
  30. Zhang, Z.; Yang, K.; Qian, J.; Zhang, L. Real-Time Surface EMG Pattern Recognition for Hand Gestures Based on an Artificial Neural Network. Sensors 2019, 19, 3170. [Google Scholar] [CrossRef] [Green Version]
  31. Zhang, Y.; Yu, J.; Xia, C.; Yang, K.; Cao, H.; Wu, Q. Research on GA-SVM Based Head-Motion Classification via Mechanomyography Feature Analysis. Sensors 2019, 19, 1986. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Khan, T.; Lundgren, L.E.; Järpe, E.; Olsson, M.C.; Viberg, P.A. A Novel Method for Classification of Running Fatigue Using Change-Point Segmentation. Sensors 2019, 19, 4729. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Gupta, R.; Agarwal, R. Electromyographic Signal-Driven Continuous Locomotion Mode Identification Module Design for Lower Limb Prosthesis Control. Arab. J. Sci. Eng. 2018, 43, 7817–7835. [Google Scholar] [CrossRef]
  34. Ai, Q.; Zhang, Y.; Qi, W.; Liu, Q. Research on Lower Limb Motion Recognition Based on Fusion of sEMG and Accelerometer Signals. Symmetry 2017, 9, 147. [Google Scholar] [CrossRef]
  35. Luo, R.; Sun, S.; Zhang, X.; Tang, Z.; Wang, W. A Low-Cost End-to-End sEMG-Based Gait Sub-Phase Recognition System. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 267–276. [Google Scholar] [CrossRef]
  36. Yoo, H.J.; Park, H.J.; Lee, B. Myoelectric Signal Classification of Targeted Muscles Using Dictionary Learning. Sensors 2019, 19, 2370. [Google Scholar] [CrossRef] [Green Version]
  37. Xu, L.; Chen, X.; Cao, S.; Zhang, X.; Chen, X. Feasibility Study of Advanced Neural Networks Applied to sEMG-Based Force Estimation. Sensors 2018, 18, 3226. [Google Scholar] [CrossRef] [Green Version]
  38. Maragliulo, S.; Lopes, P.F.; Osório, L.B.; De Almeida, A.T.; Tavakoli, M. Foot Gesture Recognition Through Dual Channel Wearable EMG System. IEEE Sens. J. 2019, 19, 10187–10197. [Google Scholar] [CrossRef]
  39. Chada, S.; Taran, S.; Bajaj, V. An efficient approach for physical actions classification using surface EMG signals. Health Inf. Sci. Syst. 2020, 8, 3. [Google Scholar] [CrossRef]
  40. Yang, C.; Xi, X.; Chen, S.; Miran, S.M.; Hua, X.; Luo, Z. SEMG-based multi-features and predictive model for knee-joint-angle estimation. AIP Adv. 2019, 9, 095042. [Google Scholar] [CrossRef]
  41. Tong, L.; Zhang, F.; Hou, Z.G.; Wang, W.; Peng, L. Bp–Ar-Based Human Joint Angle Estimation Using Multi-Channel Semg. Int. J. Robot. Autom. 2015, 30, 227–237. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Surface electromyography (sEMG) signal sensor location. Channel A was located in the thigh semitendinosus, Channel B was located in the lateral thigh muscle, Channel C was located in the calf gastrocnemius.
Figure 1. Surface electromyography (sEMG) signal sensor location. Channel A was located in the thigh semitendinosus, Channel B was located in the lateral thigh muscle, Channel C was located in the calf gastrocnemius.
Entropy 22 00852 g001
Figure 2. The biometrics wireless sEMG sensor system.
Figure 2. The biometrics wireless sEMG sensor system.
Entropy 22 00852 g002
Figure 3. Five movements of lower limbs. (a) HW. (b) CO. (c) SU. (d) DS. (e) GU.
Figure 3. Five movements of lower limbs. (a) HW. (b) CO. (c) SU. (d) DS. (e) GU.
Entropy 22 00852 g003
Figure 4. The raw sEMG signals of five movements.
Figure 4. The raw sEMG signals of five movements.
Entropy 22 00852 g004
Figure 5. Separable scatterplot of eleven feature extraction algorithms. (a) Scatter plot for five different motion features extracted using the EWC. (b) Scatter plot for five different motion features extracted using the EWP. (c) Scatter plot for five different motion features extracted using the IsEMG. (d) Scatter plot for five different motion features extracted using the MAV. (e) Scatter plot for five different motion features extracted using the RMS. (f) Scatter plot for five different motion features extracted using the SSI. (g) Scatter plot for five different motion features extracted using the VAR. (h) Scatter plot for five different motion features extracted using the WL. (i) Scatter plot for five different motion features extracted using the ZC. (j) Scatter plot for five different motion features extracted using the FE. (k) Scatter plot for five different motion features extracted using the WAMP.
Figure 5. Separable scatterplot of eleven feature extraction algorithms. (a) Scatter plot for five different motion features extracted using the EWC. (b) Scatter plot for five different motion features extracted using the EWP. (c) Scatter plot for five different motion features extracted using the IsEMG. (d) Scatter plot for five different motion features extracted using the MAV. (e) Scatter plot for five different motion features extracted using the RMS. (f) Scatter plot for five different motion features extracted using the SSI. (g) Scatter plot for five different motion features extracted using the VAR. (h) Scatter plot for five different motion features extracted using the WL. (i) Scatter plot for five different motion features extracted using the ZC. (j) Scatter plot for five different motion features extracted using the FE. (k) Scatter plot for five different motion features extracted using the WAMP.
Entropy 22 00852 g005aEntropy 22 00852 g005b
Figure 6. Average classification accuracy rates.
Figure 6. Average classification accuracy rates.
Entropy 22 00852 g006
Figure 7. Box plot of the classification accuracy. (a) Box plot of 11 feature extraction algorithms. (b) Box plot of 5 classifier accuracies.
Figure 7. Box plot of the classification accuracy. (a) Box plot of 11 feature extraction algorithms. (b) Box plot of 5 classifier accuracies.
Entropy 22 00852 g007
Figure 8. The average classification accuracy rate and the average calculation time across the classifiers.
Figure 8. The average classification accuracy rate and the average calculation time across the classifiers.
Entropy 22 00852 g008
Figure 9. Average classification accuracy rates.
Figure 9. Average classification accuracy rates.
Entropy 22 00852 g009
Table 1. Feature extraction method list.
Table 1. Feature extraction method list.
IDExtraction FeatureAbbreviation
1Root mean squareRMS
2VarianceVAR
3Wilson AmplitudeWAMP
4Zero-CrossingZC
5Mean of absolute valueMAV
6Waveform lengthWL
7Integrated sEMGIsEMG
8Simple squared integrationSSI
9Energy of Wavelet Packet CoefficientEWP
10Energy of Wavelet CoefficientEWC
11Fuzzy entropyFE
Table 2. Classification method list.
Table 2. Classification method list.
Classification AlgorithmAbbreviation
Multiple Kernel Relevance Vector MachineMKRVM
Random ForestRF
Back Propagation Neural NetworkBPNN
Gaussian Kernel Linear Discriminant Analysis Linear Discriminate AnalysisGK-LDA
Wavelet Neural NetworkWNN
Table 3. Statistical metrics of eleven feature extraction algorithms.
Table 3. Statistical metrics of eleven feature extraction algorithms.
AlgorithmFeature Algorithm Metrics
Average Standard Deviation (SD)Euclidean Distance (ED)The Ratio between ED and SD (RES)
Feature 1Feature 2Feature 3Feature 1Feature 2Feature 3Feature 1Feature 2Feature 1
EWC20.111.512.5211.9155.567.213.612.94.2
EWP8.65.76.7105.180.195.115.112.912.5
IsEMG28,929.323,548.823,281.4322,538.2214,969.8218,668.012.39.28.4
MAV7.25.95.982.653.158.012.69.08.9
RMS20.411.812.7223.0106.3168.213.98.710.2
SSI6,338,6952,144,3882,767,77362,093,129.122,888,219.854,247,541.014.79.311.9
VAR1582.6536.4689.715,481.35726.113,461.714.69.311.8
WL2.71.61.629.214.220.813.48.99.7
ZC81.959.232.1702.8617.6480.710.113.78.6
FE0.50.50.36.75.82.613.414.56.5
WAMP6.110.95.8102.693.747.712.011.28.0
Table 4. Classification rates and calculation time (ms, %). The double underscores represent the feature with the shortest computation time for each classifier. The underline represent the classifier with the shortest computation time for each feature. The shadow represents the most accurate feature of each classifier. The background shading represent the classifier with the highest accuracy for each feature.
Table 4. Classification rates and calculation time (ms, %). The double underscores represent the feature with the shortest computation time for each classifier. The underline represent the classifier with the shortest computation time for each feature. The shadow represents the most accurate feature of each classifier. The background shading represent the classifier with the highest accuracy for each feature.
Feature Extraction MethodClassification Algorithm
MKRVMRFBPNNGK-LDAWNN
TimeAccuracyTimeAccuracyTimeAccuracyTimeAccuracyTimeAccuracy
EWC890871438810278469.89225580
EWP969921469410348975.39526788
IsEMG922851538910449059.49024680
MAV93483146888568059.88925480
RMS998901498810477560.18926180
SSI1058821498910546059.88924770
VAR97285150888767059.48626480
WL103290144909398859.89426170
ZC93280142898896059.78825270
FE895921889410388369.49525890
WAMP86393145927648159.99625390
Table 5. Sensitivity metrics of the MKRVM algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
Table 5. Sensitivity metrics of the MKRVM algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
AlgorithmSensitivity (%)
HWCOSUGUDS
EWC_ MKRVM8589869184
EWP_ MKRVM8994919690
IsEMG_ MKRVM8387848982
MAV_ MKRVM8185828780
RMS_ MKRVM8892899487
SSI_ MKRVM8084818679
VAR_ MKRVM8089858883
WL_ MKRVM8594909388
ZC_ MKRVM7584808378
FE_ MKRVM8796929590
WAMP_ MKRVM8898939492
Table 6. Sensitivity metrics of the RF algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
Table 6. Sensitivity metrics of the RF algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
AlgorithmSensitivity (%)
HWCOSUGUDS
EWC_ RF8192899088
EWP_ RF8798959694
IsEMG_ RF8293909189
MAV_ RF8291878991
RMS_ RF8092889387
SSI_ RF8193899488
VAR_ RF7894849589
WL_ RF8096869791
ZC_ RF8391899290
FE_ RF9097969988
WAMP_ RF8898929983
Table 7. Sensitivity metrics of the BPNN algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
Table 7. Sensitivity metrics of the BPNN algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
AlgorithmSensitivity (%)
HWCOSUGUDS
EWC_ BPNN8285848683
EWP_ BPNN8891909289
IsEMG_ BPNN8790899188
MAV_ BPNN7881808279
RMS_ BPNN7376757774
SSI_ BPNN5861606259
VAR_ BPNN6871707269
WL_ BPNN8689889087
ZC_ BPNN5960655858
FE_ BPNN8184838582
WAMP_ BPNN7982818380
Table 8. Sensitivity metrics of the GK-LDA algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
Table 8. Sensitivity metrics of the GK-LDA algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
AlgorithmSensitivity (%)
HWCOSUGUDS
EWC_ GK-LDA8695929493
EWP_ GK-LDA8899969795
IsEMG_ GK-LDA8394919290
MAV_ GK-LDA8293909189
RMS_ GK-LDA8192879491
SSI_ GK-LDA8489869492
VAR_ GK-LDA8186839189
WL_ GK-LDA8994919997
ZC_ GK-LDA8388859391
FE_ GK-LDA87100989496
WAMP_ GK-LDA851001009793
Table 9. Sensitivity metrics of the WNN algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
Table 9. Sensitivity metrics of the WNN algorithm. The shadows represent the most accurate actions of each classifier. The bold numbers represent the classifier with the highest accuracy for each action.
AlgorithmSensitivity (%)
HWCOSUGUDS
EWC_ WNN7582798381
EWP_ WNN8390879189
IsEMG_ WNN7785768280
MAV_ WNN7389798079
RMS_ WNN8082778180
SSI_ WNN7072677170
VAR_ WNN7983788179
WL_ WNN6973687070
ZC_ WNN6874657271
FE_ WNN8993889189
WAMP_ WNN8592899391

Share and Cite

MDPI and ACS Style

Qin, P.; Shi, X. Evaluation of Feature Extraction and Classification for Lower Limb Motion Based on sEMG Signal. Entropy 2020, 22, 852. https://0-doi-org.brum.beds.ac.uk/10.3390/e22080852

AMA Style

Qin P, Shi X. Evaluation of Feature Extraction and Classification for Lower Limb Motion Based on sEMG Signal. Entropy. 2020; 22(8):852. https://0-doi-org.brum.beds.ac.uk/10.3390/e22080852

Chicago/Turabian Style

Qin, Pengjie, and Xin Shi. 2020. "Evaluation of Feature Extraction and Classification for Lower Limb Motion Based on sEMG Signal" Entropy 22, no. 8: 852. https://0-doi-org.brum.beds.ac.uk/10.3390/e22080852

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop