Next Article in Journal
A Model for Examining Challenges and Opportunities in Use of Cloud Computing for Health Information Systems
Previous Article in Journal
Text Mining of Stocktwits Data for Predicting Stock Prices
Previous Article in Special Issue
New Compact Wearable Metamaterials Circular Patch Antennas for IoT, Medical and 5G Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Wireless Motion Capture System for Upper Limb Rehabilitation

Electrical and Computer Engineering Department, University of Patras, 26500 Rio, Greece
*
Authors to whom correspondence should be addressed.
Appl. Syst. Innov. 2021, 4(1), 14; https://0-doi-org.brum.beds.ac.uk/10.3390/asi4010014
Submission received: 22 December 2020 / Revised: 3 February 2021 / Accepted: 5 February 2021 / Published: 17 February 2021

Abstract

:
This work is devoted to the presentation of a Wireless Sensor System implementation for upper limb rehabilitation to function as a complementary system for a patient’s progress supervision during rehabilitation exercises. A cost effective motion capture sensor node composed by a 9 Degrees-of-Freedom (DoF) Inertial Measurement Unit (IMU) is mounted on the patient’s upper limb segments and sends wirelessly the corresponding measured signals to a base station. The sensor orientation and the upper limb individual segments movement in 3-Dimensional (3D) space are derived by processing the sensors’ raw data. For the latter purpose, a biomechanical model which resembles that of a kinematic model of a robotic arm based on the Denavit-Hartenberg (DH) configuration is used to approximate in real time the upper limb movements. The joint angles of the upper limb model are estimated from the extracted sensor node’s orientation angles. The experimental results of a human performing common rehabilitation exercises using the proposed motion capture sensor node are compared with the ones using an off-the-shelf sensor. This comparison results to very low error rates with the root mean square error (RMSE) being about 0.02 m.

1. Introduction

In the rehabilitation process, the patient is expected to perform sets of physical exercises and activities under the supervision of the corresponding medical staff, in order to achieve a functioning level of body segments, which may have been impaired by an accident or a surgery, or as a consequence of pathological conditions, such as in case of a stroke. The objective in these circumstances is to decrease the recovery time for the patient by rehabilitating his physiological motor capabilities. Research studies and clinical results have shown that intensive rehabilitation can lead to optimal outcomes in terms of the patient’s motor capabilities and of the corresponding recovery time as well [1,2]. Within this challenging field in the broader context of medical science, human motion science and selected fields of biomedical engineering have contributed to the advancement of movement rehabilitation techniques by taking into account quantitative measurements and parametric models of human movements [3,4].

1.1. Types of Upper Limb Rehabilitation Systems

In the field of upper limb rehabilitation, a wide range of systems along with a variety of technologies and methodologies are presented in the literature. Human Motion Tracking systems, have been introduced in the rehabilitation supervision procedure [4]. However, the accurate localization that such systems perform are in direct conflict with their high cost. Furthermore, occlusion problems and the issues regarding the line-of-sight (LoS) during experimental process are drawbacks that reduce the effectiveness of these systems. On the other side, robot-based systems, where the patient installs his body segments, for rehabilitation purposes are extensively used in cases of severe disabilities [5,6,7]. Such systems are able to move, guide, even apply resistance to the limb motion, aiming at muscles strengthening and functionality improvement of the affected body segment.
Taking into account the issues inherent to visual-based systems, such as LoS, occlusion cases [4] and the problems related to the expensive robot assisting devices, which are cumbersome and require specialized medical staff, the alternative of lower cost, small-size sensor nodes to be used in motion tracking [3,8] was risen. In sensor-based integrations, such nodes are worn by the patient and gather, in an unobtrusive way, motion, position and physiological state data, without interfering with the patient’s normal behavior. This low cost technology is motivated by the several benefits regarding self organization, flexibility and the ability to provide long-term monitoring. The technical challenges of WSN within this application field of rehabilitation are presented in [9].

1.2. WSN-based Upper Limb Rehabilitation Systems

WSN-based Rehabilitation Systems for the Upper Limb have been introduced in the literature since 2010, indicating that WSN rehabilitation applications is a recent field of research. The sensor nodes that are attached to human body segments, forming a Body Area Network, consist of specific sensor units [10]. Individual accelerometers, gyroscopes and magnetometers or directly a unified Inertial Measurement Unit (IMU) can be integrated into these nodes. These systems rely on both custom-made sensor nodes [11,12,13] and off-the-shelf items [14,15].
These sensors nodes collect data, such as accelerations or angular velocities, in order to extract information about the position, the motion and/or the direction of body segments. Apart from the IMU, the optical linear encoder (OLE) [16] is another type of sensor that is met in the literature regarding WSN-based rehabilitation applications. It can provide direct information about a joint angle (only 1 DoF), but it is necessary to be combined with an accelerometer at least, to derive further information for the rest of the rotations, which are performed by this joint [17,18].
Another fundamental point in an upper limb rehabilitation application is the networking protocol that is going to be implemented. Within the literature, Wi-Fi [19], Bluetooth [14], IEEE 802.15.4 [11], ZigBee [20] and MiWi [13] are proposed, affecting in a varying degree the energy consumption of the total system, as well as the reliability of the sensors’ data transmission. Depending on the particular application’s objectives, either Time Division Multiple Access (TDMA) scheduling algorithms for node synchronization are used [15,20] or Carrier Sense Multiple Access/Collision Detection (CSMA/CA) for collision avoidance [21]. As far as the packet loss issue is concerned, due to the high sampling rate of the sensor, a lost data packet does not significantly reduce the accuracy of the collected data, since the missing information can be restored from neighbor sensors’ data using sensor fusion methods.
The interdisciplinary research field of rehabilitation procedure supervision using WSNs appears to be emerging and promising for the future. The achievements and complete rehabilitation solutions developed by relevant companies, such as Libelium [22], Deltason [23], Xsens [24], The IoT Marketplace [25] and Shimmer [26] indicate a rising interest in this field.

1.3. Upper Limb Motion Reconstruction

In reliable WSN-based rehabilitation systems, multiple accelerometers, gyroscopes and magnetometers collect raw data related to linear accelerations, angular rates and magnetic field data of the patient body movements. This data and the corresponding timestamps are transmitted to a base station for accurate estimation of the body segments movements in a global coordinate reference system [27]. More accurate estimations of the segments’ orientations can be obtained by taking into account restrictions related to the feasible orientation directions and ranges of the joint angles. At the next step, the reconstruction of the posture of the subject takes place, adopting the Denavit-Hartenberg (DH) configuration [17] or a Quaternion-based [14] approach.
In the final processing stage, using kinematic analysis, the limb posture and orientation is visualized in a 3-Dimensional (3D) digital representation. Using this representation, the extraction of several distinct characteristics, such as the movement amplitude, segments’ range of motion, movement velocity, is accomplished. This could give useful information during each exercise within a rehabilitation session. Furthermore, in some cases, the extraction of motion patterns is achieved via Principal Component Analysis (PCA) [28] or Machine Learning algorithms and techniques [19].

1.4. Motivation & Objectives

Several successful examples exist in the literature regarding the estimation of the upper limb movements using a wide variety of techniques along with IMU sensors. An inertial tracking system can be used in rehabilitation applications to derive the upper limb movement and its trajectory in real time. Furthermore, such a system can provide additional information, such as the movement amplitude, the range of motion, the angles between upper limb segments, or even the movement velocity. This information allows the medical stuff and the patient to evaluate the upper limb exercises, either at a medical center or at home. In the latter case, the patient can be monitored by the therapists without the need to visit a medical facility.
In this work a sensor system for monitoring the upper limb rehabilitation procedure is presented. The core of this system is a 9 Degrees of Freedom (DoF) IMU. Thus, the sensor node is aimed to be attached on various human body segments for motion capturing purposes. Regarding the specific case of upper limb, such sensors can be mounted on upper arm, forearm and hand, gather measurements and transmit the data to a base station. Further processing and filtering of these raw data result to each segment individual orientation, while the estimation of the upper limb joints and segments position and orientation in the 3D space, is accomplished by a reconstruction procedure based on the upper limb model.
The overall processing procedure includes a series of well defined mathematical expressions for the calculation of the upper limb’s joint angles in closed form. This is performed by exploiting the extracted orientation of the individual segments. Then, these joint angles represent the DoFs angles of a biomechanical model of the upper limb structure based on the Denavit-Hartenberg configuration. The efficiency of this methodology is evaluated by experimental studies on some common rehabilitation exercises, which are applied on a human’s upper limb. Furthermore, the comparison between the cost effective motion capture sensor node, that is proposed in this work, and an off-the-shelf sensor, emphasizes the effectiveness of the proposed system along with the total processing methodology.
The present work is an extension of our previous paper [29], where a custom-made sensor node was used for motion acquisition and 3D reconstruction of the upper limb segments movement. Here, the sensor fusion algorithm has been improved, in order to account for the relationship between the Euler angles’ time rates of change and the angular velocity resolved in the body fixed frame, as is presented in [30]. Moreover, we turn the previous device into a low-cost wearable device that did not exceed the cost of 20 dollars. Rehabilitation exercises were performed wearing the device to ensure that it is comfortable for the patient. Finally, motion capturing results using this sensor node and a much more expensive off-the-shelf device are compared to verify the validity of the custom-made device.
The upper limb model representation based on the Denavit-Hartenberg configuration, is adopted to achieve the 3D motion reconstruction of the upper limb motion, and, is presented in Section 2. In Section 3, the followed methodology regarding the gathering, processing and filtering of sensor nodes’ inertial measurements is described. This results to the calculation of the links/segments orientation, where the sensor nodes are attached to. The estimation of the joints’ angles for the upper limb model using this orientations, is also presented for each segment of the upper limb. In Section 4, the experimental results of a human performing a set of typical rehabilitation exercises are shown. Based on the motion capture sensor node raw data, the proposed methodology is utilized to provide the upper limb segments trajectories. The estimated 3D trajectories are compared with the trajectories that are concurrently derived by Shimmer3 motion capture sensor nodes. Finally, a discussion on the proposed methodology and over the experimental results is given in Section 5, followed by the conclusions.

2. Upper Limb Modeling

2.1. Upper Limb Kinematic Model

Anatomically, the upper limb consists of the pectoral girdle, the upper arm, the forearm and the hand [31]. The three joints of the upper limb are: the glenohumeral joint, the elbow, and the wrist joint. Hence, the upper limb could be modeled as a kinematic chain consisting of these three joints along with the corresponding DoFs, which account for the feasible rotations for each joint.
Related configuration of upper limb for motion tracking, based on the DH parameters can be found in the literature. In [32], the authors studied the movement of the shoulder and the elbow, and they designed a 4 DoF kinematic chain to express only the shoulder glenohumeral rotations and the elbow flexion. In [17], the human upper limb is also represented by rigid links connected with joints, where the upper limb kinematics are described with a 7 DoF model. This model has the shoulder as the origin of the kinematic chain and represents the shoulder joint mobility by 3 DoFs, the elbow joint by only 1 DoF, and the wrist joint by 3 DoFs. The latter correspond to wrist flexion-extension and deviation and the third one, is the forearm pronation-supination motion. In our work, the forearm pronation-supination has been defined as a wrist joint rotational DoF.
In [33], the authors assume a 7 DoF model of the left upper limb. The motions of the scapula and clavicle are also modeled by means of the humerus head ability to elevate and retract. This concludes to two extra DoFs that represent the clavicle elevation-depression and profusion-retraction, respectively. In this model, 3 rotational DoFs account for the shoulder abduction-adduction, internal rotation, and flexion-extension, followed by the 2 elbow DoFs performing the elbow flexion-extension and the forearm pronation-supination. The wrist joint DoFs is omitted in this model. Hense, this work is focused on the clavicle range of motion and on the shoulder joint DoFs.
In [34], the authors proposed a more complex kinematic model, based on the DH parameters, as well. This refers to the total upper body of a rower along with the seat rail for the case of an indoor rowing performance assessment. Each upper limb is represented as a 7 DoF kinematic chain with 3 revolute joints for the shoulder abduction-adduction, rotation and flexion-extension, 2 DoFs for the elbow representing flexion-extension and rotation and, 2 DoFs for the wrist abduction-adduction and flexion-extension. Moreover, this model accounts for the clavicle DoF, representing it as a revolute one.
In the present paper the forward kinematics were extracted for the human upper limb, which was designed as a kinematic chain consisting of three joints, namely, the shoulder, the elbow and the wrist and 7 DoFs. The shoulder/glenohumeral joint formed by the humeral head of the scapula and the glenoid cavity [35], performs three rotational DoFs: the flexion-extension one, abduction-adduction, and at last the internal-external rotation. The elbow can be represented as a hinge joint permitting elbow flexion-extension. The bones, muscles and ligaments structure in the elbow joint close area restricts its mobility. Nevertheless, a second DoF that originates a bit distal from the elbow is responsible for the pronation-supination of the forearm. Regarding the wrist joint, two DoFs are present, performing flexion-extension and deviation, respectively. The described kinematic chain used to model the human upper limb, is illustrated in Figure 1.
The coordinate frames were defined for each DoF, with respect to a set of rules from robotics theory and the DH configuration convention. These are presented in Figure 1 and summarized in Table 1. The origin point of the kinematic chain is located at the human’s chest. The corresponding coordinate frame is noted as {C} and is also regarded as the origin one. The shoulder joint’s 3 DoFs follow with the frames {S1}, {S2} and {S3} assigned on them denoting the flexion-extension, abduction-adduction and internal-external rotation, accordingly. The elbow rotational DoFs are depicted as {E1} and {E2} frames accounting for the flexion-extension and pronation-supination DoFs, while {W1} and {W2} frames stand for the wrist joint DoFs, namely flexion-extension and deviation. The end-effector position and orientation correspond to the ones of the tip of the hand.

2.2. Range of Motion for the Upper Limb Model

In [36], the shoulder’s joint motion is explained in detail, while defining the corresponding ranges of motion. From an anatomic point of view the shoulder girdle consists of the clavicle, the scapula and the humerus. The shoulder joint motion is determined by a combination of these body parts distinct motions, making the analysis more complex. A ball-and-socket joint is adopted to model the shoulder girdle motion ability. Hence, the shoulder is described by 3 DoFs, permitting shoulder abduction-adduction motion, flexion-extension and internal-external rotation.
The elevation plain of upper limb defines the kind of exercise that the subject executes. The elevation plane of 0 denotes the upper limb elevation over the frontal plane. In this case, the abduction-adduction of the upper limb occurs with the corresponding range of motion being [ 0 , 180 ] . The elevation plane of 90 indicates that the shoulder flexion-extension takes place. The corresponding range of motion is [ 0 , 180 ] during elevation through flexion and [ 0 , 60 ] for the shoulder’s elevation through extension (hyperextension). In both cases, the rotation of 0 is defined for an upper limb posture with its shaft parallel to the thorax vertical axis. The upper arm internal-external rotation is performed when the shoulder’s elevation angle is 0 and the elbow is flexed at 90 , leading the forearm to be lying in the sagittal plane. The range of motion for this shoulder’s DoF that accomplishes the upper arm internal-external rotation is [ 90 , 20 ] , with the negative value being when the upper arm rotates internally towards the human body.
The elbow joint main rotation is performed around a fixed axis that is located among the center point of the trochlear sulcus and the center point of the capitulum of the humerus. This DoF permits the elbow flexion-extension with a range of 0 for the case of the full extension for the elbow to 130 depicting its flexion. Furthermore, a second rotation occurs a bit distal from the elbow joint, which is responsible for the forearm pronation-supination motion. In this case, the forearm rotates around an axis that is defined among the center points of the radial head and the distal ulna. The range of this motion is 90 for the pronation case to − 90 for the supination one. The forearm neutral position, where its rotation is 0 , occurs when having the shoulder and also the wrist at their neutral positions, the hand is lying in the sagittal plane.
At last, the motion of the wrist joint around the axes that are defined in [37], allow for the wrist flexion-extension and wrist deviation. The first ranges from 70 for the case of extension to 70 for the wrist flexion. The wrist deviation is defined as radial and ulnar. In the radial, the wrist deviates internally towards the body and till 10 , while during the ulnar deviation the wrist deviates outwards of the body and to an upper limit of 25 . The neutral position of 0 for both flexion and deviation DoFs of the wrist occurs when the third metacarpal and the longitudinal axis of the forearm are aligned.
Taking into account the previous assumptions, the ranges of motion for the upper limb degrees of freedom of our model are summarized in Table 2.

2.3. Dh Parameters of Upper Limb Model

To describe the forward kinematics of the upper limb model, the DH parameters have to be derived. A quadruple of such parameters represent the relation among two sequential coordinate frames of the kinematic chain model. In Table 3, the DH parameters are presented, describing the model configuration that is depicted in Figure 1. The variables l c , l u a , l f a and l h stand for the length from the chest to the shoulder joint, the upper arm length and the forearm and hand lengths, respectively. The estimation of these lengths is based on biomechanics literature [35] and correspond to appropriate fractions of the human height H. Hence, if these parameters cannot be measured, the estimation of the upper limb segments’ lengths are given by: l c = 0.129 H , l u a = 0.186 H , l f a = 0.146 H and l h = 0.108 H .
The A i 1 i of Equation (1) corresponds to the transformation matrix between the two sequential coordinate frames i 1 and i:
A i 1 i = c θ i s θ i c α i s θ i s α i a i c θ i s θ i c θ i c α i c θ i s α i a i c θ i 0 s α i c α i d i 0 0 0 1
where s and c represent sine and cosine of the corresponding angle and the four quantities θ i , α i , d i , and a i are parameters associated with link i and joint i. These four parameters of Equation (1) are generally given the names joint angle, link twist, link offset and link length, respectively, and represent specific aspects of the geometric relationship between two coordinate frames [38]. Since the matrix A i 1 i is a function of a single variable, as shown in Table 3, it concludes that the three out of the four parameters for a link have constant values and the only variable is θ i or d i in case of a revolute or a prismatic joint, accordingly.
The transformation matrix from the chest position, where the origin frame is assumed, to the end-effector T 0 7 is given by Equation (2).
T 0 7 = i = 0 7 A i 1 i

3. DoFs Angles Estimation

Having modeled the upper limb as described in the previous Section, we present a methodology to estimate the angles’ values of the defined DoFs for the upper limb model. This methodology starts using the inertial sensors measurements. If a 9-DoF inertial sensor is attached on a link of upper limb, the gathered measurements during this link’s motion could give the link orientation in the 3D space. The calculation approach, which is described in the sequel, assumes that the coordinate frames of the sensor node resemble these of the MPU-9150 [39]. In case of a different sensor or coordinate frames configuration, the mathematical equations should be modified accordingly. Furthermore, an important step before the extraction of the orientation is the filtering process, as these are calculated by the raw sensors data, which suffer from gyroscopes drifting and accelerometers noise.
Nevertheless, the calculation of each individual segment orientation is insufficient for the estimation of the human upper limb posture and orientation in the 3D space. Hence, in the proposed methodology, the orientation of each upper limb link is initially estimated individually based on the corresponding sensor node. Then, as each link is a part of the described in the previous Section upper limb model, we manage to get the related DoFs angles in a closed form. The intermediate calculation steps for each upper limb segment and the corresponding joints angles (3 DoFs for shoulder, 2 for elbow and 2 for wrist) are presented in Section 3.2, Section 3.3 and Section 3.4. At the last step, the calculated joint angles θ i are used to solve the Forward Kinematics of the upper limb model, following the procedure that was presented in Section 2. Thus, the reconstruction and representation of the upper limb motion in the 3D space is finally accomplished.

3.1. Sensor Fusion and Orientation Estimation

The sensor node orientation can be estimated either by integrating the angular velocities or by processing the accelerometer measurements and estimating the gravity vector. However, to avoid the accumulative errors due to the integration of the gyroscope measurements and to minimize the noise in the orientation vector derived from the accelerometer signal, both estimators are implemented and fused over a complementary filter [40,41]. Although, there are improved versions of the complementary filter [42,43,44], we used this simpler version, which concludes to satisfactory results regarding the experimental test cases, while it also provides quick and accurate performance [45]. In the sequel, the estimation and filtering procedure that concludes to the calculation of links orientation is presented in detail.
The simplest method to extract the orientation angles of an IMU-based sensor node is by integrating the measurements of its gyroscope adopting the first-order approximation of Taylor series:
ϕ [ n ] = ϕ [ n 1 ] + δ ϕ g [ n ] δ t T θ [ n ] = θ [ n 1 ] + δ θ g [ n ] δ t T ψ [ n ] = ψ [ n 1 ] + δ ψ g [ n ] δ t T
where ϕ , θ and ψ are the roll, pitch and yaw rotations around y—the longitudinal, x—the transverse and z—the vertical axes of the sensor, and T is the sampling period. In our implementation the value of T is constant, but filtering with variable sampling periods can also be adopted.
The rates of Euler angles that are presented as coefficients in Equation (3), are given by the first-order differential equations [30] as:
δ ϕ g δ t δ θ g δ t δ ψ g δ t = s ϕ a s θ a c θ a 1 c ϕ a s θ a c θ a c ϕ a 0 s ϕ a s ϕ a c θ a 0 c ϕ a c θ a G x G y G z
where G x , G y , G z are the measurements of the gyroscope for x, y and z axis, respectively and s and c represent sine and cosine of the corresponding angle.
The transformation from the raw gyroscope measurements with respect to the sensor body axis coordinate system to the fixed coordinate system is derived from the direction cosine matrix shown in Equation (4). This relation allowed to update the orientation of the upper limb link with time, improving also its accuracy, compared to the previous implementation [29], where we used the raw gyroscope measurements in the calculations. Using the raw measurements, we get the angular velocities with respect to the moving body frame and not the fixed 3-dimensional global frame that is defined. This is the reason which restricted us from testing successively the full range of the motions, which were presented in our previous work, and instead we presented the half range of motions, compared to the results that we will present in the sequel, in Section 4.
The parameters ϕ a , θ a and ψ a are the orientation Euler angles that are estimated from the accelerometer measurements. Due to the discrete integration (Equation (3)), the errors and the sensor noise result to the angular error, known as drift.
These orientation angles can be retrieved by identifying the direction of the gravity vector for a R y ( ϕ ) R x ( θ ) R z ( ψ ) configuration [46] as:
ϕ a = atan 2 A x A z θ a = atan 2 A y A x 2 + A z 2
where A x , A y , A z correspond to accelerometer measurements over x, y and z axis. The a t a n 2 Y , X denotes the operation arctan Y X of inverse tangent, which accept solutions in the angle range of π 2 , π 2 . The estimation of the rotation ψ a in z axis is derived from the compass measurements along with the already estimated orientation angles ϕ a and θ a , as shown in the following equation. This is performed in order to improve the accuracy of the rotation estimation in this axis:
ψ a = atan 2 C y , C x
where
C x = M x cos ( ϕ a ) + M y sin ( ϕ a ) sin ( θ a ) + M z sin ( ϕ a ) cos ( θ a ) C y = M y cos ( θ a ) M z sin ( ϕ a )
and the parameters M x , M y , and M z represent the compass measurements along the corresponding axes.
Subsequently, sensors’ orientation can be estimated based on the gyroscope measurements on a short-term horizon, due to the resulting drifting, and by relying on the accelerometer and magnetometer measurements for a long-term orientation estimation.
A complementary filter is suggested, in order to provide more accurate estimations of the orientation angles for the sensor node. This filter accounts for drifting compensation and noise reduction and is formed as:
ϕ [ n ] = ϕ [ n 1 ] + δ ϕ g [ n ] δ t T k + ( 1 k ) ϕ a [ n ] θ [ n ] = θ [ n 1 ] + δ θ g [ n ] δ t T k + ( 1 k ) θ a [ n ] ψ [ n ] = ψ [ n 1 ] + δ ψ g [ n ] δ t T k + ( 1 k ) ψ a [ n ]
Thus, concluding to the next form taking into account the Equations (3), (5) and (6):
ϕ [ n ] = ϕ [ n 1 ] + δ ϕ g [ n ] δ t T k + ( 1 k ) atan 2 A x [ n ] A z [ n ] θ [ n ] = θ [ n 1 ] + δ θ g [ n ] δ t T k + ( 1 k ) atan 2 A y [ n ] , A x [ n ] 2 + A z [ n ] 2 ψ [ n ] = ψ [ n 1 ] + δ ψ g [ n ] δ t T k + ( 1 k ) atan 2 C y [ n ] , C x [ n ]
where k ( 0.5 , 1 ) is a variable weighting the effect of its term and its typical value is close to 1.

3.2. Estimation of the Shoulder Joint Angles

The collected motion data from the IMU, mounted on the upper limb segments, are filtered and processed, as described in Section 3.1. Hence, the orientation of all the individual links and for each timestamp is determined in terms of the extracted roll, pitch and yaw angles. Using these orientation angles, the direction cosine matrix is calculated for each upper limb segment, specifying also the required rotation sequence. This is a dynamic procedure and takes place for every timestamp that updated inertial measurements are received by the sensor nodes.
Having described before the shoulder joint as a spherical wrist configuration for a robotic arm model [38], the shoulder joint angles’ variables θ 1 , θ 2 and θ 3 can be assigned to the roll, pitch and yaw angles, with respect to the coordinate frame O 0 x 0 y 0 z 0 (Figure 1). The mathematical expressions that relate the two sets of angles, result from the following analysis.
The shoulder joint rotations are described by a sequence of the three elemental rotations that occur around the axes of the origin coordinate system, which remains motionless, adopting the convention of the Tait-Bryan angles [47]. The sequence of elemental rotations is important in the estimation process. So, in case of a different definition in the sequence of the elemental rotations, the corresponding transformation matrix representation must be re-estimated.
The orientation transformation among the origin frame 0 and the frame 4 of the upper limb model (Figure 1) is given by the Equation (10). This equation results from the transformation matrix T 0 4 of the forward kinematics based on the DH parameterization. On the other side, the corresponding representation for the Tait-Bryan angles Z α X β Y γ configuration has the form of Equation (11). X, Y and Z are the transformation matrices of the elemental rotations around the fixed frame axes x, y and z, accordingly. Hence, this equation represents an angle α rotation around z axis, followed by an angle β rotation around x axis and, an angle γ rotation around y axis of the fixed frame.
R 0 4 = s θ 1 c θ 3 + c θ 1 s θ 2 s θ 3 c θ 1 c θ 2 s θ 1 s θ 3 c θ 1 s θ 2 c θ 3 c θ 1 c θ 3 + s θ 1 s θ 2 s θ 3 s θ 1 c θ 2 c θ 1 s θ 3 s θ 1 s θ 2 c θ 3 c θ 2 s θ 3 s θ 2 c θ 2 c θ 3
R = Z α X β Y γ = c α c γ s α s β s γ s α c β c α s γ + s α s β c γ s α c γ + c α s β s γ c α c β s α s γ c α s β c γ c β s γ s β c β c γ
The parameter α of Equation (11) should not be confused with the Denavit-Hartenberg parameter α noted in Table 3 and Equation (1). Solving over the corresponding items of the matrices presented in Equations (10) and (11), we conclude to the results of Equation (12):
θ 1 = α + p i / 2 θ 2 = β θ 3 = γ
By replacing the estimated values of angles θ 1 , θ 2 and θ 3 in the transformation matrix T 0 4 , the posture of the upper arm is extracted with respect to the O C x C y C z C coordinate frame, as it is defined in Figure 1.

3.3. Estimation of the Elbow Joint Angles

The position and orientation of the elbow joint, as it is the distal point of the upper arm, can be estimated from the upper arm position and orientation as shown in Section 3.2. Following a similar approach for the motion data, which are collected by the sensor node that is attached to the human’s forearm, the orientation of this segment is extracted in terms of roll, pitch and yaw angles. The kinematic model variables that relate to the elbow joint are the angles θ 4 and θ 5 . The elbow joint revolute DoFs can be estimated from the orientation angles, taking into account the representation of Equation (11) for the Tait-Bryan angles, as:
θ 4 = atan 2 R 3 , 1 , R 3 , 3
θ 5 = atan 2 R 1 , 2 , R 2 , 2
By replacing these estimated angles’ values in the transformation matrix T 4 6 , the posture of the forearm is extracted with respect to the O 3 x 3 y 3 z 3 coordinate frame (Figure 1).

3.4. Estimation of the Wrist Joint Angles

The wrist joint rotations θ 6 and θ 7 of the kinematic model can be derived by a similar procedure as following:
θ 6 = atan 2 R 3 , 1 , R 3 , 3
θ 7 = atan 2 R 1 , 2 , R 2 , 2
Hence, by the transformation matrix T 6 8 , the posture of the hand is extracted with respect to the O 5 x 5 y 5 z 5 coordinate frame (Figure 1).

4. System Implementation and Experimental Results

4.1. Motion Sensors

The implemented device, shown in Figure 2, consists of inertial sensors that capture the node’s motion with respect to the inertial frame and a microcontroller with Wi-Fi capabilities. Specifically, the device consists of an IMU, a microcontroller and a removable and rechargeable small-size battery. The IMU that was used is a 9 DoF Micro-Electro-Mechanical System (MEMS) IMU. It is the single unit chip MPU-9150, produced by InvenSense Inc. The microcontroller that was used is a low-cost WiFi chip with full TCP/IP stack. It is the ESP8266 and it is produced by Espressif Systems [48]. The purpose of this device is to retrieve and transmit the sensor data through WiFi to a base station. At the base station, the data is processed, stored and visualized. At last, the battery, which was used, is a removable RCR123A Lithium, high current, rechargeable battery that allows for a long-life and easy replacement.
Through an Inter Integrated Circuit (I 2 C) protocol [49], the data received from the IMU sensors are used to estimate the orientation of the IMU chip. The ESP8266 micro-controller serves every device that requests data from the sensor node. The fact that HTTP protocol [50] applies large headers, while it also lacks full duplex communication, makes this protocol not suitable for the designed application. Moreover, the time-restrictions and the requirement for transferring a great amount of data from the sensor to the base station, in a request-response messaging protocol, imposed the use of a more appropriate protocol. Thus, a websockets [51] protocol was adopted. This protocol offers a persistent TCP/IP connection, while client and server can exchange packets, avoiding to burden the communication channel with a large volume of irrelevant data.
Each sensor node requires a calibration process before the first use. There are 3 sensors in the node and there is a different procedure for each sensor. For the magnetometer we should rotate the sensor multiple times at different directions. We store only the minimum and the maximum value on each axis. Having rotated for a while, we find the offset of each axis by taking the mean value of the maximum and minimum values for that axis. Then, we subtract this offset by the measurements. For the gyroscope, letting the device on a surface for a few seconds can produce the mean value of the sensor, which is supposed to be zero. Thus, we subtract this mean value from each measurement, in order to get the correct angular rotation of the device. The accelerometer calibration needs to lay the sensor with all its faces down. Hence, the gravity vector can be identified on each axis, and the values are saved on the microcontroller.
During the experimental sessions, a motion capturing sensor node is mounted on the outer side of the selected upper limb segment with the Y axis of the sensor being aligned with the longitudinal axis of the segment. The right placement of the sensor is ensured using a special designed fabric case, which is strapped in the circumference of the limb segment. Then, the sensor is switched on and the communication with the base station is established. The objectives are the acquisition, storing and concurrent visualization of the node’s IMU measurements during the subject’s exercise session.
Concurrently, a Shimmer3 sensor node [26] has been aligned with the custom-made sensor, proposed in this work, and both were attached to the human’s upper limb segments. A custom-made piece with two sockets was used to successfully align these two sensors. The measurements form Shimmer3 sensor node were captured directly into Matlab® using the Shimmer Matlab Instrument Driver. This driver is provided by the corresponding company and establishes the communication with the device via the Bluetooth protocol. Before the motion capturing session, the Shimmer3 units were calibrated using the Shimmer 9DOF Calibration Application, that is also derived by the same company. This application implements an automated process that calculates the calibration parameters for Shimmer’s integrated accelerometers, gyroscopes and magnetometers sensors. These calibration parameters are finally stored in the unit memory, so as the sensor measurements can be automatically corrected before they are sent to the paired device (e.g., a computer that functions as base station).
In the sequel, the collected sensors measurements, for both type of sensors, are processed and filtered as described in the previous Section. Then, the DoFs rotations θ i , for the upper limb model, are extracted by Equation (12) for the shoulder, Equations (13) and (14) for the elbow and Equations (15) and (16) for the wrist joint, respectively. In the following Subsections, the 3D reconstructed upper limb joints trajectories are presented in common rehabilitation exercises, as the elbow’s flexion-extension, the shoulder’s abduction-adduction and the wrist’s flexion-extension exercise. In the processing stage, the sensor’s sampling period is selected as T = 20 msec, while the values of the model lengths are defined as l c = 0.2322 m, l u a = 0.3348 m, l f a = 0.2628 m and l h = 0.1944 m, which were estimated for the subject’s height of H = 1.80 m. Furthermore, the first two experimental cases do not account for any wrist rotation, since there was not any sensor node attached over the subject’s hand. Hence, the corresponding DoFs angles are assumed as θ 6 = θ 7 = 0 .

4.2. Elbow Joint Flexion-Extension Exercise

During this exercise session, the subject performs flexion and extension of the elbow. This exercise is repeated a few times and the sensor measurements of the node, which is attached to the subject’s forearm, are recorded and processed. This results to the estimation of the node’s orientation angles ϕ , θ and ψ . Using the node’s orientation angles, we can estimate the elbow DoF angles. Comparison between the proposed custom-made sensor node and the Shimmer3 sensor unit for the estimated angles θ 4 and θ 5 are shown in Figure 3 and Figure 4, respectively.
The trajectories of the shoulder, the elbow, the wrist joints and the tip of the hand in x, y, and z axes are presented in Figure 5. The distances between the segments joints, as shown in this figure, are in accordance with the corresponding fractions of human’s height. The motion takes place in the y z plane and the value of x component for each joint is constant and equal to the l c , due to the dependence of the segments’ position only by the angle θ 4 . In this case, the position of the forearm is calculated by the transformation matrix T 4 6 , which depends only on the variable θ 4 . The angle θ 5 affects only the orientation of the forearm and not its position. Furthermore, the hand is supposed to be aligned with the forearm, thus angles θ 6 and θ 7 of the wrist joint are zeroed.
The resulted joints’ trajectories, as extracted by the upper limb model presented in Section 2, for the calculated elbow DoF angles, are illustrated in Figure 6. The trajectories, as extracted for the case of the custom-made sensor node, are noted with the ‘o’ symbol and represent the exercise that the patient has performed. The upper limb segments’ trajectories for the case of the Shimmer3 sensor unit’s measurements are also presented in the same figure. Deviations between the trajectories, especially in the upper and lower limit of the motion range, are identified in this figure. In the case of Shimmer3, the trajectories are more limited (noted with deep purple and deep red color). This is resulted by the elbow DoF angle θ 4 that ranges from 13 to 85 approximately, while the corresponding angle calculated by the measurements of the proposed sensor node ranges between 8 to 88 , as shown in Figure 3.
The processing and filtering methodology, described in Section 3.1, were followed for the IMU measurements of both sensor nodes. The nodes were calibrated before the capturing session. It was noticed that the differences in the elbow angles’ ranges, which were observed along a series of experiments, were related to the calibration accuracy of the sensors. Moreover, it was noticed that the forearm configuration during the exercise execution does not account for the gimbal lock problem. Finally, the points that are visualized in the trajectories’ 3D plots correspond to the total number of the exercise’s repetitions and not to only one.
The mean absolute error (MAE), the maximum error (MaxError) and root mean square error (RMSE) for the wrist and the tip of the hand (end-effector) trajectories, as were calculated between the proposed sensor node and the Shimmer one, are summarized in Table 4.
Where
M A E = i = 1 n | P s e n i P s h i m i | n M a x E r r o r = m a x P s e n i P s h i m i , i = 1 , n R M S E = i = 1 n P s e n i P s h i m i 2 n
and the 3D trajectory points P s e n i for the custom-made sensor node and P s h i m i for the Shimmer one are given as P i = x i 2 + y i 2 + z i 2 , while n denotes the total number of points of the trajectory. The estimated error metrics, which represent the difference among the trajectories extracted by these two sensor nodes, show a quite high level of accuracy.

4.3. Shoulder Joint Abduction-Adduction Exercise

Before the exercise session, we attach the wearable device to the subject’s upper arm. During this exercise session, the subject performs abduction and adduction of the shoulder joint for his right upper limb. The sensor’s on-board processor estimates the node’s orientation angles ϕ , θ and ψ . Then, it calculates the shoulder rotation angles θ 1 , θ 2 , θ 3 based on the procedure described above. The results are presented in Figure 7, Figure 8 and Figure 9, accordingly, while the corresponding elbow angles θ 4 and θ 5 are fixed, during this exercise. As it is shown in the aforementioned figures, the shoulder’s joint angles, as extracted from both sensor nodes, are almost identical. Therefore, an accurate reconstruction of the upper limb movement can be achieved either using a shimmer sensor node or the proposed custom-made one.
The trajectories of the shoulder, the elbow, the wrist and the tip of the hand in x, y, and z axes are presented in Figure 10. Deviations in the y axis occur, while the subject’s motion took place in the x z (sagittal) plane. This was caused because of the not ideal mounting of the node on the subject’s upper arm and, thus, a sensor-to-body frame transformation should be defined.
An issue that might occur during this motion is the gimbal lock problem. This results from the use of the Euler orientation angles ϕ , θ and ψ to find, after processing, the position and orientation of the corresponding segment. Actually, the shoulder joint can perform three rotations. When the θ 2 angle rotation comes up to the furthest point, the shoulder joint axes z 0 and z 2 of the first and third shoulder’s DoF, accordingly, are collinear. This is a singular configuration and only the sum θ 1 + θ 3 or the difference θ 1 θ 3 can be determined. One solution is to choose the one angle arbitrarily and then determine the other using trigonometric equations. Nevertheless, the range of motion in the present exercise is not affected by the gimbal lock issue.
The 3D reconstructed trajectories of each upper limb joint are presented in Figure 11, noted with ’o’ symbol, for the case of the custom-made sensor node. A few consecutive repetitions of abduction-adduction exercise are performed, which explains the deviations of the visualized points. The extracted upper limb segments’ trajectories for the case of the Shimmer3 unit are also presented in the same figure. As in the comparison of the previous exercise (elbow flexion-extension), the trajectories of the upper limb segments, which are extracted from the Shimmer3 sensor’s measurements (noted with light blue, deep purple and deep red color), are more limited than the corresponding ones resulted from the custom-made motion capture sensor node. This results from the minor differences in the calculated shoulder DoFs’ ranges between the two sensors nodes. Once again, in this experiment, it was observed that the shoulder angles’ ranges were related to the accuracy of the calibration parameters for each sensor.
The mean absolute error (MAE), the maximum error (MaxError) and root mean square error (RMSE) for the elbow, wrist and tip of the hand (end-effector) trajectories, between the two sensor nodes, as extracted using Equations (17), are summarized in Table 5.

4.4. Wrist Joint Flexion-Extension Exercise

In this case, the subject performs a few repetitions of wrist joint flexion-extension exercise of his right upper limb. The extracted sensor orientation angles ϕ , θ and ψ contribute to the wrist DoFs angles estimation. During this exercise the subject’s elbow is steadily flexed at θ 4 = 90 being perpendicular to the coronal plane, and the hand is repeatably flexed with the palm towards the chest at almost θ 6 = 55 and extended until almost θ 6 = 25 , as shown in Figure 12. In Figure 13, a deviation around a mean value is presented that corresponds to the second DoF of the wrist, this of radial-ulnar deviation. Actually, a small deviation is present during the subject’s wrist motion, thus the deviation in this graph is reasonable.
The concluded joints’ trajectories, as extracted by the upper limb model for the wrist DoF angles θ 6 and θ 7 is illustrated in Figure 14. It represents the exercise that the subject has performed, along with this small deviation. The corresponding trajectories of the shoulder, the elbow, the wrist and the tip of the hand in x, y, and z axes are presented in Figure 15.
In this case, a comparison among the custom-made and the Shimmer3 sensor node was not realized, because it was not possible to attach the array of the aligned sensors on the subject’s hand, due to the hardware dimensions.
Such graphs of upper limb trajectories and joint DoFs angles during exercise sessions are useful for a subject’s upper limb status evaluation by the physical therapists. Especially, the analysis over the range of motion for each joint angle or even the estimation of the speed during the execution of an exercise indicate the status and progress of a patient with upper limb movement disorders. Hence, the therapists could tune the sequel rehabilitation sessions accordingly.

5. Discussion

During the elaboration of this work and especially the experimental part, in order to validate the sensor fusion algorithms along with the upper limb modeling for the successful 3D motion reconstruction, several issues were encountered. Some of them need more attention and further investigation, aiming at the improvement of the upper limb motion tracking.
The experimental results showed that the proposed custom-made motion capturing system had an acceptable performance, as shown by the estimated error indicators in Table 4 and Table 5. Using optical motion capturing systems instead, like Vicon, the position error would be much smaller [52]. Nevertheless, in order to avoid the occlusion problem, which is the main drawback of this kind of systems, multiple cameras should be installed in the rehabilitation center. However, the overall cost may be prohibited for an average healthcare facility.
The choice of Euler angles representation found to be more understandable to handle and to identify failing results in the implementation that then could be solved. At the current status of research, the range of motions, which were performed during the experimental tests, is not affected by the gimbal lock problem. Thus, the adoption of Euler configuration was finally chosen. A quaternion-based option that is currently examined is the Geometrical Constraints Modeling and Fusion methodology presented in [53,54], where each IMU placement defines the corresponding segment’s reference frame. When the quaternion representation of the sensors’ measurements flow is provided, a fusion algorithm accounts for the conversion from quaternion sequences into the axis-angular sequences. In this case, an upper limb model is also needed, in order to express the biomechanical characteristics and contribute to the 3D motion reconstruction.
Regarding the current state of the presented custom-made sensor system, upper limb movements, such as horizontal abduction that has a range of 130 towards the human chest or scapula protraction-retraction, has not been examined yet. In this case, actually, an extra DoF has to be included in the upper limb model. The complete model for both upper limbs can be defined, as well, as an extension to this work.

6. Conclusions

In this paper, an upper limb kinematic model that resembles a robotic arm with rigid links is proposed. It is formed as a 7 DoF kinematic chain, while the Denavit-Hartenberg configuration is adopted to describe this model. Furthermore, the Forward Kinematics matrices that can give the position and orientation of each upper limb segment are derived.
For this model, the shoulder, the elbow and the wrist joint angles are estimated by exploiting the upper arm, forearm and hand segments’ orientation vectors, respectively. For this reason, inertial sensors are attached to the upper limb segments, in order to capture theirs motion. This is finally accomplished by a two steps procedure. At first, a sensor fusion algorithm is applied to the inertial sensors’ data along with a complementary filter, in order to extract each individual upper limb’s segment orientation. Then, these orientations contribute to the estimation of the rotational DoFs that are defined in the aforementioned upper limb model.
This concludes to the 3D reconstruction of the upper limb model’s motion by applying the Forward Kinematics of the DH configuration. The 3D reconstruction results, the trajectory tracking and the visualization of a subject’s upper limb for both the prototype motion capture sensor node and the off-the-shelf Shimmer3 sensor unit, are performed, for comparison reasons. The accuracy of the results derived by the custom-made sensor node is also validated with an RMSE of 0.02 m and a maximum error of 0.07 m, when these are compared to the results of the similar commercial wireless motion capturing device.
These validation results demonstrate the accuracy of the inertial technology based motion analysis, which is presented in this work, and can be a guide for further improvement. The same methodology could also be extended to estimate the lower limb, or the whole body movements in the 3D space, using additional sensors and similar robotic modeling of the human body segments.

Author Contributions

Conceptualization, O.T. and E.D.; methodology, O.T. and K.G.; software, O.T., K.G. and N.E.; investigation, O.T., K.G., N.E. and E.D.; data curation, O.T., K.G. and N.E.; writing–original draft preparation, O.T., K.G. and E.D.; writing–review and editing, O.T., K.G. and E.D.; visualization, O.T. and K.G.; supervision, E.D.; project administration, E.D.; funding acquisition, O.T., K.G., N.E. and E.D. All authors have read and agreed to the published version of the manuscript.

Funding

This work was performed under the framework of the “EDBM34 Call - Support research focusing on young researchers” project entitled “Design and implementation of a complementary sensing system for physiotherapy procedure monitoring through smart portable devices” (MIS 5005357), co-funded by the European Social Fund (ESF) and National Resources.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
3D3-Dimensional
CSMA/CACarrier Sense Multiple Access/Collision Detection
DHDenavit-Hartenberg
DoFDegrees of Freedom
I 2 CInter Integrated Circuit
IMUInertial Measurement Unit
LoSLine of Sight
MEMSMicro-Electro-Mechanical System
OLEOptical Linear Encoders
PCAPrincipal Component Analysis
TDMATime Division Multiple Access
WSNWireless Sensor Network

References

  1. García-Rudolph, A.; Laxe, S.; Sauri, J.; Opisso, E.; Tormos, J.; Bernabeu, M. Evidence of chronic stroke rehabilitation interventions in activities and participation outcomes: Systematic review of meta-analyses of randomized controlled trials. Eur. J. Phys. Rehabil. Med. 2019, 55, 695–709. [Google Scholar] [CrossRef]
  2. Kwakkel, G.; van Peppen, R.; Wagenaar, R.C.; Dauphinee, S.W.; Richards, C.; Ashburn, A.; Miller, K.; Lincoln, N.; Partridge, C.; Wellwood, I.; et al. Effects of augmented exercise therapy time after stroke: A meta-analysis. Stroke 2004, 35, 2529–2539. [Google Scholar] [CrossRef] [Green Version]
  3. López-Nava, I.H.; Muñoz-Meléndez, A. Wearable Inertial Sensors for Human Motion Analysis: A Review. IEEE Sens. J. 2016, 16, 7821–7834. [Google Scholar] [CrossRef]
  4. Zhou, H.; Hu, H. Human motion tracking for rehabilitation-A survey. Biomed. Signal Process. Control 2008, 3, 1–18. [Google Scholar] [CrossRef]
  5. Onose, G.; Popescu, N.; Munteanu, C.; Ciobanu, V.; Sporea, C.; Mirea, M.D.; Daia, C.; Andone, I.; Spînu, A.; Mirea, A. Mobile Mechatronic/Robotic Orthotic Devices to Assist–Rehabilitate Neuromotor Impairments in the Upper Limb: A Systematic and Synthetic Review. Front. Neurosci. 2018, 12, 577. [Google Scholar] [CrossRef] [PubMed]
  6. Maciejasz, P.; Eschweiler, J.; Gerlach-Hahn, K.; Jansen-Troy, A.; Leonhardt, S. A survey on robotic devices for upper limb rehabilitation. J. NeuroEng. Rehabil. 2014, 11, 3. [Google Scholar] [CrossRef] [Green Version]
  7. Yoon, J.; Novandy, B.; Yoon, C.H.; Park, K.J. A 6-DOF gait rehabilitation robot with upper and lower limb connections that allows walking velocity updates on various terrains. IEEE/ASME Trans. Mechatron. 2010, 15, 201–215. [Google Scholar] [CrossRef]
  8. Wong, W.Y.; Wong, M.S.; Lo, K.H. Clinical applications of sensors for human posture and movement analysis: A review. Prosthetics Orthot. Int. 2007, 31, 62–75. [Google Scholar] [CrossRef] [PubMed]
  9. Hadjidj, A.; Souil, M.; Bouabdallah, A.; Challal, Y.; Owen, H. Wireless sensor networks for rehabilitation applications: Challenges and opportunities. J. Netw. Comput. Appl. 2013, 36, 1–15. [Google Scholar] [CrossRef] [Green Version]
  10. Gravina, R.; Alinia, P.; Ghasemzadeh, H.; Fortino, G. Multi-Sensor Fusion in Body Sensor Networks: State-of-the-art and research challenges. Inf. Fusion 2016, 35, 68–80. [Google Scholar] [CrossRef]
  11. Macedo, P.; Afonso, J.A.; Rocha, L.A.; Simoes, R. A telerehabilitation system based on wireless motion capture sensors. In Proceedings of the International Conference on Physiological Computing Systems, Lisbon, Portugal, 7–9 January 2014; pp. 55–62. [Google Scholar]
  12. Arnold, D.; Li, X.; Lin, Y.; Wang, Z.; Yi, W.J.; Saniie, J. IoT Framework for 3D Body Posture Visualization. In Proceedings of the 2020 IEEE International Conference on Electro Information Technology (EIT), Chicago, IL, USA, 31 July–1 August 2020; pp. 117–120. [Google Scholar] [CrossRef]
  13. Lee, G.X.; Low, K.S.; Taher, T. Unrestrained measurement of arm motion based on a wearable wireless sensor network. IEEE Trans. Instrum. Meas. 2010, 59, 1309–1317. [Google Scholar] [CrossRef]
  14. Mazomenos, E.B.; Biswas, D.; Cranny, A.; Rajan, A.; Maharatna, K.; Achner, J.; Klemke, J.; Jöbges, M.; Ortmann, S.; Langendörfer, P. Detecting elementary arm movements by tracking upper limb joint angles with MARG sensors. IEEE J. Biomed. Health Inform. 2016, 20, 1088–1099. [Google Scholar] [CrossRef] [PubMed]
  15. Hadjidj, A.; Bouabdallah, A.; Challal, Y. Rehabilitation supervision using wireless sensor networks. In Proceedings of the 2011 IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM), Lucca, Italy, 20–24 June 2011; pp. 1–3. [Google Scholar]
  16. Nguyen, K.D.; Chen, I.M.; Luo, Z.; Yeo, S.H.; Duh, H.B.L. A wearable sensing system for tracking and monitoring of functional arm movement. IEEE/ASME Trans. Mechatron. 2011, 16, 213–220. [Google Scholar] [CrossRef]
  17. Lim, C.K.; Chen, I.M.; Luo, Z.; Yeo, S.H. A low cost wearable wireless sensing system for upper limb home rehabilitation. In Proceedings of the 2010 IEEE Conference on Robotics Automation and Mechatronics (RAM), Singapore, 28–30 June 2010; pp. 1–8. [Google Scholar]
  18. Lim, K.Y.; Goh, F.Y.K.; Dong, W.; Nguyen, K.D.; Chen, I.M.; Yeo, S.H.; Duh, H.B.L.; Kim, C.G. A wearable, self-calibrating, wireless sensor network for body motion processing. In Proceedings of the IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008; pp. 1017–1022. [Google Scholar]
  19. Jiang, Y.; Qin, Y.; Kim, I.; Wang, Y. Towards an IoT-based upper limb rehabilitation assessment system. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Korea, 11–15 July 2017; pp. 2414–2417. [Google Scholar]
  20. Papazoglou, P.; Laskari, T.; Fourlas, G. Towards a low cost open architecture wearable sensor network for health care applications. In Proceedings of the 7th International Conference on Pervasive Technologies Related to Assistive Environments, St Petersburg, Russia, 23–25 September 2014; p. 10. [Google Scholar]
  21. Alves, R.C.; Gabriel, L.B.; de Oliveira, B.T.; Margi, C.B.; dos Santos, F.C.L. Assisting physical (hydro) therapy with wireless sensors networks. IEEE Internet Things J. 2015, 2, 113–120. [Google Scholar] [CrossRef]
  22. Libelium Pushes Its eHealth IoT Platform with a New Cloud and Medical Development Kits. Available online: https://www.libelium.com/libeliumworld/libelium-pushes-its-ehealth-iot-platform-with-a-new-cloud-and-medical-development-kits/ (accessed on 15 December 2020).
  23. Deltason Medical Ltd. Available online: www.deltason.com (accessed on 15 December 2020).
  24. Xsens 3D Motion Tracking. Available online: www.xsens.com (accessed on 15 December 2020).
  25. The IoT Marketplace. Available online: www.the-iot-marketplace.com/mysignals-sport-performance-monitoring-development-kit-ble (accessed on 15 December 2020).
  26. Shimmer Wireless Sensing Technology. Available online: www.shimmer-research.com (accessed on 15 December 2020).
  27. De Baets, L.; van der Straaten, R.; Matheve, T.; Timmermans, A. Shoulder Assessment according to the International Classification of Functioning by means of Inertial Sensor Technologies: A Systematic Review. Gait Posture 2017, 57, 278–294. [Google Scholar] [CrossRef] [Green Version]
  28. de Lucena, D.S.; Stoller, O.; Rowe, J.B.; Chan, V.; Reinkensmeyer, D.J. Wearable sensing for rehabilitation after stroke: Bimanual jerk asymmetry encodes unique information about the variability of upper extremity recovery. In Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 1603–1608. [Google Scholar]
  29. Tsilomitrou, O.; Gkountas, K.; Evangeliou, N.; Dermatas, E. On the development of a wireless motion capture sensor node for upper limb rehabilitation. In Proceedings of the 2019 6th International Conference on Control, Decision and Information Technologies (CoDIT), Paris, France, 23–26 April 2019; pp. 1568–1573. [Google Scholar]
  30. Phillips, W.; Hailey, C.; Gebert, G. A review of attitude kinematics for aircraft flight simulation. In Proceedings of the Modeling and Simulation Technologies Conference, Denver, CO, USA, 14–17 August 2000; p. 4302. [Google Scholar]
  31. Marieb, E.N.; Hoehn, K. Human Anatomy & Physiology; Pearson Education: London, UK, 2007. [Google Scholar]
  32. Theofanidis, M.; Lioulemes, A.; Makedon, F. A motion and force analysis system for human upper-limb exercises. In Proceedings of the 9th ACM International Conference on Pervasive Technologies Related to Assistive Environments, Corfu, Greece, 29 June–1 July 2016; p. 9. [Google Scholar]
  33. Peppoloni, L.; Filippeschi, A.; Ruffaldi, E.; Avizzano, C.A. A novel 7 degrees of freedom model for upper limb kinematic reconstruction based on wearable sensors. In Proceedings of the 2013 IEEE 11th International Symposium on Intelligent Systems and Informatics (SISY), Subotica, Serbia, 26–28 September 2013; pp. 105–110. [Google Scholar]
  34. Ruffaldi, E.; Peppoloni, L.; Filippeschi, A.; Avizzano, C.A. A novel approach to motion tracking with wearable sensors based on probabilistic graphical models. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 1247–1252. [Google Scholar]
  35. Ethier, C.R.; Simmons, C.A. Introductory Biomechanics: From Cells to Organisms; Cambridge University Press: Cambridge, UK, 2007; pp. 381, 468–470. [Google Scholar]
  36. Holzbaur, K.R.; Murray, W.M.; Delp, S.L. A model of the upper extremity for simulating musculoskeletal surgery and analyzing neuromuscular control. Ann. Biomed. Eng. 2005, 33, 829–840. [Google Scholar] [CrossRef]
  37. Ruby, L.; Conney, W., III; An, K.; Linscheid, R.; Chao, E. Relative motion of selected carpal bones: A kinematic analysis of the normal wrist. J. Hand Surg. 1988, 13, 1–10. [Google Scholar] [CrossRef]
  38. Spong, M.W.; Vidyasagar, M. Robot Dynamics and Control; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  39. InvenSense MPU-9150. Available online: www.invensense.com/products/motion-tracking/9-axis/mpu-9150/ (accessed on 15 December 2020).
  40. Gallagher, A.; Matsuoka, Y.; Ang, W.T. An efficient real-time human posture tracking algorithm using low-cost inertial and magnetic sensors. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566), Sendai, Japan, 28 September–2 October 2004; Volume 3, pp. 2967–2972. [Google Scholar]
  41. Tian, Y.; Wei, H.; Tan, J. An adaptive-gain complementary filter for real-time human motion tracking with MARG sensors in free-living environments. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 21, 254–264. [Google Scholar] [CrossRef]
  42. Fourati, H. Heterogeneous Data Fusion Algorithm for Pedestrian Navigation via Foot-Mounted Inertial Measurement Unit and Complementary Filter. IEEE Trans. Instrum. Meas. 2015, 64, 221–229. [Google Scholar] [CrossRef]
  43. Valenti, R.G.; Dryanovski, I.; Xiao, J. Keeping a Good Attitude: A Quaternion-Based Orientation Filter for IMUs and MARGs. Sensors 2015, 15, 19302–19330. [Google Scholar] [CrossRef] [Green Version]
  44. Mahony, R.; Hamel, T.; Pflimlin, J. Nonlinear Complementary Filters on the Special Orthogonal Group. IEEE Trans. Autom. Control 2008, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version]
  45. Gui, P.; Tang, L.; Mukhopadhyay, S. MEMS based IMU for tilting measurement: Comparison of complementary and kalman filter based data fusion. In Proceedings of the 2015 IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), Auckland, New Zealand, 15–17 June 2015; pp. 2004–2009. [Google Scholar] [CrossRef]
  46. Pedley, M. High precision calibration of a three-axis accelerometer. Free. Semicond. Appl. Note 2013, 1, AN4399. [Google Scholar]
  47. Tait-Bryan Angles. Available online: https://en.wikipedia.org/wiki/Euler_angles (accessed on 15 December 2020).
  48. Espressif Systems. Available online: www.espressif.com (accessed on 15 December 2020).
  49. Semiconductors, P. The I2C-bus specification. Philips Semicond. 2000, 9397, 00954. [Google Scholar]
  50. Fielding, R.; Gettys, J.; Mogul, J.; Frystyk, H.; Masinter, L.; Leach, P.; Berners-Lee, T. Hypertext Transfer Protocol–HTTP/1.1. Technical Report. Available online: http://www.rfc-editor.org/rfc/rfc2616.txt (accessed on 17 February 2021).
  51. Fette, I.; Melnikov, A. The Websocket Protocol. Technical Report. Available online: http://www.rfc-editor.org/rfc/rfc6455.txt (accessed on 17 February 2021).
  52. Merriaux, P.; Dupuis, Y.; Boutteau, R.; Vasseur, P.; Savatier, X. A Study of Vicon System Positioning Performance. Sensors 2017, 17, 1591. [Google Scholar] [CrossRef] [PubMed]
  53. Zhang, Z.; Huang, Z.; Wu, J. Hierarchical information fusion for human upper limb motion capture. In Proceedings of the 2009 12th International Conference on Information Fusion, Seattle, WA, USA, 6–9 July 2009; pp. 1704–1711. [Google Scholar]
  54. Zhang, S.; Jin, W.; Zhang, Y. Implementation and complexity analysis of orientation estimation algorithms for human body motion tracking using low-cost sensors. In Proceedings of the 2017 2nd International Conference on Frontiers of Sensors Technologies (ICFST), Shenzhen, China, 14–16 April 2017; pp. 49–54. [Google Scholar]
Figure 1. Upper limb representation as a robotic arm.
Figure 1. Upper limb representation as a robotic arm.
Asi 04 00014 g001
Figure 2. The wearable motion sensors.
Figure 2. The wearable motion sensors.
Asi 04 00014 g002
Figure 3. Elbow rotation angle θ 4 —Flexion-Extension DoF.
Figure 3. Elbow rotation angle θ 4 —Flexion-Extension DoF.
Asi 04 00014 g003
Figure 4. Elbow rotation angle θ 5 —Pronation-Supination DoF.
Figure 4. Elbow rotation angle θ 5 —Pronation-Supination DoF.
Asi 04 00014 g004
Figure 5. Joints trajectories coordinates in x, y, z axes for elbow flexion-extension exercise—Custom-made sensor node.
Figure 5. Joints trajectories coordinates in x, y, z axes for elbow flexion-extension exercise—Custom-made sensor node.
Asi 04 00014 g005
Figure 6. Wrist and hand trajectories during elbow joint flexion-extension exercise – Comparison with Shimmer3 sensor nodes.
Figure 6. Wrist and hand trajectories during elbow joint flexion-extension exercise – Comparison with Shimmer3 sensor nodes.
Asi 04 00014 g006
Figure 7. Shoulder rotation angle θ 1 —Flexion-Extension DoF.
Figure 7. Shoulder rotation angle θ 1 —Flexion-Extension DoF.
Asi 04 00014 g007
Figure 8. Shoulder rotation angle θ 2 —Abduction-Adduction DoF.
Figure 8. Shoulder rotation angle θ 2 —Abduction-Adduction DoF.
Asi 04 00014 g008
Figure 9. Shoulder rotation angle θ 3 —Internal-External rotation DoF.
Figure 9. Shoulder rotation angle θ 3 —Internal-External rotation DoF.
Asi 04 00014 g009
Figure 10. Joints trajectories coordinates in x, y, z axes for shoulder abduction-adduction exercise—Custom-made sensor node.
Figure 10. Joints trajectories coordinates in x, y, z axes for shoulder abduction-adduction exercise—Custom-made sensor node.
Asi 04 00014 g010
Figure 11. Elbow, wrist and hand trajectories during shoulder joint abduction-adduction exercise – Comparison with Shimmer3 sensor nodes.
Figure 11. Elbow, wrist and hand trajectories during shoulder joint abduction-adduction exercise – Comparison with Shimmer3 sensor nodes.
Asi 04 00014 g011
Figure 12. Wrist rotation angle θ 6 —Flexion-Extension DoF.
Figure 12. Wrist rotation angle θ 6 —Flexion-Extension DoF.
Asi 04 00014 g012
Figure 13. Wrist rotation angle θ 7 —Radial-Ulnar deviation DoF.
Figure 13. Wrist rotation angle θ 7 —Radial-Ulnar deviation DoF.
Asi 04 00014 g013
Figure 14. Joints trajectories coordinates in x, y, z axes for wrist flexion-extension exercise—Custom-made sensor node.
Figure 14. Joints trajectories coordinates in x, y, z axes for wrist flexion-extension exercise—Custom-made sensor node.
Asi 04 00014 g014
Figure 15. Wrist and hand trajectories during wrist joint flexion-extension exercise.
Figure 15. Wrist and hand trajectories during wrist joint flexion-extension exercise.
Asi 04 00014 g015
Table 1. Frames of the upper limb model.
Table 1. Frames of the upper limb model.
LocationDescription
Chest {C}Chest Position—Origin Frame
Shoulder {S1}Flexion and Extension
Shoulder {S2}Ab- and Adduction
Shoulder {S3}In- and External Rotation
Elbow {E1}Flexion and Extension
Elbow {E2}Pronation and Supination
Wrist {W1}Flexion and Extension
Wrist {W2}Deviation
Hand {End-effector}End-effector
Table 2. Upper limb DoFs ranges of motion.
Table 2. Upper limb DoFs ranges of motion.
MotionDoF Range
Shoulder extension-flexion 60 –180
Shoulder adduction-abduction0 –180
Shoulder internal-external rotation 90 –20
Elbow extension-flexion0 –130
Elbow supination-pronation 90 –90
Wrist extension-flexion 70 –70
Wrist radial-ulnar deviation 10 –25
Table 3. DH parameters for the 7 DoF robotic arm.
Table 3. DH parameters for the 7 DoF robotic arm.
FramesLinksai α i d i θ i
{C}000 l c 90 °
{S1}1090°0 θ 1 + 90 °
{S2}2090°0 θ 2 + 90 °
{S3}3090° l u a θ 3 + 90 °
{E1}4090°0 θ 4 + 180 °
{E2}50 90 ° l f a θ 5
{W1}6090°0 θ 6 90 °
{W2}7 l h 90°0 θ 7
Table 4. Elbow joint’s MAE, maximum and RMSE errors.
Table 4. Elbow joint’s MAE, maximum and RMSE errors.
TrajectoryMAE [m]MaxError [m]RMSE [m]
Wrist0.01260.05090.0161
End-Effector0.01820.07060.0229
Table 5. Shoulder joint’s MAE, maximum and RMSE errors.
Table 5. Shoulder joint’s MAE, maximum and RMSE errors.
TrajectoryMAE [m]MaxError [m]RMSE [m]
Elbow0.00610.04620.0087
Wrist0.00720.05470.0103
End-Effector0.00760.05770.0109
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tsilomitrou, O.; Gkountas, K.; Evangeliou, N.; Dermatas, E. Wireless Motion Capture System for Upper Limb Rehabilitation. Appl. Syst. Innov. 2021, 4, 14. https://0-doi-org.brum.beds.ac.uk/10.3390/asi4010014

AMA Style

Tsilomitrou O, Gkountas K, Evangeliou N, Dermatas E. Wireless Motion Capture System for Upper Limb Rehabilitation. Applied System Innovation. 2021; 4(1):14. https://0-doi-org.brum.beds.ac.uk/10.3390/asi4010014

Chicago/Turabian Style

Tsilomitrou, Ourania, Konstantinos Gkountas, Nikolaos Evangeliou, and Evangelos Dermatas. 2021. "Wireless Motion Capture System for Upper Limb Rehabilitation" Applied System Innovation 4, no. 1: 14. https://0-doi-org.brum.beds.ac.uk/10.3390/asi4010014

Article Metrics

Back to TopTop