Next Article in Journal
A Proposed Low-Cost Viticulture Stress Framework for Table Grape Varieties
Previous Article in Journal
Miniaturized On-Chip NFC Antenna versus Screen-Printed Antenna for the Flexible Disposable Sensor Strips
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Monitoring Activities of Daily Living Using UWB Radar Technology: A Contactless Approach

Department of Technology, Kristiania University College, 0152 Oslo, Norway
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 30 September 2020 / Revised: 23 October 2020 / Accepted: 27 October 2020 / Published: 30 October 2020

Abstract

:
In recent years, the ultra-wideband (UWB) radar technology has shown great potential in monitoring activities of daily living (ADLs) for smart homes. In this paper, we investigate the significance of using non-wearable UWB sensors for developing non-intrusive, unobtrusive, and privacy-preserving monitoring of elderly ADLs. A controlled experiment was setup, implementing multiple non-wearable sensors in a smart home Lab setting. A total of nine (n = 9) participants were involved in conducting predefined scenarios of ADLs- cooking, eating, resting, sleeping and mobility. We employed the UWB sensing prototype and conventional implementation technologies, and the sensed data of both systems were stored, analysed and their performances were compared. The result shows that the performance of the non-wearable UWB technology is as good as that of the conventional ones. Furthermore, we provided a proof-of-concept solution for the real-time detection of abnormal behaviour based on excessive activity levels, and a model for automatic alerts to caregivers for timely medical assistance on-demand.

1. Introduction

Trends show that the population of developing nations is growing older than ever before. This is due to the increase in life expectancy and lower birth-rates [1,2,3]. This trend will inevitably lead to a shortage in both nursing home spots and healthcare personnel while simultaneously increasing the demand for elderly care due to age-related diseases [4]. Consequently, there is a growing concern on sociological and economic challenges with regards to elderly care in the future. Moreover, there exist a need for a technology that enables to maintain the health and wellbeing of the older population with a limited health workforce or availability of family members [5]. Additionally, study shows that most older adults prefer to age in place and the comfort of their home [6]. One of the proposed solutions to overcome these challenges is to enable the elderly to stay independent at home and age in place for as long as possible [7]. According to the authors, this triggers a demand for an evaluation of a person’s ability to function independently when performing the activities of daily living (ADLs). However, manual assessment of the performance of elderly ADL is not feasible in real life [8]. Thus, the ubiquitous and automated sensing of elderly activities, behaviour, physiological and cognitive abilities has received notable attention in the ambient assisted living (AAL) research domain [5]. Moreover, the implementation of such technologies can empower the elderly towards independent living through devices that assist them in conducting ADLs and monitoring their health [4,9]. In this regard, Debes et al. [10] pointed out the great potential of deploying sensing technologies and the Internet of Things (IoT) into homes of the elderly to classify and monitor the performance in conducting the ADLs. In the grounds of such monitoring requirements, various researchers such as in [11,12,13] have attempted to characterise the existing proximity sensing technologies, as depicted in Table 1.
Vassli and Farshchian [7] noted that one significant challenge in such implementations is related to the elderly’s acceptance and motivation to use the solutions provided. For example, wearable devices would not be feasible for long-term elderly monitoring because they are burdensome or can be neglected, for example, when the elderly suffers from dementia [14]. Consequently, non-wearable solutions are preferred for better elderly acceptance [5]. On the other hand, the perceived privacy of the user of such sensors can be affected due to the richness of the technology [10]. Integration of ADL monitoring technologies also needs to be cost-effective and easy to maintain [15]. In this regard, the ultra-wideband (UWB) radar technology has in recent years shown great potential [16,17] and has the advantage of using for multiple purposes in the AAL setting while simultaneously meeting the requirements of elderly acceptance. However, the effectiveness of such sensors varies depending on the type of activity being recognised [10]. In general, the emphasis on elderly acceptance and perceived usefulness of the chosen technology is crucial for the task of continuous monitoring. Thus, we describe a non-wearable UWB sensing prototype and explore its performance with respect to the conventional technologies for the monitoring of elderly ADL. The remainder of the paper is organised as follows. Section 2 provides a review of the state-of-the-art sensing technologies in conventional ADL monitoring, AAL, fog computing, on the potential of UWB radar sensing in health monitoring systems, and architectural challenges. In Section 3, we present the research methodology describing the experimental setup, tools and prototypes, and various sensing technologies used in the study. The results and discussion are presented in Section 4, while Section 5 concludes the paper.

2. Related Work

2.1. Monitoring ADL

A person’s ability to conduct ADL is important to live a healthy life with minimum caretakers’ assistance [18]. In this regard, Virone et al. [19] employed a heuristic approach using passive infrared motion sensors to recognise ADL at home in which the sensed data are wirelessly transmitted and stored in a web server. The authors described the use of circadian activity rhythms (CAR) analysis for establishing patterns and identifying irregularities in behaviour of the resident. In another experimental setup in [18], electrical, force, and contact sensors are used to monitor the usage (when and for how long) of various household items. The authors employed machine learning techniques and were able to establish indicators of the residents’ wellbeing by identifying irregularities based on excessive or neglected activities throughout the day. They argue that these indicators could trigger a message to a healthcare personnel or family member when the system is monitoring the irregular levels. Dawadi et al. [20] conducted a related study in which a combination of motion/light sensors and an activity recognition algorithm were used to identify and label variables for mobility, sleep, and bed/toilet transitions. The authors were also able to identify and collect time duration for various ADLs, such as cooking, eating, relaxing, personal hygiene, and leaving home, and provided proof of concept with significant correlation between clinical tests and the proposed system. Although UWB sensing technologies have shown great potential in the AAL domain in recent years, the proposed research is still new. Indeed, to the best of our knowledge, there has only been one in [15] in which the authors used single, non-wearable, UWB sensor for ADL monitoring and proposed a framework to acquire and send raw radar data to the cloud through middleware server architecture. According to Rana et al. [15], the azimuth angle (resident’s position in each movement) is calculated on the cloud and short-term Fourier transform (STFT) is performed to determine the frequency distribution of the various activities. The range between the resident and the sensor is also calculated, and these attributes are used to understand the resident’s engagement at different times, meaning that the relationship between the attributes and the ADL can be extracted. This relationship is used to train an SVM (support vector machine learning algorithm) and thereby recognise the conducted ADL and abnormal activities. The authors argue that the implemented framework facilitates to act upon detection of abnormality remotely.

2.2. The Potential of UWB Radar Sensing

Diraco et al. [5] describes the use of impulse radio UWB sensing for unobtrusive detection of falls and monitoring vital cardiorespiratory sign of residents while performing the ADL. The authors proposed a framework that reduces interference and noise; increased signal-to-noise (SNR) ratio by performing clutter-removal of reflected signals from static objects (e.g., furniture); and produced Doppler spectrogram to estimate the distance between the resident and the radar. According to Diraco et al. [5], further detection of signals reflected from a person’s chest is performed to estimate vital signs and micro-motion. The signals were extracted, and noise generated from periodic movement of sources (e.g., fans, curtains, doors) were removed using band-pass filter. Next, the signals reflected from the person are transformed (with minimum noise and clutter) into intrinsic mode functions (IMF); therefore, the reflected respiration and heart-rate signals are differentiated using empirical mode decomposition (EMD). Finally, heart rate and respiration rate are estimated to simulate the ADLs. However, the authors pointed out that heart rate movement is sensitive to activities involving high motion and above three meters distance to the sensor. So, activities such as resting, sleeping and watching TV showed excellent results on the monitoring of cardio-respiratory signs. In addition, supervised and unsupervised machine learning algorithms were used to detect falls based on the micro-motion signatures. While the supervised approach was based on simulated falls, the unsupervised approach showed greater accuracy by training the algorithm based on regular ADL performance to detect the abnormalities in the residents’ movement. Thus, the UWB system was feasible for providing enough discriminating features for the detection of events such as falling through micro-motion signatures while simultaneously monitoring vital sign. Khan et al. [21] also described a system that monitors vital signs and to detect movement of non-stationary human subject using wearable devices, UWB radar technology and a combination of algorithms. Similarly, Baird et al. [14] presented an algorithm that uses non-contact UWB radar to determine if a room is occupied or not by calculating the sensed and threshold energy values. The algorithm determined the number of people in the room using principal component analysis (PCA) by calculating the first PC (the most significant variance in the data) and dividing it to the integral of the entire PC. If this value is below a threshold, the algorithm assumes there are more people present and continues by searching for another person by repeating the PC calculation. However, this time without the window containing the first person detected and the process goes on until all the people are detected and counted. Nguyen and Pyun [22] proposed Kalman Filter (KF) as a clutter reduction method to remove unwanted signals in impulse radio UWB in indoor positioning systems. The authors proposed a modification to the CLEAN algorithm for target detection as well as extended KF to estimate the localization and tracking of the target. The results showed that the proposed methods and modifications to conventional approaches improve the efficiency and probability for detecting and tracking moving targets indoor. Mokhtari et al. [23] also described a novel alternative to wearable tags or video cameras in order to detect and identify different residents in a smart home. The authors identify that human identification can be achieved through the generation of unique signatures based on data from sensors measuring the individual body shape and movement using methods like passive infrared, analysis of footstep through a microphone or ultrasound sensor for detection and identification of residents through their height. However, the authors showed that a UWB based approach is more suitable for detection and identification applications than the ultrasound due to the low energy and high data rate features. Additionally, as height measurement may lead to lower accuracy (e.g., individuals with similar heights), the authors pointed out for considering multiple features like head, shoulder, and gait.

2.3. Architectural Challenges

Implementation sensing technologies (and IoT) will inevitably lead to vast amounts of data. However, although one could argue that the value of a specific sensor highly depends on what and how much valuable information can be extracted from the data that it provides, the sensor in isolation is useless without the infrastructure to interpret and act upon the generated data. Accordingly, research such as in [24] proposed a cloud-centric (private/public) data processing framework for end-users. The authors argue that the proposed framework provides flexibility which allows different stakeholders (e.g., computation, storage, networking, and visualization) grow independently while simultaneously complement each other in the joint environment. However, the authors also acknowledged open challenges related to architecture, sensing efficiency, security, privacy, quality of service (QoS), protocol effectiveness, data mining, visualization, and support. Stojkoska and Trivodaliev [25] also proposed a similar cloud-centric framework which not only gathers and stores data but also acts as a gateway for application development for third-party stakeholders, enabling them to perform different tasks at different layers at sensors (or objects), hubs, cloud and third-party applications. The authors also recognised challenges in data processing, interoperability, and networking and discussed how fog computing could decrease the transmission of data to the cloud by implementing simple data processing algorithms locally. Furthermore, they reflected on challenges related to big data management solution such as using NoSQL databases, business intelligence tools, and distributed data processing systems such as Apache Hadoop. The ever-increasing trend in the production of sensors and the challenges of data handling using cloud-centric frameworks is also discussed in [26]. Aazam and Huh [26] commented on the benefit of implementing methods such as smart gateway communication and trimming with fog computing as well as pre-processing the data locally before sending is important to provide efficient service by reducing the computation burden from the cloud. Indeed, by implementing fog computing, delay-sensitive applications can be calculated locally and handled in real time as a result of the decrease in transmission delay.

3. Methodology

3.1. Participants

In this study, older adults were the required target demography of participants. However, since the prototype system classifies ADLs only based on relative position, the age of participants was not relevant for completing the experiment. Furthermore, because the performed scenarios are normal ADLs, no special skill was required to participate. Thus, for our convenience in finding participants, email invitation was sent to Kristiania University College students. We stated in the email that participation is completely voluntary and one can opt out anywhere in the middle of the experiment. They were encouraged to experience the state-of-the-art sensing technologies used in our experiment. However, no additional incentive package was provided. Accordingly, nine users (4 male and 5 female students between mid-twenties and early thirties) were chosen to participate in the study.

3.2. Experimental Apparatus

X4M03 Xethru UWB, passive infrared, and ultrasonic sensors were employed in the experiment. Raspberry Pi, Express server-side framework, MangoDB, and Socket.io (a library in node.js) were used to implement the Gateway while Meteor (a Cordova framework for mobile interface), and the mLab cloud database service were used for the cloud level implementation of the prototype. Detail description about the implementation of these apparatuses is provided in Section 4.

3.3. Experimental Setup

We performed a controlled experiment using a fully integrated smart home monitoring laboratory setting. Various types of non-wearable sensors were mounted on different areas (cooking, eating, sleeping, watching TV and mobility) in the home to detect and classify the participants’ ADLs. While the conventional sensing technologies were placed in each area in order to detect motion, two UWB sensors were mounted on the wall as depicted in Figure 1. By continuously sensing and transmitting data to a fog gateway, the ADLs conducted by the participants were classified based on detection and localization within the predefined locations. The UWB sensors calculate the distance to the person by measuring the reflected radio signals and sending them to a local fog gateway to compute the resident’s relative position in the room based on the intersection of two circles. Consequently, when the resident is detected in an area associated with an activity, the system stores activity along with a timestamp. The conventional setup, on the other hand, consisted of infrared and ultrasonic motion-detection sensors placed to detect and register the mobility event data caused by the resident for a given ADL. A Raspberry Pi gateway receives the event data, classifies the currently conducted activity, and stores the activity along with a timestamp.

3.4. Scenarios

In order to test the system’s ability to detect a resident’s ADL, the participants performed multiple scenarios that simulate a “cooking” ADL as well as four normal ADLs for the eating, sleeping, resting and mobility. The sensing systems detected and classified the ADL simultaneously based on the residents’ position in the room and the duration for performing each scenario was 3 min. For each of the scenarios described below, the experiment is controlled, started, and stopped through the controller shown in Figure 2.
In the cooking scenario, the participant walks to the kitchen area, grabs and fills a kettle with water, and then proceeds to boil the water. On the kitchen counter, the participant was presented with multiple instant-ramen noodle cups followed by a set of instructions—pouring water into the cup, stir with a fork and wait for a couple of minutes as shown in Figure 3a. The eating scenario, on the other hand, happened at the dinner table as illustrated in Figure 3b, where the participant brings the cooked instant ramen noodles made in the cooking scenario, proceeds to sit down at one of the eight available seats and then starts eating. In the leisure scenario, the participant first walks over the multimedia area where a TV show is being broadcasted and then moves to the couch to sit down and enjoy a couple of minutes of entertainment (see Figure 3c). Similarly, the participant simulated the sleeping scenario by laying down flat in a bed (see Figure 3d) located in the sleeping area behind a partition wall which blocked the direct path between the bed and the UWB sensors. The blocking enabled us to test the UWB radars’ ability to measure distance through obstacles. Finally, a scenario that measures the system’s ability to detect a residents’ mobility was implemented by allowing the participant to lay down in bed for 1.5 min and then watch TV for the remaining 1.5 min. Accordingly, the system detects whether the resident is moving around in the home to perform various ADLs, thereby depicting changes in the residents’ behavior.

3.5. Procedure

First, the participant is instructed to conduct one specific ADL at a time for 3 min while observed through the window. The scenario is also started in the experiment controller dashboard shown in Figure 2. Simultaneously, the sensing prototypes detect and localise the participants, and the sensed data are streamlined into a Raspberry Pi fog gateway for processing, thereby classifying the conducted ADL. The classified ADL is then stored in a database containing the identified activity, the source prototype, and a timestamp. When a participant finishes attending one scenario, information about type of the performed scenario and start/stop time is stored in the database. For each participant and scenario, the process was repeated for both UWB and conventional sensing systems.

4. Artifact Design

4.1. Architecture

The prototype system is designed in such a way that its control and the generated data are placed within the resident’s premises, meaning that data processing was performed locally through fog computing. Additionally, we integrated third-party systems to notify stakeholders when irregularities occur. Thus, we simulate the implementation of fully integrated smart home ADL monitoring for the elderly. Figure 4 depicts the overall project architecture with multiple sensors, human presence as well as distance detection capabilities connected to micro-controllers that enable seamless throughput of sensed data to the rest of the system. A Raspberry Pi-based fog gateway processes the data received over established TCP connections. This enables us to localise and track the residents’ position in the room and thereby classify and store the activities in a database for later analysis. An off-premise cloud solution receives status messages of the system and the connected sensors and alerts when the local sensing system detects abnormal behavior. Accordingly, the residents’ health is monitored at a glance through reactive, responsive, and interactive (web and mobile) user interface. Detailed description of the different levels of the architecture, technologies and frameworks is provided next.

4.2. Sensor Level

In this study, UWB, PIR and Ultrasonic sensing prototypes are implemented. The UWB sensing prototype was developed by seamlessly connecting a Xethru X4M300 presence sensor connected to a Particle microcontroller as depicted in Figure 5a. XeThru X4M300 is Novelda’s presence and occupancy sensor powered by the XeThru X4 ultra-wideband radar chip which is ultrasensitive with excellent signal to noise performance for detecting the smallest human movement in a room [27]. Initially, when powered up and connected to the local Wi-Fi, the Particle requests for connection credentials (IP address and Port) from the fog gateway through the Particle cloud followed by initializing the radar and making it ready for analysing the presence data using an open source Xethru-Arduino library [28]. After restarting the radars using a reset pin on the Xethru board, a predefined profile for occupancy detection analysis is loaded followed by generating a noise map using a pre-processing clutter reduction technique. Consequently, noise from static objects in the room is considered and allows the sensors to detect micro-movement generated from the resident. Finally, the detection zone is set to an area between 0 and 9 m and sensitivity is set to the maximum value equal to nine. At the same time, the Particle establishes TCP connection to the fog gateway and then sends JSON-encoded data (per second) containing sensor’s name and the radar’s current state.
The radar can be in one of "presence", "no presence", "unknown" and "initializing" states. However, once it is initialised and human movement is detected, the methods are used to fetch processed presence data (e.g., estimated distance to the resident in millimeters, direction, and an indicator of the signal strength). So, whenever presence state is detected, the JSON-encoded payload is sent to the Raspberry Pi for processing the localization to the resident in the room and classify the conducted ADL. If the connection breaks, the Particle re-sends a request for connection information. Thus, when the fog gateway boots up, the UWB prototypes connect, reinitialise the radar, and sends the sensed data automatically. As part of the conventional technologies, PIR prototypes were developed consisting of Luxorparts PIR sensor connected seamlessly to Particle Photon microcontrollers. With a 7-m detection range and 100 degrees angle, the prototypes were mounted pointing down from the roof above the dinner table and kitchen area in order to detect motion and thereby classify the cooking and eating ADL as shown in Figure 5b. The PIR sensor does not calculate the distance to the resident but it detects the radiation levels emitted in the room. Because of body-heat, humans emit higher levels of radiation than household objects, which enables the sensor to detect motion in the area. Furthermore, the connection between the sensor and the Particle microcontroller allowed throughput of presence data to be sent continuously every second. Like in the UWB prototype, the connected Particle microcontroller is implemented with the switch/case state system, but it did not require initializing the radar. The prototype sends sensed data containing the sensor’s name, presence detection and predefined activity. When the presence state is detected, the fog gateway classifies the conducted activity.
Ultrasonic motion detection is another sensing prototype which was developed using HC-SR05 sensor connected to a Particle Photon microcontroller as seen in Figure 5c. The sensor transmits ultrasonic sound waves and measures the time it takes for the reflected signals to be received by the sensor which enables to calculate the distance between the sensor and an object in its pathway. By multiplying, the time spent (traveling) and the speed of sound (in cm/sec) return distance travelled in centimeters. Consequently, the sensors are placed in key positions where the resident is conducting an ADL in its direct pathway and the distance (in cm) is calculated. The ultrasonic prototype sends the sensed data including name of the sensor, presence state, and predefined activity; and the fog gateway classifies conducted activity whenever a state attribute change to “presence” is detected. Particle microcontrollers is used to manage connectivity of the sensors with rest of the system over using an access point (e.g., Wi-Fi) [29]. The configuration is performed to automatically receive an IP address of the fog gateway and leverages native publish/subscribe feature of the microcontrollers as depicted in Figure 6 and Figure 7. When the microcontroller is online, a HelloWorld message is periodically published including the unique particle ID to the cloud thereby made accessible to system and the fog gateway connection string. The fog gateway is setup with a custom API library for communicating with the Particle cloud which can detect messages from the microcontroller. Then, the fog gateway makes a call to a function on the microcontrollers through the API which enables the Particle to fetch and send data directly to the fog gateway over local TCP connection.

4.3. The Gateway

The smart fog gateway provides middleware service capable of reconciling the sensing prototypes and the cloud [30,31]. It enables one to classify the ADLs locally on a low-cost IoT device, thereby eliminating the need for exporting sensitive data off residents’ premises. It implements web application using Node.js frameworks on a Raspberry Pi in which the Express server-side framework enables one to set up local RESTful API through server-side routing. This allows for the sensing prototypes to interact with the fog through established HTTP methods. Furthermore, the data are processed with regards to localization and classification of the conducted ADL. Moreover, notifications are sent to the external cloud solution system’s status or alerting whenever irregularities are detected. RESTful APIs were set up to allow for the connection and interaction with the fog gateway through predefined HTTP POST methods. The APIs are accessed by specifying the IP address and port number of the Raspberry Pi in the request headers along with JSON-encoded data as described next:
  • <IP>:<Port>/api/event: receives JSON-encoded presence data over TCP.
  • <IP>:<Port>/api/experiment: stores performed activities along with the start and stop timestamps.
  • <IP>:<Port>/api/mobility: stores timestamp of movements during the mobility scenario.
The external cloud solution provides authorised stakeholders to monitor the residents’ health immediately. Therefore, whenever the sensing prototypes connect to local fog gateway, the name of the sensor is added in a list along with a timestamp. The conventional sensors are associated to a location (hence an activity) but this is not the case with the UWB prototypes. Thus, whenever the resident is present, the UWB prototypes add estimated distance between sensors and the resident in JSON-encoded payload which enables the system to localise and track residents’ position in the room. The distance between the sensor and the resident is estimated using the intersection of two circles represented by X and Y coordinates constituting the UWB sensors in a grid system as shown in Figure 8.
The classification of ADLs is performed by checking if the president’s position is within one of the predefined areas in the room shown in Figure 1. Thus, the X and Y coordinate position of the resident are used to classify the ADL by checking if the resident is within the specified area. However, being registered inside a predefined area is not enough to classify whether the activity is being conducted or not. For example, the resident could be walking by the predefined area which would result in a false classification. Thus, a simple ADL classification algorithm was implemented based on frequency of consecutive presence (e.g., five times) of the resident in an area and the classification data along with timestamp is stored in local database. Monitoring excessive or neglected performance of ADLs can establish indicators of the residents’ well-being [18]. Thus, an algorithm was implemented to detect irregularities based on the time spent conducting a specific ADL estimated relative to a threshold value for normal behavior, enabling notifications to be sent to the cloud. The processed sensor data are sent for visualization on user interface through the Socket.io and enables real-time, bidirectional and event-based communication [32]. Thus, the data sent from the sensing prototypes are received, processed and visualised in real-time in the user interface and progress of the classification algorithm is shown in progress bars.

4.4. Integration

The cloud solution was implemented using Meteor JavaScript framework which comes with a set of technologies for building connected-client reactive applications, a build tool, and a curated set of packages from Node.js and JavaScript [33]. This enables rapid development with seamless connection between MongoDB, client, server, authentication, routing as well as mobile devices. The solution was deployed on Heroku cloud platform as http://elderly-monitoring-hub.herokuapp.com/ [34]. The mLab MongoDB cloud service was used to deploy the database—with backup, monitoring and expert support [35]. Thus, a user can access the solution by providing a username and password and can host the entire cloud solution outside of the residence. Regarding server-side routing, the local sensing system interacts with the cloud using RESTful API consisting of <url>/api/update for updating abnormal behavior and <url>/api/ping for receiving ping messages to verify connectivity of the sensing prototypes. The API uses JSO-encoded data as input parameters along with current connectivity status. The designated health care personnel, friends and family of the resident require presenting the incoming sensor data in a secure, yet intuitive and reliable mechanism while monitoring the elderly’s ADL. Thus, the stakeholders are authenticated by logging in with a registered e-mail and password. This includes a quick overview of the relevant information about the patient, the status of local sensing system and ADL irregularities, if detected.
The cloud receives updates if any of the components stopped sending, which provides reliability of the presented data in decision making. Thus, whenever one of the sensing prototypes stop working, the status icon of the system changes to a yellow warning sign. Moreover, the sign indicating irregularities in ADL will change to a red cross, due to not being able to present whether the system can detect irregularities reliably. Additionally, detailed information on a specific patient can be retrieved from the patient page through patient cards (see Figure 9).
The Meteor framework is essentially used for creating Web applications, but it also seamlessly implements Apache Cordova to create apps for mobile platforms. Cordova enables one to wrap the application written in HTML/JavaScript into a native container to access device functions of the mobile platforms. These functions are exposed via a unified JavaScript API, allowing one to write one set of code to target nearly every phone or tablet available today and publish to their app stores [36]. Thus, as illustrated in Figure 10, friends, family, and healthcare personnel can seamlessly and securely log in and monitor the elderly using a phone, tablet or the conventional web.

5. Results

Descriptive statistics, accuracy, specificity, recall, and precision are used as metrics to evaluate performance of both the UWB and conventional sensing prototypes. Additionally, detection frequency, initial detection time, and the ability to detect mobility are presented next. The results reveal that the UWB and conventional sensing prototypes were able to detect 785 and 703 times, respectively, distributed over four different scenarios, where each scenario were conducted for 3 min by the nine participants. Out of these detections, 783 in UWB and 702 in conventional were classified appropriately. A detailed description of the results is provided in Table 2.
Analysis of the system’s accuracy, specificity, recall (or sensitivity), precision and error rate for both the UWB and conventional sensing prototypes is described next:
  • Accuracy indicates how often the classification model was able to predict the correct ADL. The accuracy (A) for each scenario is calculated as Ai = (TPi + Tni)/N, where TP and TN are true positive and true negative values, for each scenario (i), and the number of detections (N). The overall accuracy of each category of sensing prototype is ΣAi.
  • Specificity, also known as true negative rate, indicates the ratio between when the activity was not conducted and when the activity was not predicted. The specificity (S) for each scenario (i) is determined as Si = TNi/(TNi + FPi), where TN and FP are true negative and false positive values, with total specificity ΣSi.
  • Recall or sensitivity, also known as the true positive rate, is the ratio between when the activity was conducted and when the activity was predicted. The recall I for each scenario (i) is calculated as Ri = TPi/(TPi + FNi), where TP and FN are true positive and false negative values, and the total recall is ΣRi.
  • The precision levels of the system indicate how often the correct daily activity was predicted. Precision (P) for each scenario is determined as Pi = TPi/(TPi + FPi), where TP and FP are true positive and false positive values, and the total precision is ΣPi.
  • The error rate indicates how often the classification model predicted the wrong daily activity. The error (E) for each scenario is calculated as Ei = (FPi + FNi)/N, where FP and FN are false positive and false negative values, for each scenario (i), and the number of detections (N). The overall accuracy of the conventional sensing prototype is ΣEi.
Accordingly, the overall system’s accuracy, specificity, recall, precision and error rate for both the UWB and conventional sensing prototypes are calculated as shown in Table 3. The table demonstrates high values for accuracy, sensitivity, specificity, and precision levels, and low values in misclassification levels. This implies that both the UWB and conventional prototypes were excellent at discriminating false data readings and classifying the correct activity.
Analysis of the detection frequency, which describes the system’s ability to classify frequently enough to exclude the possibility of missing a conducted activity, was found to be 100/108 and 94/108 potential minutes for the UWB and conventional systems, respectively (see Table 3). That is, both systems performed reasonably well, the UWB being slightly better. Additionally, the average initial detection time for the conventional prototype was found to be slightly faster. Finally, the ability to detect the mobility of the resident was tested by having the participants first conduct the sleeping scenario, and then perform the resting scenario. Thus, as shown in Table 4, the UWB system was able to detect all the participants’ mobility in such a way that sleeping was detected before the mobility change and the resting after. However, the conventional system performed slightly less as it missed to detect the sleeping scenario of two participants. Consequently, the results indicate the ability to provide excellent performance with regards to monitoring the elderly’s ADL using both the UWB and the conventional system. In general, our results show that the non-wearable ultrawide-band technology can provide equally good performance as conventional ones with regards to monitoring of elderly ADL. However, as this research was focused on the limited quality characteristics mentioned above, it will be extended further in our future work by concentrating more on the evaluation of usability of the gateway and cloud solutions for monitoring elderly ADL using non-wearable UWB.

6. Conclusions

This work investigated the development of a context-aware, non-wearable UWB sensing prototype capable of recognizing activities of daily living (ADL). The prototype was implemented using a non-contact UWB, and its performance was compared to conventional state-of-the-art sensing technologies including the ultrasonic and passive infrared. Accordingly, a controlled experiment was performed in a smart-home laboratory setting which allowed us to measure the ability of the technologies to detect and to classify the participants’ daily activities through simulation of predefined scenarios- cooking, eating, resting, sleeping, and mobility. The classification performance was evaluated through statistical metrics and indicators revealing valuable insights into the sensing technologies ability to monitor elderly ADL. The result showed excellent performance for both systems in accuracy, sensitivity, specificity, and precision. The low-level misclassification also reveals that both technologies were excellent in discriminating false data readings and classifying the activities correctly. Regarding detection frequency, both systems performed well and the UWB system performed slightly better. Furthermore, although the average initial detection time was shorter for UWB, looking closer at the datasets revealed that the conventional implementation showed more outliers, which makes it slightly faster. Finally, the ability to detect a user’s mobility was tested in such a way that the participants first performed the sleeping scenario and then the resting. The result showed that the UWB system was able to detect the mobility changes of all participants in the correct order (sleeping was detected before resting). However, the conventional implementation performed slightly less. Overall, our study indicates excellent performance with regards to monitoring elderly ADL for both the non-wearable UWB radar sensing prototype as well as conventional implementations, the UWB being slightly better in some of the indicators.

Author Contributions

Conceptualization, S.K., G.A., S.F. and T.-M.G.; Formal analysis, S.K.; Investigation, S.K.; Methodology, G.A.; Software, S.K.; Supervision, S.F. and T.-M.G.; Visualization, G.A.; Writing—original draft, S.K. and G.A.; Writing—review & editing, G.A., S.F. and T.-M.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

We thank all the participants who endeavored to participate in our experiment with the Ultrawideband as well as Conventional sensing prototypes; and in the various scenarios of ADLs.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADLsActivities of Daily Living
AALAmbient Assisted Living
EMDEmpirical Mode Decomposition
IMFIntrinsic Mode Functions
IoTInternet of Things
KFKalman Filter
MDPIMultidisciplinary Digital Publishing Institute
PCAPrincipal Component Analysis
QoSQuality of Service
SNRSignal-to-noise Ratio
UWBUltra-Wide-Band

References

  1. Calvaresi, D.; Cesarini, D.; Sernani, P.; Marinoni, M.; Dragoni, A.F.; Sturm, A. Exploring the ambient assisted living domain: A systematic review. J. Ambient Intell. Humaniz. Comput. 2017, 8, 239–257. [Google Scholar] [CrossRef]
  2. Giannakouris, K. Ageing characterises the demographic perspectives of the European societies. Stat. Focus 2008, 72, 2008. [Google Scholar]
  3. Muszyn´ska, M.M.; Rau, R. The old-age healthy dependency ratio in Europe. J. Popul. Ageing 2012, 5, 151–162. [Google Scholar] [CrossRef] [Green Version]
  4. Kon, B.; Lam, A.; Chan, J. Evolution of smart homes for the elderly. In Proceedings of the 26th International Conference on World Wide Web Companion; International World Wide Web Conferences Steering Committee: Geneva, Switzerland, 2017; pp. 1095–1101. [Google Scholar]
  5. Diraco, G.; Leone, A.; Siciliano, P. A radar-based smart sensor for unobtrusive elderly monitoring in ambient assisted living applications. Biosensors 2017, 7, 55. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Farber, N.; Shinkle, D.; Lynott, J.; Fox-Grage, W.; Harrell, R. Aging in Place: A State Survey of Livability Policies and Practices; National Conference of State Legislatures and AARP Public Policy Institute: Denver, CO, USA, 2011. [Google Scholar]
  7. Vassli, L.T.; Farshchian, B.A. Acceptance of health-related ICT among elderly people living in the community: A systematic review of qualitative evidence. Int. J. Hum. Comput. Interact. 2018, 34, 99–116. [Google Scholar] [CrossRef]
  8. Narins, B. The Gale Encyclopedia of Nursing and Allied Health; Gale, Cengage Learning: Boston, MA, USA, 2013. [Google Scholar]
  9. Liu, L.; Stroulia, E.; Nikolaidis, I.; Miguel-Cruz, A.; Rincon, A.R. Smart homes and home health monitoring technologies for older adults: A systematic review. Int. J. Med. Inform. 2016, 91, 44–59. [Google Scholar] [CrossRef]
  10. Debes, C.; Merentitis, A.; Sukhanov, S.; Niessen, M.; Frangiadakis, N.; Bauer, A. Monitoring activities of daily living in smart homes: Understanding human behavior. IEEE Signal Process. Mag. 2016, 33, 81–94. [Google Scholar] [CrossRef]
  11. Shit, R.C.; Sharma, S.; Puthal, D.; Zomaya, A.Y. Location of Things (LoT): A review and taxonomy of sensors localization in IoT infrastructure. IEEE Commun. Surv. Tutor. 2018, 20, 2028–2061. [Google Scholar] [CrossRef]
  12. Plikynas, D.; Žvironas, A.; Budrionis, A.; Gudauskis, M. Indoor Navigation Systems for Visually Impaired Persons: Mapping the Features of Existing Technologies to User Needs. Sensors 2020, 20, 636. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Ramtohul, A.; Khedo, K.K. Mobile Positioning Techniques and Systems: A Comprehensive Review. Mob. Inf. Syst. 2020, 2020. [Google Scholar] [CrossRef]
  14. Baird, Z.; Gunasekara, I.; Bolic, M.; Rajan, S. Principal component analysis-based occupancy detection with ultra wideband radar. In Proceedings of the 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, 6–9 August 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1573–1576. [Google Scholar]
  15. Rana, S.P.; Dey, M.; Ghavami, M.; Dudley, S. Signature inspired home environments monitoring system using ir-uwb technology. Sensors 2019, 19, 385. [Google Scholar] [CrossRef] [Green Version]
  16. Alarifi, A.; Al-Salman, A.; Alsaleh, M.; Alnafessah, A.; Al-Hadhrami, S.; Al-Ammar, M.A.; Al-Khalifa, H.S. Ultra wideband indoor positioning technologies: Analysis and recent advances. Sensors 2016, 16, 707. [Google Scholar] [CrossRef]
  17. Bouchard, K.; Maitre, J.; Bertuglia, C.; Gaboury, S. Activity Recognition in Smart Homes using UWB Radars. Procedia Comput. Sci. 2020, 170, 10–17. [Google Scholar] [CrossRef]
  18. Suryadevara, N.K.; Mukhopadhyay, S.C. Wireless sensor network based home monitoring system for wellness determination of elderly. IEEE Sens. J. 2012, 12, 1965–1972. [Google Scholar] [CrossRef]
  19. Virone, G.; Alwan, M.; Dalal, S.; Kell, S.W.; Turner, B.; Stankovic, J.A.; Felder, R. Behavioral patterns of older adults in assisted living. IEEE Trans. Inf. Technol. Biomed. 2008, 12, 387–398. [Google Scholar] [CrossRef] [PubMed]
  20. Dawadi, P.N.; Cook, D.J.; Schmitter-Edgecombe, M. Automated cognitive health assessment from smart home-based behavior data. IEEE J. Biomed. Health Inform. 2015, 20, 1188–1194. [Google Scholar] [CrossRef] [Green Version]
  21. Khan, F.; Choi, J.W.; Cho, S.H. Vital sign monitoring of a non-stationary human through IR-UWB radar. In Proceedings of the 2014 4th IEEE International Conference on Network Infrastructure and Digital Content, Beijing, China, 19–21 September 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 511–514. [Google Scholar]
  22. Nguyen, V.H.; Pyun, J.Y. Location detection and tracking of moving targets by a 2D IR-UWB radar system. Sensors 2015, 15, 6740–6762. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Mokhtari, G.; Zhang, Q.; Hargrave, C.; Ralston, J.C. Non-wearable UWB sensor for human identification in smart home. IEEE Sens. J. 2017, 17, 3332–3340. [Google Scholar] [CrossRef]
  24. Gubbi, J.; Buyya, R.; Marusic, S.; Palaniswami, M. Internet of Things (IoT): A vision, architectural elements, and future directions. Future Gener. Comput. Syst. 2013, 29, 1645–1660. [Google Scholar] [CrossRef] [Green Version]
  25. Stojkoska, B.L.R.; Trivodaliev, K.V. A review of Internet of Things for smart home: Challenges and solutions. J. Clean. Prod. 2017, 140, 1454–1464. [Google Scholar] [CrossRef]
  26. Aazam, M.; Huh, E.N. Fog computing and smart gateway based communication for cloud of things. In Proceedings of the 2014 International Conference on Future Internet of Things and Cloud, Barcelona, Spain, 27–29 August 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 464–470. [Google Scholar]
  27. Xethru. X4M300: Presence Sensor. Available online: https://www.xethru.com/x4m300-presence-sensor.html (accessed on 23 April 2019).
  28. Xethru. Xethru-Arduino. Available online: https://github.com/xethru/XeThru-Arduino-Examples (accessed on 23 April 2019).
  29. Particle.io. Hardware to Bring Your Device Online. Available online: https://www.particle.io/what-is-particle (accessed on 23 April 2019).
  30. Rahmani, A.M.; Gia, T.N.; Negash, B.; Anzanpour, A.; Azimi, I.; Jiang, M.; Liljeberg, P. Exploiting smart e-Health gateways at the edge of healthcare Internet-of-Things: A fog computing approach. Future Gener. Comput. Syst. 2018, 78, 641–658. [Google Scholar] [CrossRef]
  31. Atzori, L.; Iera, A.; Morabito, G. The internet of things: A survey. Comput. Netw. 2010, 54, 2787–2805. [Google Scholar] [CrossRef]
  32. Socket.io. Socket.io Is Here. Available online: https://socket.io/ (accessed on 23 April 2019).
  33. Meteor. What Is Meteor? Available online: https://guide.meteor.com/ (accessed on 23 April 2019).
  34. Heroku. What Is Heroku? Available online: https://www.heroku.com/what (accessed on 23 April 2019).
  35. mLab. Database-as-a-Service for MongoDB. Available online: https://mlab.com/company/ (accessed on 23 April 2019).
  36. Cordova. Supported Platforms. Available online: https://cordova.apache.org/ (accessed on 23 April 2019).
Figure 1. Layout and sensor setup overview.
Figure 1. Layout and sensor setup overview.
Iot 01 00019 g001
Figure 2. Experiment control dashboard.
Figure 2. Experiment control dashboard.
Iot 01 00019 g002
Figure 3. Graphical illustration of the scenario—cooking (a), eating (b), resting (c) and sleeping (d).
Figure 3. Graphical illustration of the scenario—cooking (a), eating (b), resting (c) and sleeping (d).
Iot 01 00019 g003
Figure 4. Technical sketch of the overall architecture.
Figure 4. Technical sketch of the overall architecture.
Iot 01 00019 g004
Figure 5. Sensing prototypes—UWB mounted to wall (a), PIR motion detector mounted in the roof (b) and Ultrasonic (c).
Figure 5. Sensing prototypes—UWB mounted to wall (a), PIR motion detector mounted in the roof (b) and Ultrasonic (c).
Iot 01 00019 g005
Figure 6. Flowchart for the ultra-wideband (UWB)/Particle prototype.
Figure 6. Flowchart for the ultra-wideband (UWB)/Particle prototype.
Iot 01 00019 g006
Figure 7. Particle microcontroller connection diagram.
Figure 7. Particle microcontroller connection diagram.
Iot 01 00019 g007
Figure 8. Real-time analytics dashboard.
Figure 8. Real-time analytics dashboard.
Iot 01 00019 g008
Figure 9. Screenshot of the resident-status page displayed in the web view.
Figure 9. Screenshot of the resident-status page displayed in the web view.
Iot 01 00019 g009
Figure 10. Illustrations of application interface on mobile and tablet.
Figure 10. Illustrations of application interface on mobile and tablet.
Iot 01 00019 g010
Table 1. Comparison between categories of conventional proximity sensors.
Table 1. Comparison between categories of conventional proximity sensors.
CriteriaUltrasoundIR SensorLaser SensorBLENFCPassive RFID
Range16 m6 m2200 m100 m1 m12 m
ResolutionHighVariableVery HighLowHighHigh
LinearityLinearNon–linearLinearNon–linearLinearNon–linear
SizeSmallSmallModerateSmallSmallSmall
MobilityPortablePortableNon–portablePortablePortablePortable
WeightLightLightHeavyLightLightLight
Table 2. UWB and conventional classification for each scenario.
Table 2. UWB and conventional classification for each scenario.
ParticipantUltrawidebandConventional
CookEatRestSleepCookEatRestSleep
12529242227133233
22731121128191123
322322712307331
4193126103111220
5532278341341
62832248153254
724322712583535
82832297285294
9233320325142333
Min529121151110
Max2833292234193535
Median2432268288294
Average22322492792715
SD7.141.135.206.015.345.777.7415.70
Total2012842168224381244134
Table 3. Performance of the UWB and conventional technologies.
Table 3. Performance of the UWB and conventional technologies.
Ultra-Wide BandConventional
Accuracy97.7%99.8%
Specificity99.9%99.9%
Recall97.6%99.8
Precision99.7%99.8%
Error Rate>1%>1%
Detection Frequency100/10894/108
Initial Detection Time15.95 s21.81 s
Mobility18/1816/18
Table 4. Initial detection time for UWB and conventional prototype in seconds.
Table 4. Initial detection time for UWB and conventional prototype in seconds.
ParticipantUltrawidebandConventional
CookEatRestSleepCookEatRestSleep
110.5311.7123.8729.6712.1431.8412.9611.08
215.3510.3440.6039.966.2911.7510.8814.95
36.408.4024.6439.986.8726.819.95139.79
46.497.8523.3880.499.149.1414.76null
57.437.2721.1846.556.7197.035.4059.64
625.035.8642.9427.9513.0327.7015.6055.31
74.834.3530.9434.197.8527.964.994.89
81.795.6917.0466.615.888.954.487.72
924.433.6622.5034.936.0242.8210.936.50
Min1.793.6617.0427.955.888.954.484.89
Max25.0311.7142.9480.4913.0397.0315.60139.79
Median7.437.2723.8739.966.8727.7010.8813.02
Average11.367.2427.4544.488.2131.569.9937.49
SD8.462.668.9117.752.6927.094.2046.89
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Klavestad, S.; Assres, G.; Fagernes, S.; Grønli, T.-M. Monitoring Activities of Daily Living Using UWB Radar Technology: A Contactless Approach. IoT 2020, 1, 320-336. https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020019

AMA Style

Klavestad S, Assres G, Fagernes S, Grønli T-M. Monitoring Activities of Daily Living Using UWB Radar Technology: A Contactless Approach. IoT. 2020; 1(2):320-336. https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020019

Chicago/Turabian Style

Klavestad, Sindre, Gebremariam Assres, Siri Fagernes, and Tor-Morten Grønli. 2020. "Monitoring Activities of Daily Living Using UWB Radar Technology: A Contactless Approach" IoT 1, no. 2: 320-336. https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020019

Article Metrics

Back to TopTop