Next Article in Journal
Quantitative Monitoring of Dynamic Blood Flows Using Coflowing Laminar Streams in a Sensorless Approach
Next Article in Special Issue
A Novel Anatomy Education Method Using a Spatial Reality Display Capable of Stereoscopic Imaging with the Naked Eye
Previous Article in Journal
Drift Evaluation of a Quadrotor Unmanned Aerial Vehicle (UAV) Sprayer: Effect of Liquid Pressure and Wind Speed on Drift Potential Based on Wind Tunnel Test
Previous Article in Special Issue
Effects of the Weight and Balance of Head-Mounted Displays on Physical Load
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Review of Microsoft HoloLens Applications over the Past Five Years

Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
*
Author to whom correspondence should be addressed.
Submission received: 14 June 2021 / Revised: 28 July 2021 / Accepted: 5 August 2021 / Published: 6 August 2021
(This article belongs to the Collection Virtual and Augmented Reality Systems)

Abstract

:
Since Microsoft HoloLens first appeared in 2016, HoloLens has been used in various industries, over the past five years. This study aims to review academic papers on the applications of HoloLens in several industries. A review was performed to summarize the results of 44 papers (dated between January 2016 and December 2020) and to outline the research trends of applying HoloLens to different industries. This study determined that HoloLens is employed in medical and surgical aids and systems, medical education and simulation, industrial engineering, architecture, civil engineering and other engineering fields. The findings of this study contribute towards classifying the current uses of HoloLens in various industries and identifying the types of visualization techniques and functions.

1. Introduction

Virtual reality (VR), augmented reality (AR) and mixed reality (MR) have been emerging technologies for several years, but they are now able to generate realistic images, sounds and other sensations that allow users to experience a spectacular imaginary world [1]. It can be difficult to understand the difference between VR, AR and MR, at a glance. Therefore, the definitions of these and other, similar terms are required. VR is the most prevalent of these technologies. It allows people to feel completely immersed in an alternate reality. Using a head-mounted display (HMD) or headset, a user can manipulate objects while experiencing computer-generated visual effects and sounds. Alternatively, AR overlays digital information on real elements. Based in the real world, it creates a new layer of perception and complements reality or the environment. MR utilizes reality and digital elements. Next-generation sensors and imaging technologies can be used to interact and manipulate both physical and virtual items and environments. MR provides an environment in which users can see their surroundings while interacting with the virtual environment without taking off their headset.
VR, AR and MR technologies are currently gaining importance in several fields (including healthcare, architecture and civil engineering, manufacturing, defense, tourism, automation and education) and changing the way we work in these fields. In the field of AR, it is critical to visualize both real and virtual content simultaneously. Significant progress has been made in this area and many AR devices, such as Google Glass, Vuzix Blade and Epson Moverio, have been developed [2]. HoloLens—a state-of-the-art AR device developed by Microsoft in 2016—is an MR-based HMD, i.e., a wearable device that allows users to interact with the environment using holograms while stimulating the senses as a whole [3]. HoloLens is a pair of perspective holographic glasses that includes a central processing unit, graphics processing unit and holographic processing unit, which can handle real-time spatial mapping and processing [4]. Further, its functions are more advanced than those of conventional AR equipment, such as stereoscopic three-dimensional (3D) display, gaze design, gesture design, spatial sound design and spatial mapping [5,6].
Based on these functions, various studies have been conducted to determine methods for using HoloLens more efficiently. For instance, a study on effectively controlling the HoloLens, using the gaze, gesture and voice control functions provided by the device, and studies on methods for utilizing this device in various industrial fields have been conducted [7,8,9,10,11,12,13,14,15,16,17]. Research on methods of visualizing the scanned image or data of an actual object as 3D contents [18,19,20] and correcting them [21,22] have also been conducted. Moreover, some studies have evaluated the performance and utility of the HoloLens, while others have developed additional toolkits or hardware to improve the performance of the HoloLens [2,23,24,25,26]. Rokhsaritalemi et al. [27] conducted a comprehensive study on the development stage and analysis model, simulation toolkit, system type and architecture type for MR and presented its practical issues for stakeholders considering other domains of MR. Nevertheless, there is a lack of analysis and review on how HoloLens is being used in various industries.
Therefore, the purpose of this study was to review academic papers on the applications of HoloLens in medical and surgical aids and systems, medical education and simulation, industrial engineering, architecture, civil engineering and other engineering fields. For this purpose, 44 academic research articles published over the past five years (2016–2020) were analyzed. The current status and trends in research based on HoloLens were investigated by year and field and the type of visualization technology and functions provided to users were discussed.

2. Methods

Three major processes were designed to review academic papers on the use of HoloLens in major industries. The first step was to design a search method for collecting papers suitable for investigating the current status and trends in HoloLens-based studies. In the second step, the abstracts of the selected papers were analyzed to determine whether the research provided information suitable for the purpose of this study. Finally, relevant information was extracted, classified and structured and the contents of the papers were summarized.

2.1. Literature Search Method

To begin this review, search keywords that covered all HoloLens-related studies were derived. In this study, “HoloLens”, “virtual reality”, “augmented reality”, “mixed reality” and “visualization” were selected as keywords. Relevant research papers published in academic journals and M.Sc./Ph.D. were found on Google Scholar, SCOPUS and Web of Science. Only articles published in the last five years (i.e., from January 2016 to December 2020) were considered, to ensure the latest information. A total of 83 candidate papers were located using the search method described above.

2.2. Selection Criteria

Next, selection criteria were determined to narrow the papers for further review. Papers that are duplicated, unpublished, or linguistically difficult were excluded (six studies). Filtering was performed using additional selection criteria in the following order: (1) papers in which the field of application of HoloLens, visualization technology, or functions provided to users were unclear were excluded; (2) the second filter excluded literature reviews; (3) the third filter excluded papers containing methodology using HoloLens or similar device development, rather than the use of HoloLens in a field. As a result, 44 papers were selected from the results of the four filters for further processing (Figure 1).

2.3. Data Analysis and Outline of Results

The 44 articles obtained during literature selection were analyzed to classify the contents of the articles and facilitate data interpretation. Analyses were performed for each publication considering the indicators presented in Table 1. In the following sections, after classifying the papers by the fields in which HoloLens was used, the contents of the papers are summarized and the analysis results are presented.
Research trends were defined and analyzed by publication year, number of citations and the HoloLens version of individual publications. In addition, the visualization technology (AR or MR) provided by HoloLens was investigated. Finally, the functions provided to users were analyzed. The functions were divided into visualization, interaction and immersion. Visualization was defined as registering virtual objects in space and time; interaction was defined as the creation of an environment that could be manipulated without a controller, using natural communication modes such as gestures, voice and gaze; immersion referred to a virtual environment in which users could immerse themselves and interact with virtual objects or data in real time to process and analyze them.

3. Results

3.1. Applications in Medical and Healthcare Fields

In the medical and healthcare field, using VR simulations, which can be reproduce the organs of actual humans closely, can help doctors and caregivers save the patients’ life and ensure that the medical equipment is used effectively; these simulations are also required for the education of medical students or trainees. HoloLens is used for surgical operations, psychiatry and rehabilitation treatment. VR technology is employed in medical care to increase the productivity and benefit of medical services [28]. In this study, the use of HoloLens in the medical and healthcare fields was analyzed by dividing them into medical/surgical aids and systems and medical education and simulation.

3.1.1. Medical and Surgical Aids and Systems

In medical and surgical aids and systems, AR-based HoloLens is used for purposes such as visualization of medical data, blood vessel search, targeting support for needles or drills and endoscope support. It is mainly used as an auxiliary means through which information related to the surgery is provided or the accuracy of the surgery is improved.
Hanna et al. [29] evaluated the utility of Microsoft HoloLens in clinical and nonclinical applications of pathology. They tested virtual annotations during autopsy, 3D gross and microscopic pathology specimens, navigating entire slide images, telepathology and real-time pathology–radiology correlation. The pathology residents who performed the autopsy, while wearing the HoloLens, were instructed remotely with real-time diagrams, annotations and voice instructions; the 3D-scanned gross pathology specimen was implemented as a hologram, such that it could be manipulated easily. Additionally, the user could remotely contact the pathologist to receive guidance and register annotations in real time in the region of interest for the specimen. It was confirmed that the HoloLens is effective for autopsy and gross and microscopic examination; it can also support high-resolution imaging and has sufficient computing power to implement digital pathology.
Chien et al. [30] proposed an MR system for medical use that can superimpose medical data onto the physical surface of a patient by combining an Intel RealSense sensor and the Microsoft HoloLens HMD system. They used the denoised-resampled-weighted-and-perturbed-iterative closest points algorithm, which was developed in their study, to remove noise and outliers from the virtual medical data, before displaying the preoperative medical image data as a 3D image. The patient’s 3D medical images were displayed in the same coordinate space as the real patient and it was confirmed that the proposed system could perform accurate and robust mapping, despite the presence of noise and outliers in the MR system.
Al Janabi et al. [3] evaluated whether the HoloLens HMD could potentially replace the conventional monitor that is used in endoscopic surgery. The evaluation was conducted with 72 participants, i.e., with novices (n = 28), intermediates (n = 24) and experts (n = 20). The participants performed ureteroscopy in an inflatable operating environment using a validated training model and the HoloLens MR device as a monitor (Figure 2). Consequently, even the novice group was able to complete the assigned task successfully. However, the experienced group found it difficult to complete the assigned task because the new device was unfamiliar to these participants. However, 97% of the participants agreed that HoloLens would play a useful role in surgical education and 95% of the participants strongly agreed that it could be introduced clinically.
Kumar et al. [31] performed a qualitative user evaluation of an MR application. Furthermore, they described the detailed development and clinical-use workflows from acquisition to visualization in MR data. They also proposed an MR pilot for real-world use in surgical planning through clinical-use cases in laparoscopic liver resection and congenital heart surgery. The results from the clinical-use cases showed that the application of MR provides a better understanding of patient-specific anatomy and that it can improve surgical planning.
Wu et al. [32] proposed an AR approach that uses an improved alignment method for image-guided surgery, to provide physicians with the ability to observe the location of the lesion during the course of surgery. After building and establishing the patient’s head surface information using an RGB-depth sensor in conjunction with a point cloud library, the preoperative medical imaging information, which was obtained using the proposed improved alignment algorithm, was visualized using AR. Then, this information was placed in the same world coordinate system as the patient’s head surface information. The sorted information was merged and displayed using the HoloLens and the surgeon could view the patient’s medical images and head simultaneously. In this study, the accuracy of the alignment algorithm was measured using spatial reference points with known positions; the improved algorithm was determined to be extremely accurate, exhibiting errors within 3 mm.
Nguyen et al. [33] quantified the accuracy of virtual object placement methods for an AR device in neurosurgery. The accuracy was quantified for three different methods of virtual object placement (i.e., tap to place, 3-point correspondence matching and keyboard control). Consequently, it was found that using a keyboard that moves in 5 mm increments was the most accurate; moreover, the tap to place method was the most efficient, which was followed by the keyboard and 3-point methods.
Koyachi et al. [34] verified the reproducibility and accuracy of preoperative planning in maxilla repositioning surgery that was performed using computer-aided design/manufacturing technologies and MR surgical navigation. To achieve this, they used new registration markers and the HoloLens headset. The reproducibility and accuracy were evaluated by comparing the preoperative virtual operation 3D image with the one-month postoperative computed tomography (CT) image. Subsequently, it was confirmed that no statistically significant difference existed between any of the points on any of the axes and that the method could reproduce the maxillary position with high accuracy.
Mojica et al. [35] proposed a prototype of a holographic interface (HI) that can visualize 3D MRI data for planning neurosurgical procedures. The proposed HI immersed the user in an MR scene, which included MRI data and virtual renderings, and was implemented to interact with objects in MR. Additionally, the user can immediately control the scanner connected to the MRI scanner. They performed a preliminary qualitative evaluation of the proposed HI and found that holographic visualization of high-resolution 3D MRI data can provide an intuitive and interactive perspective of complex brain vasculature and anatomical structures.
Kuhlemann et al. [36] developed a real-time navigation framework, which provides a 3D holographic view of the vascular system, to reduce the X-ray exposure that may occur during endovascular interventions using X-ray imaging and convert the two-dimensional (2D) X-ray images into 3D images. Because the presented framework uses calibrations based on extrinsic landmarks, the virtual objects are precisely aligned with the real world. They extracted the patient’s surface and vascular tree from preoperative CT data and registered this information to the patient using a magnetic tracking system. The developed system was evaluated by experienced vascular surgeons, radiologists and thoracic surgeons; it was confirmed that their approach reduced X-ray exposure in endovascular procedures, while allowing novices to acquire the required skills faster.
Pratt et al. [37] used AR to obtain information contained in preoperative computed tomography angiography (CTA) images. They depicted the separated volumes of osseous, vascular, skin, soft tissue structures and relevant vascular perforators on the CTA scans, to generate 3D images that were then converted to polygonal models and rendered using a custom application within the HoloLens stereo HMD. Figure 3 presents the original CTA image, segmentation and corresponding polygonal models. Moreover, during the surgery, a combination of tracked hand gestures and voice commands was used to allow the surgeon to register the model directly to their respective subjects. By comparing the subsurface location of vascular perforators through AR overlay and the positions obtained by audible Doppler ultrasound, it was determined that the HoloLens allows the surgeon to locate the perforating vessels precisely and efficiently.
Jiang et al. [38] evaluated the precision of a HoloLens-based vascular localization system. Thus, a 3D printing model for the vascular map was developed and the precision was tested in a simulated operating room under different conditions. They detected five pairs of point coordinates with a probe on the vascular map, which could be identified in the 3D printing and virtual models, and calculated the distance between the points as a navigation error. Consequently, the mean error was reported to be within the clinically acceptable range. This precision evaluation showed that the HoloLens system precisely localizes the perforator and can potentially assist the surgeon in performing the operation.
Liebmann et al. [39] evaluated the possibility of using an optical see-through HMD to navigate lumbar pedicle screw placement. A novel navigation method—tailored to run on the HoloLens—involves capturing the intraoperatively reachable surface of the vertebrae to achieve registration and tool tracking with real-time visualizations, without the need for intraoperative imaging. Fiducial markers are mounted on both surface sampling and navigation; furthermore, 3D printing is possible.
Müller et al. [40] evaluated the surgical accuracy of holographic pedicle screw navigation using an HMD via 3D intraoperative fluoroscopy. The accuracy of surgical navigation using an HMD was evaluated by comparing its results with those obtained from navigation using a state-of-the-art pose-tracking system. Holographic navigation using the device proposed in their study showed an accuracy comparable to that of a high-end pose-tracking system.
Park et al. [41] performed a prospective trial on CT-guided lesions targeting an abdominal phantom with and without AR guidance using HoloLens. Eight operators performed needle passes and the total needle redirections, radiation dose, procedure time and puncture rates of non-targeted lesions were compared with and without AR. Consequently, the operators were able to trace their ideal trajectory easily using real needles via virtual guides. Furthermore, it was confirmed that the needle passes, radiation dose and puncture rate of a non-targeted lesion were significantly reduced when using AR. Figure 4 shows an example of using AR-assisted navigation with HoloLens.
Gu et al. [42] designed a marker-less image-based registration pipeline using HoloLens and built-in sensors—to guide glenoid drilling during the total shoulder arthroplasty (TSA) procedure—and performed a feasibility analysis. They evaluated the performance of inside-out image-based registration on the HoloLens using a short-throw time-of-flight (ToF) camera and conducted a pilot study on its application to TSA. Although the evaluation results were promising, the current end-to-end error of the system does not appear to meet the surgical accuracy requirements.
Qian et al. [43] proposed an automatic flexible endoscopy control method that tracks the surgeon’s head with respect to the object in the surgical scene. The robotic flexible endoscope captures the surgical scene from the same perspective as the surgeon and the surgeon wears an HMD to observe the endoscopic video. The frustum of the flexible endoscope is rendered as an AR overlay, to provide surgical guidance. They developed a prototype FlexiVision integrating robotic flexible endoscope based on the da Vinci Research Kit and Microsoft HoloLens. Additionally, the AR surgical guidance was evaluated in the lesion targeting task; it was confirmed that the completion time, number of errors and subjective task load level could be significantly reduced.
Lee et al. [44] proposed a balance rehabilitation method based on AR. This method measures the movement of the head through an inertial measurement unit sensor mounted on an AR HMD and quantitatively assesses the individual’s postural stability. Moreover, it provides visual feedback to the user through holographic objects that interact with the head position in real time (Figure 5). The proposed method was validated using eight participants, who performed three postural tasks three times, depending on the presence or absence of AR. The center of pressure (COP) displacement was measured using the Wii Balance Board and the displacement of the head was measured through the HoloLens. Consequently, significant correlations were observed between the COP and head displacement. Additionally, there were significant differences between patients with and without AR feedback. However, the experiments performed were not statistically significant because the sample size was small.
Condino et al. [45] proposed the first wearable AR application for shoulder rehabilitation based on Microsoft HoloLens, which can track a user’s hand in real time. A serious game that maximizes the user’s comfort was designed—starting with the analysis of traditional rehabilitation exercises, while accounting for HoloLens specifications—and an immersive AR application was implemented. The proposed application was evaluated by rehabilitation specialists and healthy subjects. Consequently, ergonomics and motivational values were positively evaluated.
Geerse et al. [46] comprehensively evaluated the HoloLens as a tool to quantify gait parameters from position data, by determining the reliability and concurrent validity of repeated tests. The HoloLens measurement system was able to quantify walking speed, step length and cadence reliably and effectively for a broad range of speeds used by healthy young adults, as well as for self-selected comfortable speeds used by people with Parkinson’s disease.
Table 2 summarizes the literature reviewed on the use of Microsoft HoloLens in medical and surgical aids and systems. Except for one of the 19 papers, all studies were conducted using HoloLens 1. Thirteen used AR technology, while the rest used MR technology. Moreover, three studies provided users with a simple visualization of virtual objects and 13 studies allowed interactions between users and virtual objects. The remaining studies provided the users with an opportunity to immerse themselves in VR and interact with objects.

3.1.2. Medical Education and Simulation

AR technology is beneficial for educating students majoring in medical science and is widely used in medical training and simulation. Furthermore, by using an AR system for telemedicine, providing long-distance medical services, problems in rural areas that lack access to medical services are being improved. Although AR is not necessarily a new technology, it is gaining momentum in medical education and simulation through Microsoft HoloLens [47].
AR technology has the potential to enhance students’ understanding of physiology and anatomy, which requires 3D knowledge of human organ systems and structures. Therefore, Moro et al. [48] evaluated and compared the learning effects when delivering lectures through AR and mobile handheld tablet devices. Thirty-eight pre-clinical undergraduate participants were taught in detail about brain physiology and anatomy and pre- and post-intervention tests were performed to evaluate their knowledge. After the activity, participants completed a questionnaire to assess adverse health effects and their perception of the device. However, no significant difference was observed in the test scores for lesson delivery using HoloLens and mobile-based AR. Further, increased dizziness was reported when using HoloLens, but no other adverse health effects, such as nausea, disorientation, or fatigue, were observed. Both modes have been shown to be effective in learning; in particular, AR has been shown to support educators and learners in the fields of health and medical science.
Bulliard et al. [49] performed a virtual autopsy using AR technology (Figure 6) and investigated the possibility of using an AR HMD to enhance autopsy in real-world conditions. They used Microsoft HoloLens as the AR device and five participants performed a virtual autopsy. The participants were able to identify major pathologies in all cases on AR Virtopsy, which took an average of 122 s per case. Further, in AR autopsy, most of the interactions occurred during the first 30 min of the internal examination of the autopsy; it was confirmed that the frequency of interactions decreased and shortened as time passed. AR headsets can help implement image data, such as post-mortem CT, in the autopsy hall and have been shown to be most beneficial during the first 30 min of an internal examination during autopsy.
Condino et al. [50] developed a hybrid training system for orthopedic open surgery using HoloLens AR technology and evaluated the potential of MR in the training system. They chose hip arthroplasty (an orthopedic surgery) as the benchmark for evaluating the proposed system. Anatomical 3D models for each patient were extracted from the CT to create the virtual contents and the physical components of the simulator were produced. A quantitative test—to estimate the accuracy of the system, by evaluating the perceived position of an AR target—and qualitative test—to evaluate the workload and usefulness of the HoloLens—were performed. The results showed that the mean and maximum errors matched the requirements of the target application; the overall workload was low; and the self-assessed performance was satisfactory. Moreover, positive feedback was obtained for visual and auditory perceptions, gestures and voice interactions.
Caligiana et al. [51] developed an MR application that is useful for modifying and cutting virtual objects and proposed digital simulations of the surgical operations. Figure 7 shows a perfect match between the real and virtual objects reached using the AR platform, Vuforia. With this tool, it is possible to automatically adjust the virtual model to the correct dimensions and convert the actual 3D printed bones to markers themselves. This approach can achieve high precision in surgical applications. Additionally, because the proposed approach is based on HoloLens, the leap motion device and Unity, it is hands-free and does not require the use of a mouse or computer keyboard. Case studies have confirmed that surgeons can simulate real-world surgery by tracking and rapidly cutting the virtual objects. The cutting time is reduced in this simulation compared to conventional methods; furthermore, the high flexibility of the tool and excellent fidelity of the geometry were confirmed.
Wang et al. [52] developed an AR-based telepresence application that can provide telemedicine-mentoring services. They used HoloLens to capture a first-person perspective of a simulated rural emergency room via MR capture and to provide a telemedicine platform with remote pointing capabilities. The developed application provides a controlled hand model with an attached ultrasound transducer, such that it is easy to set up and run the system. They evaluated the suitability of the developed system for practical use through a user study. The results showed that there was no significant difference from the existing telemedicine settings and the viability of the system was validated.
In this study, the literature review on using HoloLens in medical education and simulation is summarized in Table 3. All five papers studied used HoloLens 1. Of the five reviewed papers, three used AR technology and the remaining two used MR technology. Moreover, the reviewed papers were found to provide interaction (three papers) and immersion (two papers) functions because of the characteristics of students or specialists who have to practice in a virtual environment.

3.2. Applications in Engineering Fields

AR and MR—which track and visualize the relationship of objects in a space—are used in various engineering fields, because they enable virtual experiences of dangerous, expensive, or inexperienced situations, such as disaster training or space travel. In this study, the use of HoloLens in the engineering field was analyzed by dividing the relevant literature into industrial engineering, architecture and civil engineering and other engineering fields.

3.2.1. Industrial Engineering

Currently, industrial manufacturing is changing to flexible and intelligent manufacturing and the development of automation and robotics, information and communication technology (ICT) has increased the efficiency and flexibility of production. As industrial robots are introduced and become increasingly used in the manufacturing industry, it is important to isolate and monitor the workspaces of the human workers and robots. Recently, HoloLens has been used to monitor the workspaces of industrial robots. Furthermore, AR devices have been used for the maintenance, repair and retrofitting of robots, as well as for product assembly and production management.
Hietanen et al. [53] proposed an interactive AR user interface (UI) to monitor and secure the minimum protective distance between a robot and an operator in the industrial manufacturing field. The AR UI was implemented with two pieces of hardware—a projector–mirror setup and wearable AR gear (HoloLens). The workspace model and UI were evaluated in a realistic diesel engine assembly task. Compared with the baseline without interaction, it was confirmed that AR-based interactive UIs reduce the time required for task completion and robot idle time. However, according to the user experience evaluation, AR using HoloLens is not yet suitable for industrial manufacturing and the projector–mirror setup needs improvement with respect to safety and work ergonomics.
Gruenefeld et al. [54] used an AR device to convey the motion intent of an industrial robot to the user. The AR visualization of the robot’s path, preview and volume enables users to perceive the movement of the robot in the workspace. The AR devices were also used to visualize the future path of the end-effector, the complete robot configuration model projected into the future and the volume slice that the robot projected onto a 2D circle, which would be occupied in the future. The proposed visualization shows the robot’s motion only for a short time ahead, while the robot is moving; however, this visualization is sufficiently long for the operator to recognize and react to the oncoming robot. Furthermore, it has been shown that collisions between the robots and human workers can be prevented in advance, which minimizes the downtime and reduces productivity losses.
Al-Maeeni et al. [55] proposed a method for remodeling production machinery using AR. In the case study, they analyzed how to help machine users perform tasks in the right sequence, to save retrofit time and cost. Owing to HoloLens’ work support, the users were able to obtain a type of navigation system. Additionally, the system can be edited to retrofit production machines for which user manuals are no longer available.
Vorraber et al. [56] presented the results of the research conducted using AR-based and audio-only communication systems for remote maintenance in a real-world industry setting. They analyzed the possibility of using HoloLens to improve the efficiency of remote maintenance operations compared to the telephone-supported maintenance process. The results showed that the completion rate of the consultation using the audio-only system was approximately 20% slower than when using the HoloLens. It also determined that HoloLens’ AR is particularly beneficial when inexperienced employees have to perform complex and difficult tasks.
Mourtzis et al. [57] proposed an AR-based application program that retrieves and displays a production schedule and processed data for production management. The proposed application is designed and implemented to dynamically exchange information between a digital environment and physical production system. The data were retrieved and loaded within AR-based applications via extensible markup language (XML) format files, then projected onto the Gantt charts and shop floor monitoring modules. HoloLens users can visualize both production Gantt charts and shop floor monitoring in the real world and place them in a comfortable position on the real-world object. Because the proposed application is based on AR technology, users can more easily recognize and manipulate the data.
Szajna et al. [58] applied AR technology to the wire assembly and production process; in particular, they produced and tested a system prototype for an assistance device for the wiring of control cabinets. The system identifies the elements of the environment online and provides significant support to a particular operator in the control cabinet assembly and production line. The operation process of the system proceeds as follows. (1) A CAD file is loaded into an EPLAN smart wiring (ESW) system (wiring support system). (2) When the user selects a wire displayed in the list of ESW systems via virtual screens or physical touch screens, extended information and data, such as wire length, connection point details and 3D graphics, are displayed. (3) The user collects a wire of an appropriate length from a wire holder. (4) The AR glasses-related software uses certain coordinates provided by ESW’s 3D graphics along with particular markers on a cabinet and matching algorithm to indicate the wire’s assembly point through virtual arrows, pointers, virtual images with markings, or textual prompts. (5) The user displays the assembly of a particular wire on a virtual touch screen with a simple gesture or voice command. (6) The user collects the next wire and repeats the operation within the process. Figure 8 shows an overview of the production station and general system principle. Szajna et al. conducted tests to determine whether the system supported the control cabinet production process. Consequently, it was verified that the wire assembly process can be significantly shortened using an AR glass-based system.
Deshpande and Kim [59] developed a HoloLens headset-based AR application that supports the assembly of ready-to-assemble (RTA) furniture. The users can quickly conceive the spatial relationship of its parts using the visual functions and interaction modes of the developed application. They tested the application on first-time users of RTA furniture. Thus, the application was shown to be effective in improving the user’s spatial problem-solving ability when working with RTA furniture with different assembly complexities. The positive effect of the AR-based supports was remarkable against the assembly of higher complexity.
Vidal-Balea et al. [60] developed a new collaboration application based on HoloLens that allows shipyard operators to interact with a virtual clutch, while assembling a turbine in a real workshop. The application acts as a virtual guide, while assembling different parts of a ship. The users can receive information on documents for each part of the clutch using the AR device and check the relevant blueprints and physical measurements. The developed solution was tested, while performing the assembly process of a clutch comprising numerous mobile parts of various shapes and sizes. Figure 9 shows a screenshot of the developed application, where the sequence of steps for the correct assembly of the clutch is visualized via animation, along with blueprints and documents associated with the component. It was possible to visualize the virtual elements in detail according to their real size; further, even before the production of such elements started, it was confirmed that the actual elements were accurately placed in a precise manner, in a position where they would be integrated.
Table 4 summarizes the literature review on using AR devices in the industrial engineering field. All papers used HoloLens 1; however, one paper did not indicate the version of HoloLens used. AR was used in all studies in the industrial engineering category. Two of the reviewed studies provided only the function of visualizing virtual objects, while the remaining six studies provided the capability to interact with objects in a virtual space.

3.2.2. Architecture and Civil Engineering

Many cities are rapidly developing into smart cities—smartly organizing information on various infrastructures by introducing ICT and the Internet of things. With this rapid development, AR technology and 3D reconstruction technology are also being developed and are being widely used in architecture and civil engineering. In this field, AR technology is widely used to map spaces or visualize data.
Microsoft HoloLens is equipped with various sensors, including four tracking cameras and a ToF range camera. The sensor images and their poses, which are estimated by the built-in tracking system, can be accessed by the user. This makes HoloLens potentially interesting as an indoor mapping device [61]. Hübner et al. [61] introduced various sensors mounted on the HoloLens and evaluated the complete system in relation to indoor environment mapping. They evaluated the performance of the HoloLens depth sensor and its tracking system separately and evaluated the complete system for its ability to map multi-room environments. Based on the evaluation, HoloLens is an easy-to-use, relatively inexpensive device that can easily capture the geometric structure of large indoor environments.
Zhang et al. [62] proposed a method for visualizing 3D city models in Toronto, Canada and various types of city data using a HoloLens, to utilize AR technology. The city model and dataset are displayed as augmented content in the real world through a lens. The user can obtain an interactive view of Toronto city data and check the details of the selected city data object. By using the proposed method, the users can efficiently obtain and observe various types of city data. Moreover, because the data can be easily manipulated and interacted with using HoloLens, the city data can be better explored and managed.
Bahri et al. [63] proposed an efficient use of MR technology to manage furniture in a room using HoloLens, to create a building information modeling (BIM) system. The MR systems can interact with BIM data, from a virtual to real environment. They mapped the occupied and empty spaces of the room using a 3D spatial mapping method. The developed system is designed to be able to move virtual items (furniture) in real time and to rotate and resize objects.
Wu et al. [64] proposed a novel control system for the path planning of omnidirectional mobile robots based on MR. The proposed system can control the movement of a mobile robot in a real environment and the interaction between the movement of the mobile robot and virtual objects that can be added to the real environment. An interactive interface is provided through HoloLens, which displays maps, paths, control commands and other information related to the mobile robot. Furthermore, by adding a virtual object to a real map, real-time interaction between the mobile robot and virtual object can be realized. The interface of the proposed system has a virtual panel and some buttons created with the Unity3D software combined with HoloToolKit, as shown in Figure 10a, and the communication mode of the system shows the flow, as shown in Figure 10b. According to the verification performed in an indoor environment using the proposed system, it was found that the proposed method can generate the movement path of the mobile robot according to the specific requirements of the operator, while demonstrating excellent obstacle avoidance performance.
Moezzi et al. [65] described how HoloLens could be used in advanced control education for localization, mapping and control experiments of autonomous vehicles. HoloLens can provide 3D position information using a spatial mapping feature and solve the problem of simultaneous localization and mapping. They used HoloLens technology and a Raspberry Pi computer, connected with standard remote-control car components, to create a suitable experimental platform for control education. The platform was able to provide several levels of automatic control teaching, starting from simple proportional integral derivative control through linear model predictive control (MPC), which switched model MPC to nonlinear MPC control. Moreover, it was confirmed that HoloLens could be used in the control education of autonomous vehicles. Furthermore, Moezzi et al. [66] conducted a study on using HoloLens in advanced control for positioning, mapping and trajectory tracking of autonomous robots.
HoloLens has a real-time inside-out tracking capacity that can accurately and stably visualize virtual contents of the surrounding spatial environment; however, to augment an indoor environment with the corresponding building model data, it requires a one-time localization of the AR platform inside the local coordinate frame of the building model to be visualized. Therefore, Hübner et al. [67] proposed a simple marker-based localization method using HoloLens, which is sufficient to overlay the indoor environment (with virtual room-scale model data) with a spatial accuracy of a few centimeters. Additionally, they demonstrated that the HoloLens mobile AR platform is suitable for the spatially correct on-site visualization of building model data.
Table 5 summarizes the aim of the study, number of citations, HoloLens version, visualization technology and functionality of the papers in architecture and civil engineering. Seven papers were reviewed and all papers used HoloLens 1. Five used AR technology and the remaining two used MR. All studies using AR provided the capability to interact with virtual objects and those using MR provided an immersion function.

3.2.3. Other Engineering Fields

In addition to industrial engineering and architecture and civil engineering, AR is being used in various engineering fields, including aeronautics and astronautics, electronics and communication.
Helin et al. [68] developed an AR system to support the manual work of astronauts. The AR system was designed for the HoloLens MR platform and implemented based on a modular architecture. Further, the performance of the developed system was evaluated using 39 participants. The participants installed a temporary stowage rack (TSR) on a physical mock-up of an international space station (ISS) module (Figure 11). The users’ experience was evaluated via questionnaires and interviews and in-depth feedback on platform experience (such as technology acceptance, system usability, smart glasses user satisfaction and user interaction satisfaction) was collected. Based on the analysis of the questionnaire and interview, the scores obtained for user experience, usability, user satisfaction and technology acceptance were close to the desired average. In particular, the system usability scale score was 68, indicating that the usability of the system was acceptable in the AR platform.
Xue et al. [69] presented the results of a survey on user satisfaction, with AR applied to three industries: aeronautics, medicine and astronautics. The user satisfaction with AR was investigated by dividing the satisfaction with the interaction and the satisfaction with the delivery device. They surveyed 142 participants from 3 industries; the collected information included age, gender, education level, internet knowledge level and participant roles in various fields. The AR users were not familiar with smart glasses; nonetheless, general computer knowledge was found to have a positive effect on user satisfaction. Furthermore, the results showed that satisfaction with both education and learning was acceptable. However, it was found that gender, age, education and the role of students or experts did not affect user satisfaction.
Mehta et al. [70] introduced the HoloLens, a mixed reality device, to detect emotions using facial expressions. Preliminary results of emotion recognition using a HoloLens were presented and compared with the emotion recognition results using a regular webcam. In both the regular webcam and the HoloLens, the five emotional expressions of anger, neutral, happy, surprise and sadness were appropriately portrayed by the three subjects selected in the experiment. However, the primary limiting factor affecting the accuracy of both the devices was that the facial expressions for emotions to be predicted had to be maintained for a long time. As it was difficult to hold the facial expressions of a sad emotion until its detection, it presented a serious challenge. However, in experiments with different light conditions, it was found that the accuracy of the HoloLens was better than that of a webcam under similar conditions due to the sensor present in the former.
Vidal-Balea et al. [71] proposed an architecture to provide low latency AR education services in classrooms or laboratories. Similar low latency can be achieved by using an edge computing device capable of real-time data processing and communication between AR devices. If the network is not properly designed, the wireless link may be overloaded, depending on the specific AR application and the number of users. Hence, the overall performance of the application may degrade and the latency may increase. To solve this problem, the performances of AR-supported classrooms or laboratories were modeled and simulated, from which wireless channel measurement and simulation results were presented (Figure 12). To this end, a Microsoft HoloLens 2 teaching application was devised, which demonstrated the feasibility of the proposed approach.
Bekele [72] proposed the “Walkable MxR Map”, an interactive immersive map that allows interaction with 3D models and various multimedia contents in museums and at historical sites. The proposed map can be applied to a specific virtual heritage (VH) setting in a predefined cultural and historical context and includes an interface that allows interaction on a map in a mixed reality environment. The researchers combined immersive reality technology, interaction methods, development platforms and mapping and cloud storage services to implement the interaction method. Users can interact with virtual objects through a map that is virtually projected onto the floor and viewed through a HoloLens. The projected maps are room-scale and walkable with potential global scalability. In addition to motion-based interaction, users can interact with virtual objects, multimedia content and 3D models using the HoloLens standard gesture, gaze and voice interaction methods.
Table 6 provides a summary of the literature reviewed thus far. Of the five research papers reviewed, one, published in 2020, used HoloLens 2 and the rest used HoloLens 1. Three used AR and the other two used MR. In addition, four of the researched cases provided the interaction function to interact with the virtual object and the remaining one provided only the data visualization function.

4. Discussion

This study reviewed academic papers on the application of HoloLens in the medical and healthcare and engineering fields. Through this review, it was possible to summarize the current status of publications by year, source, application in various industries, types of visualization technologies and functions implemented by various industries.

4.1. Current Status of Publications Related to HoloLens Research by Year and Source

The number of publications on HoloLens during the past five years is shown in Figure 13. In the first year of HoloLens 1 release, not a single article was published. Since 2017, papers have been published (three cases) in the field of medicine and healthcare. In 2018, researchers also started using HoloLens in the engineering field. In 2019, as in the previous year, nine papers were published. In 2020, the number of publications was 23, indicating significant growth in research using the device. The second version (HoloLens 2), which was an upgrade in terms of hardware and software, compared with its predecessor, was released in May 2019 and it is believed that this is because the demand for HoloLens has increased in related industries. Therefore, research using HoloLens have been actively conducted recently. In addition, the sources of publication were academic journals (84%) and conference proceedings (16%).

4.2. Current Applications of the HoloLens in Various Industries

For this study, 44 research papers that encompassed a wide range of applications of the HoloLens within various industries were reviewed. The review indicated that the HoloLens was primarily used in the medical, healthcare and engineering fields. In particular, it was found that the current use of HoloLens in medical and surgical aids and systems is the leading trend amongst the classification of the usage of HoloLens, as indicated below:
  • Medical and surgical aids and systems (19 papers);
  • Medical education and simulation (5 papers);
  • Industrial engineering (7 papers);
  • Architecture and civil engineering (7 papers);
  • Other engineering (7 papers).
Figure 14 shows the percentage distribution of HoloLens application categories in various industries and the number of studies classified under those applications. HoloLens applications in medical and healthcare account for 55% of all reviewed studies. In engineering, it has been widely used in industrial engineering, architecture and civil engineering. The applications of HoloLens in other engineering streams accounted for 11% of the reviewed studies. Industrial engineering was the second most frequent application of the HoloLens.

4.3. Types of Visualization Technology Used in Various Industries

The HoloLens can essentially implement AR that overlays virtual images or data on the real world. Additionally, it is possible to implement a mixed reality in which users can manipulate and interact with both physical and virtual items and environments using real and digital elements. As indicated by the reviews conducted during this study, the two types of visualization techniques, AR and MR, were used for applications in various industries. Table 7 shows the number of studies using AR and MR visualization techniques for each industry. A total of 32 research papers actively used AR in all the industries compared to MR, that was present in 12 research papers. In particular, AR application in medical and surgical aids and systems was the most common, as evidenced by its use in 13 research papers. Furthermore, MR was used in all the industries except industrial engineering.

4.4. Types of Function Provided to Users

The 44 reviewed research papers were grouped into three types, according to the functions they provided to the users in each industry (Table 8). Medical and surgical aids and systems mostly provided an environment that could interact with virtual objects. It was found that medical education and simulation provide an interactive and immersive environment by providing a real-world environment to learners. The majority of the fields of engineering provided an interactive environment as per the literature reviewed.

5. Conclusions

In this study, a literature review was performed to investigate the current status and trends in HoloLens studies published over the past five years (2016–2020). This review yielded the following four conclusions:
  • Research using HoloLens started in 2017, the year after HoloLens was first released. Since the release of HoloLens 2, in 2019, research is expected to expand further in the future.
  • Studies on HoloLens could be divided into five categories—medical and surgical aids and systems, medical education and simulation, industrial engineering, architecture and civil engineering and other engineering. Among the current applications, HoloLens was applied the most in medical and surgical aids and systems (43%).
  • Two categories of visualization technology were employed, depending on the industry and purpose of application, and it was found that the majority utilized AR (72.7%).
  • The functions provided to users in each industry were classified into three types (visualization, interaction and immersion); interaction with virtual objects in a virtual world was the most widely adopted.
Considering the expandability and functionality of HoloLens, the device can be widely used not only for manufacturing but also for gaming, defense and tourism. In particular, in manufacturing applications, HoloLens can be applied to the entire process, from product design to manufacturing, maintenance and staff training. However, reports on such applications are rare in academic papers and this is clearly a research area with sufficient opportunities for future work. The results of this study can be helpful in recognizing the potential applications of HoloLens and identifying potential areas for future study.

Author Contributions

Conceptualization, Y.C.; methodology, Y.C.; software, S.P.; validation, S.P.; formal analysis, S.P. and S.B.; investigation, Y.C.; resources, Y.C.; data curation, S.P. and S.B.; writing—original draft preparation, S.P. and S.B.; writing—review and editing, Y.C.; visualization, S.P.; supervision, Y.C.; project administration, Y.C.; funding acquisition, Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (2021R1A2C1011216).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Demystifying the Virtual Reality Landscape. Available online: https://www.intel.com/content/www/us/en/tech-tips-and-tricks/virtual-reality-vs-augmented-reality.html (accessed on 22 July 2021).
  2. Liu, Y.; Dong, H.; Zhang, L.; El Saddik, A. Technical evaluation of HoloLens for multimedia: A first look. IEEE MultiMed. 2018, 25, 8–18. [Google Scholar] [CrossRef]
  3. Al Janabi, H.F.; Aydin, A.; Palaneer, S.; Macchione, N.; Al-Jabir, A.; Khan, M.S.; Dasgupta, P.; Ahmed, K. Effectiveness of the HoloLens mixed-reality headset in minimally invasive surgery: A simulation-based feasibility study. Surg. Endosc. 2020, 34, 1143–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Wang, W.; Wu, X.; Chen, G.; Chen, Z. Holo3DGIS: Leveraging Microsoft HoloLens in 3D Geographic Information. ISPRS Int. J. Geo-Inf. 2018, 7, 60. [Google Scholar] [CrossRef] [Green Version]
  5. Furlan, R. The future of augmented reality: Hololens-Microsoft’s AR headset shines despite rough edges [Resources_Tools and Toys]. IEEE Spectr. 2016, 53, 21. [Google Scholar] [CrossRef]
  6. HoloLens Hardware Details. Available online: https://developer.microsoft.com/en-us/windows/mixed-reality/hololens_hardware_details (accessed on 1 June 2021).
  7. Garon, M.; Boulet, P.O.; Doiron, J.P.; Beaulieu, L.; Lalonde, J.F. Real-Time High Resolution 3D Data on the HoloLens. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, Mexico, 19–23 September 2016; pp. 189–191. [Google Scholar] [CrossRef]
  8. Strzys, M.P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A. Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens. Phys. Teach. 2017, 55, 376–377. [Google Scholar] [CrossRef] [Green Version]
  9. Picallo, I.; Vidal-Balea, A.; Lopez-Iturri, P.; Fraga-Lamas, P.; Klaina, H.; Fernández-Caramés, T.M.; Falcone, F. Wireless Channel Assessment of Auditoriums for the Deployment of Augmented Reality Systems for Enhanced Show Experience of Impaired Persons. Proceedings 2020, 42, 30. [Google Scholar] [CrossRef] [Green Version]
  10. Jing, H.; Boxiong, Y.; Jiajie, C. Non-contact Measurement Method Research Based on HoloLens. In Proceedings of the 2017 International Conference on Virtual Reality and Visualization (ICVRV), Zhengzhou, China, 21–22 October 2017; pp. 267–271. [Google Scholar] [CrossRef]
  11. Fernández-Caramés, T.M.; Fraga-Lamas, P.; Suárez-Albela, M.; Vilar-Montesinos, M. A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard. Sensors 2018, 18, 1798. [Google Scholar] [CrossRef] [Green Version]
  12. Vaquero-Melchor, D.; Bernardos, A.M. Enhancing Interaction with Augmented Reality through Mid-Air Haptic Feedback: Architecture Design and User Feedback. Appl. Sci. 2019, 9, 5123. [Google Scholar] [CrossRef] [Green Version]
  13. Blanco-Novoa, Ó.; Fraga-Lamas, P.; Vilar-Montesinos, M.A.; Fernández-Caramés, T.M. Creating the Internet of Augmented Things: An Open-Source Framework to Make IoT Devices and Augmented and Mixed Reality Systems Talk to Each Other. Sensors 2020, 20, 3328. [Google Scholar] [CrossRef]
  14. Stark, E.; Bisták, P.; Kucera, E.; Haffner, O.; Kozák, S. Virtual Laboratory Based on Node.js Technology and Visualized in Mixed Reality Using Microsoft HoloLens. In Proceedings of the 2017 Federated Conference on Computer Science and Information Systems, Prague, Czech Republic, 3–6 September 2017; pp. 315–322. [Google Scholar] [CrossRef] [Green Version]
  15. Osti, F.; Santi, G.M.; Caligiana, G. Real Time Shadow Mapping for Augmented Reality Photorealistic Rendering. Appl. Sci. 2019, 9, 2225. [Google Scholar] [CrossRef] [Green Version]
  16. Coolen, B.; Beek, P.J.; Geerse, D.J.; Roerdink, M. Avoiding 3D Obstacles in Mixed Reality: Does It Differ from Negotiating Real Obstacles? Sensors 2020, 20, 1095. [Google Scholar] [CrossRef] [Green Version]
  17. Vaquero-Melchor, D.; Bernardos, A.M.; Bergesio, L. SARA: A Microservice-Based Architecture for Cross-Platform Collaborative Augmented Reality. Appl. Sci. 2020, 10, 2074. [Google Scholar] [CrossRef] [Green Version]
  18. Guo, N.; Wang, T.; Yang, B.; Hu, L.; Liu, H.; Wang, Y. An online calibration method for microsoft HoloLens. IEEE Access 2019, 7, 101795–101803. [Google Scholar] [CrossRef]
  19. Lee, J.; Hafeez, J.; Kim, K.; Lee, S.; Kwon, S. A Novel Real-Time Match-Moving Method with HoloLens. Appl. Sci. 2019, 9, 2889. [Google Scholar] [CrossRef] [Green Version]
  20. Hübner, P.; Weinmann, M.; Wursthorn, S. Voxel-Based Indoor Reconstruction from HoloLens Triangle Meshes. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, V-4-2020, 79–86. [Google Scholar] [CrossRef]
  21. Radkowski, R.; Kanunganti, S. Augmented reality system calibration for assembly support with the microsoft hololens. In Proceedings of the International Manufacturing Science and Engineering Conference, College Station, TX, USA, 18–22 June 2018; p. V003T02A021. [Google Scholar] [CrossRef]
  22. Ostanin, M.; Klimchik, A. Interactive robot programing using mixed reality. IFAC-PapersOnLine 2018, 51, 50–55. [Google Scholar] [CrossRef]
  23. Kress, B.; Cummings, W. Towards the Ultimate Mixed Reality Experience: HoloLens Display Architecture Choices. SID Symp. Dig. Tech. Pap. 2017, 48, 127–131. [Google Scholar] [CrossRef]
  24. Riedlinger, U.; Oppermann, L.; Prinz, W. Tango vs. HoloLens: A Comparison of Collaborative Indoor AR Visualisations Using Hand-Held and Hands-Free Devices. Multimodal Technol. Interact. 2019, 3, 23. [Google Scholar] [CrossRef] [Green Version]
  25. Merino, L.; Sotomayor-Gómez, B.; Yu, X.; Salgado, R.; Bergel, A.; Sedlmair, M.; Weiskopf, D. Toward Agile Situated Visualization: An Exploratory User Study. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–7. [Google Scholar] [CrossRef]
  26. Ababsa, F.; He, J.; Chardonnet, J.R. Combining HoloLens and Leap-Motion for Free Hand-Based 3D Interaction in MR Environments. In Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2020; De Paolis, L., Bourdot, P., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; Volume 12242, pp. 315–327. ISBN 978-3-030-58464-1. [Google Scholar] [CrossRef]
  27. Rokhsaritalemi, S.; Sadeghi-Niaraki, A.; Choi, S.-M. A Review on Mixed Reality: Current Trends, Challenges and Prospects. Appl. Sci. 2020, 10, 636. [Google Scholar] [CrossRef] [Green Version]
  28. Chun, H.S. Application of Virtual Reality in the Medical Field, In Electronics and Telecommunications Trends; ETRI: Daejeon, Korea, 2019; Volume 34, pp. 19–28. [Google Scholar] [CrossRef]
  29. Hanna, M.G.; Ahmed, I.; Nine, J.; Prajapati, S.; Pantanowitz, L. Augmented reality technology using Microsoft HoloLens in anatomic pathology. Arch. Pathol. Lab. Med. 2018, 142, 638–644. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Chien, J.-C.; Tsai, Y.-R.; Wu, C.-T.; Lee, J.-D. HoloLens-Based AR System with a Robust Point Set Registration Algorithm. Sensors 2019, 19, 3555. [Google Scholar] [CrossRef] [Green Version]
  31. Kumar, R.P.; Pelanis, E.; Bugge, R.; Brun, H.; Palomar, R.; Aghayan, D.L.; Fretland, A.A.; Edwin, B.; Elle, O.J. Use of mixed reality for surgery planning: Assessment and development workflow. J. Biomed. Inform. X 2020, 8, 100077. [Google Scholar] [CrossRef]
  32. Wu, M.-L.; Chien, J.-C.; Wu, C.-T.; Lee, J.-D. An Augmented Reality System Using Improved-Iterative Closest Point Algorithm for On-Patient Medical Image Visualization. Sensors 2018, 18, 2505. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Nguyen, N.Q.; Cardinell, J.; Ramjist, J.M.; Lai, P.; Dobashi, Y.; Guha, D.; Androutsos, D.; Yang, V.X. An augmented reality system characterization of placement accuracy in neurosurgery. J. Clin. Neurosci. 2020, 72, 392–396. [Google Scholar] [CrossRef]
  34. Koyachi, M.; Sugahara, K.; Odaka, K.; Matsunaga, S.; Abe, S.; Sugimoto, M.; Katakura, A. Accuracy of Le Fort I osteotomy with combined computer-aided design/computer-aided manufacturing technology and mixed reality. Int. J. Oral Maxillofac. Surg. 2020, 50, 782–790. [Google Scholar] [CrossRef] [PubMed]
  35. Mojica, C.M.M.; Navkar, N.V.; Tsekos, N.V.; Tsagkaris, D.; Webb, A.; Birbilis, T.; Seimenis, I. Holographic Interface for three-dimensional visualization of MRI on HoloLens: A prototype platform for MRI guided neurosurgeries. In Proceedings of the 2017 IEEE 17th International Conference on Bioinformatics and Bioengineering (BIBE), Washington, DC, USA, 23–25 October 2017; pp. 21–27. [Google Scholar] [CrossRef]
  36. Kuhlemann, I.; Kleemann, M.; Jauer, P.; Schweikard, A.; Ernst, F. Towards X-ray free endovascular interventions–using HoloLens for on-line holographic visualisation. Healthc. Technol. Lett. 2017, 4, 184–187. [Google Scholar] [CrossRef]
  37. Pratt, P.; Ives, M.; Lawton, G.; Simmons, J.; Radev, N.; Spyropoulou, L.; Amiras, D. Through the HoloLens™ looking glass: Augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. Eur. Radiol. Exp. 2018, 2, 2. [Google Scholar] [CrossRef]
  38. Jiang, T.; Yu, D.; Wang, Y.; Zan, T.; Wang, S.; Li, Q. HoloLens-Based Vascular Localization System: Precision Evaluation Study With a Three-Dimensional Printed Model. J. Med. Internet Res. 2020, 22, e16852. [Google Scholar] [CrossRef]
  39. Liebmann, F.; Roner, S.; von Atzigen, M.; Scaramuzza, D.; Sutter, R.; Snedeker, J.; Farshad, M.; Fürnstahl, P. Pedicle screw navigation using surface digitization on the Microsoft HoloLens. Int. J. CARS 2019, 14, 1157–1165. [Google Scholar] [CrossRef]
  40. Müller, F.; Roner, S.; Liebmann, F.; Spirig, J.M.; Fürnstahl, P.; Farshad, M. Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. Spine J. 2020, 20, 621–628. [Google Scholar] [CrossRef]
  41. Park, B.J.; Hunt, S.J.; Nadolski, G.J.; Gade, T.P. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: A phantom study using HoloLens 2. Sci. Rep. 2020, 10, 18620. [Google Scholar] [CrossRef]
  42. Gu, W.; Shah, K.; Knopf, J.; Navab, N.; Unberath, M. Feasibility of image-based augmented reality guidance of total shoulder arthroplasty using microsoft HoloLens 1. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2020, 1835556. [Google Scholar] [CrossRef]
  43. Qian, L.; Song, C.; Jiang, Y.; Luo, Q.; Ma, X.; Chiu, P.W.; Li, Z.; Kazanzides, P. FlexiVision: Teleporting the Surgeon’s Eyes via Robotic Flexible Endoscope and Head-Mounted Display. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020. [Google Scholar]
  44. Lee, E.-Y.; Tran, V.T.; Kim, D. A Novel Head Mounted Display Based Methodology for Balance Evaluation and Rehabilitation. Sustainability 2019, 11, 6453. [Google Scholar] [CrossRef] [Green Version]
  45. Condino, S.; Turini, G.; Viglialoro, R.; Gesi, M.; Ferrari, V. Wearable Augmented Reality Application for Shoulder Rehabilitation. Electronics 2019, 8, 1178. [Google Scholar] [CrossRef] [Green Version]
  46. Geerse, D.J.; Coolen, B.; Roerdink, M. Quantifying Spatiotemporal Gait Parameters with HoloLens in Healthy Adults and People with Parkinson’s Disease: Test-Retest Reliability, Concurrent Validity, and Face Validity. Sensors 2020, 20, 3216. [Google Scholar] [CrossRef] [PubMed]
  47. Herron, J. Augmented reality in medical education and training. J. Electron. Resour. Med. Libr. 2016, 13, 51–55. [Google Scholar] [CrossRef] [Green Version]
  48. Moro, C.; Phelps, C.; Redmond, P.; Stromberga, Z. HoloLens and mobile augmented reality in medical and health science education: A randomised controlled trial. Br. J. Educ. Technol. 2020, 52, 680–694. [Google Scholar] [CrossRef]
  49. Bulliard, J.; Eggert, S.; Ampanozi, G.; Affolter, R.; Gascho, D.; Sieberth, T.; Thali, M.J.; Ebert, L.C. Preliminary testing of an augmented reality headset as a DICOM viewer during autopsy. Forensic Imaging 2020, 23, 200417. [Google Scholar] [CrossRef]
  50. Condino, S.; Turini, G.; Parchi, P.D.; Viglialoro, R.M.; Piolanti, N.; Gesi, M.; Ferrari, M.; Ferrari, V. How to build a patient-specific hybrid simulator for Orthopaedic open surgery: Benefits and limits of mixed-reality using the Microsoft HoloLens. J. Healthc. Eng. 2018, 2018, 5435097. [Google Scholar] [CrossRef]
  51. Caligiana, P.; Liverani, A.; Ceruti, A.; Santi, G.M.; Donnici, G.; Osti, F. An Interactive Real-Time Cutting Technique for 3D Models in Mixed Reality. Technologies 2020, 8, 23. [Google Scholar] [CrossRef]
  52. Wang, S.; Parsons, M.; Stone-McLean, J.; Rogers, P.; Boyd, S.; Hoover, K.; Meruvia-Pastor, O.; Gong, M.; Smith, A. Augmented Reality as a Telemedicine Platform for Remote Procedural Training. Sensors 2017, 17, 2294. [Google Scholar] [CrossRef] [PubMed]
  53. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput.-Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
  54. Gruenefeld, U.; Prädel, L.; Illing, J.; Stratmann, T.; Drolshagen, S.; Pfingsthorn, M. Mind the ARm: Realtime visualization of robot motion intent in head-mounted augmented reality. In Proceedings of the Conference on Mensch und Computer, Magdeburg, Germany, 6–9 September 2020; pp. 259–266. [Google Scholar] [CrossRef]
  55. Al-Maeeni, S.S.H.; Kuhnhen, C.; Engel, B.; Schiller, M. Smart retrofitting of machine tools in the context of industry 4.0. Procedia CIRP 2020, 88, 369–374. [Google Scholar] [CrossRef]
  56. Vorraber, W.; Gasser, J.; Webb, H.; Neubacher, D.; Url, P. Assessing augmented reality in production: Remote-assisted maintenance with HoloLens. Procedia CIRP 2020, 88, 139–144. [Google Scholar] [CrossRef]
  57. Mourtzis, D.; Siatras, V.; Zogopoulos, V. Augmented reality visualization of production scheduling and monitoring. Procedia CIRP 2020, 88, 151–156. [Google Scholar] [CrossRef]
  58. Szajna, A.; Stryjski, R.; Woźniak, W.; Chamier-Gliszczyński, N.; Kostrzewski, M. Assessment of Augmented Reality in Manual Wiring Production Process with Use of Mobile AR Glasses. Sensors 2020, 20, 4755. [Google Scholar] [CrossRef]
  59. Deshpande, A.; Kim, I. The effects of augmented reality on improving spatial problem solving for object assembly. Adv. Eng. Inform. 2018, 38, 760–775. [Google Scholar] [CrossRef]
  60. Vidal-Balea, A.; Blanco-Novoa, O.; Fraga-Lamas, P.; Vilar-Montesinos, M.; Fernández-Caramés, T.M. Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future. Appl. Sci. 2020, 10, 9073. [Google Scholar] [CrossRef]
  61. Hübner, P.; Clintworth, K.; Liu, Q.; Weinmann, M.; Wursthorn, S. Evaluation of HoloLens Tracking and Depth Sensing for Indoor Mapping Applications. Sensors 2020, 20, 1021. [Google Scholar] [CrossRef] [Green Version]
  62. Zhang, L.; Chen, S.; Dong, H.; El Saddik, A. Visualizing Toronto city data with Hololens: Using augmented reality for a city model. IEEE Consum. Electron. Mag. 2018, 7, 73–80. [Google Scholar] [CrossRef]
  63. Bahri, H.; Krcmarik, D.; Moezzi, R.; Kočí, J. Efficient use of mixed reality for bim system using microsoft HoloLens. IFAC-PapersOnLine 2019, 52, 235–239. [Google Scholar] [CrossRef]
  64. Wu, M.; Dai, S.-L.; Yang, C. Mixed Reality Enhanced User Interactive Path Planning for Omnidirectional Mobile Robot. Appl. Sci. 2020, 10, 1135. [Google Scholar] [CrossRef] [Green Version]
  65. Moezzi, R.; Krcmarik, D.; Bahri, H.; Hlava, J. Autonomous vehicle control based on HoloLens technology and raspberry pi platform: An educational perspective. IFAC-PapersOnLine 2019, 52, 80–85. [Google Scholar] [CrossRef]
  66. Moezzi, R.; Krcmarik, D.; Hlava, J.; Cýrus, J. Hybrid SLAM modelling of autonomous robot with augmented reality device. Mater. Today Proc. 2020, 32, 103–107. [Google Scholar] [CrossRef]
  67. Hübner, P.; Weinmann, M.; Wursthorn, S. Marker-based localization of the microsoft hololens in building models. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 195–202. [Google Scholar] [CrossRef] [Green Version]
  68. Helin, K.; Kuula, T.; Vizzi, C.; Karjalainen, J.; Vovk, A. User experience of augmented reality system for astronaut’s manual work support. Front. Robot. AI 2018, 5, 106. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  69. Xue, H.; Sharma, P.; Wild, F. User Satisfaction in Augmented Reality-Based Training Using Microsoft HoloLens. Computers 2019, 8, 9. [Google Scholar] [CrossRef] [Green Version]
  70. Mehta, D.; Siddiqui, M.F.H.; Javaid, A.Y. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality. Sensors 2018, 18, 416. [Google Scholar] [CrossRef] [Green Version]
  71. Vidal-Balea, A.; Blanco-Novoa, O.; Picallo-Guembe, I.; Celaya-Echarri, M.; Fraga-Lamas, P.; Lopez-Iturri, P.; Azpilicueta, L.; Falcone, F.; Fernández-Caramés, T.M. Analysis, Design and Practical Validation of an Augmented Reality Teaching System Based on Microsoft HoloLens 2 and Edge Computing. Eng. Proc. 2020, 2, 52. [Google Scholar] [CrossRef]
  72. Bekele, M.K. Walkable mixed reality map as interaction interface for virtual heritage. Digit. Appl. Archaeol. Cult. Herit. 2019, 15, e00127. [Google Scholar] [CrossRef]
Figure 1. Flow diagram for selecting papers to achieve research objectives.
Figure 1. Flow diagram for selecting papers to achieve research objectives.
Applsci 11 07259 g001
Figure 2. Scene performing ureteroscopy in full immersion simulation environment. (A) View of the user to a nonuser when the HoloLens is worn. (B) Simulated view of what the user of the HoloLens sees [3].
Figure 2. Scene performing ureteroscopy in full immersion simulation environment. (A) View of the user to a nonuser when the HoloLens is worn. (B) Simulated view of what the user of the HoloLens sees [3].
Applsci 11 07259 g002
Figure 3. Original CTA imaging, segmentation and corresponding polygonal models. (a) CTA imaging indicating the location of the perforating arteries with yellow arrows. (b) Example of a HoloLens rendering of the segmented polygonal models [37].
Figure 3. Original CTA imaging, segmentation and corresponding polygonal models. (a) CTA imaging indicating the location of the perforating arteries with yellow arrows. (b) Example of a HoloLens rendering of the segmented polygonal models [37].
Applsci 11 07259 g003
Figure 4. AR-assisted navigation using HoloLens 2. (A) Participant inserts the needle while wearing HoloLens 2. (B) View of the needle insertion without AR, (C) View of needle insertion through HoloLens 2 with a 3D model and virtual needle guide projected onto the phantom [41].
Figure 4. AR-assisted navigation using HoloLens 2. (A) Participant inserts the needle while wearing HoloLens 2. (B) View of the needle insertion without AR, (C) View of needle insertion through HoloLens 2 with a 3D model and virtual needle guide projected onto the phantom [41].
Applsci 11 07259 g004
Figure 5. Conceptual diagram of an AR HMD-based rehabilitation management method that can support both therapists and patients [44] (IMU, inertial measurement unit).
Figure 5. Conceptual diagram of an AR HMD-based rehabilitation management method that can support both therapists and patients [44] (IMU, inertial measurement unit).
Applsci 11 07259 g005
Figure 6. Scene of performing virtual autopsy using AR device. (Left) Display of the setup used in this study. (Right) Simulated view seen by the participant [49].
Figure 6. Scene of performing virtual autopsy using AR device. (Left) Display of the setup used in this study. (Right) Simulated view seen by the participant [49].
Applsci 11 07259 g006
Figure 7. Example of cutting 3D model [51].
Figure 7. Example of cutting 3D model [51].
Applsci 11 07259 g007
Figure 8. Overview of the production station and general system principle [58].
Figure 8. Overview of the production station and general system principle [58].
Applsci 11 07259 g008
Figure 9. Interaction between two devices with a shared 3D element (Left) and the assembly process at Navantia’s Turbine workshop (Right) [60].
Figure 9. Interaction between two devices with a shared 3D element (Left) and the assembly process at Navantia’s Turbine workshop (Right) [60].
Applsci 11 07259 g009
Figure 10. Control system for path planning of MR-based omnidirectional mobile robot. (a) Virtual panel in Unity3D. (b) Diagram of the software and communication layout [64].
Figure 10. Control system for path planning of MR-based omnidirectional mobile robot. (a) Virtual panel in Unity3D. (b) Diagram of the software and communication layout [64].
Applsci 11 07259 g010
Figure 11. TSR installation in a physical mock-up of an ISS module using an AR system [68].
Figure 11. TSR installation in a physical mock-up of an ISS module using an AR system [68].
Applsci 11 07259 g011
Figure 12. Three-dimensional model of the test classroom and the received signal power for HoloLens 2 glasses that used 2.4 GHz and 5 GHz Wi-Fi [71].
Figure 12. Three-dimensional model of the test classroom and the received signal power for HoloLens 2 glasses that used 2.4 GHz and 5 GHz Wi-Fi [71].
Applsci 11 07259 g012
Figure 13. Number of publications related to HoloLens research by (a) year and (b) source of publications.
Figure 13. Number of publications related to HoloLens research by (a) year and (b) source of publications.
Applsci 11 07259 g013
Figure 14. Percentage distribution and number of reviewed studies of HoloLens applications in the medical and healthcare and engineering fields.
Figure 14. Percentage distribution and number of reviewed studies of HoloLens applications in the medical and healthcare and engineering fields.
Applsci 11 07259 g014
Table 1. Analysis indicators to investigate the current status and trends of research based on HoloLens.
Table 1. Analysis indicators to investigate the current status and trends of research based on HoloLens.
Research TrendIndustry FieldVisualization TechnologyFunction
Upper LevelLower Level
Number of PapersMedical and HealthcareMedical auxiliary devices and systemsARVisualization
Publication year Medical education and simulationMRInteraction
Number of citationsEngineeringIndustrial engineering Immersion
Version of HoloLens Architectural engineering
Other engineering
Table 2. Summary of Microsoft HoloLens application for medical auxiliary devices and systems.
Table 2. Summary of Microsoft HoloLens application for medical auxiliary devices and systems.
ReferenceYearAim of StudyNumber of CitationsHoloLens VersionVisualization TechnologyFunctionality
Hanna et al. [29]2018Evaluation of the utility of HoloLens for pathology951ARImmersion
Chien et al. [30]2019Proposal of MR system for visualization of medical data31MRVisualization
Al Janabi et al. [3]2020Evaluation of the utility of HoloLens in endoscopic surgery231MRImmersion
Kumar et al. [31]2020Evaluation of the utility of HoloLens for visualization of medical data11MRInteraction
Wu et al. [32]2018Proposal of medical data visualization method using AR191ARInteraction
Nguyen et al. [33]2020Quantifying the accuracy of placement method of virtual medical objects41ARInteraction
Koyachi et al. [34]2020Verification of reproducibility and accuracy of maxillary repositioning surgery11MRInteraction
Mojica et al. [35]2017Proposal of holographic interface for 3D visualization of MRI data121MRImmersion
Kuhlemann et al. [36]2017Development of framework for 3D hologram transformation of vascular system431ARInteraction
Pratt et al. [37]2018Realization of data included in CTA video as AR1141ARInteraction
Jiang et al. [38]2020Accuracy evaluation of AR-based vascular localization system31ARVisualization
Liebmann et al. [39]2019Evaluation of the utility of HoloLens in pedicle screw placement exploration341ARInteraction
Müller et al. [40]2020Evaluation of surgical accuracy in holographic pedicle screw search191ARInteraction
Park et al. [41]2020CT-guided lesion targeting using AR guidance32ARInteraction
Gu et al. [42]2020Evaluation of the utility of HoloLens for glenoid drilling guides-1ARVisualization
Qian et al. [43]2020Proposal of automatic flexible endoscope control method11ARInteraction
Lee et al. [44]2019Proposal of AR-based balance rehabilitation method-1ARInteraction
Condino et al. [45]2019Proposal of AR-based shoulder rehabilitation method71ARInteraction
Geerse et al. [46]2020Evaluation of the utility of HoloLens for quantification of walking-related data31MRInteraction
Table 3. Summary of Microsoft HoloLens application for medical education and simulation.
Table 3. Summary of Microsoft HoloLens application for medical education and simulation.
ReferenceYearAim of StudyNumber of CitationsHoloLens VersionVisualization TechnologyFunctionality
Moro et al. [48]2020Evaluation of the learning effects of AR in physiology and anatomy learning31ARInteraction
Bulliard et al. [49]2020Evaluation of the utility of HoloLens in virtual autopsy11ARInteraction
Condino et al. [50]2018Development of training system for orthopedic open surgery using AR711MRImmersion
Caligiana et al. [51]2020Digital simulation of surgical operation using HoloLens21MRInteraction
Wang et al. [52]2017Development of AR-based telemedicine mentoring service701ARImmersion
Table 4. Summary of Microsoft HoloLens application for industrial engineering.
Table 4. Summary of Microsoft HoloLens application for industrial engineering.
ReferenceYearAim of StudyNumber of CitationsHoloLens VersionVisualization TechnologyFunctionality
Hietanen et al. [53]2019Monitoring the safety distance between robots and workers, visualizing the movement of the robot221ARInteraction
Gruenefeld et al. [54]2020Monitoring the safety distance between robots and workers, visualizing the movement of the robot11ARInteraction
Al-Maeeni et al. [55]2020Proposal of a method for retrofitting production machines using AR1UnknownARInteraction
Vorraber et al. [56]2020Proposal of AR-based and audio-only communication system for remote maintenance21ARVisualization
Mourtzis et al. [57]2020Proposal of production schedule and data visualization method for production management11ARInteraction
Szajna et al. [58]2020AR-based auxiliary device development for wire assembly and production process51ARVisualization
Deshpande and Kim [59]2018Guide support for product assembly using AR211ARInteraction
Vidal-Balea et al. [60]2020Guide support for product assembly using AR11ARInteraction
Table 5. Summary of Microsoft HoloLens application for architecture and civil engineering.
Table 5. Summary of Microsoft HoloLens application for architecture and civil engineering.
ReferenceYearAim of StudyNumber of CitationsHoloLens VersionVisualization TechnologyFunctionality
Hübner et al. [61]2020HoloLens sensor introduction and system evaluation for spatial mapping171ARInteraction
Zhang et al. [62]2018Proposal of spatial mapping and data visualization method191ARInteraction
Bahri et al. [63]2019Proposal of spatial mapping and data visualization method11MRImmersion
Wu et al. [64]2020Proposal of spatial mapping and data visualization method91MRImmersion
Moezzi et al. [65]2019Proposal of spatial mapping and data visualization method11ARInteraction
Moezzi et al. [66]2020Presenting the application plan of AR in autonomous robot control-1ARInteraction
Hübner et al. [67]2018Proposal of marker-based localization technique using AR141ARInteraction
Table 6. Summary of the Microsoft HoloLens application for other engineering.
Table 6. Summary of the Microsoft HoloLens application for other engineering.
ReferenceYearAim of StudyNumber of CitationsHoloLens VersionVisualization TechnologyFunctionality
Helin et al. [68]2018Development of AR-based system to support the work of astronauts131ARInteraction
Xue et al. [69]2019Survey of AR users’ satisfaction in aeronautics and astronautics141ARInteraction
Mehta et al. [70]2018Proposal of human emotion recognition method using AR device551MRInteraction
Vidal-Balea et al. [71]2020Architecture proposal for real-time data processing and implementation of communication technology between AR devices12ARVisualization
Bekele [72]2019Proposal of interactive and immersive maps to interact with virtual content71MRInteraction
Table 7. Numbers of reviewed studies by visualization types used in medical and healthcare and engineering fields.
Table 7. Numbers of reviewed studies by visualization types used in medical and healthcare and engineering fields.
TypeAugmented Reality (AR)Mixed Reality (MR)
Medical and HealthcareMedical and surgical aids and systems136
Medical education and simulation32
EngineeringIndustry engineering80
Architecture and civil engineering52
Other engineering32
Sum3212
Table 8. Numbers of reviewed studies by visualization types used in the medical and healthcare and engineering fields.
Table 8. Numbers of reviewed studies by visualization types used in the medical and healthcare and engineering fields.
TypeVisualizationInteractionImmersion
Medical and HealthcareMedical and surgical aids and systems3133
Medical education and simulation-32
EngineeringIndustry engineering-62
Architecture and civil engineering-52
Other engineering14-
Sum4319
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Park, S.; Bokijonov, S.; Choi, Y. Review of Microsoft HoloLens Applications over the Past Five Years. Appl. Sci. 2021, 11, 7259. https://0-doi-org.brum.beds.ac.uk/10.3390/app11167259

AMA Style

Park S, Bokijonov S, Choi Y. Review of Microsoft HoloLens Applications over the Past Five Years. Applied Sciences. 2021; 11(16):7259. https://0-doi-org.brum.beds.ac.uk/10.3390/app11167259

Chicago/Turabian Style

Park, Sebeom, Shokhrukh Bokijonov, and Yosoon Choi. 2021. "Review of Microsoft HoloLens Applications over the Past Five Years" Applied Sciences 11, no. 16: 7259. https://0-doi-org.brum.beds.ac.uk/10.3390/app11167259

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop