Next Article in Journal
Advances and Trends in Non-Conventional, Abrasive and Precision Machining
Previous Article in Journal
Numerical and Experimental Study of a Device for Electrical Power Lines Probing for a Tunnel-Boring Complex Control System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications

by
Simon Enjalbert
1,
Livia Maria Gandini
2,
Alexandre Pereda Baños
3,
Stefano Ricci
2,* and
Frederic Vanderhaegen
1
1
Department of Automation and Control, Université Polytechnique Hauts-de-France, 59313 Valenciennes, France
2
Dipartimento di Ingegneria Civile Edile e Ambientale, Sapienza Università di Roma, 00185 Roma, Italy
3
Eurecat, Centro Tecnologico de Cataluna, 08290 Cerdanyola del Vallès, Spain
*
Author to whom correspondence should be addressed.
Submission received: 3 January 2021 / Revised: 25 January 2021 / Accepted: 28 January 2021 / Published: 8 February 2021
(This article belongs to the Section Robotics, Mechatronics and Intelligent Machines)

Abstract

:
This paper provides an overview of Human Machine Interface (HMI) design and command systems in commercial or experimental operation across transport modes. It presents and comments on different HMIs from the perspective of vehicle automation equipment and simulators of different application domains. Considering the fields of cognition and automation, this investigation highlights human factors and the experiences of different industries according to industrial and literature reviews. Moreover, to better focus the objectives and extend the investigated industrial panorama, the analysis covers the most effective simulators in operation across various transport modes for the training of operators as well as research in the fields of safety and ergonomics. Special focus is given to new technologies that are potentially applicable in future train cabins, e.g., visual displays and haptic-shared controls. Finally, a synthesis of human factors and their limits regarding support for monitoring or driving assistance is proposed.

1. Introduction

The field of automation is gaining increasing interest from researchers, and numerous milestones have been achieved which have progressively affected various application fields. This path began in the 1920s with the first radio-controlled vehicle and continues today thanks to the insights of major vehicle manufacturers and the Tech Giants who are committed to designing vehicles without drivers. There is an increasing interest from companies to invest in such technologies as they become aware of the advantages that the large-scale adoption of self-driving vehicles could offer them.
Studies on the user experience aboard autonomous or semi-autonomous vehicles are important as they provide insight on the question of trust in Human Machine Interfaces (HMIs). In particular, in a semi-automatic vehicle, HMI design must give the user an awareness of the driving situation so that they are able to effectively regain control if called upon to intervene. Thus, the user gains awareness of the situation and confidence in the capabilities of the vehicle simultaneously. Regarding fully autonomous vehicles, studying HMI is a way to address the problem of social fear regarding this technology and support its introduction into the market.
The CARBODIN research project was cofunded under the Shift2Rail project, with the aim of meeting the expectations of train drivers and staff engaged in HMI automation functions to facilitate its progressive introduction. Therefore, the focus was on the identification of required inputs and available outputs of various systems and operator preferences. This was further investigated by dedicated surveys depicting the technology of potential HMI configurations in the design of a new driver’s cabin and a virtual mock-up with immersive technology using a specific bank of sounds and gestures. The present paper provides a description of the preliminary research work based on an extended investigation on the current advancements in the field. HMIs are categorized, their technical approaches are described and current issues are discussed.

2. Organization of State-of-the-Art Investigation

The study aims to carry out a review and analysis of HMIs, their design and control systems currently on the market or in experimental operation. The analysis acts as a review manual on HMI technologies addressed to scholars, researchers and professionals who intend to study the subject or design their own HMI and want to analyze the technologies that are being used or tested currently. The investigation is original in terms of its goal to export selected best practices from other transport systems to the railway field, which inspires their selection criteria as detailed below.
A vast amount of research has been conducted within these fields, and this paper has no ambition to be exhaustive, narrowing the focus to the potential applicability of the results to train driving in line with the objective of the research. Specific aspects (e.g., the effects of driver age, management of emergency evacuations) will be the focus of subsequent research, which will include a survey with professional train drivers.
The paper was organized into sections as detailed below.
  • Automatic and semiautomatic control systems developed by manufacturers of surface vehicles outside the railway field, including trucks, cars and ships, were detailed. Without claiming universality, selection criteria considered the potential application of automated driving functions to railway vehicles. They were required to ensure the high level of safety typical of guided transport systems and the possibility of a fast switch from autopilot to manual operation and vice versa.
  • Simulators were developed for various transport systems, including rail, cars, aviation and integrated solutions. Their selection criteria was in line with that which was applied in Section 3, with the additional requirement of consolidated use for driver training activities in the respective operational fields. For the rail simulation, the focus was on systems developed by authors in previous research activities that were able to deal with operational situations, including traffic conflicts (e.g., nodes, junctions, level crossings). Some interactive functions were also imported from game simulators. However, these had a confined role due to the need to ensure that the safety levels were compatible with strict railway requirements.
  • An extended literature review on the performance of supporting tools for driver assistance was conducted. The focus was predominantly on the role of human behaviors and the extent of their consideration by various experimental applications, as well as a wide review of gesture control systems as operational tools to put them into practice.
  • A synthetic analysis of results and corresponding conclusions was presented.

3. Systems Developed by Manufacturers of No-Rail Surface Vehicles

3.1. Trucks

MAN PLATOONING by MAN and DB Schenker is a technical system that turns multiple vehicles into a convoy of trucks (Figure 1). The technology relies on driver assistance and control systems. The driver always keeps his/her hands on the wheel, even when another truck on the road behind him/her reacts directly, without active intervention. A specially designed display provides drivers with additional information about the convoy. Additionally, camera and radar systems continuously scan the surroundings. Figure 1 presents a conceptual visualization of platooning trucks.
HIGHWAY PILOT by Daimler [2] is an autopilot mechanism for trucks. It consists of assistance and connectivity systems enabling autonomous driving on highways by adapting the speed of trucks to traffic density. Overtaking maneuvers, lane changes or exiting the highway remain prerogatives of the driver. The user interface continuously informs the driver about the activation status and can manage received instructions, including activation and deactivation, meaning that the system can be overridden at any time. In addition, control is returned to the driver whenever the onboard system is no longer able to detect safety relevant aspects, as in cases of roadworks, extreme weather conditions or an absence of lane markings.

3.2. Cars

AUTOPILOT 2.5 by Tesla [3] equipment includes:
  • 8 × 360° cameras;
  • 12 ultrasonic sensors for the detection of hard and soft objects at a distance and with double the accuracy of previous systems;
  • A forward-facing radar system, with high processing capabilities, providing additional data on the surrounding environment on different wavelengths in order to counteract the effects of heavy rain, fog, dust and other cars.
The autopilot function suggests when to change lanes in order to optimize the route, and makes adjustments to avoid remaining behind slow vehicles. It also automatically guides the vehicle through junctions and onto highways exits based on the selected destination.
iNEXT-COPILOT by BMW (Figure 2) may be activated for autonomous driving or deactivated for traditional driving. The interior of the vehicle is readjusted according to touch or voice commands:
  • The steering wheel retracts, providing more room for passengers;
  • The pedals retract, creating a flat surface on the footwall;
  • The driver and front passenger can turn back towards other passengers in the rear seats;
  • Various displays provide information about the surrounding area.

3.3. Ships

THE FALCO by Finferries and Rolls-Royce allows vehicles to detect objects through a fusion of sensors and artificial intelligence for collision avoidance. It is the culmination of studies launched 2008 (Figure 3) [5]; since then, its realism has improved greatly thanks to the presence of advanced sensors which provide real time images of surroundings with a level of accuracy higher than that of the human eye. Basing on this, the vessel is able to alter course and speed automatically during the approach to a quay, and dock in a full-automated maneuver without human intervention. A unified bridge provides the operator with a functional HMI with control levers, touch screens for calls and control, as well as logically presented information on system status. During maneuvers outside congested harbor areas, the operator can control operations remotely using a joystick or supervise them via onboard situation awareness systems. Various autonomy levels can operate selectively or in combination, depending on the vessel situation and external conditions.

4. Simulators of Transport Systems

4.1. Rail Simulators

The PSCHITT-RAIL simulator [6] (Figure 4), designed to be an additional support for research and experimentation in the field of transport and funded by the European Union with the European Regional Development Fund (ERDF) and the Hauts-de-France Region, aims to integrate new modular equipment through the study of driver behavior in risky situations. Its main functionalities are:
  • Immersion in a sonic and visual environment;
  • Integration between real components and a simulated environment;
  • Management of driver information.
The equipment includes Alstom Citadis Dualis desk, a six-degrees-of-freedom motion system, OKSimRail software, five screens providing a 225° view, three eye trackers, cameras, a 6.1 audio system, and scripting, synthetized into a dynamic platform by means of measurements, such as systems capturing movements, physiological measurement sensors, etc.
SPICA RAIL, by University of Technology of Compiegne [6], is a supervision platform to recreate real accident scenarios in the laboratory in order to analyze human behaviors and decisions in such situations. It is able to analyze and simulate the entire network by integrating traffic control functions. Moreover, to simulate crises, personnel can start from an automatic control level, and progressively insert disturbances on the network.
OKTAL SYDAC simulators [7] cover solutions for trams, light rail, metro, freight (complex systems), high-speed trains, truck, bus and airports. The full cab or desk is an exact replica of a real cab. The compact desk simulators offer a solution with limited space for training.
The IFFSTAR-RAIL by IFFSTAR [8] is a platform designed to simulate rail traffic for the assessment of some European Rail Traffic Management System (ERTMS) components. It includes three subsystems:
  • A driving simulator desk used in combination with a 3D representation of tracks;
  • A traffic simulator, acting both as a single train management tool and for railway traffic control;
  • A test bench connected with onboard ERTMS equipment, in compliance with specifications and rules.

4.2. Car Simulators

IFFSTAR TS2 is a simulator of internal and external factors influencing driver behavior, and a human-machine interface located in Bron and Salon-de-Provence. It is capable of analyzing driver behavior relative to internal (e.g., anger, sadness) or external environmental factors and studying driver–vehicle interactions. The instrumentation includes:
  • Sensors for the control, communication and processing of of dashboard information;
  • Images of road scenes projected on five frontal screens in a visual field covering 200° × 40°:
  • A device providing rear-view images;
  • Quadraphonic sound reproducing internal (motor, rolling, starter) and external traffic noises.
NVIDIA DRIVE [9] is an open platform providing an interface that integrates environmental, vehicle and sensor models with traffic scenario and data managers, including two servers: A simulator, which generates output from virtual car sensors, and a computer, which contains the DRIVE AGX Pegasus AI car computer that runs the complete, binary compatible autonomous vehicle software stack. It processes the simulated data as if it were coming from the sensors of a car actually driving on the road. The car computer receives the sensor data, makes decisions and sends vehicle control commands back to the simulator in a closed loop process enabling bit-accurate, timing-accurate hardware-in-the-loop testing. The kit enables the development of Artificial Intelligence assistants for both drivers and passengers. HMI uses data from sensors tracking the drivers and the surrounding environment to keep them alert, anticipate passenger needs and provide insightful visualizations of journeys. The system uses deep learning networks to track head movements and gaze, and can communicate verbally via advanced speech recognition, lip reading and natural language understanding.
VRX-2019 by OPTIS [10,11] is a dynamic simulator with proportions, shapes, placement and surfaces for the interior which emphasize ergonomics and the comfort of passengers. From the integration of new technologies, such as driving assistance and autonomous driving, to the validation of ergonomics for future drivers and passengers, each step of the interior has been analyzed and validated. It reproduces the feeling of a cockpit HMI. It is very useful for virtual tests and for the integration of next-generation sensors before their actual release, helping to eliminate expensive real-world tests and reduce time-to-market. By virtual displays, actuators, visual simulations, eye and finger tracking and haptic feedback, it provides a tool for full HMI evaluation. Moreover, the user can validate safety and comfort improvements for drivers and pedestrians in dangerous scenarios. Key interface features are:
  • A finger-tracking system;
  • Tactile displays and dynamic content;
  • Windshield or glasshouse reflection studies based on physically accurate reflection simulations;
  • Testing and validation of head up displays, specifying and improving optical performance and the quality of the content.

4.3. Aviation Simulators

The CAE 7000XR Series Full-Flight Simulator (FFS) surpasses the operator requirements of Level D regulations. It provides integrated access to advanced functions, such as:
  • An intuitive lesson plan builder;
  • A 3D map of flight paths with event markers;
  • Increased information density;
  • Ergonomic redesign of interiors (Figure 5).
The CAE 3000 Series Helicopter flight and mission Simulator [13] is helicopter-specific for training in severe conditions, e.g., offshore, emergency services, high-altitude, etc., based on the following features:
  • A visual system with high-definition commercial projectors;
  • Up to 220° × 80° field-of-view direct projection dome, with full chin window coverage tailored to helicopter training operations.
The EXCALIBUR MP521 Simulator includes a capsule with a six-axis motion system, visual and instrument displays, touch control panels and hardware for flight control. The user can enter parameters to define an aircraft through a graphical user interface (GUI). The graphics also feature airports and reconfigurable scenic elements, in order to meet the requirements of flight training, such as runway lighting, approach aids and surroundings, with the possibility of large wall screens vision for group lessons.
The ALSIM AL250 simulator [14] includes a visual system equipped with a high quality compact panoramic visual display (minimum frame rate of 60 images/s), 250° × 49° field of view, high definition visual systems for better weather rendering and ultra-realism, new map display, positioning/repositioning system, weather condition adjustment, failure menu, position and weather presets. Optional upset recovery and a touch screen with wider instructor surface and adjustable seats complete the setup.
AIRLINER is a multipurpose hybrid simulator which is able to cover the following training scenarios: Multi Crew Cooperation (MCC), Advanced Pilot Standard (APS), airline selection, preparation and skills tests, aircraft complex systems operation, Line Oriented Flight Training (LOFT), type-rating preparation and familiarization, Upset Prevention Training (UPT). It is based on the hybrid Airbus A320/Boeing B737 philosophy, with versatility to adapt flight training and interchangeable cockpit configuration.
i-VISION by OPTIS is an immersive virtual environment for the design and validation of human-centered aircraft cockpits. It is the result of a project developed and applied to accelerate design, validation and training processes of prototyping aircraft cockpits. The applied methods are also exportable to cars, trucks, boats and trains. i-VISION includes three main technological components (Figure 7), which were integrated and validated in collaboration with relevant industrial partners:
  • Human cockpit operations analysis module, with human factor methods demonstrated in the prototype of the project;
  • Semantic virtual cockpit, with semantic virtual scene-graph and knowledge-based reasoning of objects and intelligent querying functions, providing a semantic-based scene-graph and human task data processing and management engine;
  • Virtual cockpit design environment, with a virtual environment provided by human ergonomics evaluation software based upon the Airbus flight simulator, to develop a new general user interface for cockpits.

4.4. Integrated Simulators

MISSRAIL® and INNORAIL are tools developed entirely by the Université Polytechnique Hauts-de-France in Valenciennes [15,16,17]. They include four main modules (Figure 6): (1) railway infrastructure design, (2) route design module, (3) driving cab simulation module, 4) control-command room simulation module. A version with an immersive helmet is also available.
It is a Client/Server application which is capable of connecting different users at the same time. Several configurations are possible: wired, wireless, portable, fixed. Moreover, MISSRAIL® is able to design various automated driving assistance tools (e.g., eco-driving assistance, collision avoidance system, cruise control and vigilance control system) and accident scenarios combining pedestrians, trains and cars (Figure 7) [17].

5. Support Tools for Driver Assistance

5.1. Human Factors and Their Limits

Human control engineering can use several technological means to control human factors, such as workload, attention or vigilance; however, controversies exist about some of them [18], i.e., studies have highlighted that, independent from technology, vigilance can be improved from dieting or fasting [19], or even from chewing gum [20,21]. Second, two kinds of technological supports can influence human cognitive state: passive tools, i.e., human–machine interaction supports, or active ones, which are capable of making decisions and acting accordingly. Examples are listening to music, which may improve concentration [22,23]. Meanwhile, a dedicated decision support system can decrease workload by improving performance [24]. Moreover, due to disengagement from a driving situation under monotonous driving conditions, automation might lead operators to become more fatigued that they would during manual driving conditions [25,26].
A great deal of research has reported on the utility of using physiological, behavioral, auditory and vehicle data to detect the mental state of the driver, such as presence/sleep, drowsiness or cognitive workload, with considerable accuracy [27,28,29]. Diverse parameters can provide information on targeted mental states: physiological measures include signals such as ElectroEncephaloGram (EEG) data, ElectroDermal Activity (EDA), and heart rate and heart rate variability. Behavioral measures include aspects such as head-direction, head-movement, gaze-direction, pose of the superior part of the body, gaze-dispersion, blinking, saccades, PERCLOS, pupil-size, eyelid movement, postural adjustment and nonself-centered gestures. Such data may be combined in multimodal approaches with information on vehicle activity and auditory information. However, existing approaches still present clear limitations, such as with electroencephalography (EEG), which is hardly usable in real contexts due to possible discomfort, an unprovable performance, as well as, in some cases, the high computational cost for calculations, which constrains implementation in real environments [30]. Note also that more traditional behavioral measures used in experimental psychology, such as the secondary task paradigm, have been shown to be quite useful in workload studies [31].
Results from studies on human factors in driver automation based on these techniques are often concerned with questions such as of how users tackle automated driving and transitions between manual and automated control. Most such studies were motivated by the increasing prevalence of automated control in commercial and public transport vehicles, as well as increases in the degree of automation. Moreover, while automated driving significantly reduces workload, this is not the case for Adaptive Cruise Control (ACC) [32]. For instance, a driving simulator and vehicle with eye-tracking measures showed that the time required to resume control of a car is about 15 s, and up to 40 s to stabilize it [33].
Alarms comprising beeps are safer than those comprising sounds with positive or negative emotional connotations [34], and human performances can differ according to the use of interaction means involving hearing or sight [35]. Moreover, interactions with visual or audio-visual displays are more efficient than those with auditory displays only [36]. In this sense, research on multimodal perception is particularly relevant when studying human factors of driver aid systems [37,38].
Other studies have not observed significant impacts of noise or music on human performance [39] and have even concluded that silence is able to increase attention during human disorder recovery conditions [40].
Moreover, the use of decision support systems can generate ambiguous results by leading to dissonances, affordances, contradictions or interferences with safety critical behavior [41,42,43], potentially increasing hypo-vigilance and extending human response time [44]. As an example, the well-known Head-Up Display (HUD) system is useful to display important information without requiring them to move their gaze in several directions, but it is also a mean to focus attention upon a reduced control area [45]. It is, therefore, a tool to prevent accidents, but can also cause problems of focused attention.
Neuropsychological studies generally use sensors connected to the brain to assess neural activities related to cognitive processes, such as perception or problem solving. In this context, eye trackers have been demonstrated to be useful for the study of visual attention or workload via parameters such as closure percentage, blink frequency, fixation duration, etc. [46,47,48,49]. Indeed, the pupil diameter increases with the increasing demand of the performed task and higher the cognitive loads [50], while an increase of physical demand does the opposite [51], as do external cues, such as variations of ambient light, use of drugs or strong emotions. Facial recognition is also incapable of detecting emotional dissonances between expressed and felt emotions. Moreover, eye blink frequency reduces as workload increases [52,53], but it increases when a secondary task is required [54,55].
Eye-trackers can be useful to analyze overt or covert attention: when a subject looks at a point on a scene, the analysis of the corresponding eye movement supposes that the attention is focused on this point, whereas attention can also focus on other points without any eye movement [56].
Variations in heartbeat frequently correspond to variations in the level of the workload, stress or emotion [57,58,59,60], but a new hypothesis considers that perceptive ability can depend on the synchronization between frequency of dynamic events and heart beats. Recent studies have demonstrated that flashing alarms synchronized with heart rate could reduce the solicitation of the insula, i.e., the part of brain dedicated to perception, and the ability to detect it correctly [61,62]. This performance-shaping factor based on the synchronization of dynamic events with heartbeats is relevant for human error analysis.
The development of future smart tools to support driving tasks has to consider extended abilities, such as the ability:
  • To cooperate with and learn from others [63,64,65];
  • To explain results in a pedagogical way [17];
  • To discover and control dissonances between users and support tools [41,42].
Significant advances for the prediction of driver drowsiness and workload have been made in association with the use of more sophisticated features of physiological signals, as well as from the application of increasingly sophisticated machine learning models, although extrapolation of such to the context of commercial pilots has not yet been attempted. Some approaches have been based on EDA signal decomposition into tonic and phasic components [66], extraction of features in time, frequency, and time-frequency (wavelet based) domains [67], or the use of signal-entropy related features [68].
Moreover, regarding machine-learning models, while the most widely used approach is the support vector machine, artificial neural networks, such as convolutional neural networks, seems to provide better performance for the detection of drowsiness and workload [69,70,71].
The combination of such approaches with multimodal data fusion has been shown to provide a very high degree of accuracy for drowsiness detection [72].
Such approaches are applicable to overcome some of the current limitations in the detection in pilots of drowsiness and mental workload. For instance, the high accuracy accomplished with only a subset of the signals suggests that various predictive models of drowsiness and workload could be trained based on different subsets of features, thereby helping to make the system useful, even when some specific features are not momentarily available (e.g., due to occlusion of the eyes or head). Recent advances can also help in the implementation of detection systems with lower computational cost, such as efficient techniques for signal filtering [73] and feature-selection methods to reduce model dimensionality and complexity [74].

5.2. Gesture Control Technology

Many technologies to control devices by gestures are already on the market. An extended, though not comprehensive, summary of them is presented below.
DEPTHSENSE CARLIB, by Sony, aims to control infotainement by hand movement [75].
EYEDRIVE GESTURE CONTROL by EyeLights is an infrared motion sensor that recognizes simple hand gestures while driving in order to control in-vehicle devices [76].
HAPTIX by Haptix Touch is a webcam-based environment to recognize any classical hand movement and build a virtual mouse to control screen interface [77].
KINECT by Microsoft is a web-cam based device that can capture motion and control devices with body or hand movements [78,79].
LEAP MOTION by Leap Motion Inc. (now UltraHaptics) is an environment for hand movement recognition dedicated to virtual reality. Movement detection is by infrared light, while micro-cameras detect the hands or other objects in 3D [80].
MYO BRACELET by Thalmic Labs proposes an armband to control interfaces with hand or finger movement detected via the electrical activities of activated muscles [74,81,82].
SOLI by Google comprises a mini-radar which is capable of identifying movements, from fingers to the whole body [83,84].
SWIPE by FIBARO is dedicated to home automation; it is controlled via hand motions in front of a simple, contactless tablet [85].
XPERIA TOUCH DEVICE by Sony is a smartphone application for gesture control which is capable of tracking proximate hand gesture via the phone camera [86].
Table 1 summarizes a Strengths Weaknesses Opportunities and Threats (SWOT) analysis of three of systems defined above: KINECT, LEAP MOTION and MYO BRACELET developed starting from the results of similar studies [87].

6. Analysis of Results and Conclusions

According to the aims to assess state of the art of technologies already in use or under experimentation, in order to identify suitable solutions for innovative train driver cabin. It focuses on the HMI developed by transport industries and other technologies.
The study starts from systems developed by manufacturers of vehicles outside of the railway field, i.e., cars, trucks and ships, with specific regard to how they can support or substitute the driver in the context of autonomous driving.
Table 2 shows the entire set of investigated solutions applied to vehicles outside of the railway field and their relevant characteristics. Important examples, in view of potential application for trains, are:
  • DRIVE PILOT, which provides automation with a fallback working mode to switch to driver responsibility;
  • iNEXT-COPILOT, which works by voice or tactile command to switch to/from automatic driving.
The maritime application THE FALCO is working on full automation with a powerful HMI, with good potential for rail applications.
Simulators represent an important starting point for the design of innovative driving functions, as well as for the safety of traffic supervision within railway systems [88,89,90] and interactions with other transport networks [91,92]. Their benefits are the possibility to test, in advance, applicability estimations, before the construction of the physical elements, to obtain information about the future design and reduce the cost of testing.
In this framework, IFFSTAR offer simulators that can be applied to multiple types of vehicles, or adapted according to different purposes, such as improved ergonomics, HMI optimization, training, visual systems, etc. Other interesting innovations have emerged from the OPTIS iVISION project, which has provided important ideas on how to design the HMI and integrate multimodal simulators.
Table 3 shows a summary of the entire set of investigated simulators.
All of the investigated systems have reached, in their respective fields, Technology Readiness Levels (TRL) in the range of TRL7 (System prototype demonstration in operational environment), TRL8 (System complete and qualified) or TRL9 (Actual system proven in operational environment). Therefore, they appear to be highly relevant to the design of train driver cabins, where the design process is less mature and the expected TRL is variable, but in the range of TRL5 (Technology validated in relevant environment), TRL6 (Technology demonstrated in relevant environment) and TRL7 (System prototype demonstration in operational environment) [93].
Future developments will include the identification of specific performance indicators to assess the functional relevance of specific driving-related actions, selected and refined with the involvement of professional train drivers in dedicated surveys. This will take into account human factors, their constraints, reflecting on assistive technologies for driving assistance and monitoring, and the management of critical situations, such as interactions with other transport systems (e.g., level crossings) and emergency management (e.g., evacuations).

Author Contributions

The distribution of the contributions and the corresponding authorship of the five authors in the various sections of the paper is equal. All authors have read and agreed to the published version of the manuscript.

Funding

This research received funding by Shift2Rail Joint Undertaking, under the European Union’s Horizon 2020 research and innovation programme, CARBODIN Project, grant agreement No 881814.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable. No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Deng, Q. A General Simulation Framework for Modeling and Analysis of Heavy-Duty Vehicle Platooning. IEEE Trans. Intell. Transp. Syst. 2016, 17, 352–3262. [Google Scholar] [CrossRef]
  2. Daimler. Highway Pilot. The Autopilot for Trucks. 2020. Available online: https://www.daimler.com/innovation/case/autonomous/highway-pilot-2.html (accessed on 28 January 2021).
  3. Tesla. Il Futuro Della Guida. 2020. Available online: https://www.tesla.com/it_IT/autopilot?redirect=no (accessed on 28 January 2021).
  4. Maximilan, J. 2019. Available online: https://commons.wikimedia.org/wiki/File:BMW_Vision_iNEXT_IAA_2019_JM_0166.jpg (accessed on 28 January 2021).
  5. Rekdalsbakken, W.; Styve, A. Simulation of Intelligent Ship Autopilots. In Proceedings of the 22nd European Conference on Modelling and Simulation, Nicosia, Cyprus, 3–6 June 2008. [Google Scholar]
  6. PSCHITT-Rail Collaborative. Hybrid, Intermodal Simulation Platform in Land Transport—Rail. Available online: https://www.uphf.fr/LAMIH/en/PSCHITT-Rail (accessed on 28 January 2021).
  7. OKTAL SYDAC. Conception. 2020. Available online: https://www.oktalsydac.com/en/ (accessed on 28 January 2021).
  8. IFFSTAR. Institut Français des Sciences et Technologies des Transports, de l’Aménagement et des Réseaux. 2020. Available online: https://www.ifsttar.fr/en/exceptional-facilities/simulators/ (accessed on 28 January 2021).
  9. NVIDIA DRIVE. Scalable AI Platform for Autonomous Driving. 2019. Available online: https://www.nvidia.com/en-us/self-driving-cars/drive-platform/ (accessed on 28 January 2021).
  10. Ansys. VRX Dynamic Driving Experience. 2020. Available online: https://www.ansys.com/products/systems/ansys-vrxperience (accessed on 28 January 2021).
  11. Ansys. Ansys VRXPERIENCE HMI. 2020. Available online: https://www.ansys.com/products/systems/ansys-vrxperience/hmi (accessed on 28 January 2021).
  12. Epagnoux, S. CAE Flight Simulator. 2020. Available online: https://commons.wikimedia.org/wiki/File:CAE-flight-simulator-Lockheed-Martin-Boeing-Airbus-aerospace-industry-Canada-EDIWeekly.jpg (accessed on 28 January 2021).
  13. CAE. CAE 3000 Series Flight Simulator. 2020. Available online: https://www.cae.com/civil-aviation/training-equipment-and-aviation-services/training-equipment/full-flight-simulators/cae3000/ (accessed on 28 January 2021).
  14. Alsim. Alsim Flight Training Solutions. Alsim Simulators & Technology. 2020. Available online: https://www.alsim.com/simulators (accessed on 28 January 2021).
  15. Vanderhaegen, F.; Richard, P. MissRail: A platform dedicated to training and research in railway systems. In Proceedings of the International Conference HCII, Heraklion, Greece, 22–27 June 2014; pp. 544–549. [Google Scholar]
  16. Vanderhaegen, F. MissRail® and Innorail. 2015. Available online: http://www.missrail.org (accessed on 28 January 2021).
  17. Vanderhaegen, F. Pedagogical learning supports based on human–systems inclusion applied to rail flow control. Cogn. Technol. Work 2019. [Google Scholar] [CrossRef]
  18. Vanderhaegen, F.; Jimenez, V. The amazing human factors and their dissonances for autonomous Cyber-Physical & Human Systems. In Proceedings of the First IEEE Conference on Industrial Cyber-Physical Systems, Saint-Petersburg, Russia, 15–18 May 2018; pp. 597–602. [Google Scholar]
  19. Fond, G.; MacGregor, A.; Leboyer, M.; Michalsen, A. Fasting in mood disorders: Neurobiology and effectiveness. A review of the literature. Psychiatry Res. 2013, 209, 253–258. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Smith, A. Effects of chewing gum on cognitive function, mood and physiology in stressed and non-stressed volunteers. Nutr. Neurosci. 2010, 13, 7–16. [Google Scholar] [CrossRef] [PubMed]
  21. Onyper, S.V.; Carr, T.L.; Farrar, J.S.; Floyd, B.R. Cognitive advantages of chewing gum. Now you see them, now you don’t. Appatite 2011, 57, 321–328. [Google Scholar] [CrossRef] [PubMed]
  22. Mori, F.; Naghsh, F.A.; Tezuka, T. The effect of music on the level of mental concentration and its temporal change. In Proceedings of the 6th International Conference on Computer Supported Education, Barcelona, Spain, 1–3 April 2014; pp. 34–42. [Google Scholar] [CrossRef]
  23. Chtouroua, H.; Briki, W.; Aloui, A.; Driss, T.; Souissi, N.; Chaouachi, A. Relationship between music and sport performance: Toward a complex and dynamical perspective. Sci. Sports 2015, 30, 119–125. [Google Scholar] [CrossRef]
  24. Stanton, N.A.; Young, M.S. Driver behaviour with adaptive cruise control. Ergonomics 2005, 48, 1294–1313. [Google Scholar] [CrossRef] [Green Version]
  25. Schömig, N.; Hargutt, V.; Neukum, A.; Petermann Stock, I.; Othersen, I. The interaction between highly automated driving and the development of drowsiness. Procedia Manuf. 2015, 3, 6652–6659. [Google Scholar] [CrossRef] [Green Version]
  26. Vogelpohl, T.; Kühn, M.; Hummel, T.; Vollrath, M. Asleep at the automated wheel -Sleepiness and fatigue during highly automated driving. Accid. Anal. Prev. 2019, 126, 70–84. [Google Scholar] [CrossRef]
  27. Borghini, G.; Astolfi, L.; Vecchiato, G.; Mattia, D.; Babiloni, F. Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev. 2014, 44, 58–75. [Google Scholar] [CrossRef]
  28. Thomas, L.C.; Gast, C.; Grube, R.; Craig, K. Fatigue detection in commercial flight operations: Results using physiological measures. Procedia Manuf. 2015, 3, 2357–2364. [Google Scholar] [CrossRef] [Green Version]
  29. Wanyan, X.; Zhuang, D.; Zhang, H. Improving pilot mental workload evaluation with combined measures. BioMed Mater. Eng. 2014, 24, 2283–2290. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Pereda-Baños, A.; Arapakis, I.; Barreda-Ángeles, M. On human information processing in information retrieval (position paper). In Proceedings of the SIGIR Workshop Neuro-Physiological Methods IR, Santiago, Chile, 13 August 2015. [Google Scholar]
  31. Hensch, A.C.; Rauh, N.; Schmidt, C.; Hergeth, S.; Naujoks, F.; Krems, J.F.; Keinath, A. Effects of secondary tasks and display position on glance behavior during partially automated driving. Transp. Res. Part F Traffic Psychol. Behav. 2020, 68, 23–32. [Google Scholar] [CrossRef]
  32. De Winter, J.C.; Happee, R.; Martens, M.H.; Stanton, N.A. Effects of adaptive cruise control and highly automated driving on workload and situation awareness: A review of the empirical evidence. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 196–217. [Google Scholar] [CrossRef] [Green Version]
  33. Merat, N.; Jamson, A.H.; Lai, F.C.; Daly, M.; Carsten, O.M. Transition to manual: Driver behaviour when resuming control from a highly automated vehicle. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 274–282. [Google Scholar] [CrossRef] [Green Version]
  34. Di Stasi, L.L.; Contreras, D.; Cañas, J.J.; Cándido, A.; Maldonado, A.; Catena, A. The consequences of unexpected emotional sounds on driving behaviour in risky situations. Saf. Sci. 2010, 48, 1463–1468. [Google Scholar] [CrossRef]
  35. Sanderson, P.; Crawford, J.; Savill, A.; Watson, M.; Russell, W.J. Visual and auditory attention in patient monitoring: A formative analysis. Cogn. Technol. Work 2004, 6, 172–185. [Google Scholar] [CrossRef]
  36. Jakus, G.; Dicke, C.; Sodnikv, J. A user study of auditory, head-up and multi-modal displays in vehicles. Appl. Ergon. 2015, 46, 184–192. [Google Scholar] [CrossRef]
  37. Geitner, C.; Biondi, F.; Skrypchuk, L.; Jennings, P.; Birrell, S. The comparison of auditory, tactile, and multimodal warnings for the effective communication of unexpected events during an automated driving scenario. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 23–33. [Google Scholar] [CrossRef]
  38. Salminen, K.; Farooq, A.; Rantala, J.; Surakka, V.; Raisamo, R. Unimodal and Multimodal Signals to Support Control Transitions in Semiautonomous Vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 22–25 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 308–318. [Google Scholar]
  39. Dalton, B.H.; Behm, D.G. Effects of noise and music on human and task performance: A systematic review. Occup. Ergon. 2007, 7, 143–152. [Google Scholar]
  40. Prince-Paul, M.; Kelley, C. Mindful communication: Being present. Semin. Oncol. Nurs. 2017, 33, 475–482. [Google Scholar] [CrossRef]
  41. Vanderhaegen, F. Dissonance engineering: A new challenge to analyze risky knowledge when using a system. Int. J. Comput. Commun. Control 2014, 9, 750–759. [Google Scholar] [CrossRef]
  42. Vanderhaegen, F. A rule-based support system for dissonance discovery and control applied to car driving. Expert Syst. Appl. 2016, 65, 361–371. [Google Scholar] [CrossRef]
  43. Vanderhaegen, F. Towards increased systems resilience: New challenges based on dissonance control for human reliability in Cyber-Physical & Human Systems. Annu. Rev. Control 2017, 44, 316–322. [Google Scholar]
  44. Dufour, A. Driving assistance technologies and vigilance: Impact of speed limiters and cruise control on drivers’ vigilance. In Proceedings of the Seminar on the Impact of Distracted Driving and Sleepiness on Road Safety, Paris, France, 15 April 2014. [Google Scholar]
  45. JTSB. Aircraft Serious Incident—Investigation Report; Report AI2008–01; JTSB: Tokyo Japan, 2008. Available online: https://www.mlit.go.jp/jtsb/eng-air_report/JA767F_JA8967.pdf (accessed on 28 January 2021).
  46. Galluscio, E.H.; Fjelde, K. Eye movement and reaction time measures of the effectiveness of caution signs. Saf. Sci. 1993, 16, 627–635. [Google Scholar] [CrossRef]
  47. Rosch, J.L.; Vogel-Walcutt, J.J. A review of eye-tracking applications as tools for training. Cogn. Technol. Work 2013, 15, 313–327. [Google Scholar] [CrossRef]
  48. De Winter, J.C.F.; Eisma, Y.B.; Cabrall, C.D.D.; Hancock, P.A.; Stanton, N.A. Situation awareness based on eye movements in relation to the task environment. Cogn. Technol. Work 2018, 21, 99–111. [Google Scholar] [CrossRef] [Green Version]
  49. Samima, S.; Sarma, S.; Samanta, D.; Prasad, G. Estimation and quantification of vigilance using ERPs and eye blink rate with a fuzzy model-based approach. Cogn. Technol. Work 2019, 21, 517–533. [Google Scholar] [CrossRef]
  50. Beatty, J. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychol. Bull. 1982, 91, 276–292. [Google Scholar] [CrossRef]
  51. Fletcher, K.; Neal, A.; Yeo, G. The effect of motor task precision on pupil diameter. Appl. Ergon. 2017, 65, 309–315. [Google Scholar] [CrossRef]
  52. Fogarty, C.; Stern, J.A. Eye movements and blinks: Their relationship to higher cognitive processes. Int. J. Psychophysiol. 1989, 8, 35–42. [Google Scholar] [CrossRef]
  53. Benedetto, S.; Pedrotti, M.; Minin, L.; Baccino, T.; Re, A.; Montanari, R. Driver workload and eye blink duration. Transp. Res. Part F Traffic Psychol. Behav. 2011, 14, 199–208. [Google Scholar] [CrossRef]
  54. Tsai, Y.F.; Viirre, E.; Strychacz, C.; Chase, B.; Jung, T.P. Task performance and eye activity: Predicting behavior relating to cognitive workload. Aviat. Space Environ. Med. 2007, 78, 176–185. [Google Scholar]
  55. Recarte, M.A.; Pérez, E.; Conchillo, A.; Nunes, L.M. Mental workload and visual impairment: Differences between pupil, blink, and subjective rating. Span. J. Psychol. 2008, 11, 374–385. [Google Scholar] [CrossRef] [Green Version]
  56. Findley, J.M. Visual selection, covert attention and eye movements? In Active Vision: The Psychology of Looking and Seeing; Oxford Psychology Series; Oxford University Press: Oxford, UK, 2003; pp. 35–54. [Google Scholar] [CrossRef]
  57. Taelman, J.; Vandeput, S.; Spaepen, A.; Van Huffel, S. Influence of mental stress on heart rate and heart rate variability. In 4th European Conference of the International Federation for Medical and Biological Engineering Proceedings; Springer: Berlin/Heidelberg, Germany, 2009; pp. 1366–1369. [Google Scholar] [CrossRef]
  58. Geisler, F.C.M.; Vennewald, N.; Kubiak, T.; Weber, H. The impact of heart rate variability on subjective well-being is mediated by emotion regulation. Personal. Individ. Differ. 2010, 49, 723–728. [Google Scholar] [CrossRef]
  59. Pizziol, S.; Dehais, F.; Tessier, C. Towards human operator state assessment. In Proceedings of the 1st International Conference on Application and Theory of Automation in Command and Control Systems, Barcelone, Spain, 26–27 May 2011; IRIT Press: Oxford, UK, 2011; pp. 99–106. [Google Scholar]
  60. Hidalgo-Muñoz, A.R.; Mouratille, D.; Matton, N.; Caussec, M.; Rouillard, Y.; El-Yagoubi, R. Cardiovascular correlates of emotional state, cognitive workload and time on-task effect during a realistic flight simulation. Int. J. Psychophysiol. 2018, 128, 62–69. [Google Scholar] [CrossRef] [Green Version]
  61. Salomon, R.; Ronchi, R.; Dönz, J.; Bello-Ruiz, J.; Herbelin, B.; Martet, R.; Faivre, N.; Schaller, K.; Blanke, O. The insula mediates access to awareness of visual stimuli presented synchronously to the heartbeat. J. Neurosci. 2016, 36, 5115–5127. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Vanderhaegen, F.; Wolff, M.; Ibarboure, S.; Mollard, R. Heart-Computer synchronization Interface to control human-machine symbiosis: A new human availability support for cooperative systems. In Proceedings of the 14th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human-Machine Systems, Tallinn, Estonia, 16–19 September 2019. [Google Scholar] [CrossRef]
  63. Vanderhaegen, F. Multilevel organization design: The case of the air traffic control. Control Eng. Pract. 1997, 5, 391–399. [Google Scholar] [CrossRef]
  64. Vanderhaegen, F. Toward a model of unreliability to study error prevention supports. Interact. Comput. 1999, 11, 575–595. [Google Scholar] [CrossRef]
  65. Vanderhaegen, F. Human-error-based design of barriers and analysis of their uses. Cogn. Technol. Work 2010, 12, 133–142. [Google Scholar] [CrossRef]
  66. Dehzangi, O.; Rajendra, V.; Taherisadr, M. Wearable driver distraction identification on the road via continuous decomposition of galvanic skin responses. Sensors 2018, 18, 503. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Chen, L.L.; Zhao, Y.; Ye, P.F.; Zhang, J.; Zou, J.Z. Detecting driving stress in physiological signals based on multimodal feature analysis and kernel classifiers. Expert Syst. Appl. 2017, 85, 279–291. [Google Scholar] [CrossRef]
  68. Collet, C.; Salvia, E.; Petit-Boulanger, C. Measuring workload with Electrodermal activity during common braking actions. Ergonomics 2014, 57, 886–896. [Google Scholar] [CrossRef]
  69. De Naurois, C.J.; Bourdin, C.; Stratulat, A.; Diaz, E.; Vercher, J.L. Detection and prediction of driver drowsiness using artificial neural network models. Accid. Anal. Prev. 2017, 126, 95–104. [Google Scholar] [CrossRef]
  70. Ngxande, M.; Tapamo, J.R.; Burke, M. Driver drowsiness detection using behavioral measures and machine learning techniques: A review of state-of-art techniques. In Proceedings of the 2017 Pattern Recognition Association of South Africa and Robotics and Mechatronics (PRASA-RobMech), Bloemfontein, South Africa, 30 November–1 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 156–161. [Google Scholar]
  71. Zhao, L.; Wang, Z.; Wang, X.; Liu, Q. Driver drowsiness detection using facial dynamic fusion information and a DBN. IET Intell. Transp. Syst. 2017, 12, 127–133. [Google Scholar] [CrossRef]
  72. Lim, S.; Yang, J.H. Driver state estimation by convolutional neural network using multimodal sensor data. Electron. Lett. 2016, 52, 1495–1497. [Google Scholar] [CrossRef] [Green Version]
  73. Shukla, J.; Barreda-Ángeles, M.; Oliver, J.; Puig, D. Efficient wavelet-based artefact removal for Electrodermal activity in real-world applications. Biomed. Signal Process. Control 2018, 42, 45–52. [Google Scholar] [CrossRef]
  74. Li, J.; Cheng, K.; Wang, S.; Morstatter, F.; Trevino, R.P.; Tang, J.; Liu, H. Feature selection: A data perspective. ACM Comput. Surv. 2017, 50, 94. [Google Scholar] [CrossRef] [Green Version]
  75. Koifman, V. Sofkinetic CARlib. 2016. Available online: http://www.f4news.com/2016/06/24/softkinetic-carlib/ (accessed on 28 January 2021).
  76. Dhall, P. EyeDrive: A Smart Drive. BWCIO BUSINESSWORLD. 2019. Available online: http://bwcio.businessworld.in/article/EyeDrive-A-smart-drive-/05-07-2019-172905/ (accessed on 28 January 2021).
  77. Boulestin, R. L’Haptix Transforme Toute Surface en Interface Tactile. 2013. Available online: https://www.silicon.fr/lhaptix-transforme-toute-surface-en-interface-tactile-88560.html (accessed on 28 January 2021).
  78. Ganguly, B.; Vishwakarma, P.; Biswas, S.; Rahul, S. Kinect Sensor Based Single Person Hand Gesture Recognition for Man–Machine Interaction. Comput. Adv. Commun. Circuits Lect. Notes Electr. Eng. 2020, 575, 139–144. [Google Scholar]
  79. Saha, S.; Lahiri, R.; Konar, A. A Novel Approach to Kinect-Based Gesture Recognition for HCI Applications. In Handbook of Research on Emerging Trends and Applications of Machine Learning; IGI Global: Hershey, PA, USA, 2020; pp. 62–78. [Google Scholar]
  80. Georgiou, O.; Biscione, V.; Hardwood, A.; Griffiths, D.; Giordano, M.; Long, B.; Carter, T. Haptic In-Vehicle Gesture Controls. In Proceedings of the 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, Automotive, Oldenburg, Germany, 24–27 September 2017. [Google Scholar]
  81. He, S.; Yang, C.; Wang, M.; Cheng, L.; Hu, Z. Hand gesture recognition using MYO armband. In Proceedings of the Chinese Automation Congress, Jinan, China, 20–22 October 2017; pp. 4850–4855. [Google Scholar]
  82. Wong, A.M.H.; Furukawa, M.; Ando, H.; Maeda, T. Dynamic Hand Gesture Authentication using Electromyography (EMG). In Proceedings of the IEEE/SICE International Symposium on System Integration, Honolulu, HI, USA, 12–15 January 2020; pp. 300–304. [Google Scholar]
  83. Anderson, T. OK, Google. We’ve Got Just the Gesture for You: Hand-Tracking Project Soli Coming to Pixel 4. The Register. 2019. Available online: https://www.theregister.co.uk/2019/07/30/google_project_soli_coming_to_pixel_4/ (accessed on 28 January 2021).
  84. Raphael, J.R. Project Soli in Depth: How Radar-Detected Gestures Could Set the Pixel 4 Apart. COMPUTERWORLD. 2019. Available online: https://www.computerworld.com/article/3402019/google-project-soli-pixel-4.html (accessed on 28 January 2021).
  85. Priest, D. The Fibaro Swipe Makes Your Hand the Remote. CNET. 2016. Available online: https://www.cnet.com/reviews/fibaro-swipe-preview/ (accessed on 28 January 2021).
  86. Shankland, S. Minority Report’ Gesture Control is about to Get Very Real. CNET. 2018. Available online: https://www.cnet.com/news/sony-builds-eyesight-gesture-control-tech-into-xperia-touch/ (accessed on 28 January 2021).
  87. Zhao, L. Gesture Control Technology: An Investigation on the Potential Use in Higher Education; University of Birmingham, IT Innovation Centre: Birmingham, UK, 2016. [Google Scholar]
  88. Malavasi, G.; Ricci, S. Simulation of stochastic elements in railway systems using self-learning processes. Eur. J. Oper. Res. 2001, 131, 262–272. [Google Scholar] [CrossRef]
  89. Ricci, S.; Tieri, A. Check and forecasting of railway traffic regularity by a Petri Nets based simulation model. Ing. Ferrov. 2009, 9, 723–767. [Google Scholar]
  90. Ricci, S.; Capodilupo, L.; Tombesi, E. Discrete Events Simulation of Intermodal Terminals Operation: Modelling Techniques and Achievable Results. Civ. Comp. Proc. 2016. [Google Scholar] [CrossRef]
  91. Fang, J.; Yan, D.; Qiao, J.; Xue, J. DADA: A Large-scale Benchmark and Model for Driver Attention Prediction in Accidental Scenarios. arXiv 2019, arXiv:1912.12148. [Google Scholar]
  92. Lin, S.; Wang, K.; Yang, K.; Cheng, R. KrNet: A Kinetic Real-Time Convolutional Neural Network for Navigational Assistance. Lecture Notes in Computer Science. 2018. Available online: https://0-link-springer-com.brum.beds.ac.uk/book/10.1007/978-3-319-94274-2 (accessed on 28 January 2021).
  93. CARBODIN. Car Body Shells, Doors and Interiors. Grant Agreement n. 881814. In H2020 Shift2Rail Joint Undertaking; European Commission: Bruxelles, Belgium, 2019. [Google Scholar]
Figure 1. Concept of trucks platooning [1].
Figure 1. Concept of trucks platooning [1].
Machines 09 00036 g001
Figure 2. iNEXT-COPILOT system [4].
Figure 2. iNEXT-COPILOT system [4].
Machines 09 00036 g002
Figure 3. Autopilot system concept based on dynamic positioning [5].
Figure 3. Autopilot system concept based on dynamic positioning [5].
Machines 09 00036 g003
Figure 4. PSCHITT-RAIL simulator interface.
Figure 4. PSCHITT-RAIL simulator interface.
Machines 09 00036 g004
Figure 5. Integrated CAE 7000XR flight simulator [12].
Figure 5. Integrated CAE 7000XR flight simulator [12].
Machines 09 00036 g005
Figure 6. MISSRAIL® interface for rail simulation.
Figure 6. MISSRAIL® interface for rail simulation.
Machines 09 00036 g006
Figure 7. Train, car and pedestrian control simulation on MISSRAIL®.
Figure 7. Train, car and pedestrian control simulation on MISSRAIL®.
Machines 09 00036 g007
Table 1. SWOT analysis of three gesture control technologies.
Table 1. SWOT analysis of three gesture control technologies.
KINECTLEAP MOTIONMYO BRACELET
Strengths
  • Body motion identification
  • Development kit
  • Hand and finger movements identification
  • Low price
  • Lightweight device
  • Development kit
  • Gesture detection only for the person wearing the bracelet
  • Lightweight device
  • Development kit
Weaknesses
  • Operational difficulties in limited spaces
  • Possible interference between movements and detection sensor
  • Deep training required
  • Possible interference between movements and detection sensor
  • Limited number of identified movements
  • Deep training required
Opportunities
  • Combining gesture control with facial or voice recognition for security purposes
  • Combining use of infrared light with cameras for security purposes
  • Combining physiological detection (e.g., heartbeat) with gesture control for security purposes
Threats
  • Undefined gesture control intrusion recovery process
  • Undefined gesture control intrusion recovery process
  • Undefined gesture control intrusion recovery process
Table 2. Overview of investigated systems implemented in vehicles outside the railway sector.
Table 2. Overview of investigated systems implemented in vehicles outside the railway sector.
NameOperating FeaturesAutomation LevelsVehicles
MAN PLATOONINGDriver assistance and controlDriver always keeps hands on the wheelTrucks
HIGHWAY PILOTAutopilotDriver can choose for autonomous driving manuallyTrucks
AUTOPILOT 2.5AutopilotDriver always keeps hands on the wheelCars
iNEXT-COPILOTEasy switch to automatic modeDriver can choose autonomous driving manually or with a voice commandCars
THE FALCOFull automationNo human intervention, ergonomic HMI with control levers and touch screens for call up and controlShips
Table 3. Overview of investigated simulators.
Table 3. Overview of investigated simulators.
NameOperating FeaturesVehicles
PSCHITT-RAILMovement capturing via eyes trackers, physiological measurement sensors (six degrees of freedom motions)Trains
SPICA RAILIncreasing disturbances to evaluate human behavior, and a supervision platformTrains
OKTAL SYDACExact replicas of the cab for a realistic driving experienceTrains
IFFSTAR-RAILDriving and rail traffic supervision simulationTrains
IFFSTAR TS2Impact of internal and external factors on driver behavior, and a fixed-base cab HMICars
NDIVIA DRIVEInterface among environment, vehicles and traffic scenarios by open platform sensorsCars
VRX-2019Autonomous vehicle reproducing the cockpit HMI by advanced sensorsCars
CEE 7000XRFull-flight with adaptation to operator needs and easy access to advanced functionsAircrafts
CAE 3000Helicopter flight in normal and unusual/dangerous conditionsHelicopters
EXCALIBUR MP521Flight control with graphical user interface and data editor in capsule with six-axis motion system, visual and instrument touch displaysAircrafts
ALSIMFlight training and interchangeable cockpit configuration with high performance visual systemAircrafts
iVISIONSemantic virtual cockpit design environment and validation of human-centered operations with analysis moduleAircrafts
MISSRAILAutomated driving assistance combining accident scenarios including pedestrians, trains and cars with human factor controlIntegrated
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Enjalbert, S.; Gandini, L.M.; Pereda Baños, A.; Ricci, S.; Vanderhaegen, F. Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications. Machines 2021, 9, 36. https://0-doi-org.brum.beds.ac.uk/10.3390/machines9020036

AMA Style

Enjalbert S, Gandini LM, Pereda Baños A, Ricci S, Vanderhaegen F. Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications. Machines. 2021; 9(2):36. https://0-doi-org.brum.beds.ac.uk/10.3390/machines9020036

Chicago/Turabian Style

Enjalbert, Simon, Livia Maria Gandini, Alexandre Pereda Baños, Stefano Ricci, and Frederic Vanderhaegen. 2021. "Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications" Machines 9, no. 2: 36. https://0-doi-org.brum.beds.ac.uk/10.3390/machines9020036

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop