Effective information acquisition and processing is one of the key requirements to satisfy in safety critical industries. In this second part of a two-part article, SCT’s Mario Pierobon looks at how systems design can provide solutions to the air rescue cockpit crew and how human-robot interaction should be managed.
A Coherent Operational Picture
Following traditional design principles, system engineers typically would develop a variety of additional automated functions supporting the air rescue cockpit crew as solutions, according to Maiwald and Schulte[i]. “These functionalities will act more or less independently in their limited operating range without considering superior mission objectives,” the researchers say.
A practical illustration can be, for example, when the assistant system can securely identify that the human operator cannot carry out the most urgent task because of overtaxing, and then it does its best by its own initiative to automatically transfer this situation into another one which can be handled normally by the assisted human operator; in this way the workload can be reduced to a manageable level, according to Maiwald and Schulte.
From a system design point of view, with the ever-expanding capability of the aircraft, especially in conditions with impoverished perceptual cues as in high altitude, at night, and during adverse weather, much supplemental information in symbolic, digital, or pictorial forms has to be furnished to enable the pilot to manage the flight, according to Vidulich et al. “Relevant information now would have to be extracted from multiple sources and integrated into one coherent picture,” the researchers say[ii].
The Rise of the Robot
During the first days of flight, apart from a streamer on the wing to help pilots with seeing the relative direction of the wind, technology did not play a crucial role in helping with spatial orientation, affirm Vidulich et al.
However, aviation is now experiencing a technology revolution in the robotic systems and the related software, according to Joseph Lyons in a paper entitled ‘Being Transparent about Transparency: A Model for Human-Robot Interaction’ from the from the 2013 AAAI Spring Symposium[iii]. “Robotic systems in the future will likely possess greater autonomy than current systems,” says Lyons. “Robotic systems are designed for a purpose, typically to support humans during some physical or analytical processes that humans either cannot do (or do not do well) or tasks that humans do not want to do.”
From the point of view of information acquisition and processing in air rescues, the interactions that are created between humans and robots will also likely get more complex as systems increase in their autonomy, according to Lyons. “For instance, instead of the teleoperation of contemporary military robotic systems such as Uninhabited Aerial Vehicles (UAVs), future operators will likely execute supervisory control of multiple robots. This evolution of robotic capabilities coupled with increased supervisory control from humans adds additional layers of complexity in the human-robot interaction, thus making the humans’ trust of robotic systems a key aspect of the overall human robot system,” he says[iv].
According to Lyons, users need to understand the purpose of the robot before they can begin to analyse the actions of the robot within a particular cognitive frame. “The task model could include an understanding of a particular task, information relating to the robot’s goals at a given time, information relating to the robot’s progress in relation to those goals, information signifying an awareness of the robot’s capabilities, and awareness of errors,” he says[v].
Concerning the communication with the robots, the robot must communicate an understanding of the task at hand to the user to promote a shared awareness between the user and robot in terms of what actions need to be accomplished for a given task. As an example, one might depict the tasks associated with a search and rescue mission to require the likes of identifying the emergency location, calculating the optimal route to the search location, travelling to the search location, searching for victims, identifying life signs of victims, notifying emergency personnel, and returning to base, according to Lyons[vi].
The role of the robot is to communicate its intent in terms of what goals it is trying to accomplish for a given task, according to Lyons. “This will provide useful to the human regarding where the robot is in terms of its task sequence and why it is performing a certain action/behaviour,” he says[vii].
[i] Felix Maiwald and Axel Schulte, Workload Prediction and Estimation of Human Mental Resources in Helicopter Emergency Medical Service Missions in proceedings to the 2014 IEEE International Conference on Systems, Man, and Cybernetics.
[ii] Michael A. Vidulich, Christopher D. Wickens, Pamela S. Tsang and John M. Flach, ‘Information Processing in Aviation’ (2010) in ‘Human Factors in Aviation’, edited by Eduardo Salas and Dan Maurino.
[iii] Joseph B. Lyons, Being Transparent about Transparency: A Model for Human-Robot Interaction.