Boeing

Telesupervised Co-Robotic Systems for Remote Confined/Hazardous Space Operations

Download full paper

The assembly of major aircraft components involves ergonomically-challenging tasks in confined spaces.  Tasks include: gauging, fastening, sealing, cleaning, coating and inspection inside of the wings and fuel tanks, and fuselage.  Boeing is addressing these tasks with co-robot technologies—that is human capabilities augmented by robotic technologies, aimed at producing substantial increases in productivity and worker safety. 

Telerobotic technologies deploy human technicians remotely into confined or otherwise hazardous spaces by bringing the human technicians’ skills to tasks but not to the dangers. This will result in the elimination of musculoskeletal disorders; removal from exposure to harmful atmospheres; and reduced cognitive workload. Additional benefits include higher productivity and quality; and allowing experienced technicians to continue to use their skills over a longer period and retain healthier outcomes.

This paper describes a co-robotics telesupervision architecture, high-fidelity immersive telepresence, co-robotic telesupervision workstation, and intelligent assisting agents.  We also describe an experimental system to be built on these principles for remote operation inside the confines of a wing bay, and the formal testing methods to be used to validate this system.

Of the many forms that humans working with robots may take, we focus on a systemic approach to augment human capabilities.  This includes extending human senses and reach physically; modifying human senses in scale both geometrically and spectrally; and expanding human cognition.

Recognizing the limits of autonomy that preclude direct leaps from the majority of human-accomplished tasks to fully-automated tasks, we take a tractable approach to selectively integrate augmentation of human sensory and cognitive capabilities.  Within our open architecture of human/autonomous cooperation, we support multiple layers: from direct human teleoperation; to augmented human operations; to high-level human supervision of autonomous actions. 

The open telesupervision architecture, developed by Gregg Podnar (and first author of the full Boeing Technical Journal article referenced below), has supported research for planetary exploration using semi-autonomous rovers for NASA's Exploration Systems Mission Directorate; and research for Harmful Algal Bloom detection by semi-autonomous ocean vessels for NASA's Earth Science Technology Office.  The architecture provides a framework within which co-robotic assembly and inspection systems can be continuously improved as intelligent autonomous agents are developed, proven robust and integrated.

The primary elements of our co-robotic telesupervision architecture used for remote confined space operation are: the distal robotic sensory and manipulation tools; the proximal immersive telepresence and manipulation controls of the telesupervisor’s workstation; and the interposed intelligent assisting agents (see Figure 1).

Our design principles for a co-robotic system for internal wing bay assembly and inspection, and our testing methodology, are based on fundamental requirements:

  • Deployed robotic equipment must be physically capable of accomplishing the domain-specific tasks;
  • Deployed sensing capability must provide situation awareness with sufficient fidelity; and
  • Workstation must provide the most natural interfaces practical.

Performance experiments of the co-robotic system for wing bay assembly and inspection tasks will prove the effectiveness of integrated telepresence, teleoperation and human augmentation technologies.  This formal approach is appropriate to validate every task-specific, co-robotic system developed within our telesupervision architecture.

We recognize that while workers are remote from the confined space environment, they will be working through an interface that may itself pose ergonomic issues. Therefore the co-robotic telesupervision workstation developed for this testing will first be assessed using a musculoskeletal disorder risk factor checklist. Interface devices and features of the co-robotic workstation include:

  • Immersive human interfaces to remote sensory systems.
  • Intuitive human controls for telerobotic actuation.
  • Geometrically-correct binocular remote vision system.
  • Binaural stereophonic remote audition system.
  • Autonomous vision agent to detect features-of-interest.
  • 3D visual overlay with mode selections.
  • Height-adjustable, positionable chair or sit/stand stool.

We will utilize measures of task performance, usability testing and workload assessment. Objective measures include task completion times, frequency and type of errors, and quality acceptance ratios (e.g., in an inspection task, how many incomplete fasteners were correctly identified).

Usability evaluations using formal testing methods will take advantage of our research in usability testing with teleoperated robots supported by NSF (IIS-0636173) and Army Research Laboratory (ARO 103526, W911NF-06-2-0041). We use a set of measures that are especially suited to remote robot operations that provide indirect indication of the quality of interface design including task completion time, the number of objectives accomplished within a time, and subjective surveys.

Workload will be measured using the NASA TLX (Task Load Index) method. Users provide subjective assessments along scales in six dimensions: mental demands (how much thinking, looking, calculating, remembering, searching), physical demands (how much pushing, pulling, turning, activating), temporal demands (how much time pressure), performance (how well did the user think they performed the task), effort (how hard did the user have to work), and frustration (how irritated, discouraged, stressed or annoyed was the user).

Situation awareness is a critical component of robot teleoperations. We will evaluate the interface conditions in terms of their contribution to enhanced operator SA using coding schemes modified from previous research, wherein operators spent much of their time trying to determine the state of the robot (its location, configuration, mode).

We will also perform musculoskeletal disorder risk factor assessments while operators perform tasks manually and while using the co-robotic system. Musculoskeletal disorder risk data provide quantitative information to help determine if there are differences between manual and co-robotic system risk.

Parametric statistical procedures such a balanced multifactor repeated measures analysis of variance (ANOVA) will be used to analyze ratio-scale data collected—task performance times, for example. Nonparametric statistical procedures will be administered to ordinal and interval-scale data, such as numerical questionnaire data. Simple descriptive analyses will also be completed on each study group separately. Variables of interest include typical demographic characteristics.

The results gathered from this research will increase knowledge of task performance outcomes with new teleoperation techniques and technologies; identify the types of feedback most effective in assisting operators; and understand the ergonomic improvement associated with new ways of performing confined space manufacturing operations.

The ability to scale the human worker through co-robotic systems allows the expansion of the design space—supporting design of significantly higher-performance systems or systems in significantly challenging environments that would otherwise be nearly impossible to achieve by conventional means.

By Mark A. Stuart

Figure 1 - Co-robotic telesupervision architecture with autonomous agents for human augmentation.

Design thinking image

The open telesupervision architecture has supported research for planetary exploration using semi-autonomous rovers for NASA's Exploration Systems Mission Directorate; and research for Harmful Algal Bloom detection by semi-autonomous ocean vessels for NASA's Earth Science Technology Office.
Boeing image

Distal Robotic Systems

Sensory and manipulation systems are deployed into confined/hazardous spaces and include teleperception sensors for binocular stereoscopic vision, binaural stereophonic audition, proprioception, and force-reflecting haptic manipulators. These are deployed using robotic vehicles or arms adapted to the tasks, spaces and access.

High-Fidelity Immersive Telepresence

Situation awareness and the sense of presence requires high-fidelity acquisition and presentation of sensory and sensorimotor data. Telepresence presentation to the telesupervisor includes geometrically-correct binocular stereoscopic viewing systems and high-fidelity stereophonic audio reproduction. Force-reflecting manipulation allows the teleoperator to feel into the environment.

Co-Robotic Telesupervision Workstation

By integrating high-fidelity operator interface components for mobility, manipulation and telesensing, the co-robotic telesupervision workstation becomes the hub of planning and control.

Intelligent Assisting Agents

The immersive telepresence and teleoperation data are communicated between the distal robotic systems and the telesupervision workstation and thus are available to the intelligent assisting agents that can autonomously monitor, interpret, indicate, automate and limit. Some high-level robotic autonomy is relatively mature, such as robot path planning and navigation. Others such as automatic task-specific operations, and system health monitoring are less robust or must be developed per application.

Distant Human Expert Telecollaboration

A further expansion of the concept of the "intelligent assisting agent" that is supported by our architecture is the facility to provide a subset of the telepresence data to a distant human expert who has more specific domain knowledge than the telesupervisor operating through the co-robotic system. This is especially useful when an unforeseen condition is experienced for which additional expertise is required.