Shared Control Interactions via Dialog


In shared-control systems a human operator and an automated technical system are interdependently in charge of control. For this to be effective, the operator has to be constantly provided with relevant information about the current system state. This is non-trivial because the state cannot be exclusively defined by the internal parameters of the automation, for example air speed of a plane, remaining fuel; it also depends on the specific environment of the system, for example weather conditions, distance to obstacles. Moreover the behavior of the user is determined by his or her naïve theory about the system, built from a personal interpretation of information deemed to be relevant (world knowledge, user instructions, system feedback, perception of the environment). Many shared-control systems are embedded in moving objects (e.g., aircraft, cars, intelligent wheelchairs), which means that the majority of the data necessary for the operator are either directly related to spatial information, or refer indirectly to space. Moreover, and particularly in mobile systems, the user must be able to both give and receive information about actions to be performed by the system; the issue of mode becomes crucial. Mode-confusion on the part of a user results in the system and the user no longer being able to communicate, which can have catastrophic results. This project therefore focuses on how to present spatial information so as to improve the mode awareness of the operator, thus avoiding mode-confusion situations. The project s major emphasis will be on the speech channel, i.e. interaction via dialogs. These must enable the user to make enquiries about the system state, the system to ask for clarification in a possibly confusing situation, and both participants to obtain agreement about a compatible modeling of the situation. The components for natural language interaction adopted within the project will provide natural language capabilities for the SFB/TR as a whole; here the re-use of existing technology via defined generic inter-module interfaces will be central. The project will also explore an innovative direction in which the dialog situation is modeled in a way similar to the shared-control situation; in both situations the behavior of the user (the dialog partner) is determined by naïve theories about dialog goals and system abilities (cf. project I1-[OntoSpace]). Formal methods are expected to improve the quality and robustness of shared-control systems and the natural language interaction in shared-dialog situations also.

Subject Areas

Cognitive Robotics
Formal Methods
Computational Linguistics
Artificial Intelligence

May '04 I3-[SharC]
SharC webmaster disclaimer