IntEnseChoI: Intelligent Environment secures Choice of Interaction (Software Campus)

May 12th, 2020  |  Published in Research

Ambient assisted living (AAL) technologies have great potential to guarantee a self-dependent life for people with disabilities like visually impaired people or the elderly. However, many AAL research projects fail to deliver market-ready products. The explicit application of Universal Design (UD) on smart home products could facilitate marketing them. Also,  technologies, which are usable ─ to the greatest extent possible ─  by all people, imply a larger user group that could help to win over industrial partners to maintain and further develop these technologies. Additionally, the application of UD is beneficial for non-impaired users, too. For instance, the provision of multimodal communication is an instrument to support the interaction with a system concerning personal preferences or ambient constraints.

The IntEnseChoI project aimed to demonstrate the benefits of universally designed products and covered the application in domestic and industrial environments. First, a universally designed remote control was developed and evaluated for the smart home. Users could interact with ambient devices via gestures, keys, and voice commands. The remote control addressed heterogeneous households, including hearing and visually impaired residents. Second, the concept of a multimodal conversational agent as an instructive assistance system (CAIAS) was developed. In doing so, an elicitation study was conducted to derive a gestural grammar model and a gesture set. The system provided multimodal task guidance for maintenance work. It addressed machine operators and service technicians to cope with dirty and noisy working environments.

Introduction of the Haptic Home Controller (H²C)

MOVIE NIGHT

Sources – H²C Introduction:

[siteorigin_widget class=”WP_Widget_Media_Image”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Media_Image”][/siteorigin_widget]

The multimodal conversational agent provided visual and aural task guidance via smartphone or headphones. The user could interact with the agent by using voice commands, gestures or an instant messaging app.

Group picture of the IntEnseChoI research team (from left to right): Christian Fleiner, Mehdi Dado, Jonas Fütterer, Thomas Lieth, Musa Mokhtar, Sajjad Hussain, Zhebin Jiang, Haoye Chen and Rossen Michev (not in picture)

START/END

  • 12/2018 – 09/2019

PARTNERS

  • TRUMPF Werkzeugmaschinen GmbH + Co. KG

RESEARCH TOPICS

  • Assistive Technologies
  • Computer-assisted instruction
  • Multimodal interfaces
  • User-defined gestures

CONTACT

  • Christian Fleiner (email: fleiner(at)teco.edu)

Comments are closed.