The DFKI Competence Center for Ambient Assisted Living

The DFKI Competence Center for Ambient
Assisted Living
Jochen Frey, Christoph Stahl, Thomas R¨ofer, Bernd Krieg-Br¨
uckner, and Jan
Alexandersson
DFKI GmbH, Saarbr¨
ucken, Germany,
{jochen.frey, christoph.stahl, thomas.roefer, bernd.krieg-brueckner,
jan.alexandersson}@dfki.de
Abstract. The DFKI Competence Center for Ambient Assisted Living
(CCAAL) is a cross-project and cross-department virtual organization
within the German Research Center for Artificial Intelligence coordinating and conducting research and development in the area of Ambient Assisted Living (AAL). Our demonstrators range from multimodal
speech dialog systems to fully instrumented environments allowing the
development of intelligent assistant systems, for instance an autonomous
wheelchair, or the recognition and processing of everyday activities in
a smart home. These innovative technologies are then tested, evaluated
and demonstrated in DFKI’s living labs.
Keywords: intelligent environments, ambient assisted living, living labs
1
Vision
The DFKI Competence Center Ambient Assisted Living1 is a cross-project and
cross-department virtual organization within the German Research Center for
Artificial Intelligence coordinating and conducting research and development
in the area of Ambient Assisted Living. We investigate AAL from the following perspectives and underlying research areas: Cognitive Assistance (context
awareness), Physical Assistance (robotics), Comfort Systems (home automation), Healthcare (telemedicine), Service Portal (multi-agent systems), and Social Connectedness (pervasive computing). Each area joins competences from one
or more of the departments Intelligent User Interfaces, Safe and Secure Cognitive Systems, Agents and Simulated Reality, Institute for Information Systems,
Augmented Vision, Knowledge Management and Robotics Innovation Center.
Following a holistic approach to AAL, the driving force for research and development are primarily users of the technology, society and business partners.
Our long-term vision is to promote an accessible intelligent environment beyond
today’s state-of-the-art based on standards-based open architectures and innovative solutions where everyone can continue to live a secure and autonomous
life and thus play a role in society.
1
http://ccaal.dfki.de
Fig. 1. The concept of Dual Reality (vertical axis) refers to the mutual influence between the real and the virtual world. Both worlds are connected to the Universal
Remote Console framework thus keeping them in sync. User actions and device states
in one world are reflected in the other world accordingly. The horizontal axis shows
the virtual (top) and real (bottom) connections between the living labs in Bremen and
Saarbruecken.
One of our current goals is to investigate social connectedness in the scope of
AAL, as recent research in psychosomatic medicine indicates that social connectedness plays an important role to reduce the risk for a stroke or heart disease.
Today, senior citizens are already adopting social networks such as facebook to
stay in contact with their family and friends and to share fotos and news. We
are interested how pervasive computing, i.e., Lifton’s concept of Dual Reality
[4], can further contribute to connectedness in smart homes (Figure 1). Lifton
connects real and virtual environments, so that they can mutually reflect and
influence each other. Similarly, Streitz and Wichert [10] put hybrid symmetric
interaction on their research agenda, meaning that both worlds maintain consistency. In our living labs, we use the underlying URC technology (Section 3) to
link the real environment with a virtual model thereof, so that actions performed
in virtual reality (e.g., to turn on a light) have a similar effect on the physical
environment. Vice versa, the virtual world reflects the real lab situation. This
allows to proactively update status reports in social networks, e.g., with activity reports. Finally, we will investigate how geographically distant environments
can be joined in order to create the sensation of virtual presence in conjoined
environments, i.e., to let users feel like other family members were present and
that they will keep an eye and know if something is wrong. As technical foundation, we extend the URC framework so that it allows us to synchronize real and
Fig. 2. Demonstrators at the Baall: (i) remote rontrol on iPhone; (ii) instrumented
flat; (iii) synchronized virtual 3-D model; (iv) autonomous wheelchair; (v) intelligent
c raumplus)
walker (photos virual realities. We design and develop our AAL labs according to our GeometryOntology-Activity ModeL (GOAL) [9] methodology which relates activity-based
requirements to a detailed 3-D model environment.
2
Ambient Assisted Living Labs
The technologies developed at DFKI are tested, evaluated and demonstrated in
so called living labs. We describe two of DFKI’s fully instrumented AAL environments — the Bremen Ambient Assisted Living Lab (Baall) and the Smart
Kitchen and thereby highlight the corresponding projects and demonstrators.
2.1
Bremen Ambient Assisted Living Lab
Baall is an automated 60 m2 apartment suitable for the elderly and people with
physical or cognitive impairments. It comprises all necessary conditions for trial
living, intended for two persons, as shown in the virtual 3-D model in figure
2 (iii). Baall aims to compensate physical impairments of the user through
mobility assistance and adaptable furniture, such as a height-adaptable kitchen.
Accordingly, the lab has been equipped with five automated sliding doors that
open to let the wheelchair pass through when activated by a user command or
proactively by the intelligent environment. Furthermore, all lights in Baall are
automated. In addition, the overall instrumentation in Baall can be controlled
remotely via mobile devices, e.g., the iPhone (Figure 2 (i)). In the EU-funded
project SHARE-it, the intelligent wheelchair Rolland [2] and walker iWalker
[7] have been developed. They provide intelligent assistance by navigating their
user within the lab by locally avoiding obstacles, controlled via the joystick or
even autonomously to locations selected in a spoken natural language dialog [5].
Fig. 3. Demonstrators of the Smart Kitchen: (i) multimodal dialogue system; (ii)
activity based calendar; (iii) instrumented kitchen; (v) synchronized virtual 3-D model
2.2
Smart Kitchen
The Smart Kitchen is a fully instrumented room that allows for realizing
kitchen and living room scenarios. The Semantic Cookbook (SC) [8] application utilizes RFID sensors to recognize products and tools on the counter and
provides the user with multi-media instructions for cooking recipes. The EUfunded i2home project incorporated technical and user partners from different
European countries and aimed at developing access technologies around home
appliances for persons with special needs by providing the notion of pluggable
user interfaces [1] and implementations thereof. The configuration of the i2home
system considered here contains a wide range of appliances, e.g., televsion (Microsoft Windows Media Center), Siemens’ Serve@Home kitchen (hood, oven,
fridge, freezer, dishwasher and air condition), SweetHeart blood pressure meter,
SmartLab Genie blood sugar meter, Enocean lighting, and services, e.g., Google
calendar, Skype and Twitter. Figure 3(i) depicts a multimodal user interface for
interaction with these appliances by speech and pointing gestures or the combination of both. A common challenge when creating intelligent user interfaces
is to handle the complexity of modern appliances. The Activity Based Calendar (Figure 3 (ii)) enables the users to define and schedule predefined tasks
models [6]. Therefore, the users will not only be reminded of a task, the calendar
automatically triggers tasks and thus assists the users by giving instructions or
automating the necessary steps to perform arbitrary tasks which can also interact with the appliances connected to the system. The EU-funded VITAL project
proposes a combination of advanced information and communication technologies that uses a familiar device like the television as the main vehicle for the
delivery of services to elderly users in their home environments.
3
The OpenURC Alliance
One of the core concepts of the demonstrated approaches above is the usage of
standards-based open architectures. We propose the Universal Remote Console
(URC) framework, that focus on accessible and inclusive user interfaces by allowing any device or service to be accessed and manipulated by any controller
[1]. The first project in Europe using URC technology was i2home, which had
the ambitious effort to inject an ecosystem around the industrial URC standard [3] and to introduce URC technology in the field of AAL. Since i2home
started, in Europe alone projects with in total 120 partners and an accumulated budget of about e 80 million are currently using the URC technology, e.g.,
VITAL, MonAMI, Brainable, SmartSenior, SensHome etc. Therefore, we have
joined an initiative to launch a global plattform for exchanging experiences, ideas
and, perhaps most importantly, continue the development of the URC standard:
the OpenURC Alliance. Four working groups—governance, technical, marketing
and user—are shaping the organization and currently gathering partners from
Europe and USA.
References
1. J. Frey, C. Schulz, R. Nesselrath, V. Stein, and J. Alexandersson, Towards Pluggable
User Interfaces for People with Cognitive Disabilities, in Proc. of the 3rd Int. Conf.
on Health Informatics (HEALTHINF), 2010.
2. B. Krieg-Br¨
uckner, T. R¨
ofer, H. Shi, and B. Gersdorf, Mobility Assistance in the
Bremen Ambient Assisted Living Lab, in J. Nehmer, U. Lindenberger, E. SteinhagenThiessen (Eds.), Aging and technology (special issue). GeroPsych: The Journal of
Gerontopsychology and Geriatric Psychiatry, to appear 2010.
3. ISO, ISO/IEC 24752: Information Technology — User Interfaces — Universal remote console — 5 parts, ”International Organization for Standardization”, 2008.
4. J. Lifton, Dual Reality: An Emerging Medium, Ph.D. Dissertation, Massachusetts
Institute of Technology, 2007.
5. H. Shi, C. Jian, and B. Krieg-Br¨
uckner, Qualitative spatial modelling of human route
instructions to mobile robots, in Proceedings of The Third International Conference
on Advances in Computer-Human Interactions, IEEE Computer Society, 2010.
6. C. Rich, Building task-based user interfaces with ANSI/CEA-2018, Computer, 42(8),
20–27, 2009.
7. T. R¨
ofer, T. Laue, B. Gersdorf, iWalker - An Intelligent Walker providing Services
for the Elderly, in Technically Assisted Rehabilitation, 2009.
8. M. Schneider, The Semantic Cookbook: Sharing Cooking Experiences in the Smart
Kitchen, in Proc. of IE-2007, Ulm, 416–423, The Institution of Engineering and
Technology (IET), 2009.
9. C. Stahl, Spatial Modeling of Activity and User Assistance in Instrumented Environments, Ph.D. thesis, Saarland University, 2009.
10. N. A. Streitz and R. Wichert, InterLink Deliverable D4.2—Road-Mapping Research in Ambient Computing and Communication Environments, 2009.