Interaction in Ambient Intelligence and Smart Environments

The main objective of this research direction is to study Ambient Intelligence technologies and their application in Smart Environments following a User-Centred Design and Universal Access approach. In this context, the HCI Laboratory develops novel software development frameworks and methods, and develops ambient interactive systems applications and services targeted to enabling natural, intuitive, high-quality, unobtrusive, inclusive and fault-tolerant interaction of people with the intelligent environment via multiple modalities and devices, integrating multidisciplinary technologies such as recognition and monitoring of users interaction with the environment, distributed processing, reasoning mechanisms, computer, sensor and actuator networks, computer vision algorithms, multimedia content design, and multimodal interaction techniques. Activities conducted in this area include: (i) Elaboration of methods and tools for user interface development in Ambient Intelligence Environments; (ii) Investigation scenarios and methodological approaches for universal access in Ambient Intelligence Environments; (iii) Development of prototype applications that demonstrate the potential, added value and benefits of Ambient Intelligence Technologies to the (mobile) end-users; (iv) Design and development of innovative public information interactive system products combining state of the art applied research with industrial design and art towards the creation of unique systems that blend usability, aesthetics, interaction and fun for the delivery of dynamic multimedia information. These systems can work independently but also as a coherent ensemble, providing holistic, personalized experiences. Individual systems range from hand-held mobile devices to room-sized interactive sculptures.

Additionally, the Laboratory conducts activities which aim to assess the impact of Ambient Intelligence Technologies on the individual and society as a whole, as well as to highlight the potential and the benefits of such technologies in various aspects of everyday life. Also, the Laboratory actively pursues the development of prototypical market products, as well as the transfer of technologies and know-how to the industrial sector.


Smart Box (2012):a standard carton box enhanced with interaction capabilities, through a 3-D accelerometer and a 3-D magnetometer which recognize the inclination of the box. The box can be used in the context of several innovative applications, e.g. to explore virtual representations of 3-D objects, as a steering wheel in the context of games, etc.

Hand, Feet and Body Gestures Navigation (2012):an innovative interaction technique based on human skeleton tracking, allowing users to interact through hand gestures, feet gestures, and body gestures (position and orientation).

Interaction techniques for persons with disabilities(2012):a head scanner for domotic control and a universal control wand have been developed allowing users with severe motor impairments and users with vision disabilities to control the surrounding environment’s devices and interactive components.

Pupil (2011):a framework that facilitates the design, development and deployment of pervasive educational applications that can automatically transform according to the context of use to ensure their usability. The collection of widgets incorporates both common basic widgets (e.g., buttons, images) and mini interfaces frequently used by in educational applications (e.g., bookViewer), as ready-to-use modules.

iTable (2010):iTable mainly targets the Exploration of terrain-based information. Its main component is a plain wooden table, the surface of which is covered by a printed map. The map does not contain any text or other kind of data. When a visitor places a cardboard piece on the table surface, an image is projected on it, showing the area of the map located underneath the paper. Furthermore, a circled crosshair is projected on the paper’s centre along with a virtual red string connecting the paper with the closest site of interest. If the visitor moves the paper so that the site of interest lies within the boundaries of the crosshair, a multimedia slideshow starts. The slideshow comprises a series of pages, each of which may contain any combination of text, images, and video. When the cardboard piece is lying on the table, a toolbar is projected at its lower bottom area, containing two buttons for moving to the next/previous page. The user can interact with these “soft” buttons using her bare fingers. If the paper is taken off the table’s surface, the buttons disappear and the user can move to the next/previous page, by tilting the paper right or left, respectively. In this case, the projection is appropriately distorted, so that the visual content registers correctly on the paper surface.

iRoom (2010):iRoom can be used for the exploration of very large-scale artifacts in real-life size, mainly targeted to exhibitions and museums. It can present large scale images of artifacts, with which one or more visitors can concurrently interact, simply by walking around. The system is capable of location sensing, and also supports interaction through mobile phones and a kiosk.

iTouch (2010):a custom-made multi-touch screen, also supporting interaction using three objects, that are detected using computer vision: a magic wand, i.e., a long stick with an IR led and a switch at its top – when the switch is pressed against the projection screen, the LED turns on; a paper magnifying glass made of white cardboard; and an IR flashlight (which also has a LED of visible light, used as feedback so that the user knows if the flashlight is turned on or not). iTouch comes with a puzzle application.

iBlow (2010):iBlow provides an alternative to typical information kiosks and touch screens used at museums, in order to allow visitors browse item collections. The system comprises a large wooden wall on wheels (for easier transportation), two framed touch screens, a webcam, two light sensors and a windmill toy. The larger screen presents a high resolution photo of the currently selected artifact. The smaller one presents information about the artifact and also includes some soft buttons. Item collections can be browsed through the touch screens, as well as by blowing the windmill toy.

Informative Art (2009):It presents dynamic information, in a subtle and aesthetically-pleasing way, without obstructing the users’ primary task. Specific information semantics are mapped to some parts of an existing painting, namely “the Birth of Venus” by Sandro Botticelli. The Informative art display initially presents a view of the original painting from which the flowers have been removed. The display tracks an e-mail account and, depending on the number and type of the incoming e-mails makes some painting elements appear (or disappear). For example, whenever a new message arrives a flower is added, messages from a list of colleagues appear as oranges on the tree, virus-infected messages as sharks circling Venus, etc.

AmIDesigner and AmIPlayer (2008):two combined tools which support the automatic generation of accessible graphical user interfaces in AmI environments. The tools offers a simple and rapid design-and-play approach, and the running user interfaces produced integrate non-visual feedback and a scanning mechanism to support accessibility.

CAMILE (2008):Camille is an interactive application for intuitively controlling multiple sources of light in AmI environments, built so that it can be used by anyone, the young, the elderly, people with visual disabilities, and people with hand-motor disabilities alike. Control is available through multiple modalities, such as touch-screen-based, for sighted users with no motor impairments, remote controlled operation in combination with speech for visually impaired users or tele-operation by sighted users, switch-based scanning for motor-impaired users and speech-based interaction for all users.

ASK-IT Home Automation Application (2008): an application which facilitates the remote overview and control of the home environment through the use of a portable device. The user interface of the applications can adapt according to user needs (vision and motor impairments), context of use (alternative display types and display devices) and presence of assistive technologies (alternative input devices).

Voyager (2004): a User Interface (UI) development framework, delivered as a C++ toolkit, for developing wireless dynamically composed wearable interfaces.

Explorer (2004): a location-aware hand-held multimedia guide for museums and archaeological sites.

Projector (2004): a C++ proxy-toolkit for Java Foundation Classes with split cross-platform execution.

© Copyright 2007 FOUNDATION FOR RESEARCH & TECHNOLOGY - HELLAS, All rights reserved.