9.00 - 9.30
|
Registration & Coffee
|
9.30 - 9.45
|
Opening (Prof. Dr. Bodo Urban & Prof. Dr. Thomas Kirste)
|
9.45 - 10.45
|
Keynote: Peripheral On-Body Interaction with Wearables for Sports and Leisure Activities (Prof. Dr. Antonio Krüger)
|
10.45 - 11.00
|
Coffee Break
|
11.00 - 12.15
|
Session #1: Development of Wearable Technology(Chair: Denys J.C. Matthies)
SmartMove: A Smartwatch Algorithm to Distinguish Between High- and Low-Amplitude Motions as well as Doffed-States by Utilizing Noise and Sleep (Marian Haescher, John Trimpop, Gerald Bieber and Bodo Urban)
In this paper, we describe a self adapting algorithm for smart watches to define individual transitions between motion intensities. The algorithm enables for a distinction between high-amplitude motions (e.g. walking, running, or simply moving extremities) low-amplitude motions (e.g. human microvibrations, and heart rate) as well as a general doffed- state. A prototypical implementation for detecting all three motion types was tested with a wrist-worn acceleration sensor. Since the aforementioned motion types are user- specific, SmartMove incorporates a training module based on a novel actigraphy-based sleep detection algorithm, in order to learn the specific motion types. In addition, our proposed sleep algorithm enables for reduced power consumption since it samples at a very low rate. Furthermore, the algorithm can identify suitable timeframes for an inertial sensor-based detection of vital-signs (e.g. seismocardiography or ballistocardiography).
WeaRelaxAble: A Wearable System to Enhance Stress Resistance using Various Kinds of Feedback Stimuli(Josephin Klamet, Denys J.C. Matthies and Michael Minge)
This paper introduces a wearable feedback device that aims at relaxing the user in stressful situations. The system, which is called WeaRelaxAble, provides various feedback modalities, such as vibration, ambient light, acoustic stimuli and heat in order to reduce the user’s stress level. The development of WeaRelaxAble is based on two studies: At first, all five kinds of feedback and appropriate body positions for stimulation were evaluated with 15 participants. Based on the findings of this initial study, we built a wearable Arduino prototype to prove the feasibility of our concept. The experience while using the system was tested with 26 test subjects under laboratory conditions. We conclude with a concept design of a wrist-worn device that provides acoustic and visual feedback. As tactile stimulation, a shirt would provide vibration at the positions of the shoulders as well as heat at the loins. Users can explicitly activate the system at any time and in any combination of feedback modalities.
okinesio – The Development of Open Hardware for Quantified Self(Michael Zöllner, Andreas Zapf, Nhan Duc Truong and Stefanie Scheer)
In this paper we are describing the development process of open hardware for quantified self applications. We are briefly describing the methods and the results of the evaluation of a range of top-selling activity trackers regarding accuracy, un- derlying hardware sensors, user experience and data acces- sibility. The main part of the paper deals with the hardware development process of the okinesio board, the production and the lessons learned.
|
12:15 - 13:30
|
Lunch Break
|
13:30 - 14:45
|
Session #2: Improving Activity and Gesture Recognition(Chair: Marian Haescher)
Coping with variability in motion based activity recognition [Best Paper] (Matthias Kreil, Bernhard Sick and Paul Lukowicz)
A key issue in the automatic recognition of human activities with body worn sensors is the variability of human motions and the huge space of possibilities for executing even fairly simple actions. In this article we introduce a new algorithm to address this issue. The core idea is that often even highly variable actions include short more or less invariant parts which are due to hard physical con- straints. The aim is to develop a method that can identify such in- variants and use them to improve the classification of the respective activities. The method is meant to be combined with existing clas- sification approaches in an ensemble like fashion, being applied only to the classes for which appropriate invariants can be found and leaving the other classes to be handled by classical methods. We compare our method’s results to prior publications on two well known data sets and are able to improve the classification in 5 of 23 respectively 4 of 19 classes, in same cases by a large margin (best case is from 27% to 76% in the first and from 50% to 64% in the second). In each data set there is only one class for which we make the recognition worse and in both cases it is one with poor results to start with and a relatively small decrease (from 54% to 45% in the first and from 65% to 62% in the second). The results are achieved for an user independent case.
Activity Recognition for Physical Therapy: Fusing Signal Processing Features and Movement Patterns(Lieven Billiet, Thijs Swinnen, Rene Westhovens, Kurt de Vlam and Sabine Van Huffel)
This paper discusses the classification of activities in the con- text of physical therapy. Usually, specific standardized activ- ities are subjectively assessed, often by means of a patient- reported questionnaire, to estimate a patient’s activity capac- ity, defined as the ability to execute a task. Automatic recog- nition of these activities is of vital importance for a more ob- jective and quantitative approach to the problem. The pro- posed accelerometry-based algorithm merges standard signal processing features with information obtained from direct ac- tivity pattern matching using dynamic time warping (DTW) in a linear model. This study with 28 spondyloarthritis pa- tients performing 10 activities shows the improvement in ac- tivity classification accuracy due to the fusion of the two ap- proaches, up to 93.6%. This is a significant increase com- pared to similar models based on either of the approaches alone (p < 0.01). Although this paper mainly contributes to the activity recognition step, it also briefly discusses the advantages of the approach with regard to further automated evaluation of the recognized activities.
CapTap - Combining Capacitive Gesture Recognition and Acoustic Touch Detection
(Andreas Braun, Sebastian Zander-Walz, Stefan Krepp, Silvia Rus, Reiner Wichert and Arjan Kuijper)
Capacitive sensing is a common technology for finger- controlled touch screens. The variety of proximity sensors extends the range, thus supporting mid-air gesture interaction and application below any non-conductive materials. However, this comes at the cost of limited resolution for touch detection. In this paper, we present CapTap, which uses capacitive proximity and acoustic sensing to create an interactive surface that combines mid-air and touch gestures, while being invisibly integrated into living room furniture. We introduce capacitive imaging, investigating the use of computer vision methods to track hand and arm positions and present several use cases for CapTap. In a user study we found that the system has average localization errors of 1.5cm at touch distance and 5cm at an elevation of 20cm above the table. The users found the system intuitive and interesting to use.
|
14.45 - 15.15
|
Coffee Break
|
15:15 - 16:30
|
Session #3: Domain Specific Activity Recognition(Chair: Ahmed R. Sadik)
Towards a Situation Model for Assessing Challenging Behaviour of People with Dementia(Kristina Yordanova, Sebastian Bader, Christina Heine, Stefan Teipel and Thomas Kirste)
With the increase of elderly population, the percentage of people suffering from dementia also increases. Typically, pa- tients with dementia are cared for at home by family mem- bers. The task of caregiving is associated with significant psychological and physical stress that affects both the care- giver and the person with dementia. One solution to improv- ing the task of caregiving is to provide an assistive system that is able to automatically recognise when challenging be- haviour is exhibited and to provide suggestions for appro- priate intervention strategies. To achieve that however, the system needs a situation model that provides the context in- formation needed to recognise the type of behaviour and to reason about its causes. To address this problem, this pa- per performs a systematic analysis of the elements needed for building a situation model for assessing the behaviour of peo- ple with Alzheimer’s disease (AD). The analysis consists of literature review, interviews with experts, and brainstorming sessions. As a result, the work proposes a concept of a situa- tion model for assessing the challenging behaviour of people with dementia based on sensor observations.
AGIS: Automated Tool Detection & Hand-Arm Vibration Estimation using an unmodified Smartwatch(Denys J.C. Matthies, Gerald Bieber and Uwe Kaulbars)
Over the past three decades, it has been known that long- lasting and intense hand-arm vibrations (HAV) can cause serious diseases, such as the Raynaud- / White Finger- Syndrome. In order to protect workers nowadays, the long- term use of tools such as a drill, grinder, rotary hammer etc. underlie strict legal regulations. However, users rarely comply with these regulations because it is quite hard to manually estimate vibration intensity throughout the day. Therefore, we propose a wearable system that automatically counts the daily HAV exposure doses due to the fact that we are able to determine the currently used tool. With the implementation of AGIS, we demonstrate the technical feasibility of using the integrated microphone and accelerometer from a commercial smartwatch. In contrast to prior works, our approach does not require a technical modification of the smartwatch nor an instrumentation of the environment or the tool. A pilot study shows our proof- of-concept to be applicable in real workshop environments.
Tool Support for the Online Annotation of Sensor Data(Max Schröder, Kristina Yordanova, Sebastian Bader and Thomas Kirste)
The labelling of sensor data with information about the real world, e. g. the activity a human performs, is called anno- tation. In this paper we perform an analysis of existing live annotation systems and derive the requirements for a general annotation approach. Based on the requirements, we propose a tool support that uses a database schema for the labelling of sensor data. Instead of video logs that are commonly used to add this information after recording, our approach aims at online annotation of the sensor data, i. e. at the moment the observation is made in the real world. Our database schema enables the automatic generation of a user interface that can be used from (human) observers. In difference to many ex- isting annotation tools, our schema includes a possibility to define callback functions that may be used to check the se- mantic correctness of the annotation. We show that our ap- proach is working with a generated online annotation system in a home environment.
|
17:30 - 18:30
|
Get Together on a Boat
|
18.30 - 21.00
|
Best Paper Banquette
|
10.00 - 11.00
|
Invited Talk: Designing Embedded MEMS-Based Activity Recognition Systems (Prof. Dr. Christian Haubelt)
|
11:00 - 11.15
|
Coffee Break
|
11:15 - 12.45
|
Workshop Part I: Demonstration & Exhibition (Chair: Bodo Urban)
As a first part of our workshop we open the gates to our labs, while we will present current research prototypes dealing with the topics of: Virtual Reality, Industry 4.0 and Future Worker Assistance. Also, we kindly invite every author to involve themselves with providing own demonstrations (Each presentation may not last longer than 10 minutes).
|
12:45 - 13.30
|
Lunch Break
|
13:30 - 15.00
|
Workshop Part II: Building Bridges (Chair: Gerald Bieber)
At the second part of this workshop we will collect research questions resulting from a thought provoking discussion. In a follow up we aim to sketch solutions in forms of concepts, which potentially help all participants with their future research. We strongly encourage everybody to venture new collaborations.
|
15.00 - 15.15
|
Closing
|